How can I sqldump a huge database?

Posted by meder on Stack Overflow See other posts from Stack Overflow or by meder
Published on 2011-03-12T08:07:31Z Indexed on 2011/03/12 8:10 UTC
Read the original article Hit count: 175

Filed under:
|
|

SELECT count(*) from table gives me 3296869 rows.

The table only contains 4 columns, storing dropped domains. I tried to dump the sql through:

$backupFile = $dbname . date("Y-m-d-H-i-s") . '.gz';
$command = "mysqldump --opt -h $dbhost -u $dbuser -p $dbpass $dbname | gzip > $backupFile";

However, this just dumps an empty 20 KB gzipped file. My client is using shared hosting so the server specs and resource usage aren't top of the line.

I'm not even given ssh access or access directly to the database so I have to make queries through PHP scripts I upload via FTP ( SFTP isn't an option, again ).

Is there some way I can perhaps sequentially download portions of it, or pass an argument to mysqldump that will optimize it?

I came across http://jeremy.zawodny.com/blog/archives/000690.html which mentions the -q flag and tried that but it didn't seem to do anything differently.

© Stack Overflow or respective owner

Related posts about php

Related posts about mysql