I have some forum software that needs updating and I've tried to back up the large database using SSH. I've done it several times before and had no problems. This time it said that the "file size limit was exceeded". What is the limit and what can I do?
There is a per file limit which is in place for security, DoS mitigation, and resource reasons:
Soft File Limit: 300MB
Hard File Limit: 400MB
As you are running into the limit due to the size of your database, you will need to proceed in a slightly different fashion.
There are a few ways to solve this:
If you can do Solution #1 - we would much prefer that as it cuts out the intermediate file creation on the servers.
- Dump your database to a file, which can be downloaded via FTP later, by issuing the following command:
mysqldump -q -uxmysqlusername -p -hmysql.example.com xdatabase |
gzip -9c > filename.txt.gz
The backup file filename.txt.gz will then be in the current directory and can be downloaded to your computer via FTP.
To restore a backup made with option #1:
zcat filename.txt.gz | mysql -q -uxmysqlusername -p
- Use a creative workaround, such as splitting the stream on the fly. Example:
mysqldump -q -uxmysqlusername -p -hmysql.example.com xdatabase | gzip -9c | split -b 250m - filename.txt.gz-
That will create multiple gzip files that can be used later to restore via:
cat filename.txt.gz-* | zcat | mysql -uxmysqlusername -p -hmysql.example.com xdatabase
Solution #2 will handle enormous files with ease...
- Only backup a subset of the tables, into multiple archives... You can read more about mysqldump at:
There are also plenty of tools available to work directly with gzip files, such as zcat, zless, zgrep, etc....
Hopefully the above will provide you with a solution to work within the confines of the protective server limit...