Linux backup using CRON to local directory

As many have pointed out I am on a backup and disaster recovery kick lately. Some would say that it is about time, others are simply glad to see that data is now being backed up. I have found that it is easiest to zip up files on a local machine prior to moving them to a final destination. So lets get started:

I have multiple Linux servers with many websites on each, as well as database. So I created a script that simply tar’s the files, then gzips them with the date in the filename for archiving.

Here is the file named ‘backupall.sh’ that I save in a place reachable by the user I will use to schedule this cronjob:

#!/bin/sh
date
echo "############### Backing up files on the system... ###############"
 
backupfilename=server_file_backup_`date '+%Y-%m-%d'`
 
echo "----- First do the sql by deleting the old file and dumping the current data -----"
rm -f /tmp/backup.sql
mysqldump --user=mysqluser --password=password --all-databases --add-drop-table > /tmp/backup.sql
 
echo "----- Now tar, then zip up all files to be saved -----"
tar cvf /directory/to/store/file/${backupfilename}.tar /home/* /var/www/html/* /usr/local/svn/* /etc/php.ini /etc/httpd/conf/httpd.conf /tmp/backup.sql /var/trac/*
gzip /directory/to/store/file/${backupfilename}.tar
rm /directory/to/store/file/${backupfilename}.tar
chmod 666 /directory/to/store/file/${backupfilename}.tar.gz
 
echo "############### Completed backing up system... ###############"
date

Continue reading Linux backup using CRON to local directory

MySqlDump export and MySQL import

To export data as a backup you can’t beat mysqldump.

# mysqldump -u username database > database_2007-05-14.sql

Then to restore a database you would use:

# mysql -u username -p database < database_2007-05-14.sql

It doesn’t get any easier than that!