Backing Up Your Website and Database

Getting the files
Now that the database is backed up, what’s the best way to suck the files from the hosting provider?

There’s another tool called rsync which can synchronize a set of files between two machines over a network. It’s quite easy to use. Here’s the command that I use on my local machine at home to grab the files:

rsync -avz --delete -e ssh <username>@<username>:~/backups ~/localbackups/

Just replace <username> and <hostname> with the proper values. One thing to point out is that I use the –delete parameter. This specifies that if the file does not exist on the remote machine, then delete it from the local machine. So, when the cleanup-files.cron script runs and cleans out the old files, rsync will mirror those changes locally. Otherwise I would continue to fill up my hard drive with database backups.

If you ran the rsync command I’m sure you noticed that it prompted you for your password to the remote machine. This makes is difficult to automate this into a script. However, there is a solution for this as well. Basically we need to instruct the remote machine to “trust” the local machine on a permanent basis. This can be done by creating an ssh key and uploading it to the remote server (web host.)

On your local machine, run this command:

ssh-keygen -t rsa

This will create a file called ~/.ssh/

You will need to append this to your ~/.ssh/authorized_keys file on the remote host. Keep in mind the ~/.ssh directory and file may not yet exist.

If all goes as expected, you can run the rsync command again and you should not be prompted for your password. Now we just copy the rsync command into a file, mark it as executable and schedule the cronjob on the local machine.

mkdir ~/cronjobs

echo 'rsync -avz --delete -e ssh <username>@<hostname>:~/backups ~/localbackups/' > ~/cronjobs/get-db-backups.cron

chmod +x ~/cronjobs/get-db-backups.cron

crontab -e
(a, paste in, ESC, :wq, ENTER)
0 2 * * * ~/cronjobs/get-db-backups.cron

This will grab the files a 2 AM every day.

So, we’ve gone through all of this just to grab the SQL backup files, but this post is about backing up the whole website. Well, by no stretch of the imagination you can simply modify the get-db-backups.cron script to sync all of the files instead of just the ones in the ~/backups directory.

Better yet, why not create another script that maybe runs once a week (instead of every day) to sync all of the files.

For Windows users, there are some good websites that show how to set this up.

Other Helpful Links:

This entry was posted in Website. Bookmark the permalink.

3 Responses to Backing Up Your Website and Database

  1. Mauriat says:

    Nice. You almost have verbatim how I have my site-backup setup. 🙂 Although for the db, an alternative some users may have is if their provider gives a link to a gzipped dump of their databases. Which is then easy to grab using ‘wget’.

  2. The Ty says:

    That is a pretty good idea.

    I have been using a gzip dump of the database and storing it onsite and offsite from oscommerce, but an always on solution may be a more comprehensive answer.. especially when I am days away from going online with 3 more websites.

    May have to get the York fix! Later bro

    – T

  3. For those web hosts that don’t allow you to access their DB servers via mysqldump, I created a PHP file that does the same thing and returns the SQL output. Unfortunately the version that I posted on my blog is slightly out of date…

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s