Skip to Main Content

Automatically Back Up Your Web Site Every Night

By Gina Trapani

Earlier this week, web site owners with sites hosted at a service called Cornerhost got a big scare: The service appeared to be closing without notice, and their owner was nowhere to be found. Terrifying, right? Unless you back up your web site regularly, that is. Here's how to set up automated backups for your web site so if worst came to worst, your data would remain safely in your hands.

If you pay for web hosting in order to run any kind of web-based application—from your WordPress blog to a nameplate site to a file-sharing service to a social media data archive—you need to back up your web server's data the same way you back up your computer's data. On database-driven web sites, there are two kinds of data you want to preserve and restore in case of disaster: the files that make up your site (the PHP/Perl/Python, JavaScript, CSS files, etc), and the contents of your database. Further, any good backup system should make both a local copy and a remote copy of the backed-up data.

I run several database-driven sites and applications, including this blog, so my backup system has to be solid. Here's how I have it set up.

This method assumes a few things:

  • You're running a LAMP-based web site (Linux, Apache, MySQL and PHP/Perl/Python).

  • You have command line access to your web server via SSH.

  • You know how to make new folders and chmod permissions on files.

  • You're comfortable with running bash scripts at the command line on your server and setting up cron jobs.

  • You know where all your web server's files are stored, what databases you need to back up, what username and password you use to log into MySQL.

  • In order to have remote data backup, you need access to another server that's available via SSH in addition to your site's server. I asked a friend of mine for an account on his server to store some backup files and he kindly obliged. If you don't have a friend with a server at a different host, you can run an always-on server at home and back up to there. I prefer not to have a computer on at all times in my home, where bandwidth speeds can be slow, so I'd recommend finding a friend to back up to (and you can offer your friend the same courtesy).

All systems go? Let's get your backup system set up.

First: Local Backup

In order to back up your web site, your script has to back up two things: all the files that make up the site, and all the data in your database. In this scheme you're not backing up the HTML pages that your PHP or PERL scripts generate; you're backing up the PHP or PERL source code itself, which accesses the data in your database. This way if your site blows up, you can restore it on a new host and everything will work the way it does now.

First, SSH into your web server, and in your home directory, create a folder named backups. Inside this folder, create a file named backup.sh. Then, create a folder name files.

Here's what the result should look like:

your_home_directory/|+ - backups/    |    + - backup.sh    |    + - files/

The file we care about right now is backup.sh. This file will be the script that zips up your data and saves it in the files.

The script I run is heavily based on an example I found on The How-To Geek's wiki. Here's the source code of backup.sh that takes care of smarterware.org's files and database:

#!/bin/shTHESITE="smarterware.org"THEDB="my_database_name"THEDBUSER="my_database_user"THEDBPW="my_database_password"THEDATE=`date +%d%m%y%H%M`mysqldump -u $THEDBUSER -p${THEDBPW} $THEDB | gzip > /var/www/vhosts/$THESITE/backups/files/dbbackup_${THEDB}_${THEDATE}.bak.gztar czf /var/www/vhosts/$THESITE/backups/files/sitebackup_${THESITE}_${THEDATE}.tar -C / var/www/vhosts/$THESITE/httpdocsgzip /var/www/vhosts/$THESITE/backups/files/sitebackup_${THESITE}_${THEDATE}.tarfind /var/www/vhosts/$THESITE/backups/files/site* -mtime +5 -exec rm {} \;find /var/www/vhosts/$THESITE/backups/files/db* -mtime +5 -exec rm {} \;

Copy and paste this source code into your backup.sh file. To successfully run this script in a setup similar to mine, on lines 3 through 7, you must replace smarterware.org, my_database_name, my_database_user, and my_database_password with the right values for your web site.

This version of the script makes two assumptions about file locations. On my web server (and many, but not all setups), my home directory is a path that looks like this: /var/www/vhosts/example.com/ (where example.com is your web site domain). All of the public, web-accessible files are located in /var/www/vhosts/example.com/httpdocs/ (where example.com is your web site domain).

Your web site file path may vary. If it does, in the script's source code, replace /var/www/vhosts/$THESITE/backups/ with the path to your backups folder location, and replace /var/www/vhosts/$THESITE/httpdocs/ with the location of your site's web-accessible files.

Let's walk through what this script is doing. After setting some variables in lines 3 through 7, line 9 is running a mysqldump of all the data in the database named in line 4, archiving it, and storing it in the files directory using a filename that looks like dbbackup_example.com_1402120101.bak.gz.tar. Line 11 and 12 are archiving the site's source code files in the httpdocsdirectory, and storing them in the files directory, using a filename that looks like sitebackup_example.com_1402120101.tar. Notice both these filenames include the date, so you can see when the backup was made.

Finally, lines 14 and 15 are deleting any backups made more than 5 days ago. You're going to run this backup script nightly, and the files will take up a lot of space quickly. That's why these last commands delete older backups. You can change the number 5 to any number of days you want to keep backups from.

In order to run this script, you must chmod +x backup.sh. Run it manually to make sure it generates the backup files you expect. Finally, schedule it to run as often as you like in your crontab. To run it at 1:01 am every morning, your crontab would look like this:

1       1       *       *       *      /var/www/vhosts/example.com/backups/backup.sh

Make sure you are running this script for every web site and database you care about.

Once this backup script has run a few nights while you're sleeping soundly in your bed, your files directory will fill up with at least 5 days worth of file and database contents backups. Nice job.

But you're not done yet.

Next: Remote Backup

Having backups on your web server will save your bacon if a WordPress update goes awry or your accidentally delete a blog post from your database. However, it doesn't help if your web server is unreachable or dies in a fire. That's why you want to send copies of this data to a remote server automatically.

Once you've got access to a remote server thanks to a generous friend or at home, you're going to set up an rsync job which transfers all your web server's backups over to it in case of total disaster. I ran down how to mirror files across systems with rsync years ago, so I won't rehash it, but you're going to use that same approach here.

In short, on the remote server, create a folder called offsitebackups. To rsync your new web site backup files to your remote host, SSH into that host, and cron a job which looks something like:

rsync -e ssh -a —delete [email protected]:/var/www/vhosts/example.com/backups/files/ /your/path/to/offsitebackups/

Replace the username, web site name, and paths with your information.

That command will sync all the files in your host's backups folder to your remote server's offsitebackups folder. Run it to make sure it works. It should prompt you for the password to log into your web server when you do. When it's done syncing, you should see your backup files in the offsitebackups folder.

The problem is, you won't be around to enter the password every night when cron tries to run it. To run it without intervention, you'll need to set up passwordless login into your web server. This excellent tutorial on automating backups with rsync runs down those steps as well.

Setting up local and remote, database and file backup of your web server requires upfront time and effort, but once you've set it up, you can forget it. Using this system you can blog away, get your blogging software up-to-date, and manually edit files directly on your web server without having to worry about losing changes or not being able to restore your data ever again.

Automatically Back Up Your Web Site Every Night | Smarterware


Want to see your work here? Send an email to [email protected]!