Webmasters nightmare #2 - Server and backup failure

Last week was a bad week, following on from one of my blogs being hacked I also experienced one of my forum accounts being accidentally deleted by our provider. It wasn’t just my host that was completely bloody incompetent, the backups I had of that site had also failed so in the end I had to resort to a backup from 2005, not good. So on the subject of backups, here’s the methods I use on other sites (and the old one now) and also some other suggestions.

Static vs Dynamic Sites

A static site is one that once uploaded doesn’t change, this means you already have one backup on your local machine and to protect from multiple failures this should be backed up onto other machines/mediums. Then if it does fail it will just be a case of uploading again. However nowadays most sites are in some way dynamic and are being continually updated so you need a more proactive back up regime.

Let the Host Do It

Many (more competent) web hosts offer their own backup service. This may be included in the package or as an optional extra. As your server/site is already with your host they should have direct access to the data thus not clogging your available bandwidth. Even if you do use your hosts backup services I would advice also taking your own just to be on the safe side (some hosts that do backups do not guarantee their effectiveness).

Backup scripts on your server

You can either use a third party script or a plugin for your CMS. I use WP DB Backup (part of the install now) to backup my Word Press databases on a daily basis and email the resultant files to my GMail address (this is a great way of using all those free GBs). Another method would to FTP the files to another server, you can do this in a few ways:

  • Use multiple hosting accounts with different providers (it will use bandwidth on both), this is useful for people with a few sites
  • Set-up your own local FTP server (for windows checkout XAMPP) and use a service such as DynDNS to give your home server it’s own IP address on the NET (you will need to mess with forwarding rules on your firewall so this is probably the most techie option)
  • “Phone a friend” and ask if you can have an FTP account on his server for backups and you’ll return the favor
  • Backup Locally

    Instead (or as well) of running scripts on you server to backup your data there are also applications that you can run locally that automatically download backups for you. The excellent MySQL Administrator has a schedule option which can be used to download your databases on a regular basis.

    Dedicated Backup Companies

    There are various companies that specialise in backing up your data, I’ve not had any experience with these, but if you have please let us know.

    I’m very interested to hear other peoples backup regimes, recommended scripts, providers, etc. If We get some good suggestions I’ll put a full guide together.

About Al Carlton

Al quit the 9 to 5 rat race in January of 2007, before then he was a software engineer and systems architect of financial system. Nowadays Al spends the days running his various businesses and experimenting with different ideas and opportunities.
Al can be found on twitter at AlCarlton.

Comments

  1. and, test your backups.

    extract them somewhere and check a few random files. check the size of the archive against the previuos one. Check the number of files.

  2. StephenB says:

    I use a combination of mysqldump and mysqlhotbackup to backup a VB forum and CMS daily of 2.5-3Gb in total (mysql) to a second hard drive on a dedicated server along with weekly full file (5Gb) offsite backup via ftp and daily changed file backups.

    As the site is global I have to pick the quietest time to do it staggered over an hour. Mysql takes about 15mins.

    S

  3. Hi Al!

    I’m hosted with Wiredtree atm and I use a few extra GB’s (at another ftp server) to automatically backup my sites on a daily basis.

    Also, I retain weekly and monthly backups and download a copy to my own office once per week at least.

    Seriously people…

    If you can’t afford $10-$20 to do a decent backup, get another biz…

  4. after reading your article i have backed up all my sites. Cheers for the heads up!

  5. Basically, I does a routine backup every now and then. Each compressed data is close to 2GB each time. Hence, it was relatively time-consuming to perform full backup on a daily basis.

    However, a full backup is definitely a must. Somehow, you wouldn’t know when you could have changed a line of code at any particular day.

  6. I’ve had backups running for over 12 months now using my own custom script that does a backup of all files and databases every day, and downloads them to my personal RAID0 file server. I’ve had to rely on it a few times, definitely saved me lots of headaches.

  7. Online Backup Services are always recommend for your websites and Apple Time Machine for your local HDD data on Mac 😉

  8. Apple Time Machine fully named as Time Capsule Wireless Backup Service

  9. I would add to this that if you have cpanel hosting, you can easily automate full backups (which will cover dynamic and static sites) to a remote FTP server. This can be your home computer, but more safely you should just pay for a secure FTP backup solution.

  10. I use bqbackup for remote storage. rsync runs from my server and moves all the data offsite. I also backup to a secondary hard drive in case the mirrored drives my sites sit on go south. From the secondary drive I move the data to SAN storage at my provider (ThePlanet.com).

  11. I come from a background of high uptime systems, i.e. we had dedicated fibre laid for 30km’s to link a hotswap backup site to the live system. This may seem extreme but when you have 5000 terminals hooked up to the one server downtime becomes pretty expensive.

    I run a dedicated server in Canada which is my main server, I also have another server (much cheaper and lower spec) located in the US.

    On the main server I run a daily cron job to dump all the mysql databases to a file. Then from my spare machine I run rsync over ssh to copy over all the website files and database dumps on a daily basis. This ensures that at most I will lose 24 hours data.

    Just for safety sake I also back this up to my mini mac at home using cron and rsync every day.

    So if my main machine goes down with no chance of recovering very quickly I can just point all the domains to my backup server and run my sites from there until the main server is back up.

    I wouldn’t recommend this setup for everyone though as it isn’t cheap, but losing all your data isn’t cheap either.

Leave a Reply