The server that I currently run this site on is a VPS from FastHosts which comes with Windows 2008 R2 (64bit) installed. As you might imagine it doesn't have a database server installed on it by default and so I installed SQL Server 2008 R2 Express.
Key to safely running a server is backing up your data so that in the event of a catastrophy, you can quickly and easily get back up and running. This isn't so important if you're only running a blog and not a business, but I still wouldn't like to lose the posts I've got on this blog, and it makes for good practice as a professional to always be on top of these things. You never know what you might learn while playing that saves your ass between the hours of 9 and 5:30.
One of the main differences between SQL Express and it's much more costly big brothers is the lack of SQL Server Agent. The SQL Server Agent is responsible for, amongst other things, running scheduled jobs against the database server such as periodic backups. Fortunately there is tool I found which fills this void.
ExpressMaint is a free utility available on CodePlex which allows you to script and schedule a bunch of tasks you want to run if you want to keep your data safe and tidy. These tasks include:
- Backing Up.
Full, Differential and Log backups are supported.
- Expiring backups.
ExpresMaint can automatically delete backup files after an arbitrary period of time.
- Rebuild and Reorganize Indexes.
- Check database integrity.
- Write text file reports on all the above activities.
It works like a dream, I have two scheduled tasks set up, one runs every Sunday at midnight and does a full backup of all the databases (including system databases, because why not). The command runs with the following arguments
expressmaint -S (local)\SQLExpress -D ALL -T DB -R D:\Dropbox\Backup\Reports -RU WEEKS -RV 1 -B D:\Dropbox\Backup\Data -BU WEEKS -BV 4 -V -C
The second task is scheduled to run twice a day at 12 noon and midnight. This is complete overkill but again, why not. My databases are small and sites low-traffic, diskspace is cheap and I might learn something from having more backups to deal with than less.
expressmaint -S (local)\SQLExpress -D ALL -T DIF -R D:\Dropbox\Backup\Reports -RU WEEKS -RV 1 -B D:\Dropbox\Backup\Data -BU days -BV 7 -V -C
These are fairly simple scripts but being the lazy developer that I am, I followed a link from the ExpressMaint page which gives a bunch of samples for all the functionality. Check it out.
I never said this was enterprise class!
As you might have guessed from the above snippets I'm utilising Dropbox to provide an off-site backup solution. I spent some time pondering the best way to get my backups into a safe place and some of the ideas that crossed my mind included:
- Batched FTP.
This is often a first thought in situations like this and isn't a bad one when you can guarantee that you'll always have a machine available to login and download the backups. I only have a laptop which is also my workstation during the week, so I can't guanrantee it will always be in a position to retrieve the files. Additionally, the hard disk might go pop, the laptop could get flushed down the toilet, stolen or anything else you can imagine which doesn't give it a great score on the data redundancy chart.
- Amazon S3.
Using something like TntDrive, this would be as easy as using Dropbox, just change the output path to the mounted S3 bucket and away you go. I Didn't go for this in the end as both TntDrive and S3 cost more money than my basic (free) Dropbox account.
In the end I went for Dropbox for a variety of reasons. I already had an account with a bunch of extra space from refferals. It put's my data in the cloud (and therefore safe?) just like S3, and it automatically syncs the backups down to my laptop whenever it can, so if I want to restore my local development database to a pretty recent representation of a live database, it's usually just two restore operations away and I don't have to mess about fetching files from the server!
Dropbox did pose one issue, it usually places a shortcut in the Startup folder on the Windows Start Menu, but these shortcuts are only executed when the user logs in. Windows servers often run for days if not weeks at a time without having a desktop session active, and as such my whole Dropbox plan would be worth nothing. I really didn't want leave a user logged in all the time just to enable this, especially as a desktop session can easily consume 100Mb of RAM or more, and on a cheap VPS, RAM is not a comodity so I'd prefer to leave it for SQL Server or IIS.
Luckily there are a few utilities that take a run-of-the-mill executable and wrap it in a Windows Service allowing you to start the application when the system boots.
I settled on Any2Service, it's a bit retro-1995 looking but does the job quickly and easily. Unfortunately the website in the applications About page seems to have been eaten by some new tool so the only way to get it is by Googling. Using this is a simple case of picking the EXE you wish to run and giving the service a name. Click Apply and you're off to the races.
This setup has been running perfectly for about two months now. Every morning when I fire up my laptop, I get a little notification from dropbox that a bunch of files have been updated.
The only problem is that each diff is in it's own file, or in SQL Server parlance, it's own media set, which means when restoring to the most recent diff, you have to run through the restore prcess twice, the first time you have to Overwrite the existing database (WITH REPLACE) and Leave the database non-operational (RESTORE WITH NORECOVERY). Then the second time round, you have to uncheck the Overwrite option so you don't replace the database and then Leave the database ready to use (RESTORE WITH RECOVERY).
This isn't ideal but been as it's only one extra step (not ten) it's fine for now, especially as I haven't spent any time looking into sorting it.