I’ve been working with our team for a while to continually refine our db backup procedures. We’ve got a fairly complex database environment, with quite a few sql servers in various locations, which prior to SQL SP2 we did backups primarily thru scheduled nightly jobs. An update in SP2 broke our (basic) model, which re-energized my dreams of db backup automation and a more granular, manageable process.
Well, I’m nearly 90% complete with my new model – check back soon for Part 2, where I’ll offer my scripts for download. Highlights include a single table that allows you to manage your backup job. This table (briefly) includes the “databases” you want to backup, the backup engine to use (whether local or quest, red-gate, idera, etc), the schedule for backing up databases, including log backups and even file-group backups. Next I wrote a Stored Proc which loops thru this table looking for backup jobs to run, using the respective backup engine. The table/SP also includes a feature to write copies of the backup to both the local DB as well as a remote NAS device (or any path you provide in the setup table.) One nice feature about this is you can keep the past 7 days backups local on the machine, in the event you need to do a quick restore, whereas backups older than 7 days get archived to a NAS device, still recoverable, but depending on the location may take additional time to recover. This helped us to address storage space, both locally and remotely, managing encryption and compression during the process. While the scope of this little backup ditty grew, the flexibility/usability increased and we were able to pack in alot of functionality, that i thought others could eventually benefit from. I wrapped up the function with some baselining capabilities, logging the start/end times of each backup with a secondary table, which provides two things. First, helping us verify a backup runs every night and second, helping use see the backup size and file growth over time enabling us to do better storage/archival projections. Lastly, it was important to me to make this entire process portable. Deployment is as easy populating the tables with the DBs you want to backup and then schedule the SP to be run at some nightly interval. thats it!
Now, before I share the code (hopefully it will be worth the wait), here are a few things to consider when desinging your backup strategy:
- Understand backup types (simple vs. bulk, vs. full)
- Know where your backups are (dont want to go searching in time of need)
- Encrypt/compress/password protect your backups (as the standard .bak file can be read with notepad!)
- DB backups dont contain everything (also need master/userDB,msdb)
- Take care of system DBs (master, msdb)
- Plan for physical vs. logical failures (nightly backups good enough? every 5 min/log shipping required?
- Ensure recoveryability of a backup (media, network, data – run restore verify only & also test entire recovery model)
- Forecast backup trends (do capacity planning and create a formal db recovery plan)
- Consider use of direct-attached or online (NAS) volumes for backup – tapes are no longer the standard for backups!
- Keep Control – if you are the DBA, dont rely on others with missing documention to jeapordize your restore.
Initial questions to ask. How much data can be lost? Are there any SLAs you have to support? What data security considerations are required? Did you remember to backup indexes? Some best practices include, running DBCC, then nightly backup with incremental log-backups periodically throughout the day as the business requires.
Hopefully that help to get you into the backup mindset. Lots of things to consider, and theres certainly no one size-fits-all. I’ll be posting the source code to my sql scripts in an upcoming post – which I hope can help you save time modernizing your backup efforts!?