Sanity is headless CMS with amazing developer experience, making it really easy to do most things. For small agencies serving clients with Sanity, however, one thing is lacking.
Clients often ask for scheduled backups as part of ongoing maintenance arrangements. What that means for indie agencies and solo developers, who likely don’t have projects on the Enterprise tier, is running backups locally and storing them on cloud storage for safekeeping.
The checklist looks like this for us every backup:
- Run
sanity dataset export
- Rename file to include timestamp
- Upload file and wait for it to complete
- Add an entry to some sort of maintenance log
Steps 3 & 4 are a bit of a pain (having to keep the computer awake), but we’ve found that steps 1 & 2 can be simplified with these two shortcuts on our package.json
, assuming a folder called backups
in the package root:
// package.json
{
...
"scripts" : {
...
"backup:assets": "sanity dataset export production backups/`echo $(date '+%Y%m%d')`.tar.gz",
"backup": "sanity dataset export --no-assets production backups/`echo $(date '+%Y%m%d')`--data-only.tar.gz"
}
}
Then, to run a data only backup, just do npm run backup
and if you want the assets as well, npm run backup:assets
. The backups will be output to the backups
folder, named with the right date, and indicates whether it’s a data only backup or not. These can then be uploaded to any cloud storage or offsite disk for safekeeping.
On top of covering the above, there’s still steps 3 and 4, which are major time sinks in the sense that someone has to context switch to babysit those tasks every week. We set out to solve for those with automation with Baccup. If running manual backups locally is getting a bit too tedious, give Baccup a try.
Alternatively, if you fancy running your own cloud setup to automate this entire process, check out Backup Sanity to AWS S3 by Simeon Griggs from Sanity.