A newer version of this article was written in 2023. It includes new details about my local storage and some more general details.


Over the past few weeks, I’ve seen quite a few threads on Twitter from folks trying to solve an ageless challenge: how to back up their personal computer. In a time where personal documents, photos, and other files are increasingly distributed across third-party services in the cloud, finding secure, no-fuss, cost-effective backup solutions is difficult.

Here’s what I’m doing.

Requirements

When I set this up a few years ago, I had some pretty specific requirements:

  • Multiple machines: we have a family iMac and my personal machine that need to be backed up, and they both run macOS
  • Large storage: our family Photos.app library is over 1TB. We’ve begun to use iCloud Library to improve access across devices. However, I’m unwilling to trust iCloud solely with this data, so we have full-resolution versions on the iMac, and thus need destinations that have at least that much capacity
  • Automatic: this needs to run continuously and cannot depend on plugging in USB drives or remembering to run a script
  • Encrypted: data stored on third-party systems must be encrypted
  • Offsite: I want to be sure we have a copy offsite (ideally in another city/state) in case of natural or other disaster that could affect our home in the future
  • Immediate Access: if something happens to my computer, I want to be able to restore immediately, without having to wait for a recovery disk to be shipped to me
  • No-fuss: I don’t want to mess around with complex scripts, fiddle with config files, or other fragile DIY contraptions. While I love *nix, I don’t want to read a man page to figure it out.
  • Versioning: being able to rewind time and get a past version of a file is great. In reality, I use this to peek in on backups occasionally… just to double-check recent updates have been backed up properly

Backup Data

I don’t back up applications, OS files, or anything else I can reproduce from the App Store or internet; my backups only contain data I can’t (easily) replace. A good backup allows me to treat a computer and its OS as a commodity.

I keep some scripts to install apps I use and set preferences with defaults in my $HOMEDIR, so I can also selective about config backups, since I can recreate a machine with a fresh OS and my restored data. I only choose a couple of directories within ~/Library for inclusion (Mail.app mailboxes, notably).

Software

I use Arq 5.

Arq’s feature list is extensive and easily covers all of my requirements. I gave their trial a spin a few years ago and was very excited with how it worked. I paid a one-time cost for the full version of the application.

The application runs quietly and offers plenty of flexibility. This version of the app has been solid, efficient, and stable. I have yet to upgrade to the latest release, due to common issues the developer is well-aware of, but look forward to the improvements in version 7 (expected late this year or early next). I’ll be happy to pay for an upgraded version when its ready.

Destinations

I’m using multiple backup destinations: one cloud-based and one onsite. I chose to have a mix for redundancy, flexibility, and cost.

Offsite Storage

I subscribe to Wasabi, an affordable, S3-like service for offsite storage. Arq supports Wasabi natively, so setup is a breeze. Wasabi’s current pricing is about $11/month for 2TB of immediately-available storage.

Both of our computers push to buckets configured on Wasabi (each with different keys to limit data exposure, should the keys be compromised). Arq encrypts data before sending it to Wasabi, so there are multiple levels of data protection here.

Our initial push from the family computer to Wasabi took a few weeks. Arq dutifully chipped away at the upload process without ever saturating our home internet. I was initially concerned about uploading this much data, but it went without a hitch. Once a backup is seeded, Arq pushes incremental changes to its destinations, so there is only a swell of data pushed when we import a large number of new photos or other large media.

Onsite Storage

I also have a local backup destination. This one flirts with fiddly, but after I initially set it up, I’ve done zero admin.

I bought a Dell Optiplex off Craigslist a couple years ago for $40. It’s one of those thin PCs like they have at the bank; it’s barebones and bulletproof. It was originally listed for $50, but I haggled it down since I didn’t need the Windows license it came with.

I picked up a 4TB spinning drive from Amazon and installed Ubuntu on it. The machine’s BIOS is configured to power on after power failure, so its always running. Its plugged in to my router with ethernet, so its connectivity is solid.

It’s basically a super-cheap NAS on commodity PC hardware. It’s pretty awesome.

To push data to the Optiplex, I set up SFTP destinations in Arq and use its mDNS hostname (so I don’t have to bother with static IPs). Installating avahi-daemon on Linux machines allows them to participate in mDNS/zeroconf networking with Macs seemlessly.

Each instance of Arq on our Macs pushes to its own folder inside the home directory of the SFTP user. The initial data push took some time (a couple days for the iMac’s photo library), but incremental updates are quick and happen a couple times a day.

Final Comments

Having onsite storage is great - it’s fast, rock solid, and available if I need quick access to an old version. I have redundancy with my offsite, cloud-based backups, so I didn’t invest in a complex, RAID-based NAS system, which is a complex product selection process itself.

I’ve had one opportunity to test this system; I’ve recently sent my machine in for keyboard repairs, and was able to restore to a spare computer successfully before confidently wiping my machine to be sent in. Remember - test your backups!

Have some questions? Reach out to me on Twitter. Good luck, and may your backups be safe, up-to-date, and unneeded!