New backup solution

Backups are important.  Everybody in IT knows this.  After all, whenever someone tells you they lost some important file, the first question is always, "Well do you have a backup?"

Nevertheless, many of us are lax in setting up our own backup solutions.  I know I was for many years.  And while that might not be forgivable for people who really should know better, it is understandable.  After all, implementing a proper backup solution is a lot of work and the benefits are not immediately obvious.  Think about it: it requires an ongoing investment of both time and money which, in the best-case scenario (i.e. nothing goes wrong), will never pay off.  Is it any wonder that even IT professionals aren't eager to put a lot of effort into something they actively hope never to need?

But about a year ago, I finally put a home backup solution in place.  That solution was CrashPlan.  But now CrashPlan is discontinuing their home plans, which means I've been forced to figure out something else.

The old setup

CrashPlan's "home" plan was actually pretty nice.  Perhaps too nice, since apparently it wasn't profitable enough for them to keep offering it.  The subscription was priced at $150 per year and that included up to ten computers in your household and unlimited cloud storage.  On top of that, their agent software supported not only backing up to the cloud, but also backing up to external hard drives and other computers running the CrashPlan agent.  And it ran on Windows, Linux, and Mac!

Currently, I have three computers in active use in my house: my laptop, my wife's laptop, and my desktop, which is really more of home server at this point.  I had both of the laptops backing up to the cloud, while the desktop backed up to both the cloud and an external hard drive.  I've got close to a terabyte of data on the desktop, so the external drive is important.  I do want a copy of that data off-site just in case of a catastrophe, but I'd rather not have to suck that much data down from the cloud if I can avoid it.

I'm happy with this setup and I wanted to keep something equivalent to it.  I feel like it provides me with sufficient protection and flexibility while not requiring me to buy extra hardware or pay for more cloud storage than I need.

The alternatives

Turns out this setup isn't quite as easy to replicate as I had hoped.  There are plenty of home backup services out there, but most of them don't offer the same range of options.  For instance, many are Mac and Windows only - no Linux.  Many offer limited storage - usually a terabyte or less, which I'm already pushing with my largest system.  Many are cloud backup only - no local drives.  And when you add up the cost of three systems, most of them are more expensive than CrashPlan Home was.

On their transition page, CrashPlan recommends that existing home customers either upgrade to their small business plan or switch over to Carbonite.  I briefly considered upgrading to the small business plan, but the price is $10/month/device.  So I'd go from $150/year to $360/year for basically the same service.  That doesn't sound like a great deal to me.

Carbonite, on the other hand, is one of the options I considered the first time when I settled on CrashPlan.  They're offering a 50% discount to CrashPlan customers, so the initial price would only be $90.  Presumably that's only for the first year, but even after that $180/year is only slightly more than CrashPlan.  However, from what I can see Carbonite doesn't support Linux on their home plan - they only do Linux servers on their office plan.  I also don't see an option to back up to an external drive.  Although it does support backing up external drives to the cloud...for another $40/year.

Plan B

After doing some research and thinking about it for a while, I eventually decided to skip the all-in-one services.  Yeah, they're nice and require almost zero work on my part, but I wanted some more flexibility and didn't want to pay an arm and a leg for it.  However, I didn't want to completely roll my own solution.  Writing simple backup scripts is easy, but writing good backup scripts with proper retention, special file handling, logging, notifications, etc. is a lot of work.CloudBerry Backup for Windows

If you don't want a full-service backup solution, the alternative is to go a la carte and get a backup program and separate cloud storage provider.  There are a number of options available for both, but after some research and experiments, I decided to go with CloudBerry Backup for my backup program and Backblaze B2 as my storage provider.

The choice of Backblaze B2 was kind of a no-brainer.  Since this is long-term backup storage, performance is not a huge concern - it was mostly a question of capacity and price.  B2 has standard "pay for what you use" billing and the prices are extremely reasonable.  Currently, storage is $0.005 per gigabyte and there's no charge for uploads.  So for the volume of data I have, I'm looking at storage costs of $5 or $6 per month, which is pretty cheap.

The backup program was a different story.  I tried out several options before settling on CloudBerry.  Most of the options I tried were...not user friendly.  For me, the CloudBerry UI had the right balance of control and ease of use.  Some of the other solutions I tried were either too arcane or simplified things too much to do what I wanted.  CloudBerry uses a wizard-based configuration approach that makes it relatively painless to figure out each step of your backup or restore plan.  I find that this allows them to expose all the available options without overwhelming the user.

As far as capabilities, CloudBerry pretty much has what I wanted.  It supports Windows, Mac, and Linux using multiple storage solutions, including local file systems, network file systems, and various cloud providers.  Beyond that, the feature set depends on the version you use.  There's a free version, but I went with the paid desktop version because it supports compression and encryption.  The licenses are per-system and they're a one-time change, not a subscription.  The basic "home" licenses currently run $50 for Windows and $30 for Linux, which I think is pretty reasonable.

Results

So far, the combination of CloudBerry and B2 for my backup solution is working well.  I've been using it for about five months and have all three of my systems backing up to the cloud and my desktop also backing up to a USB hard drive.  The process was largely painless, but there were a few bumps along the way.

As I mentioned in a previous post, as part of this process I moved my desktop from Windows to Linux.  As it turns out, setting up the two laptops to have CloudBerry back up to B2 was completely painless.  Literally the only annoyance I had in that process was that it took quite a while for the confirmation e-mail that contained my license key to arrive.  So if you're a Windows user, I can recommend CloudBerry without reservation.

Linux wasn't quite as simple, though.  I had a number of problems with the initial setup.  The interface is very stripped-down compared to the Windows version and doesn't offer all of the same options.  I also had problems getting the backups to run correctly - they were repeatedly stalling and hanging.  Fortunately, the paid license comes with a year of support and I found the CloudBerry support people to be very helpful.  In addition, they seem to be actively working on the Linux version.  I initially installed version 2.1.0 and now they're up to 2.4.1.  All of the issues I had have been resolved by upgrading to the newer versions, so things are working well now.

I had initially been a little concerned about the per-gigabyte and per-transaction pricing, but so far it hasn't been an issue.  I found Backblaze's storage cost calculator to be pretty accurate and the per-transaction charges are not significant.  The cost has been basically what I initially estimated and I haven't had any surprises.

Overall, I'm very happy with this solution.  The price is reasonable and, more importantly, it provides me with lots of flexibility.  Hopefully I'll be able to keep this solution in place for years to come.

Apparently CrashPlan eats inotify watchers

As a brief follow-up to my last post, I had a problem trying to remote into my newly installed Ubuntu box today.  As I mentioned in that post, I'm using xRDP for my remote GUI access and I'm tunneling it over SSH.  When I connected to xRDP, I logged in and was greeted by the Ubuntu lock screen for my running session.  So I entered my password, hit "enter", and waited.  And waited.  And waited some more.  My session just never unlocked.

OK, so apparently xRPD is freaking out.  I already have a an SSH session open, so I'll just restart the service and maybe that will clear up the problem.  So I ran service xrdp restart and it told me:

Failed to add /run/systemd/ask-password to directory watch: No space left on device

That didn't seem right.  I had plenty of disk space and plenty of RAM - what could possibly be out of space?  So naturally I asked the internet, which led me to this Stack Exchange post.  Turns out that CrashPlan has a habit of eating up inotify watchers, which is what I was actually out of.  Stopping the CrashPlan service fixed the problem.  Fortunately, I don't need CrashPlan anymore, so that shouldn't be a problem again.

Of backups and reinstalled

I finally decided to do it - reinstall my home media server.  I switched it from Windows 8.1 to Ubuntu 17.10.

The reinstall

Believe it or not, this is actually a big thing for me.  Fifteen years ago, it would have been par for the course.  In those days, I didn't have a kid or a house, so rebuilding my primary workstation from scratch every few months was just good fun.  But these days...not so much.  Now I have a house and a kid and career focus.  Installing the Linux distribution of the week isn't fun anymore - it's just extra time I could be spending on more important and enjoyable things.

But, in a fit of optimism, I decided to give it a shot.  After all, modern Linux distributions are pretty reliable, right?  I'll just boot up and Ubuntu DVD, it'll run the installer for 15 minutes or so, and I'll have a working system.  Right?

Well...not quite.  Turns out Ubuntu didn't like my disk setup.  Things went fine until it tried to install Grub and failed miserably.  I'm not sure why - the error message the installer put up was uninformative.  I assume it had something to do with the fact that I was installing to /dev/sdb and /dev/sda had an old Windows install on it.  After a couple of tries, I decided to just crack open the case and swap the SATA cables, making my target drive /dev/sda, and call it a day.  That did the trick and Ubuntu installed cleanly.  I did have to update my BIOS settings before it would boot (apparently the BIOS didn't automatically detect the change of drives), but it worked fine.

The only real problem I had was that apparently Wayland doesn't work properly on my system.  I tried several times to log into the defaut GNOME session and after a minute or two, system load spiked to the point that the GUI and even SSH sessions became unresponsive and eventually the GUI just died.  And I mean died - it didn't even kick me back to the GDM login screen.

I suspect the problem is my system's archaic video card - an integrated Intel card from the 2010 era which I have absolutely no intention of ever upgrading.  I mean, it apparently wasn't good enough to run Windows 10, so I wouldn't be surprised if Wayland had problems with it.  But in any case, Ubuntu still supports X.org and switching to the GNOME X.org session worked just fine.  I don't really intend to use the GUI on this system anyway, so it's not a big deal.

The restore

Once I got Ubuntu installed, it was on to step two of the process: getting my data drive set up.  It's a 1TB drive that used to serve as the Windows install drive.  It was divided into a couple of partitions, both formatted as NTFS.  Since I'm switching to Linux and really just wanted one big data drive, this was a sub-optimal setup.  Therefore I decided to just blow away the partition table and restore from a backup

I currently use CrashPlan to back up this computer.  It's set to back up to both the cloud and a local USB hard drive.  So my plan was to repartition the disk, install CrashPlan, and restore from the local hard drive.

This was fairly easy.  Installing the CrashPlan client was the first task.  There's no .deb package, but rather a .tgz that contains an installer script.  It was actually pretty painless - just kick off the script and wait.  It even installs its own copy of the JRE that doesn't conflict with the system version.  Nice!

Next was actually restoring the data.  Fortunately, CrashPlan has their process for restoring from a USB drive well documented, so there wasn't much to figure out.  The process of connecting a particular backup dataset on the drive to the CrashPlan client was slightly confusing because the backup interface (as far as I can remember) doesn't really make it obvious that a USB drive can have more than one dataset.  But it just boils down to picking the right directory, which is easy when you only have one choice.

The only surprise I ran into was that running the restore took a really long time (essentially all day) and created some duplicate data.  My data drive contained several symlinks to other directories on the same drive and CrashPlan apparently doesn't handle that well - I ended up with two directories that had identical content.  I'm not sure whether this was a result of having symlinks at all, or if it was just moving from Windows symlinks to Linux symlinks.  In any case, it's slightly inconvenient, but not a big deal.

Other services

While the restore ran, I started setting up the system.  Really, there were only a handful of packages cared about installing.  Unfortunately for me, most of them are proprietary and therefore not in the APT repository.  But the good news is that most of them were pretty easy to set up.

The first order of business, since this box is primarily a media server, was setting up Plex.  This turned out to be as simple as installing the .deb package and firing up the UI to do the configuration.  From there, I moved on to installing Subsonic.  This was only marginally more difficult, as I also had to install the OpenJDK JRE to get it to run.  I also followed the instructions here to make the service run as a user other than root, which seemed like not such a great idea.

The only thing I wanted to install that I couldn't get to work was TeamViewer.  But this wasn't a deal-breaker, though, because the only reason I was using TeamViewer under Windows was because I was running the Windows 8.1 Home and was too cheap to pay for the upgrade to Professional just to get the RDP server.  But since this is Ubuntu, there are other options.  Obviously SSH is the tool of choice for command-line access and file transfers.  For remote GUI access, I tried several variations on VNC, but it eventually became clear that xRDP was the best solution for me.  It's not quite as straight-forward to get working, but this guide provides a nice step-by-step walk through.  There are also a few caveats to using it, like the fact that the same user can't be logged in both locally and remotely, but those weren't big issues for my use case.

Results

For the most part, the transition has gone pretty smoothly.  The setup didn't go quite as smoothly as it could have been, but it wasn't too bad.  In any event, what I really cared about was getting Plex and Subsonic up and running and that was pretty painless.  So I'm back to where I was originally, but on an up-to-date operating system, which is all I really wanted.