The dangers of using old stuff

I was reminded the other day of the dangers of using old software.  And by "old" I mean, "hasn't been updated in a couple of years".  So not really that old, just not new.

For my personal projects, I use the open-source version of a program called The Bug Genie.  It's a web-based bug issue tracker written in PHP.  I picked it mostly because it was easy to install on my hosting account (which didn't allow SSH access at the time) and sucked less than the alternatives I had tried.  It's actually not a bad program - a little confusing to administer, but has a decent feature set and UI.

The problem is that the last "official release" of the open-source version was in 2015.  In and of itself, this is not a problem - there are lots of super-useful programs out there that haven't been updated in far longer than that.  The problem is that this is a web application and the web, as an ecosystem, is not at all shy about breaking your stuff if you go more than six months without updating it.

So I tried to log into my Bug Genie instance the other day and I ran into two issues.  The first, and most serious, was a fatal error saying that a parameter to the count() function must be either an array or a countable.  After a little debugging, it turned out that this was due to a change in PHP 7.2, to which my web host had recently upgraded.  The code for one of the Composer packages contained a bug that didn't always pass an appropriate parameter to count() and in older versions of PHP, this would pass silently.  But in PHP 7.2, this raises a warning, which was in turn causing an error.  The fix was simply to update to a more recent version of the package that fixes the underlying bug.

The second issue was client-side.  The project pages have a number of panels on them that are loaded via AJAX calls, and none of them were loading.  Turned out the problem was that a JavaScript file related to the Mozilla Persona support wasn't loading and this was causing subsequent scripts on the page to fail.  A quick search revealed that there was a good reason it wasn't loading - Mozilla discontinued its Persona service in 2016, so the URL the page was trying to load no longer existed.  Fortunately, this was easily fixed by turning off that feature.

So we have two things broken by the passage of time.  One a change in platform semantics and the other a change in the surrounding ecosystem.  Both of them theoretically foreseeable.  But on the other hand, both also very easy to overlook.

For a software developer, this is a parable on building for longevity.  There are dangers is relying on external dependencies.  There are dangers in being even slightly out of spec.  If we want our software to last, we need to be vigilant in our validation and establish boundaries.  You can't trust the testing of today to reflect the world of tomorrow.

New backup solution

Backups are important.  Everybody in IT knows this.  After all, whenever someone tells you they lost some important file, the first question is always, "Well do you have a backup?"

Nevertheless, many of us are lax in setting up our own backup solutions.  I know I was for many years.  And while that might not be forgivable for people who really should know better, it is understandable.  After all, implementing a proper backup solution is a lot of work and the benefits are not immediately obvious.  Think about it: it requires an ongoing investment of both time and money which, in the best-case scenario (i.e. nothing goes wrong), will never pay off.  Is it any wonder that even IT professionals aren't eager to put a lot of effort into something they actively hope never to need?

But about a year ago, I finally put a home backup solution in place.  That solution was CrashPlan.  But now CrashPlan is discontinuing their home plans, which means I've been forced to figure out something else.

The old setup

CrashPlan's "home" plan was actually pretty nice.  Perhaps too nice, since apparently it wasn't profitable enough for them to keep offering it.  The subscription was priced at $150 per year and that included up to ten computers in your household and unlimited cloud storage.  On top of that, their agent software supported not only backing up to the cloud, but also backing up to external hard drives and other computers running the CrashPlan agent.  And it ran on Windows, Linux, and Mac!

Currently, I have three computers in active use in my house: my laptop, my wife's laptop, and my desktop, which is really more of home server at this point.  I had both of the laptops backing up to the cloud, while the desktop backed up to both the cloud and an external hard drive.  I've got close to a terabyte of data on the desktop, so the external drive is important.  I do want a copy of that data off-site just in case of a catastrophe, but I'd rather not have to suck that much data down from the cloud if I can avoid it.

I'm happy with this setup and I wanted to keep something equivalent to it.  I feel like it provides me with sufficient protection and flexibility while not requiring me to buy extra hardware or pay for more cloud storage than I need.

The alternatives

Turns out this setup isn't quite as easy to replicate as I had hoped.  There are plenty of home backup services out there, but most of them don't offer the same range of options.  For instance, many are Mac and Windows only - no Linux.  Many offer limited storage - usually a terabyte or less, which I'm already pushing with my largest system.  Many are cloud backup only - no local drives.  And when you add up the cost of three systems, most of them are more expensive than CrashPlan Home was.

On their transition page, CrashPlan recommends that existing home customers either upgrade to their small business plan or switch over to Carbonite.  I briefly considered upgrading to the small business plan, but the price is $10/month/device.  So I'd go from $150/year to $360/year for basically the same service.  That doesn't sound like a great deal to me.

Carbonite, on the other hand, is one of the options I considered the first time when I settled on CrashPlan.  They're offering a 50% discount to CrashPlan customers, so the initial price would only be $90.  Presumably that's only for the first year, but even after that $180/year is only slightly more than CrashPlan.  However, from what I can see Carbonite doesn't support Linux on their home plan - they only do Linux servers on their office plan.  I also don't see an option to back up to an external drive.  Although it does support backing up external drives to the cloud...for another $40/year.

Plan B

After doing some research and thinking about it for a while, I eventually decided to skip the all-in-one services.  Yeah, they're nice and require almost zero work on my part, but I wanted some more flexibility and didn't want to pay an arm and a leg for it.  However, I didn't want to completely roll my own solution.  Writing simple backup scripts is easy, but writing good backup scripts with proper retention, special file handling, logging, notifications, etc. is a lot of work.CloudBerry Backup for Windows

If you don't want a full-service backup solution, the alternative is to go a la carte and get a backup program and separate cloud storage provider.  There are a number of options available for both, but after some research and experiments, I decided to go with CloudBerry Backup for my backup program and Backblaze B2 as my storage provider.

The choice of Backblaze B2 was kind of a no-brainer.  Since this is long-term backup storage, performance is not a huge concern - it was mostly a question of capacity and price.  B2 has standard "pay for what you use" billing and the prices are extremely reasonable.  Currently, storage is $0.005 per gigabyte and there's no charge for uploads.  So for the volume of data I have, I'm looking at storage costs of $5 or $6 per month, which is pretty cheap.

The backup program was a different story.  I tried out several options before settling on CloudBerry.  Most of the options I tried were...not user friendly.  For me, the CloudBerry UI had the right balance of control and ease of use.  Some of the other solutions I tried were either too arcane or simplified things too much to do what I wanted.  CloudBerry uses a wizard-based configuration approach that makes it relatively painless to figure out each step of your backup or restore plan.  I find that this allows them to expose all the available options without overwhelming the user.

As far as capabilities, CloudBerry pretty much has what I wanted.  It supports Windows, Mac, and Linux using multiple storage solutions, including local file systems, network file systems, and various cloud providers.  Beyond that, the feature set depends on the version you use.  There's a free version, but I went with the paid desktop version because it supports compression and encryption.  The licenses are per-system and they're a one-time change, not a subscription.  The basic "home" licenses currently run $50 for Windows and $30 for Linux, which I think is pretty reasonable.

Results

So far, the combination of CloudBerry and B2 for my backup solution is working well.  I've been using it for about five months and have all three of my systems backing up to the cloud and my desktop also backing up to a USB hard drive.  The process was largely painless, but there were a few bumps along the way.

As I mentioned in a previous post, as part of this process I moved my desktop from Windows to Linux.  As it turns out, setting up the two laptops to have CloudBerry back up to B2 was completely painless.  Literally the only annoyance I had in that process was that it took quite a while for the confirmation e-mail that contained my license key to arrive.  So if you're a Windows user, I can recommend CloudBerry without reservation.

Linux wasn't quite as simple, though.  I had a number of problems with the initial setup.  The interface is very stripped-down compared to the Windows version and doesn't offer all of the same options.  I also had problems getting the backups to run correctly - they were repeatedly stalling and hanging.  Fortunately, the paid license comes with a year of support and I found the CloudBerry support people to be very helpful.  In addition, they seem to be actively working on the Linux version.  I initially installed version 2.1.0 and now they're up to 2.4.1.  All of the issues I had have been resolved by upgrading to the newer versions, so things are working well now.

I had initially been a little concerned about the per-gigabyte and per-transaction pricing, but so far it hasn't been an issue.  I found Backblaze's storage cost calculator to be pretty accurate and the per-transaction charges are not significant.  The cost has been basically what I initially estimated and I haven't had any surprises.

Overall, I'm very happy with this solution.  The price is reasonable and, more importantly, it provides me with lots of flexibility.  Hopefully I'll be able to keep this solution in place for years to come.

Database URIs are a pain

Note to self: When using SQLite with SQLAlchemy, the URI has three slashes after the colon, and that's not counting a leading slash.

I only post this because I've forgotten this at least three times and had to spend way too much time figuring it out.  A have a project that has a test SQLite database in the local project directory with a URI of sqlite:///../file.db.  And that's fine.  But then I forget and try to change it to an absolute path and can't figure out why sqlite:///path/to/file.db doesn't work.  But of course that's wrong: it's sqlite:////path/to/file.db with four slashes at the beginning - the last one being part of the actual path.  Hopefully this time I won't forget.

New SSD for me!

I've been going back and forth on getting a new laptop for a few months.  My Lenovo IdeaPad U310 is about four years old and it's been starting to drag.  However, while it was never top-of-the-line, the specs are still decent compared to what I could buy for a reasonable price (translation: "a price I wouldn't feel uncomfortable communicating to my spouse").  So eventually I settled on just upgrading the spinning disk to a solid state drive.

I was slightly apprehensive about this choice.  I've upgraded laptops before, but the IdeaPad officially has "no user-servicable parts", so it clearly wasn't going to be as easy as the last time.  I found a good guide, but the short version is that replacing the drive was simple.  The hard part was getting the stupid case open.

The difference between the SSD and the rotational disk is actually pretty dramatic.  I knew there would be a big performance difference, but it's interesting to see just how big it is on otherwise identical hardware.  Boot time went from "forever" to 15 or 20 seconds.  Running backups and other disk-intensive activity no longer grinds the entire system to a halt.  It's fantastic!  Makes me wish I'd sprung the extra cash for an SSD when I originally bought the system.

Apparently CrashPlan eats inotify watchers

As a brief follow-up to my last post, I had a problem trying to remote into my newly installed Ubuntu box today.  As I mentioned in that post, I'm using xRDP for my remote GUI access and I'm tunneling it over SSH.  When I connected to xRDP, I logged in and was greeted by the Ubuntu lock screen for my running session.  So I entered my password, hit "enter", and waited.  And waited.  And waited some more.  My session just never unlocked.

OK, so apparently xRPD is freaking out.  I already have a an SSH session open, so I'll just restart the service and maybe that will clear up the problem.  So I ran service xrdp restart and it told me:

Failed to add /run/systemd/ask-password to directory watch: No space left on device

That didn't seem right.  I had plenty of disk space and plenty of RAM - what could possibly be out of space?  So naturally I asked the internet, which led me to this Stack Exchange post.  Turns out that CrashPlan has a habit of eating up inotify watchers, which is what I was actually out of.  Stopping the CrashPlan service fixed the problem.  Fortunately, I don't need CrashPlan anymore, so that shouldn't be a problem again.