Reinstalling Ubuntu

I finally got around to re-installing my desktop/home server the other day.  I upgraded it to Ubuntu 20.04.  It had been running Ubuntu for several years, and in the course of several upgrades, it somehow got...extremely messed up.

Of course, it doesn't help that the hardware is really old.  But it had accumulated a lot of cruft, to the point that it wouldn't upgrade to 20.04 without serious manual intervention - which, frankly I didn't feel like taking the time to figure out.  And something had gone wrong in the attempted upgrades, because several programs (including Firefox) had just stopped loading.  As in, I would fire them up, get a gray window for a second, and then they's segfault.  So it was time to repave.

Sadly, the process was...not great.  Some of it was my fault, some of it was Ubuntu's fault, and some of the fault lands on third parties.  But regardless, what should have been a couple of hours turned into a multi-day ordeal.

Let's start with a list of the things I needed to install and configure.  Most of these I knew going in, though a few I would have expected to be installed by default.  Note that I started with a "normal" install from the standard x64 Ubuntu Desktop DVD.  I completely reformatted my root drive, but left my data drive intact.  I had to install:

  • Vim.  I'm not sure why this isn't the standard VI in Ubuntu
  • OpenSSH server.  I sort of get why this isn't included by default in the desktop version, but it kinda feels like it should be the default for everything except maybe laptops.
  • Trinity Desktop, because I really liked KDE 3, damn it!
  • The Vivaldi browser, because I really liked old-school Opera, damn it!
  • Cloudberry Backup, for local and off-site backups.
  • libdvdcss, because I've been ripping backups of all my old DVDs before they go bad (which some already are).
  • OwnCloud, which I use to facilitate sharing various types of media files between my devices.
  • The LAMP stack, for my own apps as well as for ownCloud.
  • Cloudberry Backup.  Turns out that was super-easy - just restored the files to /opt and installed an upgrade (was on 2.x, installed .deb for 3.0), worked like a charm.

The up side is that Cloudberry was super-easy to reinstall.  I created a backup of the file in /opt before reformatting, then just restored those and installed a .deb for the latest release over top of them.  Worked like a charm!  Since it's a proprietary package with license validation, I'd been expecting to have to have to contact support and get them to release and refresh the license, but it turns out that wasn't necessary.

On the down side, a lot of other things didn't go well.  Here's a brief summary of some of the issues I came up against, for posterity.

  1. The video was corrupted.  This happened in both the installer and the main GNOME desktop.  It wasn't so bad as to be completely unusable, so I was able to get in and work around it.  But it was almost completely unusable.  Fortunately, the issue went away when I switched to Trinity Desktop.
  2. The sound didn't work.  It seems that the system picked up the wrong sound card.  I'm not sure why.  I was able to find and fix that by installing PulseAudio Volume Control.
  3. The scroll buttons on my Logitech Marble Mouse trackball don't work.  I still haven't figured this out.  Of course, there's no graphical utility to re-map mouse buttons that I've found, so it's all editing config files.  I found several possible configs online, but they don't seem to work.  I might just have to live with this, because I'm not sure I care enough to devote the time it would take dig into it.
  4. I forgot to take a dump of my MySQL databases before reinstalling.  Of course, this was completely my fault.  But the down side is that, from what I read, you can't just "put back" the old database files for InnodDB databases.  Apparently it just doesn't work that way.  Luckily I didn't have anything important in those databases (it was all just dev testing stuff), but it was still annoying.
  5. Re-installing ownCloud did not go as smoothly as anticipated.  In addition to the MySQL issue, it seems that Ubuntu 20.04 ships with PHP 7.4 out of the box, which is great.  However, ownCloud apparently doesn't support 7.4 yet, so it refused to run.  A quick search suggested that the "not working" parts were mostly in some tests, so I was able to comment out the version checks and get it to work.  I wasn't running anything but the standard apps on this instance, so it might not work in the general case, but it seems to be OK for what I need.
  6. Plex was annoying.  When I installed it, I got a brand new server that I had to claim for my account, which is fine, but the old instance was still present in my account.  Which is understandable, but annoying.  I had to re-add my libraries to my managed user accounts and was able to remove the dead instance from my authorized devices without much trouble.  It just took a bit to figure out what was going on.  Probably didn't help that both servers had the same name.  I also had to re-do all the metadata changes I'd manually made to some of my media files.  Next time I'll need to figure out how to backup the Plex database.
  7. I probably should have thought of this, but I didn't move my user and group files in /etc over to the new install.  Not a big deal, since I don't have that many, but it meant that several of the groups I created weren't present and were recreated with different GIDs than on the old install.  This was mainly an annoyance because it meant that some of the group ownerships on my data drive ended up wrong.
  8. I had some trouble getting Linga back up and running.  After setting up a new virtualenv, I kept getting errors from the Pillow image manipulation library that there was "no module named builtins".  This was really puzzling, because, as the name suggests, "builtins" is built into Python 3.  I initially assumed this was just my Python-fu being weak and that I had an error in my code somewhere.  But no - it was my environment.  After some Googling (well, actually Duck Duck Go-ing), I realized that this was a common problem with Python 2 to 3 compatibility and was reminded of this post that I wrote six months ago.  The short version is that Apache was running the Python 2 WSGI module.  That seems weird, given that the system only ever had Python 3, but apparently that's how the Apache module works.  Anyway, I installed libapache2-mod-wsgi-py3 and everything was fine.

All in all, this experience reminds me why I don't do this sort of thing very often.  All this tinkering and debugging might have been kind of fun and interesting when I was 25 and had nothing better to do with my time, but these days it's just tedious.  Now I'd much rather things "just work", and if I have to forgo some customization or flexibility, I'm kinda fine with that.

Holy crap, Let's Encrypt is super easy!

Well, I just set up Let's Encrypt on my home server for the first time.  When I was finished, my first thought was, "Damn, that was awesome!  Why didn't I set that up a long time ago?"

Let's Encrypt logoIf you're not familiar with Let's Encrypt, it's a non-profit project of the Internet Security Research Group to provide website operators with free SSL certificates.  The idea is to make it easy for everyone to have SSL properly enabled for their website, as opposed to the old days when you had to either buy an SSL certificate or use a self-signed one that browsers would complain about.

I didn't really know much about Let's Encrypt until recently, other than then fact that they provide free SSL certs which are actually trusted by browsers.  And really, that was all I needed to know to be interested.  So I decided to try it out on my home server.  I was already using them on this website, that that was a slightly different situation: my web host integrated Let's Encrypt into their control panel, so all I had to do to set up a cert for one of my subdomains was click a button.  Super convenient, but not really any learning process there.

It turns out that setting up my home server to use the Let's Encrypt certs was pretty painless.  The recommended method is to use certbot, which is a tool developed by the EFF.  It basically automates the entire process of setting up the certificate.  Seriously - the entire process.  It's actually way easier to set up a Let's Encrypt cert with certbot than it is to make your own self-signed cert.  You just need to run a command, answer a couple of questions, and it will get the certs for each of your sites, install them, and keep them updated.  The only catch is that you need root shell access and your web server has to be accessible via port 80 (for verification purposes).

Compared to the old self-signed cert I was using, this is way easier.  You don't have to generate any keys, or create a CSR (Certifiate Signing Request), or edit your server config files.  Running certbot takes care of everything for you.  So if you haven't tried Let's Encrypt and you're running a site that could use some SSL, I definitely recommend it.

Isn't SanDisk considerate

Author's note: Here's another article from the archives. This one was written way back on March 17th, 2007. At the time I was still working for the county government and using Linux on all my home computers. This rant came out of frustration at the interaction of not-fantastic bespoke software and annoying semi/anti-features in a commercial flash drive.

We don't have that in the Linux world. Most of our software is free or open-source and is installed from a single trusted distributor. We don't have to worry about whether a package is bundled with spyware, a crappy browser toolbar, or a useless systray icon.

This was brought home to me today when I had to deal with a user's flash drive at work. His department has a number of laptops that can't connect to our network, but this crappy application that the state foisted on us requires them to move data from back and forth between our file server and the laptops. The easy solution to this is just to use flash drives.

Well, this user's flash drive broke and the department didn't have any spares, so he went out and bought a 1GB Cruzer Micro. Nice enough drive, plenty of capacity for his purposes. The only problem was, it didn't work with The Crappy Software Foisted On Us By The State (TCSFOUBTS, pronounced 'ticks-fouts').

You see, TCSFOUBTS is one of those half-baked, government consultingware type of systems. It has a bazillion features and is highly customizable. On the down side, it's slow as all hell, has a terrible user interface, and is difficult to configure. For example, some customization options require manually editing MS Access database tables and quite a number involve hand-editing INI files or other text files.

Now, having planned ahead, the TCSFOUBTS people built in a feature to let users automatically copy their data onto a removable drive with just a couple of clicks. However, by "removable drive," what they really meant is "a path." For this feature to work, you have to configure TCSFOUBTS with the drive letter for your removable device.

By now, you probably see where I'm going with this. TCSFOUBTS needs the drive letter, which has to be configured by the system administrator. However, unlike the old flash drive, the Cruzer Micro has a bootable partition that displays as a CD-ROM drive. (Note from the present: this was before booting from USB was a common thing.) Of course, that's the first partition, and so get the drive letter where TCSFOUBTS wants to copy its files.

What was on that first partition? I don't remember. Some useless utility software that nobody asked for. One of those "value-adds" that they put on flash drives because they want to use the space for something. But it meant that the drive was stuck with an unusable partition that I didn't immediately have the ability to remove. So for the user to use this drive, they would have to hand-edit some INI file, and then edit it back when they got a different drive. And since the users for this system were all non-technical, that sounded less than appealing.

Eventually the user decided to just not bother and requisition an "official" drive. It was just a frustrating series of events. It goes to show how a few misplaced assumptions can completely mess up a user experience.

Mobile phone woes

Something strange happened the other week.  I was expecting a call at a certain time.  I was sitting at my desk, with my phone directly in front of me.  It was turned on, it had reception, and the ringer was on with the volume turned up.  But it never rang.

A few minutes after the time I was expecting the call, I got a visual voicemail notification.  Apparently the person had called and it had gone to voicemail.  Yet my phone didn't ring, and there was no indication of a missed call.  Weird.  So I tried to call the person back.  And when I did...nothing happened.  As in, the call didn't connect.  I mean, I didn't even ring.  And it wasn't just this one person.  It was everything.  I tried a bunch of calls to a bunch of numbers and only a handful - maybe 1 out or 10 - even rang, and when they connected the quality was pretty bad.  Same thing when I had other people try to call me - not even a ring.

And it wasn't just me.  It turned out my wife was experiencing the same issue. And the really strange part was that the problem was just with voice calls.  We could both still send and receive SMS messages and use mobile data.  It was just voice calls that didn't work.

OnePlus 5

We're both using basically the same setup - OnePlus 5's on Cricket Wireless.  So if there was some issue, it's not surprising that it might affect both of us.  But nothing had changed on our end since this started.  There was no software update for our phones, or anything like that.  Our phones had been working just fine up until then.

So, of course, I contacted Cricket's tech support.  Obviously I couldn't call them, so I used the online chat.  The person I chatted with was pleasant enough, but ultimately couldn't help and recommended I either get a new phone or go to a store and see if swapping out the SIM card would fix it.  So the next day I went to the store.  The people were, again, very pleasant and tried a bunch of things, including a SIM swap and the usual litany of restarting and resetting things on the phone, but they ultimately came up with nothing.  The only things they could suggest were to try a factory reset of the phone or call tech support again and see if they can reset something on the back-end that might fix it.  So I had another web chat with support and they created a support ticket and said that a technician would contact me to help resolve the problem.

There was one red flag at this point - the person I was chatting with wasn't clear on how the technician would contact me.  They asked for my phone numbers, but neither of our phones could reliably receive calls.  When I pointed this out, the representative didn't seem to get it and simply repeated that they would contact me within 24 hours.

Fast forward to the next morning.  While we were on our way to do some shopping at the public market, a notification goes off on my wife's phone.  It's visual voicemail, with a message from the Cricket technician whose call we apparently just missed.  Because our phones can't receive voice calls.  Which was the entire reason he was calling us in the first place! 

Don't get me wrong - I know tech support is a tough job and I don't mean to put down the people who do it.  But come on!

Simple Mobile SIM kit

Anyway, I was so disgusted by that episode that I decided to take matters into my own hands and see if I could definitively pin down whether it was the network or the phones.  So, to that end, I went over to the local Walmart and bought a Simple Mobile SIM starter kit for $10.

I don't know much about Simple Mobile - just that it's a pre-paid cellular provider that runs on T-Mobile's network.  And I know that Cricket uses AT&T's network, so, by process of elimination, if I can make calls from my phone using the Simple Mobile SIM, then that means my phone is fine and it's something in the network that's broken.

The setup process was pretty painless.  I just stuck the Simple Mobile SIM card into the second SIM port (the OnePlus 5 supports dual SIM cards, which is actually pretty cool and it was nice to have a reason to test that out), signed up for the basic $25/month service online, and waited a while.  After a bit, I got a bunch of text messages from Simple Mobile and I was in business.

Naturally, the first thing I did was try to call a few numbers from my phone book using the new SIM.  And, of course, the calls went through just fine.  So my phone is officially not broken, which is good news.  Also, I now had a secondary phone number that I could give to the Cricket tech support people.  Which is what I did.

Well, the next day I got another call from the Cricket technician.  He just asked if I was still having the issue, asked a couple of basic questions, and said they'd get back to me.  Well, they never got back to me.  But the next day I tried making a few calls and my phone was working again.  I have no idea what the problem was, or what caused it, but apparently they fixed it, which is all I really care about.

While this experience does not incline me to go get a land-line, it does make me a little hesitant about having a single phone provider.  Fortunately, the ready availability of cheap, pre-paid SIMs helps mitigate that concern.  It's nice to know that if my current provider becomes unreliable, short-term solutions are easy to come by.