The joy of WSL and desktop Linux

I had a workflow revelation on my work machine the other week: Corporate IT pushed a Windows 10 update that brought me up to a version that actually supports running Linux GUI apps under WSL!

I've WSL GUI apps on my home Win11 system for a while, but having that on my work machine is huge.  I use WSL extensively there.  See, we've been standardizing our dev environment setup using Lando, which is basically a convenience layer on top of docker-compose.  It's a handy tool, but the relevant thing to note here is that it works by directly mounting your project code directory inside the Docker container.  This is convenient for Mac and Linux users, but is kind of a problem for Windows users.

As you may (or may not) know, WSL2 is great and out-performs WSL1 in every way except one: cross-filesystem performance.  As long as you keep everything inside the WSL filesystem, you're golden and everything is fast.  But as soon as you try to cross from the Windows filesystem to the Linux one, or vice versa, performance just falls off a cliff.  Sure, it's not a big deal if you just want to edit a file or something like that, but anything that does any remotely significant amount of filesystem access (e.g. running an npm install) is just painful.  In my experience, it's not unheard of for the performance penalty to be on the order of 10x. 

Clearly that's not something you want to endure as part of your normal workflow.  The simple fix is to do all your work inside WSL, which for me means installing Lando in WSL and hosting my code inside WSL.  The only problem is managing the code.  If you want to do that in Windows, you need to do it over a network share, which works, but isn't exactly a great experience.  It also causes various permissions and file ownership issues for apps like Git that actually care about that.

That's where the WSL GUI apps come in.  Rather than dealing with the network share hassle, you can just install your favorite GUI tools inside WSL and run them just like you're on native Linux.  Problem solved!

Well, mostly solved.  Sadly, not everything runs on Linux.  In particular, there's no Linux port of SourceTree, which is currently my graphical Git client of choice.  But it's not that I particularly like SourceTree - it's just that I hate it less than all the other Git clients I've tried.  So I was forced to try some other options.

This part did not go well.  I tried a few different options, including GitFiend (which was nice, but would randomly crash under WSL) and Git-Cola, which was also decent, but which I had to drop because if left alone it would occasionally lock up and then somehow take down the entire system if I tried to close it.  I have no idea how it managed to do that (presumably some bug in WSL's GUI layer), but that's a different problem.  I also attempted to try Gittyup, but I couldn't, because it only offered Flatpak packages.  And, of course, the Flatpak daemon (or whatever it's called) won't install on WSL because it's missing some of the system-level stuff that it uses.  But that's a different post.

Eventually, I declared GUI bankruptcy and decided to actually learn how to use Fugitive, the Git plugin for Vim.  Turns out that Fugitive is actually pretty good and figuring it out was easier than finding a good graphical Git client.  But that's also a story for another post.

In any event, having WSL GUI apps is pretty nice.  Now I can do all my work in WSL, and still have both GVim and PHPStorm, if I need it, without having to pay the cross-OS performance price.  So I can have nice things and good performance.  Yay!

Installing PHPStorm under WSL2

The other week I tried to install PHPStorm under WSL2.  Because that's a thing you can do now (especially since Linux GUI apps now work in recent Windows 10 updates).  The installation process itself was pretty simple.

  • Download PHPStorm for Linux from JetBrains website.
  • Now extract the tarball and run the bin/phpstorm.sh script.
  • PHPStorm should start up.

The next step is to configure your license.  In my case, I was using a corporate license server.  The issue with this is that you need to log into JetBrains' website using a special link to activate the license.  Unfortunately:

  • By default, WSL doesn't have a browser installed.
  • Firefox can't be installed because the default build uses a snap image, and WSL apparently doesn't support snap.
  • PHPStorm doesn't appear to be able to properly deal with activating via a Windows browser (I tried pointing it to the Windows Chrome executable and got an error page that points to a port on localhost).

So how do we get around this?  Well, we need to install a browser in WSL and configure PHPStorm to use it.  So here's what we do:

  • Skip the registration for now by starting a trial license.
  • Download the Vivaldi for Linux DEB package from Vivaldi's website.  You could use a different browser, but I like Vivaldi and it offers a convenient DEB package, so I used that.
  • Install the Vivaldi DEB.  WSL will be missing some packages, so you have to run apt install --fix-broken after installing it.
  • Go into the PHPStorm settings and configure your web browsers to include Vivaldi and set it as the default browser.
  • Go back to the registration dialog and try again.  This time, PHPStorm should start up Vivaldi and direct you to the appropriate link.
  • Log into your JetBrains account and follow the instructions.  The web-based portion should succeed and registration should complete when you click "activate" in PHPStorm again.

There we go - PHPStorm is registered and works.  Mildly annoying setup, but not actually that bad.

OneDrive for Linux

As I mentioned a while ago, I replaced my desktop/home server this past summer.  In the process, I switched from my old setup of Ubuntu running Trinity Desktop to plain-old Ubuntu MATE, so I've been getting used to some new software anyway.  As part of this process, I figured it was time to take another look for OneDrive clients for Linux.

See, I actually kind of like OneDrive.  I have an Office 365 subscription, which means I get 1TB of OneDrive storage included, so I might as well use it.  I also happen to like the web interface and photo-syncing aspects of it pretty well.

However, I'm slightly paranoid and generally distrustful of cloud service providers, so I like to have local copies and offline backups of my files.  This is a problem for me, because my primary Windows machine is a laptop, and I don't want to pay the premium to put a multi-terabyte drive in my laptop just so I can sync my entire OneDrive, and scheduled backups to a USB disk are awkward for a laptop that's not plugged in most of the time.  Now, I do have a multi-terabyte drive connected to my Linux desktop, but for a long time there were no good OneDrive sync clients for Linux.  In the past, I had worked around this by using one-off sync tools like Unison (which...mostly worked most of the time) or by setting up an ownCloud sync on top of the OneDrive sync (which worked but was kind of janky).  However, but those depended on syncing from my Windows laptop, which was OK when I had 20 or 30 gigabytes of data in OneDrive, but at this point I'm well over 100GB.  Most of that is archival data like family photos and just eats up too much space on a 500GB SSD.

Enter InSync.  InSync is a third-party file sync tool that runs on Windows, Mac, and Linux and supports OneDrive, Google Drive, and Dropbox.  It has all the bells and whistles you'd expect, including file manager integrations, exclusions, directory selection, and other cool stuff.  But what I care about is the basics - two-way syncing.  And it does that really well.  In fact, it totally solves my problem right out of the box.  No more janky hacks - I can just connect it to my OneDrive account and it syncs things to my Linux box.

The only down-side to InSync is that it's proprietary (which I don't mind) and the licensing is confusing.  The up side is that it's not actually that expensive - currently, the pricing page lists licenses at $30 USD per cloud account.  So if you only want to sync OneDrive, it's $30 and you're done.  However, there's also an optional support contract and there's some difference between "legacy" licenses (which I think is what I have) and their new subscription model.  Frankly, I don't fully understand the difference, but as long as it syncs my OneDrive and doesn't cost too much, I don't really care.  

So if you're a OneDrive user and a Linux user, InSync is definitely worth a try.  I don't know about the other platforms or services (I assume they're all similar), but OneDrive on Linux works great.

On WSL performance

As somebody who does a lot of work in a Linux environment, WSL (the Windows Subsystem for Linux) has become almost a required too for me.  A while back, I looked up ways to share files between native Windows and WSL.  For various reasons, the most convenient workflow for me is to do much of my work on the code from within Windows, but then run various tests and parts of the build process in Linux.  So I wanted to see what my options were.

The option I had been using was to use the mount point that WSL sets up in Linux for the Windows filesystem.  In addition to that, it turns out there are a couple of ways to go the other direction and read Linux files from Windows.  There's the direct, unsupported way or the the supported way using a network share.  Sadly, it turns out none of these are really good for me.

My main problem and motivation for looking into this was simple: performance.  When crossing container boundaries, filesystem performance takes a nose-dive.  And I'm not just talking about an "I notice it and it's annoying" performance hit, I'm talking "this is actively reducing my productivity" hit.  For filesystem-intensive processes, on a conservative estimate, when running a process in Linux, things take at least 2 to 3 times as long when the files are hosted in Windows compared to when they're hosted in Linux.  And it's frequently much worse than that.  For one project I was working on, the build process took upwards of 20 minutes when the files were on Windows, but when I moved them to Linux it was around 3 minutes.  And it's not just that project.  Even for smaller jobs, like running PHPStan over a different project, the difference is still on the order of several minutes vs. 30 seconds or so.  Perhaps this has improved in more recent versions, but I'm still stuck on Windows 10 and this is seriously painful.

My solution?  Go old-school: I wrote a quick script to rsync my project code from Windows to Linux.  Not that rsync is super-fast either, but it's not bad after the initial sync.  I just set it up to skip external dependencies and run NPM et al. on Linux, so even when there's "a lot of files", it's not nearly as many as it could be.  Of course, then I need to remember to sync the code before running my commands, which is not idea.  But still, the time difference is enough that I can run the command, realize I forgot to sync, do the sync, and run the command again in less time than just running it once on the Windows-hosted code.

Changing the display manager in Ubuntu

Just as a quick note to my future self, if you want to change the display manager in Ubuntu, you just need to run the following:

$ dpkg-reconfigure gdm3
$ systemctl restart display-manager.service

The reconfigure will bring up a menu that allows you to choose from the installed display managers.  There's a nice summary with pictures here.

If anyone cares, the context here is that I run Ubuntu 20.04 with the Trinity Desktop Environment on my home destkop/server.  The problem is that pretty much every time I run an upgrade on that box it resets the display manager from TDM back to the default GDM. 

This is actually a really big problem because GDM doesn't work on this box.  Aside from the disk drives, most of the hardware in this box is about 10 years old.  So the video card is sufficiently archaic that GNOME just can't deal with.  When I try to log into GNOME or even just use GDM, I end up with massive display corruption and the desktop is basically unusable.  

One of these days, I should really replace that computer.  Or maybe rebuild it.  Or possibly just relegate it to purely headless server duty and get a different box to use as a desktop.  One day....

Reinstalling Ubuntu

I finally got around to re-installing my desktop/home server the other day.  I upgraded it to Ubuntu 20.04.  It had been running Ubuntu for several years, and in the course of several upgrades, it somehow got...extremely messed up.

Of course, it doesn't help that the hardware is really old.  But it had accumulated a lot of cruft, to the point that it wouldn't upgrade to 20.04 without serious manual intervention - which, frankly I didn't feel like taking the time to figure out.  And something had gone wrong in the attempted upgrades, because several programs (including Firefox) had just stopped loading.  As in, I would fire them up, get a gray window for a second, and then they's segfault.  So it was time to repave.

Sadly, the process was...not great.  Some of it was my fault, some of it was Ubuntu's fault, and some of the fault lands on third parties.  But regardless, what should have been a couple of hours turned into a multi-day ordeal.

Let's start with a list of the things I needed to install and configure.  Most of these I knew going in, though a few I would have expected to be installed by default.  Note that I started with a "normal" install from the standard x64 Ubuntu Desktop DVD.  I completely reformatted my root drive, but left my data drive intact.  I had to install:

  • Vim.  I'm not sure why this isn't the standard VI in Ubuntu
  • OpenSSH server.  I sort of get why this isn't included by default in the desktop version, but it kinda feels like it should be the default for everything except maybe laptops.
  • Trinity Desktop, because I really liked KDE 3, damn it!
  • The Vivaldi browser, because I really liked old-school Opera, damn it!
  • Cloudberry Backup, for local and off-site backups.
  • libdvdcss, because I've been ripping backups of all my old DVDs before they go bad (which some already are).
  • OwnCloud, which I use to facilitate sharing various types of media files between my devices.
  • The LAMP stack, for my own apps as well as for ownCloud.
  • Cloudberry Backup.  Turns out that was super-easy - just restored the files to /opt and installed an upgrade (was on 2.x, installed .deb for 3.0), worked like a charm.

The up side is that Cloudberry was super-easy to reinstall.  I created a backup of the file in /opt before reformatting, then just restored those and installed a .deb for the latest release over top of them.  Worked like a charm!  Since it's a proprietary package with license validation, I'd been expecting to have to have to contact support and get them to release and refresh the license, but it turns out that wasn't necessary.

On the down side, a lot of other things didn't go well.  Here's a brief summary of some of the issues I came up against, for posterity.

  1. The video was corrupted.  This happened in both the installer and the main GNOME desktop.  It wasn't so bad as to be completely unusable, so I was able to get in and work around it.  But it was almost completely unusable.  Fortunately, the issue went away when I switched to Trinity Desktop.
  2. The sound didn't work.  It seems that the system picked up the wrong sound card.  I'm not sure why.  I was able to find and fix that by installing PulseAudio Volume Control.
  3. The scroll buttons on my Logitech Marble Mouse trackball don't work.  I still haven't figured this out.  Of course, there's no graphical utility to re-map mouse buttons that I've found, so it's all editing config files.  I found several possible configs online, but they don't seem to work.  I might just have to live with this, because I'm not sure I care enough to devote the time it would take dig into it.
  4. I forgot to take a dump of my MySQL databases before reinstalling.  Of course, this was completely my fault.  But the down side is that, from what I read, you can't just "put back" the old database files for InnodDB databases.  Apparently it just doesn't work that way.  Luckily I didn't have anything important in those databases (it was all just dev testing stuff), but it was still annoying.
  5. Re-installing ownCloud did not go as smoothly as anticipated.  In addition to the MySQL issue, it seems that Ubuntu 20.04 ships with PHP 7.4 out of the box, which is great.  However, ownCloud apparently doesn't support 7.4 yet, so it refused to run.  A quick search suggested that the "not working" parts were mostly in some tests, so I was able to comment out the version checks and get it to work.  I wasn't running anything but the standard apps on this instance, so it might not work in the general case, but it seems to be OK for what I need.
  6. Plex was annoying.  When I installed it, I got a brand new server that I had to claim for my account, which is fine, but the old instance was still present in my account.  Which is understandable, but annoying.  I had to re-add my libraries to my managed user accounts and was able to remove the dead instance from my authorized devices without much trouble.  It just took a bit to figure out what was going on.  Probably didn't help that both servers had the same name.  I also had to re-do all the metadata changes I'd manually made to some of my media files.  Next time I'll need to figure out how to backup the Plex database.
  7. I probably should have thought of this, but I didn't move my user and group files in /etc over to the new install.  Not a big deal, since I don't have that many, but it meant that several of the groups I created weren't present and were recreated with different GIDs than on the old install.  This was mainly an annoyance because it meant that some of the group ownerships on my data drive ended up wrong.
  8. I had some trouble getting Linga back up and running.  After setting up a new virtualenv, I kept getting errors from the Pillow image manipulation library that there was "no module named builtins".  This was really puzzling, because, as the name suggests, "builtins" is built into Python 3.  I initially assumed this was just my Python-fu being weak and that I had an error in my code somewhere.  But no - it was my environment.  After some Googling (well, actually Duck Duck Go-ing), I realized that this was a common problem with Python 2 to 3 compatibility and was reminded of this post that I wrote six months ago.  The short version is that Apache was running the Python 2 WSGI module.  That seems weird, given that the system only ever had Python 3, but apparently that's how the Apache module works.  Anyway, I installed libapache2-mod-wsgi-py3 and everything was fine.

All in all, this experience reminds me why I don't do this sort of thing very often.  All this tinkering and debugging might have been kind of fun and interesting when I was 25 and had nothing better to do with my time, but these days it's just tedious.  Now I'd much rather things "just work", and if I have to forgo some customization or flexibility, I'm kinda fine with that.

No KDE4 for me

Author's note: Welcome to another episode of "From the Archives". This is the stub of an article that I wrote twelve years ago, on April 25, 2008. At the time, KDE 4.x was freshly into stable release. I was a use and fan of KDE at the time, and there had been a lot of hype about how awesome version 4 was going to be. My initial reaction, however, was...not so great.

This is actually slightly relevant because I have resurrected the GUI on my "home desktop", by which I mean my "home server". This is the box sitting under my desk in the basement that runs Ubuntu 18.04 and runs various web and media server software. It does have a GUI installed, but I hadn't really used it in years - in part because GNOME didn't really work well on it. This isn't super surprising, since it's an old box with just the integrated graphics chip. But it's got more than enough memory and processing power for the workload I want it to do, so there's not really any point in upgrading.

Anyway, due to the current CONVID-19 pandemic I'm now working from home and sitting at that desk all day every day, so I decided to fix up my desktop. Part of my cleanup process was to axe GNOME and install the Trinity Desktop Environment (TDE). This is a project I just recently discovered and immediately fell in love with. It's essentially a fork and continuation of KDE 3.5. It's since evolved into its own thing, apparently, but it's still noticeably KDE-like, so I'm very comfortable in the UI. Just like the original KDE 3.5, TDE is powerful, intuitive, generally pleasant to use, and works quite well on my not-so-new system. It doesn't have the fancy graphical bells and whistles, but I never really cared much about that anyway. I would definitely recommend it to any old-school KDE 3.x fans.

Anyway, here are my thoughts from the time. To be fair, I haven't used "real" KDE to any extent since this, so I'm sure all of my complaints have been addressed. But then again, I don't really care. I'm happy with TDE. Enjoy!

Kubuntu 8.04 was released yesterday (note: again, this was twelve years ago). That means it's upgrade time for me again.

This time around, Kubuntu comes in 2 varieties: the "rock-solid" KDE 3 version, and the KDE 4 remix. I had been intending to get on the leading edge and install the KDE 4 version. However, just to be on the safe side, I decided to give the live CD a try first. And after messing around with it for half an hour or so, I'm glad I did.

Bottom line: I think I'm going to wait for KDE 4.1. Or maybe 4.2.

I just don't care for 4.0.3. It definitely looks different...but not better. I just didn't see any new features that looked even remotely interesting, let alone compelling. The splash screens were kind of nice, and the plasma widget effects on the desktop were pretty neat, but that's about it.

There seemed to be a lot more down sides. Of course, I'm not sure how many of these are KDE 4 issues and how many are Kubuntu issues, but I found them annoying either way. Here's my list:

  1. The window style for the default desktop theme is unbearably ugly. It's too dark and too monochromatic. I guess somebody must like it, but it really bothers me.
  2. Where's the control panel? I don't see it and the context-sensitive configuration panels don't have all the options in them.
  3. In fact, where the heck are the other options?
  4. What the heck happened to Amarok and the other applications? The UI is completely different and it feels like half the features are missing.

I could go on, but why bother? There's just no reason for me to upgrade at this point.

Sudoku on Linux in days past

Here's another "from the archives" post that's been sitting in my drafts folder since March 27, 2007.  At the time I'd been getting into playing Sudoku and was running Kubuntu on my home desktop and laptop.  All I wanted to do was play Sudoku on my computer.  And apparently it didn't go very well.  At the time I wrote:

Why is it that there don't seem to be any good desktop sudoku games for Linux? At least, I haven't found any that meet my requirements. The three in Ubuntu's repositories sure don't.

My requirements for a sudoku game are pretty simple. Beyond the basics of creating a new puzzle and telling me if I'm right when I complete it, I only want one thing: notes. I just want an easy way to add notes to unfilled squares so that I can quickly reference the possibilities.

You wouldn't think that would be such a tall order. However, of the three sudoku games in the Ubuntu repositories, none of them do this. Gnusudoku doesn't even claim to support notes. GNOME-sudoku claims to support them "simply" by clicking at the corner of a box. However, it doesn't work. Maybe you have to run it under GNOME?

And last but not least, we have ksudoku. This one does support notes, and they do work, but they suffer from the same problem as the rest of this app: the user interface makes absolutely no freaking sense at all. Even this might be forgivable if there were a manual or even some kind of guide on the website, but there isn't. The status bar does flash back and forth between "helpful" tips, one of which says to use "RMB" to make notes, but that's it. It took me a few minutes and some experimentation to figures out that "RMB" is supposed to be an abbreviation for "Right Mouse Button". But even with that, I still can't figure out how the hell notes are supposed to work. I can't seem to get more than one number into a note and things disappear if I right-click too many times.

I assume those problems have probably been fixed by now.  I mean, it's been almost 13 years, so either they're fixed or the project is dead.  But I seem to recall this being sort of emblematic of open-source games at the time: there weren't that many of them and even fewer were any good.

By now, I'm sure that the situation is much better, if only because enough time has passed for more games to mature.  But, unfortunately, I don't really care anymore.  These days I don't want to sit down in front of a computer just to play video games.  I just don't have that kind of time.  If I'm going to play a game, it's the kind that runs on my phone and which can be dropped at a moment's notice.

Coincidentally, one such game that I just downloaded the other day is Open Sudoku, which is actually pretty nice as Android Sudoku apps go.  It's open-source and you can find it on F-Droid as well as in the Play store.  Like most open-source games, it doesn't have a lot of frills or fancy graphics, but it's easy to play and does the job.

Random Python and WSGI tip

In light of the fact that Python 2.x loses support...now, I've been upgrading some of my personal projects that I haven't done much with recently to Python 3.  One of them is a little web-based comic book reader I call Linga (I don't remember why) that I run on my home network.

Since my home server runs Ubuntu 18.04 LTS, where Python 2.7 is the default, this has been a bit of a pain.  Before doing the conversion to Python 3.x, I had Linga set up and working using MySQL and Apache with mod_wsgi.  But once I pushed the code for the 3.x version, everything broke.  After digging in, there were several problems I needed to correct:

  1. The default mod_wsgi install uses Python 2.7, not 3.6 like I want.
  2. My WSGI script uses execfile(), which no longer exists in Python 3.x.
  3. SQLAlchemy can't load the MySQLdb module.

Luckily, the solutions to these problems weren't particularly hard to find or implement.  With a little Googling (well, DuckDuckGo-ing really, but that doesn't have the same ring to it) and a little experimentation, I found the following fixes:

  1. The version of Python to use is hard-coded into the Apache module.  So to use Python 3.6 I just needed to install the Python 3 WSGI module with apt install libapache2-mod-wsgi-py3.
  2. Apparently the recommended replacement for execfile("filename") is exec(open("filename").read()).
  3. It seems that Zen of Python's saying that "There should be one-- and preferably only one --obvious way to do it" breaks down when it comes to connecting to MySQL, as there are apparently several client libraries available.  The solution that worked for me was to pip install pymysql and run pymysql.install_as_MySQLdb() in my WSGI script.  Having to call a method to turn on MySQLdb compatibility is annoying and seems a little hacky, but it works and it's not really a big deal.

With that, Linga now works just fine.  Maybe this will save someone else having to Google the answers to these questions as well.

Apparently CrashPlan eats inotify watchers

As a brief follow-up to my last post, I had a problem trying to remote into my newly installed Ubuntu box today.  As I mentioned in that post, I'm using xRDP for my remote GUI access and I'm tunneling it over SSH.  When I connected to xRDP, I logged in and was greeted by the Ubuntu lock screen for my running session.  So I entered my password, hit "enter", and waited.  And waited.  And waited some more.  My session just never unlocked.

OK, so apparently xRPD is freaking out.  I already have a an SSH session open, so I'll just restart the service and maybe that will clear up the problem.  So I ran service xrdp restart and it told me:

Failed to add /run/systemd/ask-password to directory watch: No space left on device

That didn't seem right.  I had plenty of disk space and plenty of RAM - what could possibly be out of space?  So naturally I asked the internet, which led me to this Stack Exchange post.  Turns out that CrashPlan has a habit of eating up inotify watchers, which is what I was actually out of.  Stopping the CrashPlan service fixed the problem.  Fortunately, I don't need CrashPlan anymore, so that shouldn't be a problem again.

Of backups and reinstalled

I finally decided to do it - reinstall my home media server.  I switched it from Windows 8.1 to Ubuntu 17.10.

The reinstall

Believe it or not, this is actually a big thing for me.  Fifteen years ago, it would have been par for the course.  In those days, I didn't have a kid or a house, so rebuilding my primary workstation from scratch every few months was just good fun.  But these days...not so much.  Now I have a house and a kid and career focus.  Installing the Linux distribution of the week isn't fun anymore - it's just extra time I could be spending on more important and enjoyable things.

But, in a fit of optimism, I decided to give it a shot.  After all, modern Linux distributions are pretty reliable, right?  I'll just boot up and Ubuntu DVD, it'll run the installer for 15 minutes or so, and I'll have a working system.  Right?

Well...not quite.  Turns out Ubuntu didn't like my disk setup.  Things went fine until it tried to install Grub and failed miserably.  I'm not sure why - the error message the installer put up was uninformative.  I assume it had something to do with the fact that I was installing to /dev/sdb and /dev/sda had an old Windows install on it.  After a couple of tries, I decided to just crack open the case and swap the SATA cables, making my target drive /dev/sda, and call it a day.  That did the trick and Ubuntu installed cleanly.  I did have to update my BIOS settings before it would boot (apparently the BIOS didn't automatically detect the change of drives), but it worked fine.

The only real problem I had was that apparently Wayland doesn't work properly on my system.  I tried several times to log into the defaut GNOME session and after a minute or two, system load spiked to the point that the GUI and even SSH sessions became unresponsive and eventually the GUI just died.  And I mean died - it didn't even kick me back to the GDM login screen.

I suspect the problem is my system's archaic video card - an integrated Intel card from the 2010 era which I have absolutely no intention of ever upgrading.  I mean, it apparently wasn't good enough to run Windows 10, so I wouldn't be surprised if Wayland had problems with it.  But in any case, Ubuntu still supports X.org and switching to the GNOME X.org session worked just fine.  I don't really intend to use the GUI on this system anyway, so it's not a big deal.

The restore

Once I got Ubuntu installed, it was on to step two of the process: getting my data drive set up.  It's a 1TB drive that used to serve as the Windows install drive.  It was divided into a couple of partitions, both formatted as NTFS.  Since I'm switching to Linux and really just wanted one big data drive, this was a sub-optimal setup.  Therefore I decided to just blow away the partition table and restore from a backup

I currently use CrashPlan to back up this computer.  It's set to back up to both the cloud and a local USB hard drive.  So my plan was to repartition the disk, install CrashPlan, and restore from the local hard drive.

This was fairly easy.  Installing the CrashPlan client was the first task.  There's no .deb package, but rather a .tgz that contains an installer script.  It was actually pretty painless - just kick off the script and wait.  It even installs its own copy of the JRE that doesn't conflict with the system version.  Nice!

Next was actually restoring the data.  Fortunately, CrashPlan has their process for restoring from a USB drive well documented, so there wasn't much to figure out.  The process of connecting a particular backup dataset on the drive to the CrashPlan client was slightly confusing because the backup interface (as far as I can remember) doesn't really make it obvious that a USB drive can have more than one dataset.  But it just boils down to picking the right directory, which is easy when you only have one choice.

The only surprise I ran into was that running the restore took a really long time (essentially all day) and created some duplicate data.  My data drive contained several symlinks to other directories on the same drive and CrashPlan apparently doesn't handle that well - I ended up with two directories that had identical content.  I'm not sure whether this was a result of having symlinks at all, or if it was just moving from Windows symlinks to Linux symlinks.  In any case, it's slightly inconvenient, but not a big deal.

Other services

While the restore ran, I started setting up the system.  Really, there were only a handful of packages cared about installing.  Unfortunately for me, most of them are proprietary and therefore not in the APT repository.  But the good news is that most of them were pretty easy to set up.

The first order of business, since this box is primarily a media server, was setting up Plex.  This turned out to be as simple as installing the .deb package and firing up the UI to do the configuration.  From there, I moved on to installing Subsonic.  This was only marginally more difficult, as I also had to install the OpenJDK JRE to get it to run.  I also followed the instructions here to make the service run as a user other than root, which seemed like not such a great idea.

The only thing I wanted to install that I couldn't get to work was TeamViewer.  But this wasn't a deal-breaker, though, because the only reason I was using TeamViewer under Windows was because I was running the Windows 8.1 Home and was too cheap to pay for the upgrade to Professional just to get the RDP server.  But since this is Ubuntu, there are other options.  Obviously SSH is the tool of choice for command-line access and file transfers.  For remote GUI access, I tried several variations on VNC, but it eventually became clear that xRDP was the best solution for me.  It's not quite as straight-forward to get working, but this guide provides a nice step-by-step walk through.  There are also a few caveats to using it, like the fact that the same user can't be logged in both locally and remotely, but those weren't big issues for my use case.

Results

For the most part, the transition has gone pretty smoothly.  The setup didn't go quite as smoothly as it could have been, but it wasn't too bad.  In any event, what I really cared about was getting Plex and Subsonic up and running and that was pretty painless.  So I'm back to where I was originally, but on an up-to-date operating system, which is all I really wanted.

VirtualBox shared folders

Here's a little issue I ran across the other day.  I was setting up a VirtualBox VM with an Ubuntu guest and I wanted to add a shared folder.  Simple, right?  Just install the VirtualBox guest additions, configure the shared folder to mount automatically, and it "just works".

The only problem is the permissions.  By default, VirtualBox mounts the shared folder with an owner of root.vboxsf and permissions of "rwxrwx---".  So if you add your user account to the vboxsf group, you get full access.  Everyone else...not so much.  Of course, this isn't inherently a problem.  However, I wanted the Apache process to have read access to this shared folder because I have a web app that needs to read data off of it.  And I didn't really want to give it write access (which it doesn't need), so just adding it to the vboxsf group wasn't a good option.  What I really needed to do was change the permissions with which the share was mounted.

As far as I can tell, there's no way to get VirtualBox to change the permissions.  At least, I didn't see anything in the guest OS and there's no setting in the management UI.  Fortunately, you can pretty easily bypass the auto-mounting.  Since it's a Linux guest, you can just turn off the auto-mounting in the VirtualBox management console and add a line to your /etc/fstab.

There is one issue, though: you have to make sure the vboxsf kernel module is loaded before the system auto-mounts the share.  If you don't the mounting will fail.  However, forcing the module to load is easily accomplished by adding a vboxsf line to your /etc/modules file.

As for the line in /etc/fstab, this seems to work pretty well for me:
SHARE_NAME   /media/somedir   vboxsf   rw,gid=999,dmode=775,fmode=664   0   0

Which desktop OS to use

The last week or so, I've been going back and forth over what to run as the primary OS on my upgraded home desktop. Right now I'm running Ubuntu 10.04 on it, but I keep thinking maybe I should switch to Windows 7. I'm using it to run Win7 under VMWare anyway, so I'm wondering if I should just invert the arrangement and run Ubuntu under VMWare instead.

So to help myself decide, I'm going to list out some of the various pros and cons. Of course, this is just for my case - this is not universally applicable, your mileage may vary, etc. I'm basing this list on my usage patterns over the last 2 year, since I started running Windows on some of my personal (i.e. non-company) machines again.

Every-day desktop software

This is obviously one of the biggest categories, and one of the reasons I'm leaning toward Windows 7 in the first place. For my every-day non-development desktop computing needs, I need the following tools:

  1. Web browser
  2. E-mail client (no, I don't want to use a web client, even if it's GMail)
  3. Desktop RSS aggregator with multi-system syncing (no, I don't particularly like Google reader, except as a back-end)
  4. Multi-format universal video player
  5. Music player/manager
  6. Podcatcher
  7. Comic book reader

For items 1, 2, and 4, I already have favorite apps that are cross-platform - Opera, Thunderbird, and VLC. So those requirements are a wash.

Pro-Windows

On Windows, I'm currently using FeedDemon for RSS and MediaMonkey for music, and I quite like both of them.

On Linux, I don't have favorite apps for those. Recently, I've been using Exaile for music, but I'm not particularly attached to it. I just haven't found anything good since Amarok 2 came out (which I absolutely hate, despite really liking Amarok 1.x). As for an RSS reader, I've tried Liferea, but didn't particularly care for it. Mostly I've been either reading my feeds on a Windows box or just using Google reader, which I'm not crazy about.

Pro-Linux

On the Linux side, I do have a favorite comic reader an podcatcher: ComiX and gPodder respectively. However, I don't think either of them are really irreplaceable. While I don't have any complaints about gPodder, my podcatching needs are fairly basic - download and transfer to my MP3 player. I have a feeling the podcasting features of MediaMonkey would be just fine for this. The same is probably true of ComiX - I don't really need any complicated management features, just a good viewer for comic archives. I've played with a few other comic readers, and there are probably several that could do the job. For instance, HoneyView seems like it would fit my needs quite well.

Verdict: I have to go with Windows on this point. Most of the every-day stuff I really care about is either cross-platform or Windows-only, while I'm not really so attached to my current Linux tools.

Development tools

The development tools are a little different from the every-day tools. In part because, in many cases, there isn't really much of a choice. In the case of things like language-specific IDEs, you either use a particular tool and just use whatever platform it's supported on, or you make things waaaaay harder on yourself than they need to be. In other words, if you want to be idealistic in your platform choice, you suffer for it.

In my case, I'm currently doing mostly LAMP and FLEX work, and want to get into more .NET stuff on the side. That means that I need not only the IDEs and development tools, but also the supporting servers for those environments. I also need good command-line and scripting utilities as well as a good desktop virtualization package.

Currently, I favor Komodo Edit and gVim for PHP, Python, and other general-purpose coding, with a little Eclipse mixed in on occasion. Both of those are cross-platform, as are PHP and Python. For supporting servers, I also need MySQL, PostgreSQL, SQLite (more of a library than a server, but I'll list it with the other databases), and Apache, all of which are also cross-platform. So in terms of what I can use, it's a wash on all of that.

Pro-Linux

However, while most (if not all) of the LAMP stack runs on Windows, it's kind of a second-class citizen. In my experience, it's much easier to install and administer this stuff on Linux. Of course, that could just be because I'm more familiar with them on Linux, but that's not really the point.

In addition, Linux gets some points for the command-line. Granted, now that Powershell has come along it's not nearly as many as Linux used to get, but it still wins on ease of installing packages and having things like FFMPEG.

Pro-Windows

Obviously, Windows wins on everything dot-NET. IIS, Visual Studio, SQL Server - all of them are Windows-only and must-haves for anyone wanting to do "professional" .NET work, by which I mean work that "counts" on your resume. (No, Mono with MySQL isn't good enough, unfair as that may be.) It also wins on the FLEX front. While the FLEX SDK is cross-platform and runs on Linux, the IDE, Flash Builder, only runs on Windows and Mac. It's Eclipse-based, so you'd think it'd run on Linux, but it doesn't. Don't ask me why.

Windows also gets a small win on virtualization. Mostly, I use VMWare, plus a bit of VirtualBox, both of which are cross-platform. However, anybody who ever has to do IE compatibility testing will tell you how handy VirtualPC is. Not in and of itself, but rather because Microsoft puts out pre-configured images for testing IE 6 and 7. Just download, extract, and run - no hunting down copies of Windows with the appropriate IE version. Granted, it's annoying that the Windows installs on those images expire every few months, but they're still useful.

Verdict: I think I have to give Windows the edge on this one too, much as it pains me. I just need the Windows-only tools, and it's easier to run the resource-sucking ones like Visual Studio and Eclipse natively than in a VM. And as for the LAMP server tools, I can easily set up an Ubuntu server VM to run in the background without assigning it too many resources.

Remote access

I've gotten very used to being able to access my home desktop from anywhere. Therefore, I need some kind of secure, easy to use method for connecting to it. I also need it to be something I can set up on someone else's computer in 5 minutes without admin access. I do use Opera Unite for some of this, but that doesn't cover everything - I want something a little more robust and general purpose.

Pro-Linux

Let's face it - SSH rocks. There's just no getting around it. It's secure, simple to install, simple to set up, and simple to use. Also, it gets you full access to the command line, which is pretty powerful. I also tunnel VNC over SSH for nice, easy graphical access to my machine.

Pro-Windows

Let's face it - Remote Desktop beats the pants off of VNC. I mean, it's not even close. Granted, VNC is cross-platform, but RDP offers more features and it's a lot faster.

Also, there are SSH servers available for Windows. They're not as "native" as SSHD under Linux, but they still offer remote command-line shell (usually based on Cygwin, so it's Linux-like) as well as SCP and SFTP.

Verdict: It's pretty much a draw on this one. The SSH support for Linux is better, but Windows has some SSH and better remote graphical capabilities.

Inertia

When switching platforms, you need to account for overcoming the inertia of your current environment. That means coming up with new tools, new customizations, new data organization, etc.

Pro-Linux

Eight years is a long time. That's about how long I've been running Linux exclusively on my home desktop. I've been building up my environment since the bad-old-days when Netscape Navigator 4 was the best browser available and you had to recompile your kernel if you wanted to use a CD burner. I have scripts that won't run on Windows, symlinked configurations that won't work on Windows, and, or course, lots of things organized in a very UNIXy way.

Pro-Windows

The nice thing about Windows 7, as opposed to previous versions, is that it's a bit easier to organize things in a UNIXy fashion. Plus, there's always Cygwin and so forth. But really, there's not much to say here. The only saving grace here is that most of my existing Linux investment isn't irreplaceable.

Verdict: Linux, obviously. But while the margin is pretty good, it's not overwhelming. I don't do that much in the way of heavy customization anymore, so it's actually not as big an issue as it would have been a few years ago.

Final Verdict

Well, I'd say it's pretty obvious at this point. The preponderance of the pros seems to be on the Windows side.

On the one hand, the idea of this transition makes me a little uncomfortable. I've been a Linux user for a long time, and will continue to be one on the server-side. But at the same time, the more I think about it, the more sense this transition makes. While I'm still a Linux fan, I've grown less concerned with the desktop side of it. These days I'm more interested in things "just working" than in twiddling with my desktop or finding free software to do something I could as easily do with freeware. By the same token, distance has dulled any animosity I may have harbored toward Windows.

So we'll see how it goes. I actually installed Win7 on my desktop yesterday afternoon (I started this entry a week ago - just didn't get around to posting it). I'll be posting updates on my difficulties and migration issues. Hopefully it will be a pretty smooth process.

Post-upgrade fixes

It's that time again - time to fix things after upgrading!

This time, I did something unusual for me - rather than copying my existing Ubuntu install over to the new drive, I just did a fresh install. This was partly because I'd been carrying around that same install for at least 3 years (probably more like 5) and partly because it was the 32-bit version, and I wanted to use the x86_64 version, what with the new CPU and all. Of course, I did copy across my home directory, so I'm probably going to see some config problems there, but that's just how these things go.

My first two problems were a little unexpected. The first was that the numeric keypad on my keyboard didn't work. I mean, it worked, in the sense that certain keys responded in approximately the desired way, but it didn't actually type any numbers. Fortunately, a little Googling turned up this post, which had a very quick fix - turn off the "mouse keys" in the keyboard config dialog. Not something I would have expected to be on by default.

The second problem took a few minutes for me to figure out, probably because it's still early. When I fired up Thunderbird and opened Lightning, none of my calendars were there. And I couldn't add them back - the dialog just didn't respond. The solution dawned on me when I tried to install the newer version of Lightning. The install bombed out, complaining that the package didn't support x86_64. Duh! I was still running the old x86 pre-upgrade Lightning binaries that were in my profile directory. Unfortunately, the main Lightning page didn't have any x64 binaries linked, so I had to go hunting. Turns out you can grab them from the contrib directory, so that was an easy fix too.

We'll see what else is broken over the next few days. Hopefully I won't hit too many more snags.

Making my games work

Despite not considering myself a "gamer" (with the exception of Battle for Wesnoth), I do have a bit of a weakness for "vintage" games, by which I mean the ones I played when I was in high-school and college. While I don't have much time for games anymore, what with full-time employment and home ownership, I still like to come back to those old every now and then.

Well, when I tried to fire up my PlayStation emulator, ePSXe, to mess around with my old copy of Final Fantasy Tactics, I ran into a problem - I no longer have GTK+ 1.2 installed! Apparently it was removed when I upgraded to Ubuntu 10.04 (or possibly even 9.10, I'm not sure). However, according to this LaunchPad bug, this particular issue is by design and will not be fixed. That kind of stinks, because I have several old closed-source Linux-based games that depend in some way on GTK+ 1.2 (often for the Loki installer).

This sort of thing is a little bit of a sore spot for me, and has been for some time. On the one hand, I can understand te Ubuntu team's position: GTK+ 1.2 is really old and has not been supported for some time. You really shouldn't be using it anyway, so there's not much reason for them to expend any effort on it.

On the other hand, how much effort is it to maintain a package that's no longer being updated? Why not at least make it available for those who need it? This is the sort of user-hostile thinking that's always bothered me about the open-source community. There's hardly any compatibility between anything. Binary compatibility between library version is sketchy, as is package compatibility between distributions. Even source compatibility breaks every few years as build tools and libraries evolve. Ever try to compile a 10-year-old program with a non-trivial build process? Good luck.

And that seems to be the attitude - "Good luck! You're on your own." It's open-source, so you can always go out and fix it yourself, if you're a programmer, or hire someone to do it for you otherwise. And while it's absolutely great that you can do that, should that really be an excuse for not giving users what they want or need? Should the community have to do it themselves when it's something that would be relatively easy for the project maintainers to set up?

Not that I can blame them. As frustrating as decisions like this can be, you can't look a gift horse in the mouth. The Ubuntu team is providing a high-quality product at no cost and with no strings attached. They don't owe me a thing. Which, I suppose, is why it's so great that members of the community can fix things themselves. The circle is complete!

But anyhow, that's enough venting. That LaunchPad thread had several posts describing how to install GTK+ 1.2 on 10.04. I chose to use a PPA repository.

sudo apt-add-repository ppa:adamkoczur/gtk1.2
sudo apt-get update
sudo apt-get install gtk+1.2

Ta-da! I'm now back at the point I was at a year or so ago when all of my old games actually worked.

Annoying mtools inconsistency

I hate upgrading Ubuntu, because every time I upgrade, something breaks. Sometimes it's small things, sometimes not so small. For instance, the last time I tried upgrading to a newer version of KDE 4, I ended up irreparably borking the installation and just switching to GNOME. (And by "irreparably", I mean messing it up bad enough that I had neither the time nor inclination to repair it.) Interestingly, the change was much less disruptive than I thought. Apparently I just don't care that much about desktop integration anymore. Chalk it up to switching back and forth between platforms all the time.

Anyway, this time the breakage was in the permission fixing script for my Sansa e280 MP3 player. The short version is that when the script ran mattrib -h to clear the "hidden" attrbiute on a directory, it just printed out the mattrib usage message, despite the fact that there was no error in the command syntax. After all, the same script had been working flawlessly for over a year in Ubuntu 8.10 and 9.04.

Well, after an hour or two of messing around and fruitless Googling, I finally stumbled upon the answer. It turns out that in the build of mtools 4.0.10 that ships with Ubuntu 9.10, the option to remove the hidden attribute has changed from -h to -H. Of course, the usage message still says -h and the man page still says -h, but it's -H that actually works. So, basically, I lost an hour of my life because somebody, whether in upstream of the package maintainer in universe, screwed up and forgot to update the documentation.

Anyway, I did my duty and filed a bug report. We'll see if anything comes of it. I'm sure it's not a high priority, though - if the Google results are any indication, I'm about the only person in the world who uses mattrib on a regular basis anymore.

Kubuntu Intrepid: Another failed upgrade

Well, that sucked.

I upgraded my Kubuntu box at work from 8.04 to 8.10 on Monday morning. It did not go well. Not only did the experience waste several hours of my time getting my system back to a state where I could actually do some work, it left me feeling bitter and fed-up.

Not that the upgrade failed or anything - on the contrary. The upgrade process itself was relatiely fast and painless. So, in contrast to some of my previous upgrade experiences - which have left systems completely inoperable - this wasn't that bad. It's just that, once the upgrade was done, nearly every customization I'd made to my desktop was broken.

Broken Stuff

As for the breakages, they were legion - at least it felt that way. The 2 most annoying were the scrolling on my Logitech Marble Mouse trackball and KHotKeys. It turns out the mouse scrolling was fixable by adding a line to my xorg.conf to disable some new half-working auto-configuration feature.

KHotKeys, on the other hand, was a lost cause. From what I've read, it just plain doesn't work right in KDE 4. So, since key bindings are an absolute must-have feature for me, I worked around it by installing xbindkeys. This works well enough, but it's a huge pain in the neck. Now, not only do I have to recreate all my key bindings, but I have to look up the DBUS commands for all those built-in KDE functions rather than just picking them from a list.

Another annoying one was that the upgrade somehow broke the init scripts for my MySQL server. I don't know how the heck that happened. I tried uninstalling it, wiping the broken init scripts, and reinstalling, but they weren't recreated, which seemed odd to me. I eventually ended up just doing a dpgk --extract on the MySQL package and manually copying the scripts into place.

On another weird note, KDE and/or X11 has been randomly killing the buttons on my mouse. I'll be working along fine and suddenly clicking a mouse button will no longer do anything. It still moves, and the keyboard still responds, but clicking does nothing. Restarting the X server resolves the problem, but that's cold comfort. It seems to happen randomly - except for when I try to run Virtual Box, in which case it happens every time the VM loses focus. Fortunately I'm more of a VMware person, so that's not a big deal, but it's still disquieting.

KDE4 In General

The other big pain-point is KDE 4. To be perfectly blunt, I don't like it. It has a few neat new features, but so far it doesn't seem worth the effort to upgrade.

The good parts that I've noticed so far seem to be small. For instance, Dolphin has a couple of nice enhancements. The one that sticks out is the graphical item-by-item highlighting. It allows you to click a little plus/minus icon to select/deselect an item, so that you no longer need to hold the control key to do arbitrary muliple selects. The media manager panel applet is nice too. It pops up a list of inserted storage devices and allows you to mount and eject them. I have to admit that I also really like the new "run" dialog. It does program searching much like Katapult, but makes it easier to run arbitrary commands and select commands with similar names. While it doesn't have some of the cool features supplied by Katapult's plugins, it's still quite good.

On the other hand, there are a lot of things I don't like (not counting the breakage). For one, I think the new version of Konsole is a huge step backward. I can't access the menus with keyboard shortcuts, the "new tab from bookmark" feature is MIA, the session close buttons are gone, and generally everything I had gotten used to is missing.

And then there's the new "kickoff" application menu. I'm getting slightly more used to it, but I still don't like it. It just feels a lot slower to access items using it. This is only made worse by the "back" button for browsing sub-menus, which is extremely hard to click when you're in a hurry (hint: Fitt's law doesn't apply on multi-monitor setups).

As for the "cool" new look of KDE 4...I'm not a fan. Maybe it's just because I don't have any of the fancy desktop effects turned on on my system (a side-effect of the crappy integrated video card that's part of my tri-monitor setup), but I just don't think it looks good. Yeah, the bare desktop itself is kind of nice looking, but the window theme is ugly as sin. It's one of those "brushed metal" sort of looks, which I find even more depressing than Windows 95 gray. It's too dark for my taste and far too monochromatic. I also find the active window highlighting to be way too subtle to be helpful. The icons also leave something to be desired. They look nice, but they don't look distinct - even after a week, it takes me a second to figure out what some of them are supposed to represent. It kind of defeats the entire point of icons.

As for the much touted Plasma, I'll grant them this - it is pretty. The panel and desktop plasmoids do pretty much all look nice. Not that it matters to me, though, because I never see my desktop - it's always covered with work. And while the various applets and widgets may look pretty, approximately 90% of them are completely useless. That's the problem with all desktop widgets for any platform. I find that if a desktop widget actually provides enough valuable functionality to justify leaving a space open for it on the desktop, it's job is probably better served by a full-fledged applicaiton. And if it's not important enough to make constantly visible, then why bother to put it on the desktop at all? I'm never going to see it, so I might as well save the RAM and CPU cycles.

Conclusion

Overall, I guess Kubuntu 8.10 and KDE 4 aren't bad systems. But to be honest, I'm not impressed. For the first time, I think that the new Kubuntu is not an improvement. In fact, I have no plans to upgrade the 3 Kubuntu boxes I have at home any time in the forseeable future.

The thing that's most disappointing to me about the upgrade to KDE 4 is that it totally defeats my purpose in switching to KDE in the first place. When I switched from the ROX desktop to KDE back in 2005, my main reason was that I was tired of having to build my own desktop. ROX was great, but it was a small community and just didn't have the range of applications and degree of integration that KDE had. You see, I always had this crazy idea that I could just use all KDE applications and everything would be tightly integrated and work well together and there would be harmony throughout my desktop.

However, more and more I've been finding that that just isn't true. Part of the problem is that lots of KDE applications just aren't that good - many of them are missing functionality and have stability problems. I find myself using fewer KDE applications all the time. I dropped Quanta+ for Komodo Edit; I tried to like Konqueror, but it just doesn't hold a candle to Firefox or Opera; I recently tried to become a KPilot user, but was almost immediately forced to switch to JPilot; I finally got fed-up with Akregator and am just using the RSS reader in Opera's M2 mail client; I still use KMail, but not because I particularly like it - I just dislike it less than M2 or Thunderbird. In fact, I think the only KDE app I would actually miss is Amarok. (K3B is very good too, but I don't burn enough disks to care what program I use, just so long as it works.)

So now I'm starting to wonder: What's the point of using KDE? If I'm not using many KDE applications, and most of the ones I am using could be easily swapped out, it seems like there's nothing keeping me with it. Maybe I should just switch to GNOME. Or maybe Windows. I have been wanting to get more into .NET development, and my tollerance for things not working has been falling over the years, so Windows is sounding better all the time.

I think next weeek I'm going to have to reinstall my work machine. Maybe a fresh install and a fresh KDE profile will give me a better experience. Or perhaps I'll ditch Kubuntu and go for straight Ubuntu with GNOME. Or perhaps I could take another look at ROX. I don't know. And while I'm at it, I think I might reinstall that old Windows partition I still have on that machine. Maybe some time playing with a nice clean install of XP, or even Vista, if we have a spare copy, will give me a little perspective.

Random binary breakage and a rant on compatibility

I enjoy an occasional video game. However, I am by no means a "gamer", as evidenced by the fact that I don't have a copy of a single proprietary game published later than 2002. Rather, I enjoy open-source games like Battle For Wesnoth, vintage games such as Bandit Kings of Ancient China and Wing Commander, and the occasional old strategy game, such as my old Loki games for Linux. I also have a soft spot for emulated console games for the NES and Super NES. I even break out an emulator for my old PlayStation disks every now and then.

Well, the other day the mood struck me to play one of my old PSX games, so I clicked the icon for the ePSXe PlayStation emulator in my application menu and waited...and waited...and waited. And it never came up. So I tried running it from a command prompt and...nothing happened. And when I say "nothing", I mean nothing - no error message or output of any kind. I just got my command prompt back immediately.

Mind you it had been a while since I'd used ePSXe, but there was no immediately obvious reason why it should fail. It's installed in my home directory and has been sitting there, fully configured, for over a year. I used it regularly for a few weeks back in September and October and it worked perfect. Absolutely nothing has changed with it.

Fortunately, a little Googling turned up this thread in the Ubuntu forums. Apparently the ePSXe binary is compressed with UPX. After installing the upx-ucl-beta package via apt-get and running upx -d /path/to/epsxe to decompress the binary, it worked as expected. Apparently something about running UPX-compressed binaries changed between Ubuntu Feisty and Gutsy. I have no idea what, though.

This actually leads into one of the things that really annoys me about Linus: binary compatibility. It's also one of the reasons I prefer to stick with open-source software on Linux when at all possible.

In the Windows world, binary compatibility between releases is pretty good. Granted there are always some applications that break, but given the sheer volume of code out there, Microsoft does a good job keeping that number relatively small. In fact, if you've ever heard any of Raymond Chen's stories of application breakage between releases, you know that the Windows app compatibility team sometimes goes to truly heroic lengths to enable badly broken applications, many of which never should have worked in the first place, to keep functioning when a bug they depended on is fixed. The sample chapter (PDF) from Raymond's book has some really great examples of this.

In the Linux world, on the other hand, nobody seems to give a damn about maintaining compatibility. If you have a binary that's a few years old, it may or may not work on a current system. And if it doesn't, sometimes you can massage it into working, as was the case with ePSXe this time, and sometimes you can't. Not that this should be surprising: some developers in the Linux world are so lazy they won't even allow you to change the paths to application support files - they just hard-code them into the binary at compile-time with preprocessor defines! If they don't care if you can install the same binary in /usr or $HOME, why should they care if it works between distributions or even releases of the same distro? The attitude seems to be, "Well, it's open-source anyway, so who cares how compatible the binaries are?"

But if we're going to be honest, even being open-source only goes so far. Actively-maintained apps are usually OK, but have you ever tried to build an application that hasn't been maintained in 7 or 8 years from source? It's pretty hit and miss. Sure, if I really needed the app, had lots of spare time on my hands, and was familiar with the programming language and libraries it used, I could always fix it to build in an up-to-date environment. But for a regular user, that's simply not an option. (And even for a programmer it may well be more trouble than it's worth.)

But as annoying as I find the general lack of compatibility, as much as I wish I could just run a damn executable without having to cross my fingers, I can understand why things are they way they are. Quite simply, maintaining compatibility is hard. It takes care and diligence and it can make it hard to fix certain problems or make certain types of improvements. And really, when you're not getting paid for your work and have no real obligation to your users, you have to ask yourself if it's worth the effort. Heck, even many commercial vendors aren't that serious about backward-compatibility. Is it really reasonable to expect a loose association of unpaid volunteers to be any better?

But that's enough ranting for tonight. There are ups and downs to every software system. I'm just disgruntled that everything in my personal Linux-land seems to be 5 times more difficult than it needs to be lately.

Strigi: What the hell?!?

As I mentioned the other day, Kubuntu now ships with Strigi as the default desktop search engine. So, I decided to give it a try. I started the daemon a few days ago and left it to build an index.

My reaction, when I came back last nigh, was - and I quote - "What the %$@#?!?" (And yes, I actually said, "What the percent dollar at pound?!?")

You see, I had noticed that I now had only 500MB of free space left on my home partition. This was surprising because. I knew the partition was filling up, but I hadn't moved any substantial amounts of data around lately and I should have had at least 5GB left.

A quick du -sch `ls -A` revealed the culprit. My ~/.strigi directory had ballooned up to a whopping 7.5GB.

Let my repeat that. The Strigi index was up to 7.5 gigabytes. How? Why? What the hell was it doing? My beagle index was only about 950MB. Why the heck does Strigi need so much more space?

This will definitely merit a little research. I'm hoping this was some freak malfunction. But if not, then I guess I won't be using Strigi. I mean, 7.5 gigs? Come on!

Upgrading to Gutsy

It's that time again. Ubuntu 7.10, code named "Gutsy Gibbon" came out last week, and so it was time for me to upgrade all my systems. So here is my report.

In the past, Ubuntu upgrades have been a pretty hit and miss process. Earlier releases involved manually editing your sources.list file and running apt-get commands in a particular order. The last upgrade, however, was a much less painful procedure, and this one followed in its steps. However, it still wasn't nearly as seemless a process as I would have liked.

Currently I have 4 machines running Kubuntu Linux: my home desktop, Sarah's desktop, my laptop, and my work desktop. So far I have upgraded my home and work desktop and done a clean install on my laptop. Sarah's desktop will also be a clean install when I get around to it, because she's still on release 6.06, which can't be upgraded directly.

First, I'll start with the good news. I experienced no hardware problems whatsoever when upgrading. Everything "just worked" after the upgrade. This was no surprise on my home desktop, which is now using a wired LAN connection and has no exotic hardware. It wasn't a big surprise on my work desktop either, as the only weird hardware that has is the tri-head display setup, the configuration for which should have carried over from before the upgrade. With my laptop, however, I was pleased to discover that the integrated Broadcom card now works without NDISwrapper. Kubuntu's new restricted driver manager allowed me to automatically download and install the binary firmware the card needed to run, and after that it worked perfectly. The process was entirely graphical and consisted of just a few clicks. No fuss, no muss!

Now the bad news. While the laptop install was pretty smooth, the desktop upgrades weren't that great. My home desktop just had a slight hicough, as the installer bombed out after complaining that it couldn't verify the gutsy-security repository. This was easily fixed by commenting that repository out of my sources.list and re-running the upgrade. After that, the only problem was that the entire process was really slow and required sporadic user interaction. That part actually caused the upgrade to drag out over the entire day, as I would answer one prompt, go away for two hours, come back to another, and so forth. However, it wasn't as bad as the last upgrade.

My work desktop, on the other hand, was a huge pain. The system functioned properly once I was done, but it was way harder than it should have been. But on the up side, the dual-core processor, 2GB of RAM, and gigabit LAN with a fiber connection made things pretty fast.

Basically, my problem was that the upgrade tool just kept crashing and hanging. For instance, I had to kill it after it spent over half an hour configuring an OpenOffice package. This might have been due to our internet connection going down briefly, but it's hard to say for sure, as I the upgrade tool wouldn't give me any terminal output at that point. After that, I had another, similar, hang, followed by several crashes. The one crash which kept recurring turned out to be caused by the mono-xsp package. Apparently, when the install script tried to shut down the service, the xsp script in /etc/init.d was bombing out with an invalid file descriptor error. This, in turn, was killing Adept. To cut a long the story short, I forced the upgrade on xsp, ran dpkg --configure -a a few times, and everything was eventually fixed. But it was an ugly, command-line intensive process.

Going back to the brighter side, I'm liking Kubuntu Gutsy so far. It's not a radical departure from Feisty, but the changes I've seen so far are nice. I've already mentioned the restricted driver manager, which is great. The boot time seems to have dropped noticably as well. They've also switched from Konqueror to Dolphin as the default file manager. I didn't care for Dolphin when I had tried it in the past, but it's starting to grow on me. It has the features I use on a regular basis and is certainly less complicated than Konqueror.

Another little nicety is that they've moved to a KDM theme with a user list. I believe that, in the past, KDM themes didn't work correctly with user lists, but it seems that that's no longer a problem. I always liked that feature - it just makes the login a little more personal, I think, what with the little avatars and such. It will also be nice for when I migrate my mother's system from Xandros 3.0 (hey, it seemed like a good idea at the time) to Kubuntu. Having a list of usernames means one less thing to remember.

Lastly, I see that they've added Strigi as the default desktop search engine. I don't have any experience with Strigi, but I have had some not-so-great experiences with Beagle. In particular, it has a tendency to just hammer your system, sometimes causing serious drag. It also generates pretty large indexes. I've read that Strigi is supposed to fix both of those problems, but haven't used it long enough to say for sure.

The one problem I have with Strigi is the user interface. To call it "spartan" would be an understatement. It's a browser-based interface that makes Google's front page look lavish. Honestly, it feels like the UI is an afterthought. Compare this to Kerry Beagle, which is simple to use, but very pretty and very functional. Hopefully there's a Strigi front-end someplace that's comparable.

So, to summarize, Kubuntu keeps getting better. It's evolution is gradual, but visible. We'be certainly come a long way from release 5.10, which I thought was pretty good at the time. Now if only KDE 4 was done....

Screw encryption!

On Friday, I said I was finally going to secure my wireless LAN. As you can probably tell from the title of this post, that didn't go so well. As of this writing, I am still running an open system because that's the only configuration I can get to work with all three of my computers.

268023_d-link_switch.jpgI've spent several hours messing with this today, and it's put me in a really foul mood. There was a time when I enjoyed messing around with my system configuration, but I just can't do it anymore. I don't care that much about networking. I have too many other things I want to spend my time on. I just want my damn network to function and not let anyone who drives by eavesdrop on all my traffic. Is that too much to ask?

My upgrade process started with a firmware update to my D-Link DI-524 C wireless router. This update included WPA2 support, which was a nice bonus. So my encryption options were now: nothing, WEP, WPA, WPA2, and something called WPA2-auto. On the down side, it included no additional documentation, so I have no clude what this "WPA2-auto" is supposed to be. But "auto" sounded promising, so I decided to go with that mode.

Turns out this was a bad idea. According to this forum thread, WPA2-auto doesn't seem to work consistently. Unfortunately, I didn't discover this until I had spent a considerable amount of time trying to get my PC configuration right. You see, I was misled because my laptop was able to connect one time while the router was in WPA2-auto mode. That led me to assume that the problem was with my PCs, not the router. Guess I should have Googled first.

So, eventually, I ended up going with plain-old WPA. The client configuration was a bit tricky for this. You see, my laptop uses NDISwrapper, so I could just use KNetworkManager to enter the pre-shared key. However, my desktops both have RaLink cards and use the rt2500 driver. This driver does not use the Linux wireless extensions and hence does not work with NetworkManager. To configure these cards, you need to add some lines to your /etc/network/interfaces file, as described here. It works, but the down side is that it breaks NetworkManager. However, since these are desktop PCs with 1 WiFi card connecting to 1 access point, that's not really a big deal.

While the desktops weren't that difficult (one I got the right router settings, that is), the laptop was another story. I still haven't figured that one out yet. Of course, I was out of energy by the time I got around to it, so I wasn't exactly in peak form.

The laptop has in integrated Broadcom card which, as I said becore, is configured to use NDISwrapper. This means it works with KNetworkManager. However, I couldn't get KNetworkManager to connect to the access point with WPA enabled. I selected the encryption mode, entered the pre-shared key, and then the connection progress bar would hang at 28%. The iwconfig output said that the card was associated with my access point, but I never got an IP address.

My current suspicion is that the laptop is using stale configuration data from my failed WPA2-auto attempt. I had some problem with stale configuration on the desktops too. For those, I just did a /etc/init.d/networking stop and then unloaded the driver module, then reloaded and restarted. That cleared everything up. In this case, however, I'm thinking it's the data stored by KNetworkManager. The only problem is, I have no clue whatsoever where I would look to find out. The interface is really spartan and there's no obvious way to delete stale configurations.

There is still one big functionality question I'm left with: how do I get NetworkManager to centrally configure an access point for all users? Both Sarah and I have our own accounts on the laptop, and I'd really like NetworkManager to automatically detect when our home network is present and connect to the access point at system start-up. I'm thinking there must be a way to do that, but there's nothing obvious in any of the configuration tools.

My wireless insecurity

I have a confession to make: it's 2007 and I still haven't set up any encryption on my wireless router. <Wince>

I know I should. I've known I should since I bought the thing two years ago. Every now and then I think I'll set it up, but then I just never get around to it. There just never seems to be a good time. However, as recent events have shown, even the best of us can have security problems, so it's time for me to get my butt in gear.

The real problem is two-fold. First, I'm a programmer, not a network admin. I know the basics of how networking works, and I can set up a basic home LAN without any problems, but I'm hardly an expert. I also know very little about WiFi, so I don't exactly have a lot of confidence that I'll get it right the first time.

This is a problem because I'm married to a lovely woman with absolutely no interest in geeky computer stuff. She also has a very low tolerance for things being temporarily broken. This goes double for "the internet," since web browsing and e-mail are 90% of what she does. This means that my only window of opportunity to mess with our LAN is when she isn't home. However, I usually have other chores to do at those times, so it never gets to the top of the priority list.

The second problem is technical. I wanted to set up encryption when I first installed my wireless hardware. However, the WiFi cards in my desktops are RaLink 2500 chipsets, and at the time, the drivers had only recently been open-sourced. The upshot is that I started out using beta releases of the community-supported driver and I was lucky to get it working at all. Encryption isn't a high priority when the network interface is just barely functional.

On top of the driver, both the router and my software were little help. My router is a D-Link DI-524 C1, which only included WPA-PSK support in a firmware upgrade (I'll be damned if I'm going to set up a Radius server) and included little to no documentation on it. And at the time I was using Slackware 10 and Xandros 3, neither of which had much in the way of helpful wireless configuration tools. So setting it up would have been a command-line, /etc/*, and vi hackathon of the type for which I have long since lost my enthusiasm

Today, however, I'm using Kubuntu 7.04. It includes a nice, stable RaLink driver and the KNetworkManager utility, which allows you to easily connect to any number of wireless networks. I essentially have no excuse anymore.

So, tomorrow afternoon, I'm going to give it a try. I've already downloaded the latest firmware upgrade from D-Link in preparation. I'll just need to dig out my 25-foot network cable, wire up one of my computers, and go to work. With any luck, it will go smoothly and I'll be done in half an hour. If I'm not so lucky...I may still be running an open network next week.

DHCP fails on boot in Feisty

On Friday, I upgraded my laptop to Kubuntu Feisty. For the most part, it was uneventful. My only immediate complaint was that this upgrade, too, took 24 hours to complete. There are just too many places where the upgrade tool prompts for user input, so if you walk away, it end up taking 5 times as long as it should.

However, later on, I did find one problem: my wireless network wasn't working. It's an integrated Broadcom BCM4318 which uses the Windows drivers through NDISwrapper. NDISwrapper was still working and the wireless extensions seemed to be functioning properly. Moreover, the interface was up. It's just that it never got an address from the DHCP server.

After some experimentation, it turned out that just restarting the network devices with sudo /etc/init.d/networking restart got things working. Same thing if I did an ifdown and ifup or even manually ran dhclient. Once the system if booted, pretty much any network re-initialization will fix things. It's just on initial boot that I don't get an IP.

KNetworkManager configuration progress
Another thing is that if I use the knetworkmanager to reconnect my wireless access point, I lose my IP again. For some reason, the progress never gets past 28%, the "Activation stage: Configuring device" stage. I suspect there's some relationship, but I really have no idea what it is. I'm not that deep into the bowels of Ubuntu configuration.

The immediate fix was to simply add /etc/init.d/networking restart to my /etc/rc.local. That gets us an IP at login time and keeps Sarah from calling me in because she can't check her e-mail. In the long term, I'd like to figure out what the heck is causing this. There's a bug in Launchpad that looks similar, but I didn't see a fix.

Hiding any file in KDE

I recently started taking advantage of a handy little feature utilized by Kubuntu: the .hidden file. It provides a simple way to hide any file in a directory.

As you may know, the traditional UNIX method for hiding files is to start the name with a period, such as ".vimrc" or ".profile". That works well enough, but there's one obvious problem: you have to rename the file. You can't just go renaming any old file or directory willy-nilly. That's how you break your system.

hidden.pngHence the .hidden file. You just create a .hidden file in a directory and in it you put the names of any files or directories you want to hide, one per line. That's it. Once you have that, KDE will treat those files/directories as hidden.

This is quite useful for my home directory, because I have my home directory set as my desktop. For the most part, I like this setup. However, there are always a few directories that I seldom, if ever, need to see in the file manager, like ~/Mail or ~/texmf. Using a .hidden file conveniently frees up those extra pixels without the hassle of dealing with symlinks or shortcuts and without the need to reconfigure any software.

Feisty upgrade

Kubuntu Feisty is out and I've upgraded my desktop. All in all, it was a fairly painless process.

The upgrade process for Feisty was much improved over easlier versions. The first time I upgraded, from Hoary to Breezy, it didn't go so well. In fact, the first couple of upgrades were fairly risky affairs. The upgrade from Breezy to Dapper was an unmitigated disaster, with the upgrade simply removing some key packages, like all of KDE.

This time, it went pretty smoothly. I did have to start over again, because the upgrade stalled while trying to contact the PLF repositories, which are now offline, but after I removed that entry from my sources.list, things went very smoothly.

The best part was probably that this upgrade was all graphical. You use Adept to install all the available updates, and when it's done, a little wizard pops up telling you that a new version of Kubuntu is available and asking if you'd like to install it. After that, you click next a few times, watch a few progress bars, and click through a few confirm prompts. That's about it. The installer even reboots the system for you at the end.

The actual upgrade took several hours to complete. In my case, it actually ran for around 12 hours, but I'm sure much of that was just sitting there waiting for input. You see, downloading the updage packages took a couple of hours, but then it prompted me to install. And then, throughout the installation, there were several prompts for configuring various packages. So, unfortunately, it's not something you can just leave for a few hours and come back to an upgraded system. It needs a little babysitting. I had to come back periodically and do things like accept the default options for some RAID software that I don't even know why it's installed (I don't have a RAID array on my system) or tell it what to do with configuration files that has changed in some trivial way. Thus, since I was busy, I ended up spreading it out over the entire day.

So far, Feisty is nice enough. I haven't used it too much yet, but there doesn't seem to be much worthy of comment so far. It seems like just new versions of everything and a few incremental improvements. Which is as it should be. If I wanted a revolutionary change, I'd buy a copy of Windows Vista. Kubuntu seems to be getting very solid now, which is just how I like things.

Amarok upgrade = disappearing podcasts

The other day, I made the mistake of actually paying attention to that "updated packages available" icon in my system tray and actually installed those updates. I really should know better by now. Kubuntu is great, but I swear every time I install updates, something breaks.

This time, something happened to Amarok. I'm not sure what, but the next day, my podcasts were just gone. The previously downloaded files were still there, but the "Podcasts" folder in my playlist panel was empty.

The worst part was that, when I tried to re-add the few podcasts I subscribe to, things didn't work properly. I used the "configure children" option to set the defaults to download under a particular directory and to download when available. However, when I added child podcasts, the settings reverted to the system defaults. And for some reason, it took several tries to get the "download when available" option to stick.

So my question is: what the hell?!? That's just weird and I can't figure out, for the life of me, why that would happen. But on the other hand, Ubuntu Feisty is supposed to be coming out in the next week, so hopefully this won't be an issue for much longer.

Laptop battery life

battery.pngI finally got around to looking up solution to my laptop power problem. I'd actually never really worried about it, since I almost always use the laptop plugged in, but I read a blog post complaining about battery life under Ubuntu, so I figured I'd look into it.

My Dell Inspiron B120 normally got about an hour an ten minutes on battery. I never actually ran Windows on it, so I have no basis for comparison. However, I'd read other comments about people getting two to three times as much battery life on Windows. For instance, my brother's Inspiron 1501 gets about 3 hours.

The good news is that there's "laptop mode." By flipping on the ENABLE_LAPTOP_MODE setting in /etc/defaultacpi-support, I gained about 20 or 30 minutes.

The bad news is that there's not an awful lot of other tweaking to be done. There are some other tips on the Ubuntu wiki, but I haven't found anything that seems to make a dramatic difference. At least, not anything I can configure with a truly intimate knowledge of the system.

So basically, it seems like we're kind of out of luck for the time being. But I'm not going to complain too loud. I'm just glad all the hardware in the laptop actually functions.

Kubuntu wins again

I collected a little more proof that Linux is better than Windows today. Or, at the very least, Kubuntu is better than Windows for the work I do. In particular, installing new software is so much more convenient, despite the crap that the nay-sayers spew.

One of the Senior Programmers (SP for short) came to me with a problem related to viewing multi-page TIFF files. Basically, the SP had written a program to normalize the canceled check scans we get from various banks. Of course, each bank has their own data format and associated viewing application, but nobody want to have half a dozen programs to do the same thing. Hence this system.

Anyway, one of the banks had changed the format of the data the were sending us, and while each file contained images of several checks, the SP's third-party TIFF tools and libraries could only access the one. It seems the internal pointers in the TIFF file were either not present or not correct. So we put our heads together and the SP gave me a copy of some sample data to play with.

Now, since I was at work and running Windows XP, I could have scoured the internet for TIFF tools, downloaded a half-dozen MSI or EXE file, and installed them. Instead, I fired up a Kubuntu Edgy virtual machine and started up Adept.

The first thing I installed was the libtiff tools. They had the same limitation as the SP's tools. So I installed KHexEdit to look at the raw TIFF file data. Each of these was selected and installed in a matter of seconds.

Meanwhile, the SP had managed to dig out an old DOS-based hex editor from the bad old days of QBasic. Why? Because that was the only thing lying around. Needless to say, it didn't hold a candle to KHexEdit and was, in fact, totally inadequate.

khex.pngI, however, was investigating the suspicious "offset" and "length" attributes in the XML file that came with these images. I jumped to the first offset value in KHexEdit and noticed that, like the beginning of the file, it started with a TIFF byte order marker. And so did the next offset. So I set bookmarks at those offsets, copied the binary between them, and pasted it to a new file. KHexEdit made it almost painfully easy. And, as I suspected, the resulting file was, in fact, its own valid TIFF file. The bastard bankers had just concatenated a bunch of TIFF files together!

And so the problem was solved. And with the help of Kubuntu, I did it before the SP even had time to hunt and install down a proper hex editor.

I guess it was monodoc

Well, I just figured out why installing MonoDevelop installs Lynx on Ubuntu. Turns out it's monodoc-base that depends on Lynx, not Monodevelop. I'm still not sure why monodoc-base (or anything else, for that matter) depends on Lynx, but at least now I know where the dependency is coming from.

There's no pleasing some people

Here's an odd one. A recent Slashdot story last week linked to a summary of Mark Shuttleworth's comments on Dell and Linux. The Slashdot summary said the article contained a response to Shuttleworth that they described as "amusing and telling at the same time."

I found that link odd because I'm not sure if that was meant to be ironic or not. The "amusing and telling" response was, and I quote:

"i'm getting so sick and tired of hearing excuses and rationalizations. just put the cd in the cupholder, install it and sell it. period. there's no need to analyze or certify. what is so hard about this?"


As far as I can tell, the only thing that tells us is that I was right and the community really is full of people who need to grow the hell up.

You almost have to feel sorry for Dell when people say things like that. It's a classic case of "damned if you do, damned if you don't." First, people complain that they don't pre-install Linux because, after all, you just have to stick in the CD and wait. But if they did that, people would complain that the system came unconfigured and they had to just start over from scratch anyway.

It reminds me of an interview with someone from Loki Games that I read years ago. (Wish I could find the link....) For those who don't remember, Loki was in the business of porting popular PC games to Linux. They went out of business in late 2001.

Apparently, they got complaints from some people that they couldn't download their games for free. Sure, the game engines were free and ran on Linux, but the media were all licensed from the original publisher, so Loki couldn't give them away even if they wanted to. But people didn't care about that. They said they wanted games that were open-source and ran natively on Linux, but not enough to pay for them. And yet some still felt entitled to complain about it. I guess it's just human nature....

A fair review

Last week, HConsumer published an article entitled 30 Days with Linux, in which the reviewer spent a month using Ubuntu. I have to say that this is probably the single best Linux review I've ever read.

Thre were two things thing I really loved about this article. First, I thought it was extremely fair. The reviewer seems to understand and accept that when 90% of the world uses Windows, there is a price associated with choosing to use anything else. In particular, he didn't blame Linux or Ubuntu for things like not supporting his favorite software or doing things differently than Windows, which you see that far too often in this kind of review. That little bit of understanding really made it clear to me that the reviewer was trying to give an honest assesment, as opposed to just cranking out enough words to fill a column.

The second thing I loved about this article was the length of the review period: 30 days. I think it's difficult, if not impossible, to fairly evaluate an operating system/environment in much less than that. Every system is different, and it takes time to adjust when switching from one to another. You need time to figure out how things are done in the new system, to find the shortcuts, and to really appreciate the more advanced features. A few days or a week might suffice for a shallow overview, but those aren't really useful to anybody.

I hope we see more articles like this in the future. The computing trade press has a nasty tendancy to be shallow and trashy, so it's nice to have some thoughtful, open-minded criticism.

Firefox feeds fixed

I've discovered why Firefox hates me. Apparently it's not just me, but broken code in Firefox.

I found the solution in this Mozillazine thread. It contained no mention of the error I got in the console, or of external feed readers simply not launching, but the patch to FeedConverter.js did fix my problem.

All I had to do was change the indicated section of /usr/lib/firefox/components/FeedConverter.js, start Firefox, and add the strings browser.feeds.handlers.application.args and browser.feeds.handlers.application.uriPrefix in the about:config dialog.

So now external feed readers work. In fact, they work better than they're supposed to, because this patch adds support for setting arguments to the external command, so no wrapper script is required. For Akregator, I just set "--addfeed" as the browser.feeds.handlers.application.args and /usr/bin/akregator as the command. Bingo!

MonoDevelop depends on Lynx?

I just installed MonoDevelop for the first time and I have a question: why does the Ubuntu MonoDevelop package depend on Metacity and Lynx? It makes no sense.

The Lynx part really gets me. What possible use could a graphical IDE have for a text-mode browser? Especially given that it depends on Firefox as well.

And what's worse, I can't even see where Lynx is getting pulled in. I don't see it in MonoDevelop's list of dependencies, but Adept doesn't want to install MonoDevelop without it. What's the deal?

Weird "BadDevice" error

Today I finally got around to looking up the solution to that wierd X error I kept getting in Ubuntu. It's the one where running pretty much any X application from a virtual terminal results in errors like "X Error: BadDevice, invalid or uninitialized input device 168".

Turns out this is a pretty easy fix. These messages are apparently due to lines in xorg.conf to support some kind of tablet PC, or something. So, just comment them out, restart the X server, and you're good to go.

Card reader and flash drive

I got the rest of my order from NewEgg today. I ordered an internal multi-card reader and two of the 1GB SD card/mini USB card reader combos. Oddly enough, I got the two SD cards last week, but the three card readers shipped separately from the cards, despite the combo deal.

Anyway, there is good news here. First, the multi-reader, a RosewillRCR-100/101, works flawlessly with Kubuntu. Card inserts are detected correctly and writing is fairly fast, unlike with the reader built into my printer. Now I can fill up that 1GB card for my MP3 player without having to wait forever! This reader also gives me a front USB port, which I had been missing. (Due to an oversight in ordering the parts for my PC, my case has no front USB.)

In fact, it's even better than that. It turns out that my Edgy USB drive problem no longer occurs when I use the new front USB port. Apparently it only happens when I plug the pen drive into my Walmart-special Belkin external hub. I'm not sure why, but it was the cheapest USB hub they had on the shelf when I bought it, so I guess I can't be too surprised.

Edgy swap bug

Hey, I found another Edgy bug! This one only happened on the laptop. I didn't notice it until today because it happens during boot, and until an fsck ran this morning, it was hidden behind the "user-friendly" startup screen.

Aside: am I the only one who thinks the new boot screen really sucks? Sure, it looks nice, but I kind of like knowing what's actually going on when the system boots. I don't need to see all the gory details, but when the progress bar stalls for a really long time, I'd like to at least know what subsystem it stalled on.

This particular problem was simple: my swap partition wasn't getting turned on. A manual sudo swapon -a resulted in an error that said my lond-uuid-for-device was an invalid parameter.

Second aside: What's with the device UUIDs in Edgy? Where did that come from and why did they switch to it? I don't necessarily think it's a bad thing - in fact, it seems like a good idea. I just wonder where it came from.

Anyway, the solution was simple. Just run sudo mkswap /dev/some-long-uuid followed by sudo swapon -a and everything was great. I still don't know what caused the problem, though. My initial guess was the UUID symlink, but that turned out to be correct. I guess I'll see what happens the next time I boot it.

Edgy and NDISwrapper

Another Edgy problem popped up the other day. In upgraded my laptop and suddenly NDISwrapper no longer worked. The driver wasn't getting loaded and attempting to do it manually with modprobe resulted in an error about an invalid parameter to the driver.

For some reason, just as with theflash drive thing, this problem went away when I booted into an older kernel. (Side note: the USB drive does work normally with the 2.6.17 kernel on the laptop.) Strangely enough, booting with kernel 2.6.15-25 worked perfectly, but 2.6.15-23 gave me the old boot screen and hung on "waiting for root filesystem."

The problem seemed to be that, somewhere in the process, my ndiswrapper-utils package got seriously screwed up. I had the ndiswrapper-utils package, but it seems that this actually depends on ndiswrapper-utils-1.1, which wasn't installed. After looking at the available packages, I actually ended up removing ndiswrapper-utils and installing ndiswrapper-utils-1.8. (I don't know why the default is ndiswrapper-utils-1.1 - you'd think the newer version would be the one to use.) This did the trick and my network card is now working normally.

Edgy USB bug

Today I discovered an extremely inconvenient bug in Kubuntu Edgy. It's a problem with a USB mass storage device that only occurs under the 2.6.17 kernel.

I have this USB flash drive that I bought this summer. It's a little unusual in that it came with two partitions on it. One is a bootable (I think) 1.4MB partition and the other is allocated the remainder of the 1GB drive capacity. On Windows, the boot partition is detected as a USB floppy drive, while the main data partition is picked up as a standard USB mass storage device. In Kubuntu Dapper, both partitions were detected as USB mass storage volumes and mounted when the drive was inserted.

When I plugged that drive into my external USB hub today, it didn't work. And by "didn't work," I mean it didn't work at all. No auto-mount, no desktop icon - udev didn't even create a freakin' device node. I checked the output of dmesg and lshal, and it seems that the drive was detected as some kind of generic USB device. In fact, dmesg only reported a "new full speed USB device using uhci_hcd and address 12." No mention of any device nodes, drivers, or anything else. Just a notice that the device was not plugged into a high-speed hub.

I don't yet know what the problem is, but it seems to be at least partly in the kernel. How do I know this? Because, on a hunch, I tried rebooting and running with the 2.6.15 kernel I still had installed from Dapper. When I did that, the device worked. It seemed a little slow, but it worked.

It's important to note that this problem only occurs with the one drive. I have an old single-partition 32MB Lexar flash drive and that functions perfectly under Edgy with kernel 2.6.17. I'm guessing the problem is something about that weird two-partition setup. I don't know why that would cause a problem, but it's all I can think of at the moment.

It finally worked!

I upgraded to Kubuntu 6.10, Edgy Eft, this evening. I was shocked by the results - it worked flawlessly!

I started using Kubuntu with Hoary and this is literally the first time I've had an uneventful upgrade. Every other time something has gone wrong and left my system partially, if not totally, borked. But not this time. The upgrade instructions were simple and complete, unlike the upgrade to Breezy, and there was no massive breakage, unlike the upgrade to Dapper. Everything was nice and smooth.

The weird part is that I'm very ambivalent about my reaction to the smooth upgrade. Part of me wants to say, "Well done, Kubuntu team! Keep upthe good work!" However, another part of me wants to say, "It's about freaking time! Why did it take so long to get non-sucky upgrades?"

Apart from the nice upgrade, I have yet to see anything truly remarkable in the new version. Things look slightly nicer and now I have Digikam, which seems pretty nice. That's about it.

I have noticed one problem, though. Some of my hotkeys are no longer working. I had all my multimedia keys set up in KHotKeys, but two of them have stopped working. KDE and X.org still recognize the keys, but pressing them doesn't run the desired command. I have other key combinations for the same commands, so it's not critical, but it is annoying, as those are the two I use most often (the keys for Konqueror and Konsole).

Plus I see that KMixer still can't mute my sound card, a CMI8738 chipset. I don't think it's ever worked on this system and I don't know why. I must admit I'm a little curious, but I almost never need mute and when I do I just hit the power button on my speakers, so it hasn't been worth the effort to figure it out.

MPIO pain

Why is my USB so messed upon Ubuntu? First my cell phone data cable doesn't work properly, and now my MPIO FL100 MP3 player doesn't work right. What's the deal?

Let me rewind and give some background. A few years ago, my parents gave me a 128MB MPIO FL100 MP3 player for my birthday. It's a nice enough player, with a display and decent controls, but it's not Linux-friendly. In other words, it doesn't work as a block device, so you can't just mount it as a USB hard drive. It requires special software.

Fortunately, such software exists for Linux. The MPIO project over at SourceForge hosts it. It has both a command-line application (similar to an FTP client, for some reason) and a simple KDE-based front end. Of course, this project is now almost completely dead save for a post to the mailing list every month or two, so any hope of bug fixes or new features seems unfounded, but the software still works.

Or, it did until recently, when I noticed that this player no longer works properly under Dapper. I'm not sure exactly when the problem started - if it was the upgrade to Dapper or some upgrade since then. Either way, things are definitely no longer normal.

I'm now having trouble connecting to the device. I experimented this evening, and it seems I need to be root in order to connect to it. When connecting as a normal user, which used to work perfectly, I now get a message that the device cannot be found. Connecting as root, however, seems fine. The first time, at least. If I disconnect the software and try again, I get a message like this:
mpio: src/io.c(766): mpio_io_read: libusb returned error: (ffffff92) "No error"
mpio: src/io.c(816): mpio_io_version_read: Failed to read Sector.(nread=0xffffff92)
mpio: src/mpio.c(409): mpio_init: Unknown version string found!
Please report this to: mpio-devel@lists.sourceforge.net
mpio: src/mpio.c(410): mpio_init: data=0x8052820 len=64
mpio: 0000: 00 00 00 00 00 00 00 00 00 00 00 00 20 20 20 20 ............
mpio: 0010: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
mpio: 0020: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
mpio: 0030: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................
mpio: src/mpio.c(171): mpio_init_internal: WARNING: no internal memory found
It seems to work again if I physically disconnect the device.

This is growing to be a pain. At this point, I guess the main question is, how much effort do I want to put into fixing support for a now obselete device? I'm guessing the answer is going to be "not that much."

Random Kubuntu complaints

It's been a month since I posted anything, so here's my random list of complaints about Kubuntu. Some of them are probably hardware related, but I'll post them anyway.

  1. Adept is great - except for proprietary software. For example, the install scripts for vmware-player and the Sun JDK both require you to accept the license agreement. However, Adept's terminal emulator doesn't seem to accept input. I can see the license agreement (or at least part of it), but I can't actually hit OK to agree to it. The only way is to do it is to forget Adept and install using apt-get from the command line.
  2. Why does my CD burner not exist when I boot up after turning off the PC? The drive has power and everything, but /dev/hdc just doesn't get created. But if I reboot, then when the box comes back up, /dev/hdc is there and functioning normally. What's up with that?
  3. Sorry, but the default desktop settings suck. Call me a heretic if you want, but I would rather the default settings to be like Windows. Yeah, I can and do change them, but it's just a pain. Maybe it would be better if they did what Xandros do and just run the settings wizard on the user's first login. Or something.
  4. The kernel is buggy. Unplugging my cell phone from the USB/serial data cable crashes something in ther kernel's USB subsystem. I'm not sure what, but the exact error message from the log includes the line "kernel BUG at kernel/workqueue.c:109!" so it's definitely a kernel bug. And whatever it is keeps my phone or USB thumb drives from working until I reboot.
  5. Probably related to the above, shutdown sometimes hangs at stopping Bluetooth services. Note that I don't actually own any Bluetooth devices, so I'm not sure what's happening there.

Upgrading the laptop

In a fit of optimism, I decided to upgrade my laptop to Kubuntu Dapper last night.  As usual, the upgrade process really sucked.  This time I knew what to look for, so it didn't suck as hard as the last time, but it still sucked.  Ubuntu may be a really great distribution, but they really need to work on the upgrade procedure.  I don't want to waste my time rebuilding a working system every six months.

After playing around with possible dependencies for way, way too long, I was finally able to get around that "upgrading uninstalls KDE" problem I had last time.  I ended up removing KOffice (which I wasn't using anyway), Amarok, and k3b-mp3.  I'm not sure exactly which package caused the problem, but after that I was able to mark all upgrades and then kubuntu-desktop, which gave me a proper upgrade.  Of course, I still had to go through a few dozen packages that Adept wanted to remove rather than upgrade (like kdegames and kile), but I eventually got everything I wanted marked for upgrade and the package installs went smoothly.

The main problem I had was that my WiFi card stopped working after I rebooted.  It took a disturbingly long time to figure out the problem, but I've got it sorted now.  It seems that Dapper includes a driver it thinks works with my integrated Broadcom BCM4318 card, but really doesn't. Naturally, this conflicted with the working NDISwrapper installation. All I had to do to fix it was add the bcm43xx driver to my /etc/modprobe.d/blacklist file. After that, ndiswrapper was free and clear and my WiFi connection was fixed.

And now scrolling is working

I mentioned earlier that the scrolling buttons on my Logitech Marble Mouse weren't working since the upgrade to Dapper. Well, now they are. Apparently, all I had to do in my old configuration was change the "EmulateWheelButton" variable from 5 to 9, which is the new number for the same button I was using before. Not sure why the change happened, but to be honest, at this point I really don't much care.

At least Opera is working

At least one of my Dapper problems is fixed now. I got Opera 9.0 beta 2 installed last night.

It turns out the problem was with the dependencies in the Opera .deb package. Apparently, the Breezy Opera .deb lists xlibs as a dependency, but Dapper has changed naming conventions, so there no longer is a package named xlibs. The program still works perfectly if you force the installation, but then APT sees the package as "broken" and won't let you install anything else without removing it.

Well, there were two possibilities to fix this. The first was to make a dumby xlibs package to satisfy APT. However, I didn't know how to do that, so I took the brute-force approach: explode the package, remove the dependency, and rebuild it. It wasn't pretty, but it worked like a charm. I figure the next stable release will have a Dapper .deb, so I don't imagine I'll have to be repeating this latter.

Upgrading to Dapper

Well, I installed Kubuntu 6.06 "Dapper Drake" from scratch the other night. I didn't actually intend to, though. I actually wanted to upgrade, but that didn't go so well. In fact, the upgrade completely hosed my system. Either I made a really bad mistake following the directions on the Kubuntu web site, or the directions were just plain bad. (To be fair, they were at least incomplete.) At any rate, Adept stopped responding in the middle of configuring autofs and I had to kill it. I tried to recover by running apt-get, which told me to run dpkg, which seemed to complete the upgrade. However, after rebooting, I found that I was without KDE (as in the upgrade removed it and never installed the new version) and without a network connection. So, since I had wanted to fix my brain-damaged legacy partition scheme anyway, I just reinstalled.

I must say, I'm extremely impressed with the new installer. Instead of booting into a crumby command-line installer, the install CD is now a live CD. You boot into a working graphical environment and then run a graphical installer program to put Kubuntu on your hard drive. You answer a few questions, configure your hard drive with QtParted, wait a while, and reboot. The mount point selection screen could have used a few more visual cues, but other than that it was the easiest install I've ever done. Bravo!

However, it wasn't all wine and roses. After booting into my new installation, I found that I no longer had sound, despite the fact that I'd gotten the system sounds when I booted from the live CD. There didn't seem to be After flailing around in vain for a while, I found the answer in this Kubuntu Forums thread, which basically said to turn off a few switches in KMix. I certainly never would have thought of that, since I never actually set any of those switches. In fact, I don't even know what they do. But, at any rate, it worked.

It also took me a few minutes to remember that Ubuntu doesn't come with MP3 support out of the box. It would have been a lot faster if Kaffeine or Amarok had just told me that they couldn't play MP3s. But nooooo, that would be too easy. Instead, I sat there for fifteen minutes wondering why Amarok was zooming through a two hour recording in three seconds without making a sound. I probably would have figured it out sooner if I wasn't so flustered from the previous failure of sound to work at all. At any rate, I eventually remembered and followed the new instructions to enable the multiverse repository and install libxine-extracodecs.

In other bad news, I still haven't gotten the extra buttons on my Marble Mouse to work yet. Just copying the mouse setup from my old xorg.conf file didn't work. The really weird thing about that is that when I try to remap the buttons with xmodmap, I get errors saying that I need to define 11 buttons instead of 7. But the trackball only has four physical buttons and I've only set the buttons option to 7 in my xorg.conf file. So where is it getting 11 from?

We'll see how the rest goes. I've still got a bunch of other things to install and configure, so I'm sure there will be some rough spots. Hopefully I'll get them all worked out and be able to post the solutions for posterity.

The Laptop Saga: Cut Short

Well, that was easy. The new laptop is up and running with Kubuntu Breezy. What's more, I worked on it for less than two hours and everything I've tried is working perfectly.

So here's the deal. On Saturday morning, my neighbor brought over the box with my new Dell Inspiron B120 in it. It was actually delivered $500 worth of laptop sitting on the front steps, so my neighbors picked up for safe keeping. (Thanks guys!) I was out of town until Sunday night, so I didn't get a chance to play with the new system until this afternoon.

By the way, while I was away, I went shopping for a case and USB mouse. I found a nice, padded case with lots of pockets in Staples for $30 and a Logitech optical mouse for $15. This beats the $44 that Dell wanted for a laptop case.

The install was pretty painless, as usual. It took a little longer than expected, because the battery died half way through the installation. (Apparently they don't come pre-charged.) I could have tried to rescue it, but I figured it would just be faster to plug it into the wall and start again, so that's what I did. After the initial install, everything worked except WiFi. Even the integrated sound works. Of course, I haven't tried suspending it yet, but that's not a huge concern for me at the moment.

Getting the integrated Broadcom WiFi card to work was actually surprisingly easy. I pretty much just had to follow the instruction in the Ubuntu NDISwrapper how-to, and it worked. No compiling necessary! Based on the lspci, the integrated WiFi card was the first Broadcom card on the supported cards list. Basically all I had to do was download the two linked files and follow the directions in the how-to and everything worked. In fact, the process was so easy, the Ubuntu team could probably automate the process. Wouldn't that be sweet?

So, I am glad to report that my quest to put Kubuntu on my laptop was cut short when everything unexpectedly worked. I guess it turns out I didn't need to order those Windows CDs after all. In fact, I wish I hadn't, because not only did I not need the driver disk, the CDs are those OEM restore disks, not real Windows, so they're basically useless for anything except restoring the system you bought them for.

Let's try a little more swap

I'm still having intermitent problems with my Kubuntu box just dying on me. This has been going on for a while and I can't figure out what the problem is. See, every now and then, the hard drive will just start thrashing like crazy and the system will go completely unresponsive. If I leave it alone for five or ten minutes, sometimes it will come back to life. Then again, sometimes it won't and my only option is to hit the reset button.

The problem appears to be something with X. If I catch it before it goes completely non-responsive, sometimes I can hit ctrl+alt+backspace and restart X. After that, it's just fine. I appears to be some kind of memory leak, as I found some messages in the system logs once that indicated processes were being killed due to insufficient memory. It's probably a slow leak, since it usually only gives me problems after I've been logged in and working for a long time. I haven't been able to connect the lock ups to any particular application, but it most often happens when I'm using Opera or Kontact. However, that doesn't necessarily mean anything because I'm nearly always running Opera and Kontact.

Anyway, I decided to mitigate the problem by adding some more swap space to my system. I've got half a gig of RAM in my current system, but only had about 256 MB of swap space due to an old partition scheme. So I decided to take that old 200MB partition I had been using as /var under an old Slackware installation and repurpose it to swap.

I'd never actually manually configured a swap partition before, but it turned out to be surprisingly easy. I had assumed I would need to change the partition type, but according the the mkswap man page, Linux doesn't actually check the type of the partition. So, basically, all I did was run mkswap -c /dev/hda5, change the /etc/fstab entry to mark the device as a swap partition, and run swapon -a. Now I've got another 200MB of swap space. We'll see if it makes any difference. At the very least, I'm hoping it will buy me enough time to notice the problem before the system locks up.

Playing with prelink

Well, I finally got up the courage to try prelink today. OK, so it's more like I said, "Oh, what the hell," but the outcome is the same. I installed prelink from the Ubuntu repositories and ran it.

If you've not heard of it, prelink is a tool that, well, prelinks your binaries. I'm not exactly an expert on shared libraries, but my understanding is that it basically does the work of the dynamic linker ahead of time. So, instead of the linker relocating libraries at run-time, it's already done. The net result is that programs load faster. This is supposed to be especially true of C++ programs, as they apparently make the loader do a lot of work.

I heard of prelink some time ago, but was always a bit wary of it. I'm a bit wary of anything that modifies system files. But I read through a thread on it at Ubuntu Forums, and there seemed to be few people who had problems, so I decided to give it a go. I also read the documentation and saw that prelink has an uninstall option and that it should cause libraries to revert to their normal loading if, for some reason, prelinking fails. That eased my mind a little.

It's only been a few hours, but so far I haven't been disappointed. I haven't actually timed application start-up with and without prelinking, but KDE applications are definitely loading faster. I don't know if they're loading in half the time, as some people claimed, but the difference is definitely noticable. In particular, I noticed that Kontact is coming up much faster. Also, Konqueror is loading up with almost no wait. Basically, things are now starting in what feels like the right amount of time. I know my Sempron 2500 with 512MB of RAM isn't exactly a high-end system, but it's got enough horse power that I shouldn't have to wait three seconds for the file manager to start up. And now I don't.

VMware woes

Wow, two weeks since the last entry. I guess I haven't had much time for blogging lately.

Anyway, I finally fixed my problem with VMware Player today. After hosing my system a month ago, I hadn't bothered to try installing VMware Player again until about two weeks ago. I didn't figure it would be an issue, since I'd used it under Breezy before. But I was oh so very wrong.

VMware Player works perfectly on a vanilla Kubuntu Breezy installation. However, I had been keeping up with the updates. That was my mistake. It seems that the kernel 2.6.12-10 package is not compatible with VMware's kernel drivers. I had kernel 2.6.12-10 installed, and every time I tried to use a bridged network connection (I'm too lazy to figure out how to use NAT), something bad would happen. If I was lucky, the player would just crash. If I was unlucky, I'd get a kernel panic and the whole system would go down.
The solution? Downgrade to kernel 2.6.12-9. Yes, that's right - "downgrading" to a previous build of the same kernel version fixed the problem.

I think I've learned a valuable lesson from this. That lesson is: screw updates. I don't need to be on the bleeding edge. In fact, I don't need to be on any edge. It's safer to take a Windows kind of approach and use the same damned software until the next OS upgrade. In fact, come to think of it, I should write an entry on Windows and compatibility some time.

So that adventure was a huge waste of time and energy. I still don't know whose fault it was - Ubuntu's or VMware's - and, frankly, I really don't care. It's working now and I'm going to stop doing kernel updates because of this. Between this and my under-sized /boot partition that I don't feel like fixing, kernel updates just aren't worth the trouble.

Making HPLIP work

I finally got around to figuring out how to make HPLIP work under Kubuntu Breezy today. It turned out to be absurdly simple.

My problem was that the HPLIP toolbox refused to run because there were no HP printers set up in CUPS. Of course, that's not really correct, because I had set up my HP PhotoSmar 7760 in CUPS and it was working perfectly. However, because it wasn't setup using the hp:/ protocol, HPLIP couldn't access the advanced functionality, like the card reader, ink monitor, and so forth.

The solution was simply to remove and re-add the printer in the KDE print manager. Of course, this wasn't immediately obvious, as the choice for hp:/ protocol was lumped in under the "other printer" options on the printer type screen. I would have just used the CUPS web administration tool, but for some reason, the Ubuntu people saw fit to disable the administrative capabilities of the web front-end. Apparently it's for "security reasons," which doesn't make much sense to me because the default configuration limits access to the local host anyway. Oh well, I'm sure it seemed like a good idea to them.

Of course, if I hadn't hosed my system with a routine kernel upgrade last week, I woulnd't have had to go through this in the first place, but that's a story for another day.

Syaptic and GTK-Qt

I ran across this thread on Ubuntu forums the other day regarding the GTK-Qt theme engine and kdesu. As you may know, the GTK-Qt theme engine is a handy KDE/Qt theme that basically causes GTK+ to call Qt functions to create widgets instead of drawing them itself. The result is that your GTK+ applications look exactly like native Qt applications.

The only problem I've had while using this in Kubuntu is that it doesn't work correctly when you start an application using kdesu, KDE's graphical version of su or sudo. For instance, if I start Synaptic from the Kmenu, my KDE theme doesn't take effect. In fact, in some cases, Synaptic seems to revert to some phenomenally ugly theme that I don't think I've ever seen before. However, if I start it using sudo from a terminal, it behaves as it should.

It turns out you can fix this with a couple of simple symlinks
sudo ln -s ~/.qt/qt_plugins_3.3rc /root/.qt/qt_plugins_3.3rc
sudo ln -s ~/.qt/qtrc /root/.qt/qtrc

That's it. Now Synaptic works as it should and all is right with the world. My only remaining question is whether this problem is a bug or a feature.