The Mythical Man-Month

I've decided it's time to do a little professional development.  In other words, stop putzing around with whatever new technology is "cool" these days (grumble, grumble kids with their Snapchats, and their Node.js, and their Dan Fogelberg) and get back to fundamentals.  That means design and architecture, development process, project management, quality controlThe Mythical Man-Month, 20th anniversary edition, documentation, and all those other things that are important, but not even remotely "hip".

To that end, I'll be doing some reading.  The first book on my list has been on my bookshelf for about 13 years - The Mythical Man-Month, by Fred Brooks.  I bought and started reading the 20th anniversary edition over a decade ago, but never got around to finishing it. 

In retrospect, I think I was just too inexperienced at the time really get the significance of the book.  After all, it might be a classic, but its's an old book (written in 1975) about an old project (much of it relates to developing OS/360).  How relevant could it be?  Especially once you get past the punch line of why the man-month is mythical - which is revealed in chapter two.

Now, after almost 14 years in tech, it's easy to see just how brilliant a book it is.  While there is some discussion of obselete methods and technologies, that's beside the point.  The Mythical Man-Month is still relevant because it isn't really about specific development techniques or technologies.  It's about the complications of managing and communicating with lots of people on a large project.  And while we've come a long way as an industry in the last 40 years, the "human nature" part of the equation hasn't really changed appreciably.

One of the nice things about this book is that Brooks clearly gets what it means to be a developer.  He starts the book with an essay on the joys and woes of the craft of programming, all of which are as relevant today as they were 40 years ago.  There are a number of other such insights sprinkled throughout the book.  My personal favorite is his statement that all programmers are optimists.  When you think about the process of estimation and how that usually turns out, this is the sort of insight that seems so blindingly obvious that you're suprised you didn't think of it yourself.

The third chapter, "The Surgical Team," was particularly interesting to me.  The proposal is essentially to treat building software less like building a bridge and more like performing surgery.  So, in other words, you'd have one senior deveoper act as the "surgeon" and he and a ore junio assistant perform all of the deliverable work.  The rest of the team supports him, doing the project management work, building supporting tools, serving as expert consultants, etc.  Since one top-notch programmer is supposedly ten times as productive as a mediocre programmer, having one of them do the work and nine other people support them gives you the same total productivity while reducing communication problems and maintaining design consistency.

One of Brooks' themes throughout the book is that the way to manage complexity is to maintain consistency of architectural vision.  That is, the overall system architecture should reflect as few viewpoints as possible, rather than being a mish-mash of the viewpoints of everyone who worked on it.  This plays into another major issue Brooks discusses: communication cost.  This is part of the reason the man-month is mythical - because not only does adding new people to a project add ramp-up time, it also increases the communication cost because you now have more people to keep in the loop.

I think one of the best things about the 20th anniversary edition is the added chapters.  For starters, they include Brooks' classic paper No Silver Buller - Essence and Accident in Software Engineering, which is a good read.  It also includes a chapter of analysis and reflection, in which Brooks discusses not only the legacy of the book, but also what he got right and wrong.  The things he got wrong are particularly interesting.  They include the classic "build one to throw away" quote, which Brooks says is wrong "not because it is too radical, but because it is too simplistic," being rooted in the waterfall model that was popular in the 1970's, as well as his notion of communicating the full project workbook to everyone.

Overall, The Mythical Man-Month is a very engaging and surprisingly easy read, especially given the volume of references.  While it contains some details that now seem like quaint digressions into the history of computing, the majority of the material is still relevant.  It definitely contains useful insights for anyone who is interested in the dynamics of running a large project.  I'm very glad I came back to it.

Swag shipment

My shipment of Komodo IDE swag arrived today.  I won a Twitter contest to find an Easter Egg in the new Commando feature of Komodo 9.  Turns out that if you type "kitt" or "michael" into the Commando box, it will play an animation of Kitt's cylon eye-light, complete with whooshing sound, from Knight Rider.

So my prize was a Komodo IDE T-shirt, some Komodo and Stackato stickers, and a pack of Komodo IDE playing cards. The shirt is actually pretty nice.  It just has the komodoide.com URL on the back and that logo on the front.  A nice little swag selection, at any rate.  Thanks, ActiveState!

Upgrade time again

It's upgrade time again.  As I mentioned in my last entry, my desktop/home server had been getting short on disk space - and by "short on disk space," I mean I started to see Explorer reporting the free space as less than 1GB on my C: drive.  So it was time for a new hard drive.

Since the only real problem was my 100GB C: partition, I decided to go with more of a "start over" approach and just got a modest 120GB SSD.  My Windows 7 installation was about 5 years old, so it had accumulated a fair amount of cruft.  For instance, the winsxs folder alone had swollen to almost 14GB.  So I reasoned that a clean installation would probably fix most of my problems.

Along with the new hard drive, it was also time for some new software.  I started that out with an OS upgrade from Windows 7 Pro to Windows 8.1.  Yes, I know - everybody hates Windows 8.  But I think it's a good OS, damn it!  Sure, I'll grant you that the initial release had some rough edges, but aside from the questionable decision to remove the start menu (which is trivially fixed by installing Classic Start), the 8.1 update fixes every complaint I had.

As a change of pace, this time I decided to try the downloadable purchase of Windows 8.1 from NewEgg rather than waiting for physical media to ship.  It turns out that this process is actually pretty simple.  You place your order and then get an e-mail with a license key and a link to a downloader program.  You run that, give it your key, and it gives you several options for downloading the installation files.  One of these is to just download a bootable ISO image that you can burn to disk.  So it's actually not as wierd as I initially feared.  Of course, the one catch is that the downloader runs under Windows, so this probably doesn't work so well if you're a Mac or Linux user.

The one thing of note here is that this time I decided to save myself a few bucks and drop down from the Professional edition to the standard one.  I made this decision after considering the non-transferability of the OEM version and looking at the feature comparison matrix.  It turns out the the Pro versino contained exactly one feature that I cared about: the Remote Desktop Server.  So I reasonsed that if I could find a suitable remote access solution to replace RDP, then there was no need to buy the Professional edition.  And after playing around with TeamViewer for a few days, I decided I'd found that. 

It turns out that TeamViewer actually works quite well.  For starters, it's free for non-commercial use and it's in Ninite, so there's basically zero overhead to getting it set up.  Registering for an account is also free and helpful, though not strictly necessary.  The performance seems to be pretty good so far and it has the ability to start LAN-only connections or go through their servers for over-the-Internet access.  After using TeamViewer for a couple of days, I was more than convinced that I could do without the Windows RPD server.

Next on the list was a service to run Virtual Box.  As you may know, Virtual Box is a free system virtualization package.  It works pretty well, but it doesn't come with the ability to run a VM on boot (at least in Windows) - you have to wait until you log in.  To fix that, I installed VBoxVmService.  This is a little Windows service that will run selected VMs on boot without anyone having to log in and also offers a systray app that allows you to manage those VMs.  Previously, I had been using the similarly named VirualBoxService, which does essentilally the same thing but isn't quite as nice to use.  Of course, there are some limitiations, but for the most part it works well enough for my setup.  All I really wanted to do was have a Linux VM run on boot to serve as a web server because setting the stuff up on Windows was just too much of a pain.

While I was at it, I also decided to give Plex a try.  I'd previously been a little turned off by the requirement to create an account, even though I only wanted to use it on my LAN, but it turns out that actually isn't necessary.  The account registration is really only needed for remote access.  And with Android and Roku apps, Plex provided more functionality and required less work than my home-grown solution of using PHP scripts to generate customized web server directory listings for Roksbox.  That was all well and good, but just using Plex is much easier and seems to work better.

So far, things seem to be going pretty well and I'm happy with my new setup.  Granted, there are no radical changes to my setup, but in my days of Linux on the desktop, painful upgrades were not always such an uncommon occurrence, so I guess I'm a little battle-scarred.

One last thing to note is that I'm actually kind of impressed with Windows 7.  I days gone by, a Windows install lasting for 5 years of heavy use would be unheard of.  And even with seriously limited hard drive space, the system was rock-solid and actually performed pretty well.  If I'd been inclined to migrate it to the new drive, I probably could have kept that installation going for much longer.  Switching back to Windows was definitely a good move.

Forget Cloud Drive, let's try OneDrive

This entry started out as a quick discussion of consolidating my photos onto Amazon Cloud Drive.  However, that didn't happen.  So now this entry is about why.

This started out as an attempt to clean up my hard drive.  I have a 1TB hard drive in my desktop, divided into two partitions: 100GB for my system "C: drive" and the rest of it allocated to assorted data.  The problem is that my C: drive was down to less than 5GB free, so it was time to do some clean-up.

Part of my problem was that my locally synced cloud storage was all in my home directory on the C: drive, including Cloud Drive.  So the plan was to move my Cloud Drive folder to the D: drive and, in the process, move a bunch of my older photos into Cloud Drive.  After all, I've complained about Cloud Drive, but now that Amazon offers unlimited photo storage to Prime members, why not take advantage of it?

Well, I'll tell you why - because it doesn't work, that's why.  For starters, the desktop Cloud Drive app doesn't provide any way to set the local sync directory.  Luckily, I did find that there's a registry key for that that you can set manually.  So I did that, moved my Cloud Drive folder, and restarted the Cloud Drive app.  And then I waited.  And waited.  And waited some more for Cloud Drive to upload the newly added photos.  However, when I came back the next morning, the systray icon said that Cloud Drive was up to date, but when I looked at the website, my new photos weren't there.

OK, so plan B: try downloading the latest version of the Cloud Drive desktop app.  My version was from March, so maybe there were somce improvements since then. 

And here's problem number two: the new app isn't the same as the one I have.  As far as I can tell Amazon no longer offers a desktop sync app.  Now they just have a "downloader/uploader" app.  It's not sync, though - the process is totally manual.  And I can't find any link to the version I have.  Presumably it's been replaced by this lame uploader app.  I notice that the Cloud Drive website now omits any mention of sync and talks about accessing your data on your computer through the website.

OK, so no upgrade.  Plan C: try to get the version I have working.  That didn't work out, though.  I didn't have the installer anymore, so reinstalling was out of the question.  I tried deregistering the app and resyncing my cloud data, but now that was just broken.  Cloud Drive synced part of my documents folder, then just stopped and reported that it was up-to-date.

At that point, I decided to just give up.  I'd been thinking about switching to OneDrive anyway.  It works well enough and fixes the problems I have with Cloud Drive.  It also gives me 30GB of free storage and has pretty reasonable rates for upgrades - $2/month for 100GB or 1TB for $7/month including an Office 365 subscription.  Plus I've already got it set up on my desktop, laptop, phone, and tablet, so it's just a matter of getting my data into it.

So that's what I did.  I changed my OneDrive location to my D: drive by unlinking and relinking the app (which is required for Windows 7 - Windows 8.1 is much easier) and then moved over the pictures from my Cloud Drive as well as the old ones I wanted to consolidate.  Hopefully that will work out.  It's going to tak OneDrive a while to sync all that data - over 10GB - but it seems to be going well so far.  And unlike Cloud Drive, as least OneDrive has a progress indicator so you can tell that something is actually happening.

As for Cloud Drive, I think I'm pretty much done with that.  I'll probably keep the app on my phone, as it provides a convenient second backup of my photos and also integrates with my Kindle, but I'm not even going to try to actively manage the content anymore.  It seems that Amazon is moving away from the desktop to an all-online kind of offering.  That's all well and good, but it's not really what I'm looking for at the moment.  Too bad.

Site outage

Well, that sucked.  My domain was MIA today.  Fixed what I could, but it's not totally working yet.

I discovered the problem this morning, when I was thwarted in my regularly scheduled RSS feed checking.  The page didn't load.  And neither did any of the other page.  Or the DNS alias that I had pointed to my home server.  So after confirming that my hosting provider was in fact up, I checked my DNS.

Somehow - I'm still not 100% sure how - my domain's nameservers got reset to defaults for my registrar.  I'm not sure if I fat-fingered something while confirming my contact info in the domain manager, or if there was some change on their end, or what.  At any rate, they were wrong.  I was able to change the settings back to my hosting provider's nameservers without any issues, but that still requires waiting for the change to finish propagating.  What a pain.

Reference project root in command

Continuing on the Komodo macro theme for this week, here's another little macro template that might come in handy.  This one is just a quick out outline for how to reference your project's root directory when running a command.

As you may know, Komodo's commands support a variety of interpolation variables to do things like insert file paths and other input into your commands.  The problem is that there's no variable to get the base directory of your current project - by which I mean the "project base directory" that you can set in your project properties.  Oh, sure, there's the %p and %P variables that work on the current project, but they don't get the project base path.  They get the path to the project file and the directory in which the project file is contained.  That's fine when your project file lives in the root of your source tree, but if you want to put your project files in another location, it doesn't help.

Sadly, there is currently no way to get this path using the interpolation variables.  However, it's pretty easy to get with a macro.  The only problem with that is that the macro syntax is a little odd and the documentation is somewhat lacking.  The documentation for ko.run.runCommand() does little more that give the function signature, which is bad because there are 17 parameters to that function, and it's not entirely clear which are required and what the valid values are.  Luckily, when you create a command, the data is stored in JSON format with keys that more or less match the parameter names to runCommand(), so you can pretty much figure it out by creating the command as you'd like it and then opening the command file up in an editor tab to examine the JSON.

Anyway, here's macro template.  Needless to say, you can substitute in the project base directory at the appropriate place for your needs.  In my case, I needed it in the working directory.

var partSvc = Cc["@activestate.com/koPartService;1"].getService(Ci.koIPartService),
    baseDir = partSvc.currentProject.liveDirectory,
    dir = baseDir + '\\path\\to\\OpenLayers\\build',
    cmd = 'python build.py';
ko.run.runCommand(window, cmd, dir, undefined, false, false, true, "command-output-window");

Better project commit macro

Several months ago, I posted a Komodo IDE macro to run a source control commit on the current project.  That was nice, but there was an issue with it: it only sort of worked. 

Basically, in some cases the SCC type of the project directory was never set.  In particular, if you focused on another window and then double-clicked the macro to invoke it, without touching anything else in Komodo, it wouldn't work.  While this scenario sounds like an edge-case, it turns out to be infuriatingly common, especially when you use multiple monitors.  The obvious example is:

  1. Make some changes to your web app in Komodo.
  2. Switch focus to a browser window and test them out.
  3. See that everything works correctly.
  4. Double click the commit macro to commit the changes.
  5. Wait a while and then curse when the commit window never comes up.

I'm no Komodo expert, so I'm not sure exactly what the problem was.  What I did eventually figure out, though, is that Komodo's SCC API doesn't seem to like dealing with directories.  It prefers to deal with files.  And it turns out that, if you're only dealing with a single file, the "commit" window code will search that file's directory for other SCC items to work with.

So here's an improved version of the same macro.  This time, it grabs the project root and looks for a regular file in it that's under source control.  It then proceedes in the same way as the old one, except that it's much more reliable.

(function() {
    "use strict";
    
    // Find a file in the project root and use it to get the SCC type.  If we
    // don't find any files, just try it on the directory itself.
    // TODO: Maybe do a recusive search in case the top-level has no files.
    function getSccType(url, path) {
        var os = Components.classes["@activestate.com/koOs;1"]
                           .getService(Components.interfaces.koIOs),
            ospath = Components.classes["@activestate.com/koOsPath;1"]
                               .getService(Components.interfaces.koIOsPath),
            fileSvc = Components.classes["@activestate.com/koFileService;1"]
                                .getService(Components.interfaces.koIFileService),
            files = os.listdir(path, {}),
            kofile = null;
        // First look for a file, because that always seems to work
        for (var i = 0; i < files.length; i++) {
            var furi = url + '/' + files[i],
                fpath = ospath.join(path, files[i]);
            if (ospath.isfile(fpath)) {
                kofile = fileSvc.getFileFromURI(furi);
                if (kofile.sccDirType) {
                    return kofile.sccDirType;
                }
            }
        }
        // If we didn't find a file, just try the directory.  However, this
        // sometimes fails for no discernable reason.
        kofile = fileSvc.getFileFromURI(url);
        return kofile.sccDirType;
    }
    
    var curr_project_url =  ko.projects.manager.getCurrentProject().importDirectoryURI,
        curr_project_path = ko.projects.manager.getCurrentProject().importDirectoryLocalPath,
        count = 0;
    
    // HACK: For some reason, the SCC type on directories doesn't populate.
    // immediately.  I don't know why.  However, it seems to work properly on
    // files, which is good enough.
    var runner = function() {
        var scc_type = getSccType(curr_project_url, curr_project_path),
            cid = "@activestate.com/koSCC?type=" + scc_type + ";1",
            fileSvc = Components.classes["@activestate.com/koFileService;1"]
                                .getService(Components.interfaces.koIFileService),
            kodir = fileSvc.getFileFromURI(curr_project_url),
            sccSvc = null;
            
        if (scc_type) {
            // Get the koISCC service object
            sccSvc = Components.classes[cid].getService(Components.interfaces.koISCC);
            
            if (!sccSvc || !sccSvc.isFunctional) {
                alert("Didn't get back a functional SCC service. :( ");
            } else {
                ko.scc.Commit(sccSvc, [curr_project_url]);
            }
        
        } else if (count < 50) { // Just in case this never actually works....
            count += 1;
            setTimeout(runner, 100);
        } else {
            alert('Project directory never got a valid SCC type.');
        }
    };
    
    runner();
}());

Resize Subsonic frames

I upgraded Subsonic today and I had to remember the little hack I put in to enable resizing of the playlist frame. This is the second time I've had to look up how to do this, so I'm just going to document it here.

By default, each piece of the Subsonic web interface is a frame, and all the frames are borderless and not resizable.  This is kind of a pain because sometimes I have a long playlist and I want to see all of it at once.  The ideal way to do that would be to just increase the size of the playlist frame - but I can't do that.  Fortunately, fixing this is relatively easy.  You can just modify the frameset markup in the following file:
C:\subsonic\jetty\<number>\webapp\WEB-INF\jsp\index.jsp

For my case, the solution was to change the third frameset tag by increasing turning on frame borders and increasing the border size.  Here's the markup:
<frameset rows="75%,25%" border="5" framespacing="0" frameborder="1">

Using RewriteBase without knowing it

So here's an interesting tidbit that I discovered this afternoon: you can use RewriteBase in a .htaccess file without knowing the actual base URL.  This is extremely useful for writing a portable .htaccess file.

In case you don't know, the RewriteBase directive to Apache's mod_rewrite is used to specify the base path used in relative rewrite rules.  Normally, if you don't specify a full path, mod_rewrite will just rewrite the URL relative to the current directory, i.e. the one where the .htaccess file is.  Unfortunately, this isn't always the right thing to do.  For example, if the .htaccess file is under an aliased directory, then mod_rewrite will try to make the URL relative to the filesystem path rather than the path alias, which won't work.

Turns out that you can account for this in four (relatively) simple lines:

RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond $1#%{REQUEST_URI} ([^#]*)#(.*)\1$
RewriteRule ^(.*)$ %2index.php [QSA,L]

All you need to do is substitute in your rewrite target for "index.php" and it "just works".  No changes to server configuration required and no need to edit the RewriteBase for the specific server.

VirtualBox shared folders

Here's a little issue I ran across the other day.  I was setting up a VirtualBox VM with an Ubuntu guest and I wanted to add a shared folder.  Simple, right?  Just install the VirtualBox guest additions, configure the shared folder to mount automatically, and it "just works".

The only problem is the permissions.  By default, VirtualBox mounts the shared folder with an owner of root.vboxsf and permissions of "rwxrwx---".  So if you add your user account to the vboxsf group, you get full access.  Everyone else...not so much.  Of course, this isn't inherently a problem.  However, I wanted the Apache process to have read access to this shared folder because I have a web app that needs to read data off of it.  And I didn't really want to give it write access (which it doesn't need), so just adding it to the vboxsf group wasn't a good option.  What I really needed to do was change the permissions with which the share was mounted.

As far as I can tell, there's no way to get VirtualBox to change the permissions.  At least, I didn't see anything in the guest OS and there's no setting in the management UI.  Fortunately, you can pretty easily bypass the auto-mounting.  Since it's a Linux guest, you can just turn off the auto-mounting in the VirtualBox management console and add a line to your /etc/fstab.

There is one issue, though: you have to make sure the vboxsf kernel module is loaded before the system auto-mounts the share.  If you don't the mounting will fail.  However, forcing the module to load is easily accomplished by adding a vboxsf line to your /etc/modules file.

As for the line in /etc/fstab, this seems to work pretty well for me:
SHARE_NAME   /media/somedir   vboxsf   rw,gid=999,dmode=775,fmode=664   0   0