Using PuTTY inside ConEmu

This is yet another one of those "post this so I can refer back to it later" things.  So if you're not a Windows user, or if you don't use ConEmu, then I suggest you go get a cup of coffee or something.

So for a while now I've been using ConEmu as my Windows console app.  It supports multiple tabs, transparency (ooooh), customizable hotkeys, customizable sessions, Far manager integration and a whole bunch of other nifty stuff.  

A couple of months ago, I saw that it was possible to embed PuTTY, the popular Windows-based SSH client, directly in a ConEmu tab.  So I tried it out and found it to be pretty slick.  The only down side was some key binding weirdness.

First, there's the general putty issue that if you accidentally press ctrl+S - you know, the key combination that means "save" in just about every editor in existence - it effectively locks the terminal and it's not obvious how to get control back.  The second issue is that, when you embed an application like PuTTY in ConEmu, it steals most of your keyboard input, so the standard key bindings for switching between window tabs don't work.

Luckily, these problems are easily fixed.  The fixes are just hard to remember, which is why I'm writing it down.  For the ctrl+S iissue, you can just hit ctrl+Q to turn input back on.  For the tab-switching issue, you can use the global key bindings for ConEmu - namely Win+Q, Win+Shift+Q, and Win+<Number> to switch consoles, as well as Win+Z to toggle focus between ConEmu and PuTTY.

 

Fixing Lenovo IdeaPad function keys

Note to self: you actually can turn the function keys on your Lenovo IdeaPad back into actual function keys.  It's just a BIOS setting.

For anyone else who comes across this, Lenovo's IdeaPad series of laptops (at least the version I have) does weird things with the function keys.  You've probably seen those keyboards where the F1 through F12 keys have alternate functions.  They often have a "function lock" button that's equivalent to caps lock or scroll lock and you can turn it off to access the alternate functionality. 

What Lenovo does is similar, except that the alternate functionality is on by default and there's no function lock button.  There's just an "Fn" button that you have to press and hold to access the normal function button keystrokes.  So on my laptop, F11 and F12 turn the screen brightness up and down and I need to press Fn+F12 to do a regular F12 keystroke.

This is probably great for "regular" users, most of whom wouldn't miss the function keys if they went away completely.  On the other hand, for a developer, this adds a second keystroke to a row of what were previously absurdly convenient hotkeys.  Granted the volume and brightness keys are convenient, but when I'm coding I use F11 and F12 for my IDE keybindings much more often than I need to change the screen brightness.

But luckily, as I found in the link above, you can turn that off in a BIOS setting.

PSP Break-down, part 4: Results

Welcome to the end of my series of PSP posts.  In part one, I started with an overview of the Personal Software Process.  Part two covered the process of learning about the PSP and how to apply it.  Part three was the sales pitch of nice things that the PSP is supposed to enable you to do.  Now, in part four, we'll get down to brass tacks and talk about how well the PSP actually works for me.

I'll start with a quick overview of the good, bad, easy, and hard parts.  Then I'll dive into a deeper discussion of my experiences and the details of what did and didn't work.

PSP at a Glance

This table gives you a nice little overview of my experience.  It rates various parts of using the PSP on two axes - whether I judge them to be useful or not (good/bad) and how difficult they are to apply in practice (easy/hard).

 GoodBad
Easy Maintaining process
Defect tracking
Time tracking
Size tracking
Code reviews
PROBE estimates
Reviewing on paper
Setting up tool support
Test report template
Coding standards
Hard Planning process
Design reviews
Postmortem analysis
Defect data analysis
Creating relative size tables
Creating review checklists
Design templates
Design verification methods

As you can see, I find the benefits of the PSP to outweigh the drawbacks.  For me, it's useful enough that I plan to keep using it, in some form, for the foreseeable future. 

Using the PSP

Despite what you might read about how cumbersome the PSP is to use, I actually didn't find it that difficult at all.  Granted, it takes a little getting used to - every process change does.  But once I had the basics down, actually using and sticking to the process wasn't that hard.  The use of written scripts with well defined entry and exit criteria helps keep you honest and disciplined.

Likewise, with the help of Process Dashboard, I found the entire data collection process to be relatively painless.  There are a few pain-points, of course.  For me, the biggest one was simply properly configuring a line counting tool so that you can measure project size.  This is actually more annoying than you'd think.  I use the integrated line-counter in Process Dashboard to count change size and cloc to measure initial size for estimation purposes, mostly because I've integrated it into my IDE.  The Process Dashboard tool has some nice features, but will require you to write custom language definitions for pretty much anything that doesn't use C-style syntax.  It uses a fairly easy to follow XML format for configuration, but still....  Cloc has much better language support, but is harder to customize.  As a further annoyance, while Process Dashboard does have VCS diff support, it currently only supports Subversion.  So if you're using Git, Mercurial, or anything else reasonably modern, you'll have to set up two copies of your local repo for before and after comparison.

Code Review

Personal pre-commit code review is one of those things that every responsible developer does, but hardly anyone seems to talk about.  I know I've been doing an informal version of it for years.  It simply consisted of looking over the diff of my changes before hitting the "commit" button and making sure that I didn't make any obvious mistakes, that I'm not checking in any changes I don't mean to, etc.  It's something you quickly learn to do after making a few embarrassing mistakes.

Doing an organize code review with a checklist really takes this to the next level.  Instead of being a CYA thing, code review becomes a way to preemptively find problems in your code.  And at its best, it can be hugely effective.  As any experienced developer knows, there are classes of problem that are hard to find in testing, but stick out like a sore thumb when you actually stop and read the code.

The only hard thing about code review is actually customizing your review checklist.  I find that it's difficult for me to do that simply by looking at the defect categories in my data because those seldom tell you anything actionable.  There are a few defect categories that readily translate into checklist items, but there are many defects that are more subtle and are either difficult to categorization or difficult to generalize into checklist items.

The one thing I really didn't like about the PSP code review process was the recommendation to do it on paper.  Using paper does have the benefit that you can get more code in front of you at a time, and it is easier to make annotations.  However, it also bypasses the navigational and analysis power baked into modern IDEs.  For instance, Komodo let's me easily navigate between functions, access standard library documentation at a click, and search for uses of an identifier.  Those things are much more tedious to check on paper. 

But the big kicker for me was that trying to review a diff on paper is just painful.  It's hard enough on screen with color highlighting, but it really sucks on paper.  And on paper I don't have things like the Komodo diff viewer's feature to jump from a diff item to that location in the file to view the context.  It might work well to review new code on paper, but for changes to existing code it feels really clunky.

Design Reviews

While we're on the topic of reviews, let's talk about design reviews for a minute.  Again, this is a really good idea, for the same reasons that code review is a good idea.  And the PSP does offer some productive advice in the recommendation to adopt a design standard and a checklist for common errors.

However, the specific methods the PSP recommends just don't work for me.  The four standard design templates are a nice idea, but they feel very repetitive and clunky to work with.  And even if I switched to the UML equivalent, the recommended list of artifacts is just too much for a lot of the things I do.  And at the risk of having my CS degree rescinded, I have to admit that I have trouble constructing a state machine for many of the projects I do - at least, one that's even remotely enlightening.  In general, I just find them painful to work with and biased toward "new program" development rather than incremental enhancement.

And the design verification techniques are even worse!  They're time-consuming and tedious - you're basically executing your program on paper.  It's a nice idea, and might come in handy occasionally, but frankly, I have less confidence in my ability to perform those verification exercises correctly than I do in my code being correct in the first place.  And, again, they're just way too heavy for most of the projects I do.

I'm still working on finding a design and design review approach that's sustainable for me.  Since my last few shops have used agile methodologies, most of the "projects" I do are fairly small enhancements to existing code - usually just two or three days, seldom more than a week.  So a heavy-weight design process with lots of templates or UML diagrams just isn't going to work. 

My current approach is as follows (note that I'm using a variant on the PSP3 process in Process Dashboard):

  1. Sketch out a high-level design in a word processor.  This is a refinement of the conceptual design used for planning, usually in the form of plain prose and bullet lists.
  2. Review that primarily for feasibility, completeness, and requirements coverage.
  3. For each component I've broken out, do a more detailed "transient design" (I'll describe that in a moment).
  4. Review the transient design primarily for completeness, correctness, and requirements coverage.

I refer to the detailed designs as "transient designs" because I don't actually create separate documents for them.  I blend implementation and design and actually do the design right in the source code.  I generally stub out things like classes and methods and fill in the details either with actual code (for simpler items) or with "design annotations", which are just comments that use a special formatting to mark them as design artifacts.  Sometimes they're pseudo-code, other times they're just descriptions of what needs to be done, whatever seems appropriate.  Then, in the code phase, I simply replace those annotations with the actual implementation.  It's certainly not perfect, but it seems to be working well enough for me so far.  As a next step, I'm going to look at a TDD-like approach and try incorporating unit test definitions as one of the design artifacts.

Estimation

In my experience so far, PSP estimation using PROBE actually works remarkably well. In the data from my last job, I was eventually able to get to the point where my actual development times were generally within about 15% of my estimates.  I consider that to be pretty good, especially when you consider that the estimates were done in minutes and based on fairly sketchy user-stories.

Of course, estimation is a learned skill, and PROBE doesn't change that.  You still need to be able to accurately account for the possible changes when constructing the conceptual design.  And as with any data-driven approach, the results are going to be sensitive to the quality of your data.  So if your relative size tables are just made up rather than being based on your past work, then don't expect your estimates to be too accurate.

It's also important to note that there's a bias against project diversity here.  For example, line counts can differ wildly for different programming languages, different problem domains, etc.  So if you tend to work on projects that are generally very similar to each other, then PROBE will work much better than if all your projects are widely divergent.  My data from my last job is based largely on a single code-base, so while the purposes of the individual projects varied wildly, the technology stack was consistent.

The hardest part about estimation, at least for me, is coming up with those relative size tables.  It's one of those things that sounds easy, but actually isn't.  For one thing, I don't have a tool to automatically count lines and methods in classes - much less one that works across a diversity of languages.  For another, my data largely comes from work, which is a problem because I do mostly web development on a team, which means I need to extract my method and class size data from a code base written in five different languages by five people.  When you wrote a quarter of the methods in one class, half of a third of the methods in another class, etc., how do you count all that?  You can just forget that and count the files you have, but then it's not really your data, so it's not clear how useful it will be. 

I've also found it challenging to come up with useful categorizations for my relative size tables.  Perhaps it's just the products I've been working on, but I end up with a whole lot of database-related classes and a smattering of other categories.  That's fine for those products, it's hard to figure out how to extrapolate that to other kinds of projects.  My suspicion is that that's a result of sub-optimal system design.  I'm currently trying to adhere religiously to the SOLID principles, which should result in more and more targeted classes, which should solve that problem.

Other Bad Things

There are a few other annoying things about the PSP as Humphrey describes it.  While the general focus on templates and checklists is not bad in and of itself, their value tends to vary.  The test report template, for instance, is one that I've not found particularly valuable.  While it is useful to sketch out your testing strategy, or maybe make a quick checklist of your test cases, the test report template is more like something that you'd give to a QA team to do manual testing.  It has a bias towards verbosity that makes it seem like more effort than it's worth.

Likewise with the focus on coding standards.  We can all agree that having coding standards, and following them consistently, is a very good thing.  Everyone should do it.  However, I've been working as a software developer for a long time now and my "coding standard" is something I've long since internalized.  I don't need to spend time formalizing it or checking it in my code reviews.  Ditto the size counting standard.  You can get really fancy if you want to, but to I'm not convinced that anything much more complicated than counting physical lines is likely to be helpful.  I suspect that, at least for my purposes, any elaborate counting standard would just serve to complicate measurement.

And, of course, there's the simple fact of process overhead.  It's really not too bad when you use Process Dashboard, but it's still there.  For example, there's a non-trivial amount of work that goes into configuring Process Dashboard itself.  It's a useful and powerful tool, but its not always simple to use.  There's also the analysis time use to assess and correct your process.  This is unquestionably valuable, but it's still some additional time that you need to plan for.  And, of course, there's just the time to actually follow the process.  This isn't actually that much, but for very small tasks (e.g. one or two hours), your estimates might be thrown off by the fact that your standard phase break-down results in a phase that's one or two minutes, and it takes you longer than that just to type in the data you need.

And Some Good Things to End On

Last but not least, I wanted to highlight two more "good but hard" things from the table above: the planning and postmortem stages.  At first, these seemed like silly, pro forma phases to me, but they're actually quite valuable.  And the most valuable thing about them is something that doesn't show up in the script: they make you stop and reflect.

The planning phase forces you to think about what you're doing.  To construct an estimate, you have to think about the project you're trying to do, break it down, and define its scope.  Even if you don't believe there's value in the estimate itself, simply going through the process gives you lots of good insight into just what you're trying to do, which reduces the number of surprises later in the development cycle and makes everything go smoother in general.

Likewise, the postmortem stage prompts you to reflect on how you're doing and figure out how you can improve.  It's like a one-man sprint retrospective.  And like the sprint retrospective, it's actually the most important part of the process.  The simple task of looking at your statistics and filling out a Process Improvement Proposal forces you to stop and focus on your performance and what you can do to improve your work.  I find that simply looking at that PIP line in the postmortem exit criteria keeps me honest and makes me stop and think of something I could improve.  And if you can't thing of at least one thing you could do better, you're just lying to yourself.

Conclusion

So there you have it.  In many ways, the PSP is working well for me.  Some of Humphrey's suggestions work well out of the box, some don't, and some need tweaking.  I do find that my process is evolving away from the "standard" PSP, but that's neither bad nor unexpected.  The basic techniques and ideas are still useful to me, and that's what matters.

Upgrading Mercurial on shared hosting

Disclaimer: This is yet another "note to self" post.  If you're not me, feel free to ignore it.

After God alone knows how many years (at least six, since I have posts related to it from 2010), it's finally time up upgrade the version of Mercurial that I have installed on my shared web hosting account.  This is a shared hosting account with no shell access - nothing but web-based tools and FTP.  I also don't know what OS it's running - just that it's some form of Linux.  So I've been putting this off for obvious reasons.

Unfortunately for me, the defaults for repository creation in Mercurial 3.7 turn on general delta support by default.  That isn't supported by the old version I was running (1.7), so my choices are to either use the now non-standard, older, and less efficient format my repositories, or just bite the bullet and upgrade.  So I did the latter, since the version I had was pretty ancient and I was going to have to do it eventually anyway.

Fortunately, my hosting provider supports Python 2.7, which gets you most of Mercurial.  However, there are some C-based components to Mercurial.  Since I have no shell access to the hosting server, and there are probably no development tools installed even if I did, I had to try compiling on a VM.  I was able to do that by spinning up a Fedora 24 VM (on the assumption that they're running RHEL, or something close enough to it), and doing a local build.  The only caveat was that apparently my provider is running a 32-bit OS, because building on a 64-bit VM resulted in errors about the ELF format being incorrect.

Once the Fedora VM was up and running, I was able to do a build by running the following:
sudo dnf install python-devel
sudo dnf install redhat-rpm-config
cd /path/to/mercurial-3.8.x
make local

That's about it.  After I had a working build I was able to copy the Mercurial 3.8 folder to the server, right over top of the old version, and it just worked.  Upgrade accomplished!

Fixing Synaptics right-click button

Tonight I finally got around to fixing my trackpad.  Again.

So here's the thing: like most Windows-based laptops, my little Lenovo ultrabook has a trackpad with two "button" regions at the bottom.  They're not separate physical buttons, but they act as the left- and right-click buttons.  However, rather than force you to use the right-click pseudo-button, the Synaptics software the comes with the laptop lets you turn off the right-click button and use two-finger-click as the right-click, a la a MacBook Pro.  I find this preferable, in particular because my laptop centers the trackpad under the space bar, rather than in the actual physical center of the unit, which means that when you're right-handed, it's easy to accidentally hit the right-click button when you didn't mean to.

Prior to upgrading to Windows 10, I had this all set up and it was fine.  After the upgrade, not so much.  Sure, the same options were still there, but the problem was the that option to turn off the right-click button did just that - it turned it completely off!  So rather than clicking in that region doing a standard left-click, that part of the trackpad was just dead, which was even worse than the problem I was trying to fix.

Luckily, it turns out that the functionality is still there - the Synaptics people just need some better UX.  I found the solution in this forum thread on Lenovo's site.  It turns out you can just change the value of the registry key HKEY_CURRENT_USER\Software\Synaptics\SynTP\TouchPadPS2\ExButton4Action to 1, which is apparently the standard left-click action.  By default, it's 2, but when you turn off the "Enable Secondary Corner Click" (i.e. right-click to open context menu) feature in the Synaptics UI, that gets changed to 0, which is apparently the "do nothing" action.

Long story short: my mouse is much better now.

PSP Break-down, part 3: The Good Stuff

Welcome to part three of my PSP series.   Now that the introductory material is out of the way, it's time to get to the good stuff!  This time, I'll discuss the details of the PSP and what you can actually get out of it.  Theoretically, that is.  The final post in this series will discuss my observations, results, and conclusions.

PSP Phases

As I alluded to in a previous post, there are several PSP phases that you go through as part of the learning process.  Levels 0 and 0.1 get you used to using a defined and measured process; levels 1 and 1.1 teach planning and estimation; and levels 2 and 2.1 focus on quality management and design.  There is also a "legacy" PSP 3 level which introduces a cyclical development process, but that's not covered in the book I used (though there is a template for it in Process Dashboard).  The phases are progressive in terms or process maturity.  So PSP 0 defines your baseline process and by the time you get to PSP 2.1 you're working with an industrial-strength process. 

For purposes of this post, my discussion will be at the level of the PSP 2.1.  Of course, there's no rule that says you can't use a lower level, but the book undoubtedly pushes you toward the higher ones.  In addition, while the out-of-the-box PSP 2.1 is probably too heavy for most people's taste, there is definitely useful material there that you can adapt to your needs.

Estimation

One of the big selling points for all that data collection that I talked about in part 1 is to use it for estimation.  The PSP uses an evidence-based estimation technique called Proxy-Base Estimation, or PROBE.  The idea is that by doing statistical analysis of past projects, you can project the size and duration of the current project.

The gist of the technique is that you create a "conceptual design" of the system you're building and define proxies for that functionality.  The conceptual design might just be a list of the proxies such that, "if I had these, I would know how to build the system."  A proxy is defined by its type/category, it's general size (from very small to very large), and how many items it will contain.  In general, you can use anything as a proxy, but in object-oriented languages, the most obvious proxy is a class.  So, for example, you might define a proxy by saying, "I'll need class X, which will be a medium-sized I/O class and will need methods for A, B, and C." 

By using historical project data, you can create relative size tables, i.e. tables that tell you how many lines of code a typical proxy should have.  So in the example above, you would be able to look up that a medium-sized I/O class has, on average, 15.2 lines of code per method, which means your class X will have about 46 lines of code.  You can repeat that process for all the proxies defined in your conceptual design to project the total system size.  Once you have the total estimated size, PROBE uses linear regression to determine the total development time for the system.  PROBE allows for several different ways to do the regression, depending on how much historical data you have and how good the correlation is.

Planning

As a complement to estimation, the PSP also shows you how to use that data to do project planning.  You can use your historical time data to estimate your actual hours-on-task per day and then use that to derive a schedule so that you can estimate exactly when your project will be done.  It also allows you to track progress towards completion using earned value.

Quality Management

For the higher PSP levels, the focus is on quality management.  That, in and of itself, is a concept worth thinking about, i.e. that quality is something that can be managed.  All developers are familiar with assuring quality through testing, but that is by no means the only method available.  The PSP goes into detail on other methods and also analyzes their efficiency compared to testing.

The main method espoused by the PSP for improving quality is review.  The PSP 2.1 calls for both a design review and a code review.  These are both guided by a customized checklist.  The idea is that you craft custom review checklists based on the kinds of errors you tend to make, and then review your design and code for each of those items in turn.  This typically means that you're making several passes through your work, which means you have that many opportunities to spot errors.

The review checklists are created based on an analysis of your defect data.  Since the PSP has you capture the defect type and injection phase for each defect you find, it is relatively easy to look at your data and figure out what kind of defects you typically introduce in code and design.  You can use that to prioritize the most "expensive" defect areas and develop review items to try to catch them.

As part of design review, the PSP also advocates using standard design formats and using organized design verification methods.  The book proposes a standard format and describes several verification methods.  Using these can help standardize your review process and more easily uncover errors.

Process Measurement

Another big win of the PSP is that it allows you to be objective about changes to your personal process.  Because you're capturing detailed data on time, size, and defects, you have a basis for before and after comparisons when adopting new practices. 

So, for instance, let's say you want to start doing TDD.  Does it really result it more reliable code?  Since you're using the PSP, you can measure whether or not it works for you.  You already have historical time, size, and defect data on your old process, so all you need to do is implement your new TDD-based process and keep measuring those things.  When you have sufficient data on the new process, you can look at the results and determine whether there's been any measurable improvement since you adopted TDD.

The same applies to any other possible change in your process.  You can leverage the data you collect as part of the PSP to analyze the effectiveness of a change.  So no more guessing or following the trends - you have a way to know if a process works for you.

Honesty

One of the least touted, but (at least for me) most advantageous aspects of using the PSP is simply that it keeps you honest.  You use a process support tool, configured with a defined series of steps, each of which has a script with defined entry and exit criteria.  It gives you a standard place in your process to check yourself.  That makes it harder to bypass the process by, say, skimping on testing or glossing over review.  It gives you a reminder that you need to do X, and if you don't want to do it, you have to make a conscious choice not to do it.  If you have a tendency to rush through the boring things, then this is a very good thing.

Next Up

So that's a list of some of the reasons to use the PSP.  There's lots of good stuff there.  Some of it you can use out-of-the-box, other parts of it offer some inspiration but will probably require customization for you to take advantage of.  Either way, I think it's at least worth learning about.

In the next and final installment in this series, I'll discuss my experiences so far using the PSP.  I'll tell you what worked for me, what didn't, and what to look out for.  I'll also give you some perspective on how the PSP fits in with the kind of work I do, which I suspect is very different from what Humphrey was envisioning when he wrote the PSP book.

Access to local storage denied

I ran into an interesting issue today while testing out an ownCloud installation on a new company laptop running Windows 10.  When trying to open the site in IE11, JavaScript would just die.  And I mean die hard.  Basically nothing on the page worked at all.  Yet it was perfectly fine in Edge, Firefox, and Chrome.

The problem was an "Access denied" message on a seemingly innocuous line of code.  It was a test for local storage support.  The exact line was:

if (typeof localStorage !== "undefined" && localStorage !== null) {

Not much that could go wrong with that line, right?  Wrong!  When I double-checked in the developer console and it turns out that typeof localStorage was returning "unknown".  So the first part of that condition was actually true.  And attempting to actually use localStorage in any way resulted in an "Access denied" error.

A little Googling turned up this post on StackOverflow.  It turns out this can be caused by an obscure error in file security on your user profile.  Who knew?  The problem was easily fixed by opening up cmd.exe and running the command:
icacls %userprofile%\Appdata\LocalLow /t /setintegritylevel (OI)(CI)L

PSP Break-down, part 2: Learning

This is part two of my evaluation of the Personal Software Process.  This time I'll be talking about the actual process of learning the PSP.  In case you haven't figured it out yet, it's not as simple as just reading the book.

Learning the PSP

The PSP is not the easiest thing to learn on your own.  It's a very different mindset than learning a new programming language or framework.  It's more of a "meta" thing.  It's about understanding your process: focusing not on what you're doing when you work, but rather how you're doing it.  It's also significantly less cut-and-dried than purely technical topics - not because the material is unclear, but simply because what constitutes the "best" process is inherently relative.

Given that, I suspect the best way to learn the PSP is though the SEI's two-part training course.  However, I did not do that.  Why not?  Because that two-part course takes two weeks and costs $6000.  If you can get your employer to give you the time and shell out the fee, then that would probably be great.  But in my case, that just wasn't gonna happen and the SAF (spouse acceptance factor) on that was too low to be viable.

Instead, I went the "teach yourself" route using the self-study materials from the SEI's TSP/PSP page.  You have to fill out a form with your contact information, but it's otherwise free.  These are essentially the materials from the SEI course - lecture slides, assignment kits, and other supplementary information.  It's designed to go along with the book, PSP: A Self-Improvement Process for Software Engineers, which serves as the basis for the course.  Thus my learning process was simply to read through the book and do the exercises as I went.

(As a side-note, remember that this is basically a college text-book, which means that it's not cheap.  Even the Kindle version is over $40.  I recommend just getting a used hardcover copy through Amazon.  Good quality ones can be had for around $25.)

Fair warning: the PSP course requires a meaningful investment of time.  There are a total of ten exercises - eight programming projects and two written reports (yes, I did those too, even though nobody else read them).  Apparently the course is designed around each day being half lecture, half lab, so you can count on the exercises taking in the area of four hours a piece, possibly much more.  So right there you're looking at a full work-week worth of exercises in addition to the time spent reading the book.

Personally, I spent a grand total of 55 hours on the exercises: 40 on the programming ones and 15 on the two reports (for the final report I analyzed not only my PSP project data, but data for some work projects I had done using the PSP).  While the earlier exercises were fairly straight-forward, I went catastrophically over my estimates on a couple of the later ones.  This was largely due partly to my misunderstanding of the assignment, and partly to confusion regarding parts of the process, both of which could easily have been averted if I'd had access to an instructor to answer questions.

Tools

 As I mentioned in the last post, you'll almost certainly want a support tool, even when you're just learning the PSP.  Again, there's a lot of information to track, and trying to do it on spreadsheets or (God forbid) paper is going to be very tedious.  Halfway decent tool support makes it manageable.

I ended up using Process Dashboard, because that seems to be the main (only?) open-source option. In fact, I don't think I even came across any other free options in my searches.  I understand other tools exist, but apparently they're not public, no longer supported, or just plain unpopular. 

One of the nice things that Process Dashboard offers is the ability to import canned scripts based on the PSP levels in the book.  To use that, you have to register with the SEI, just like when you download the PSP materials, which is annoying but not a big deal.  (The author got permission from the SEI to use their copyrighted PSP material and apparently that was their price.)  This is really handy for the learning process because it puts all the scripts right at your fingertips and handles all of the calculations for you at each of the different levels.

In terms of capabilities, Process Dashboard is actually pretty powerful.  The UI is extremely minimal - essentially just a status bar with some buttons to access scripts, log defects, pause the timer, and change phases.  Much of the interesting stuff happens in a web browser.  Process Dashboard runs Jetty, which means that it includes a number of web-based forms and reports that do most of the heavy lifting.  This includes generating estimates, entering size data, and displaying stock and ad hoc reports.

Process Dashboard has fairly extensive customization support, which is good, because one of the basic premises of the PSP is that you're going to need to customize it.  You can customize all the process scripts and web pages, the project plan summary report, the built-in line counter settings, the defect categories, etc.  And that's all great.  The one down side is that the configuration is usually done by editing XML files rather than using a graphical tool. 

Since this is an open-source developer tool, I guess that's sort of fine.  And at least the documentation is good.  But it's important to realize that you will need to spend some time reading the documentation.  It's a powerful tool and probably has just about everything you need, but it's not always easy to use.  On the up side, it's the sort of thing that you can do once (or occasionally) and not have to worry about again.  Just don't expect that you'll be able to fire up Process Dashboard from scratch and have all the grunt work just done for you.  There's still some learning curve and some work to do.  But if you end up using the PSP, it's worth it.

PSP Break-down, part 1: Overview

As I mentioned in a couple of previous posts, I've been studying the PSP.  As promised, this is my first report on it.  After several failed attempts to write a single summary of the process, I've decided to divide my assessment up into a series of posts.  This first entry will be a basic overview of what the PSP actually is and how it's intended to work.

What is the PSP?

PSP stands for Personal Software Process.  It was developed by Watts Humphrey of the Software Engineering Institute (SEI) at Carnegie-Mellon University (which you may know as the home of CERT). Humphrey is also the guy who came up with the Capability Maturity Model, which is the CMM in CMMI.  His intent was to take the industrial-strength process methodologies he worked on for the CMM and scale them down to something that could be useful to an individual developer.  The result is the PSP.

Despite the name, at its core the PSP isn't actually a development process itself.  Rather, it is a process improvement process.  To put it another way, the goal is not to tell you how to develop software, but rather to give you the intellectual tools to figure out the development process that works best for you and help you improve that process to make yourself more effective.

Now, to be clear, Humphrey actually does describe a particular development process in his PSP books.  And yes, that process is kind of "waterfally" and extremely heavy.  And if you look for PSP information online, you'll no doubt find a lot of people who take issue with that process.  And they're right to.

But try not to get hung up on that.

As Humphrey puts it, the process he describes is his process.  That doesn't mean it should be your process.  The process he describes is for pedagogical purposes as much as anything else.  It is intended to demonstrate the kinds of methods that apply to large-scale industrial systems, so that you have those tools in your toolbox if you need them.  If this process doesn't really work for you (and it almost certainly won't), then you should modify it or replace it with something that will work for you.

This is something that bears dwelling on.  Although pretty much all of the PSP focuses on Humphrey's process, he notes several times that it is important to pick a process that's sustainable for you.  To put it another way, it's important to be realistic.  Sure, his process might be objectively "better" by some measure than whatever you're doing, but that doesn't matter if you aren't going to follow his process.  Maybe his process conflicts with your company's process; maybe it's not effective for the kind of project you usually do; maybe you just plain don't have the will power to stick to it.  It doesn't matter.  The process you define needs to be in line with what you actually do, otherwise it's not going to help you.  So while you can learn from Humphrey's process, don't take it as gospel and don't think "this is what I need to be doing."

For me, the real take-away of studying the PSP is to be mindful of your development process and constantly try to improve it.  The goal is simple: quality and consistency.  Monitor your process and tune it such that you can consistently build a quality product in a predictable time-frame.  Ground yourself in reality and be guided by the data, not the latest fads.

The PSP Approach

The PSP is all about data-driven process changes.  As you work, you collect data on what you produce. The general idea is that you analyze that data and make changes to your process based on that analysis.  You then repeat this process, evaluating the results of your changes and making further tweaks as necessary.  So it's basically a plan-do-check-act cycle.  This is the same sort of thing that agile methods advocate (e.g. Scrum retrospectives).

Project Structure

The PSP structures a task or project into phases.  The exact phases will obviously depend on your particular process.  The "standard" (i.e. from Humphrey's process) phases are Planning, Development, and Postmortem.  In the Planning phase, you define what is to be built and construct estimates of the effort involved.  The Development phase is where you actually construct the software.  In the standard PSP process, this is divided into a number of sub-phases, depending on which level of PSP you're using.  Lastly, the Postmortem phase is where you record your final process measurements and document any problems and potential process improvements that came up in the course of the project.

The PSP levels I mentioned above are essentially maturity phases.  In the process of learning the PSP, you go through six of them: PSP0, PSP0.1, PSP1, PSP1.1, PSP2, and PSP2.1.  You start with PSP0, which is basically just "whatever you do now, but with measurements" and progress up to PSP2.1, which includes five sub-phases: Design (with standard design templates), Design Review, Code, Code Review, and Test.  I'll analyze this structure and the contents of the phases in a later post. 

Data Collection

Data collection in the PSP happens at a very fine-grained level.  The idea is that you do light-weight data capture as you work in order to minimize inaccuracies.  Data is collected on three axes: how much time is spent in each phase of the project, size of the product, and what defects were found.  In the standard process, you would track how many minutes you spend in each phase or sub-phase, how many lines of code you add or modify, and the number and type of bugs you find in in each phase.

The general idea behind this is simply that the more detailed the data, the more opportunities you have to use it later on.  The PSP suggests a number of ways to use this data to drive and measure process change.  Two of the most visible are the use of historical size and time data to drive the estimation process and the use of defect phase, type, and time data to measure the efficiency of defect removal activities such as code review and testing.

Interestingly, when Humphrey started developing and teaching the PSP, students tracked this stuff on paper.  He later introduced spreadsheets to track it, which is better, but still a little clumsy.  These days, there are PSP support tools you can use.  The main open-source one seems to be Process Dashboard, which I used from day one.  It will help you track PSP data and do most of the calculations you need for data analysis.  Frankly, I can't imagine trying to track this by hand or with Excel spreadsheets - it just seems like it would be painfully tedious.  But with decent tool support its really not that bad.

Quality Management

One of the interesting things to note about the standard PSP process is that it explicitly takes a position on quality management.  This is something you don't tend to hear a lot about - in fact, it might not even have occurred to you that quality is something you can manage.

The position Humphrey takes is simple: the right way is always the fastest way.  And the right way is to get things right the first time to the maximum extent feasible.  That means finding defects as early as possible and taking steps to stop them from being injected in the first place.

I'll get into the details of this in a later post.  I mention it now simply Humphrey devotes a lot of ink to it.  In fact, much of the new material in the higher PSP levels is devoted to quality control.  Quality management is one of the big uses of the defect data that you track as part of the PSP, so it's important to realize at the outset that this is a core part of the process.

Should I Care?

I think you should.  You may not want to actually use the PSP, but it's still an interesting thing to learn about.  If you're like me, you probably never thought about how you build software in terms of hard data, so it's interesting to see how it can be done.  It's a very different and empowering way of understanding how you work - it makes programming feel more like a science and less like a black art.  So even if you don't want to go all the way and actually try to use the PSP, it's enlightening to read about the various statistical analysis techniques and quality improvement recommendations.  If nothing else, it broadens your perspective and gives you more intellectual tools to draw on.

In part two, I'll talk a little about what I did to learn about the PSP, including the resources and tools that I used.

Nope, I don't know JS

I've been writing software for a living for the last 15 years.  I've been doing mostly full-stack web development for nine of those.  That means I've written my fair share of JavaScript code.  But you know what?  It turns out I really don't know JS.  I thought I did.  But I don't.

I reached this conclusion after reading the first two and a half of the six books in Kyle Simpson's You Don't Know JS series.  Normally, I'd wait until I was finished with the series to blog about it, but seriously, this is good stuff.  If you think you have a decent grasp of JavaScript, you should read this to test your mettle.

Forget the W3Schools tutorials or jQuery guides you might have read to learn JavaScript in the first place.  This is way beyond that.  The goal of the You Don't Know JS series is not to teach you "how to code JavaScript" but rather to help you master JavaScript.  It's a deep-dive into the guts of JavaScript - not the subset most of us are used to, but what's actually spelled out in the ECMAScript specifications. 

The beautiful thing about this series is that it's not about expanding your catalog of technical tricks.  Most of it (well, of the 2.5 books I've read so far, anyway) is about understanding the fundamentals in a deep way.  For example, going beyond "oh, this in JavaScript is weird" and actually understanding the rules behind how the dynamic binding of this works and how it differs from the lexical scope used for everything else.  Things like that are easy to gloss over.  After all, you don't really need to know the gory details of how this is bound in order to write code and be productive.  But these kinds of things really are important.  They're the difference between "knowing" JavaScript and knowing JavaScript. 

To put it another way, there's more to the craft of building software than just "getting the job done".  For some people, just "getting it done" is sufficient - and that's fine: they're satisfied to remain journeymen.  But for some people, that's not enough - they want to be master craftsmen.  This series is for them.