So as I mentioned in a previous post, I've been taking a look at the Personal Software Process, or PSP (not to be confused with Sony's portable gaming console, which makes Googling for it loads of fun). It's been an eye-opening experience and I'll write more about it once I've officially finished the course.
At the same time, I've been reading about the #NoEstimates hashtag/movement/discussion/whatever-the-hell-it-is. I actually first heard about it around the time I was starting the PSP book. This makes for an interesting contrast, because one of the first things you learn about when studying the PSP is...how to do estimates.
The thing is, up until a few months ago, I would have completely agreed with the #NoEstimates people. In fact, I wrote about my feelings on estimates a couple of years ago. My feelings were pretty much in line with Woody Zuill's comments in .NET Rocks! episode 1160 - that estimates are useless because they're just made-up, grossly inaccurate numbers which can only be used for evil or to make even more inaccurate plans. So why bother? Just decide if something is important enough to do, and then either do it or not. Stop pretending.
Then I started learning about the PSP. I started accumulating data on my actual productivity and learning how to leverage it. And you know what? It turns out it actually is possible to do estimates with a reasonable degree of accuracy - and it doesn't even take huge amounts of up-front planning. Granted, there are caveats - you need historical data that's representative of the technology and problem domain you're working in. But once you've got a few projects under your belt to establish a baseline, it really does work.
So now I have some cognitive dissonance. Was I wrong to take the time to learn this estimating stuff? Or was I wrong before when I agreed with the #NoEstimates people? Where do I go with this?
What got me thinking about this is an article by Henri Karhatsu in Methods and Tools. Towards the end of the article, he discusses the #NoEstimates alternative to providing an estimate: forecasting. This is what you use when you really do need to figure out when something will be done (e.g. so that you can prepare marketing materials for a new product or feature). While this may sound like estimating, according to Karhatsu, there is a key difference: forecasts are based on data, while estimates are not.
I had several reactions to this revelation. The first was, "Well, that's a crock - 'forecasting' is just estimating by a different name." Then I realized, "Hey, 'forecasting' is exactly what you're doing in the PSP! Cognitive dissonance resolved!" Which prompted the question, "If PSP 'estimating' is the same thing as #NoEstimates 'forecasting', then what the heck does #NoEstimates even mean?"
And that's when I read this article analyzing the #NoEstimates movement. It's an interesting read and pretty well summarizes my impressions of #NoEstimates based on what I've heard and read. The short version is that #NoEstimates doesn't actually mean "no estimates ever". It doesn't even mean that estimates are inherently bad in all cases. Rather, it seems to be a reaction against people using estimates in stupid an fruitless ways. This came up a number of times in the .NET Rocks! interview with Woody Zuill. You have managers using estimates as a weapon against developers, or clueless project managers demanding estimates because they need to plug the numbers into Microsoft Project so they can finish their plan, even though nobody really thinks the plan reflects reality. This ends in the demoralizing and unproductive result of developers fabricating numbers, without any clue if they're even vaguely realistic, for no other purpose than to keep the managers happy.
I don't think anybody could reasonably argue that such a situation is a good thing.
But on the other hand, when you're being paid to do a project, whether it's building software or building a house, the questions "How long is it going to take?" and "How much is it going to cost?" are not inherently unreasonable things for the customer to ask. And while it's sometimes true that you're working on an unprecedented project an have no clue how long it will take, it's more often true that you're on reasonably familiar ground and have a pretty good idea of the effort involved in a project. And let's be honest - while a big-deal consultant like Woody Zuill might be able to get away with telling the customer that they don't really need an estimate, that's not going to fly in every organization. So #NoEstimates as a general recommendation is probably not realistic.
And that brings me back around to the first chapter of the PSP book. In it, Watts Humphrey relates a story from one of his first TSP teams (that's "Team Software Process" - the team equivalent of the PSP). They had just come off a two-year death-march project and now management had a new project for them - with a nine-month timeline. The dev team was sure they couldn't make that schedule. They didn't really know how long the project would take, but figured it would probably be closer to two years again.
So what normally happens in that situation? Well, the dev team says that the schedule is impossible to meet, management insists that the date is firm, and the developers eventually cave and say they'll "do their best". I've been in that situation, as I'm sure a lot of other people have. Why do we do that? Because, well, what else can we do? We don't really know how long the project will take, and even if we have an idea, we usually don't have anything concrete to back up our estimate. And as Humphrey puts it, when we don't know and management doesn't know, they'll win every time.
But rather than go the #NoEstimates route and try to convince people that coming up with an estimate isn't a useful exercise, Humphrey goes the opposite direction. He uses the power of data to give estimates teeth. Remember - even if management wants the new project done yesterday, it's still important for them to know when it's really going to be done. And while it's easy to write off a guess-work estimate as "schedule padding", it's not as simple when you have a detailed plan based on historical data. When you reach that point, estimates can become a weapon for the dev team rather than one used against them.
So where does this leave me? Pretty much back where I started. Estimates are good when they're needed and accurate. They're bad when they're unneeded or inaccurate. They should be based on data, not guess-work, and should be used responsibly. So no earth-shattering news. Just an interesting detour.
You can reply to this entry by leaving a comment below. You can send TrackBack pings to this URL. This entry accepts Pingbacks from other blogs. You can follow comments on this entry by subscribing to the RSS feed.