Don't Do Retrospectives Unless You're Going to Do Them Right

The odds are good that you've been involved in some sort of retrospective meeting. It may have been called something else such as the popular 'post-mortem', but the purpose is generally the same for any given software release/sprint/iteration: figure out what went well and what didn't go well. Why, then, do so many retrospectives go awry? In my experience, there are three very common reasons: there are too many items in the 'must improve' list, there is no follow-up on the items in that list, and the list isn't very good to begin with.

Too Many Items 

One common problem I've seen is for a team to put too many items on the list of things to be improved for the next iteration. Ever had one thing to do? How was that? Even if it was a difficult task, at least you could wrap your head around it. Ever had a thousand things to do? How was that? Overwhelming, right? Having too many items on a 'must improve' list is arguably as bad as not having a list at all. While it is often important to document all the ideas on what could be improved, it is best to focus on a handful of items (ideally, one or two) that could be improved for the next iteration. If your team improves one or two things every iteration, then that is continuous improvement. 

 

Overwhelming.

Lack of Follow-Up 

Even if you manage to decide on a small number of improvements to be made, you can still run into trouble by failing to follow up on the tasks that will implement the improvement. If no one has the responsibility of ensuring that the tasks get done, then it's quite likely that the tasks will not get done. It doesn't really matter if you have your manager, scrum master, team lead, or intern as the responsible party for keeping an eye on the improvement, what matters is that someone is making sure that the tasks get completed. By the way, the person responsible for ensuring that the tasks get done doesn't necessarily have to be the same person to actually implement the improvement. They just need to make sure that the tasks don't fall through the cracks during the heat of battle. 

Nothing But Whining 

Okay, so you have a small list of improvement items and someone is assigned to make sure that those items are completed. Everything is great, right? No. You can still have problems if your list isn't very good to start with. While it is common to focus on things that went wrong during an iteration, it is important to remember the things that went well too. It's too easy to get caught up in having 'improvements' that revolve around negative things (e.g. "must make sure that we get the specifications from the customer") and lose sight of the positive things that have been done that could be further improved (e.g. "integrating our source control with our bug tracking system was great, maybe we can integrate that with our help desk"). Improving on your improvements is allowed.

Folks, I'm not going to claim that this is an exhaustive list of things that can be done to make sure that your retrospectives are fruitful. What I will claim, however, is that committing to a small list of well-thought-out improvements will make your software development life better.

You Must Know What Your Project is All About

Brian Welcker, writing for his own blog Direct Reports, had a good post about knowing when a project is headed for trouble. The lens for his post is his experience on Microsoft's file system project named WinFS. The best part:

I suspected from early on that the project was doomed to failure. What made me think this? Because when I would ask anyone involved with the project the question "What is it?", I would get a very different answer.

​If your project's goal can't be answered relatively consistently or relatively easily by the people involved in the project, then you're probably in trouble. This is indicative of either having a project that is too big/unfocused/gnarly to be completed successfully, or a project that is too ill-defined for everyone involved to be effective in delivering on its goal. I've been involved in both types of projects, and trust me it isn't pretty.

The Perils of Technical Debt in Software

Joe McKendrick, writing for ZDNet:

As Jones put it: "If you skimp on quality before you deliver software, you end up paying heavy interest downstream after the software is released for things you could have gotten rid of earlier, had you been more careful." Cunningham adds that applications themselves have a way of taking on a life of their own -- turning into "its own little bureaucracy where you can’t actually on a daily basis create value." The result is technical debt that "has piled up and you’re paying that interest."

In my career, I've seen first-hand how technical debt can impact software projects. I've seen it cause a project to be woefully over budget, behind schedule, and ultimately disappointing to users. I've also seen it completely destroy a project (at the expense of many jobs).

However, that isn't to say that technical debt must be avoided at all costs.  It isn't necessarily always possible or practical to get things correct the first time. Sometimes software needs to 'bake' a bit more, especially when blazing new trails. As well, it is naive to think that shipping is unimportant (real artists ship, right?).

In short, technical debt should be minimized where possible but understood as a reality of software projects.