Data Driven Retrospectives

Searching for “Data Driven Retrospectives” in Google yields very few results for what I’m looking for. In the search results I can see a single reference to a paragraph in David Anderson’s Kanban book. Apart from that, nothing relevant. Perhaps I’m searching for the wrong keywords?

What I’m after are examples or tips on using data to provide insight into improvement activity for development teams. I only have my own experiences to draw upon. I’ll share some of those with you in this post. If data plays a significant role in your continuous improvement activity then please share your experience.

The Data Driven Retrospective

Cycle-time-chart

The format of the retrospective goes something like this:

  1. Present the data to the team
  2. Drill down into the data looking for patterns or outliers
  3. Ask why, ask who, ask how, and take action!

The chart above was generated in Mingle (ThoughtWorks Studios). It shows average weekly cycle time over a 6 month period. There are many issues with this tool, particularly the fact that it averages – but that’s for another post. Putting all it’s shortcomings aside, it still serves as an excellent catalyst for analysing sources of variation.

Cycle-time-chart-drill-down
Drill down on outliers provides further exploration of the dataset.

Teams who are tracking cycle time and using this data in retrospectives are learning to spot patterns of what good and bad looks like. We are consistently seeing cycle times improving in terms of reduced cycle time and reduced variation in cycle time when teams measure and manage this metric.

Dwell Time

Some teams use what they call a dwell time chart as follows, which aims to identify excessive queuing time:

Dwell-time-chart

Unblocked Blockers

Blockers

I encourage all teams to keep unblocked blockers in an envelope at the end of their card wall. Basically, when you unblock a card, don’t throw the magenta post-it away. Stick it in the envelope. You can then lay out these stickies on the table in your retrospective and group them. Again, another form of data point.

Technical Quality

I covered measures of effectiveness in a previous blog post. The use of these metrics in retrospectives is becoming more prevalent, particularly the technical measures. Tech Leads are exploring stats, e.g. test coverage trends, and the team are responding with probing questions.

Successes & Feedback

I’ve been experimenting with data driven retros across many of the teams I coach. At a rough guess I would estimate 30-40 teams are now using this form of retrospective in preference to any other format. So why is it proving to be so popular?

The buzz and energy in the room during these retro’s is almost deafening. Participants are no longer expressing opinions but rather exploring data for insight. Cognitive biases are certainly reduced when compared to the common (works well / could improve / puzzles) format of previous retrospectives.

Some teams have taken a blended approach, for example first 30 mins data driven followed by works well / could improve / puzzles approach. Others have adopted an alternating approach whereby every other retro is data driven.

Ops-Review
In this photo, multiple delivery teams come together to run an Ops Review.

In addition to getting more data focused I’ve been coaching teams to avoid waiting for the next retro to look for improvements. Most of this data is available real-time. So why not retrospect in real-time? Scrum Masters (or equiv job titles!) should be watching queues for long waiting times, tech leads should be looking at code quality metrics daily, and management should be spotting patterns of recurring blockers.

Please share your experiences, good or bad, of using data driven retrospectives.