Your browser is unable to display this site correctly. Please try an up-to-date version of Chrome or Firefox instead.

< Back to all posts

Data-Driven Retrospectives

Abdul Basit

By Abdul Basit

Technical Architect

View bio
April 09, 2019

Data Driven Retrospectives

In business, we fundamentally believe that data has to drive the decision making process. Companies like Amazon and Netflix are known for relentlessly leveraging their user data to further grow to new heights. Needless to say, they spend much of their resources devoted in collecting, analyzing and generating insights from data.

But using data to drive business decisions is not restricted to large organizations – in fact, if done correctly, data can be used to improve almost any process. What makes it even more enticing, is that the cost of data collection, analysis and reporting can be kept to a minimum, meaning that you don’t have to pay ongoing licensing fees or have big data clusters to generate insights. With a few modest tools, you can put together quite a sophisticated dashboard to assess and monitor progress.

Here at Jonah Group, we use an agile approach to manage our projects. Projects are broken into sprints which can vary between one and three weeks. On the last day of the sprint, the entire team gets together and talks about how they did in the sprint and what they can do to improve on in the next sprint. This is called a retrospective and is a common practice in the industry.


Typically, we go around the table and ask questions to all people working on the project, what they would like to share and what they think can improve. Because people know a retrospective is coming around the corner, as the sprint progresses and they stumble on the issue, they know to talk about it then. But what about all the things that they don't stumble upon? Perhaps there are deeper underlying issues and the items being discussed are just symptoms of it. Perhaps there are other things that are missed altogether.

But if you look at the process outlined above, aren't we really just collecting data, analysing it and producing a report? It's just more fluid and short-lived. The answer is absolutely yes! The following is an example dashboard that is built in excel using a few simple tools:


We will go through each chart and describe what it indicates:

Estimates vs Actual Effort

estimates vs actual

Tickets are broken down by development and testing effort – roughly 2/3 development and 1/3 testing (indicated by the grey and yellow respectively). The blue line indicates the initial estimate and the orange line indicates how far you were from your initial estimate. Notice how issues on either extreme are worth talking about!

Comment Tags


Any development effort goes through a review process – but what are the reviewers saying? Are there common themes that we need to focus on? Thus in our process, every comment receives one more tags classifying the comment. Clearly in the above, the #minor comment is the winner. Whether this is good or bad is subject to debate.

Pull Request Comments

pull request comments

As developers open up pull requests for code approval, how many comments are they receiving? Is it bottling them down and thus hurting project delivery? My earlier comment said that minor comments are subject to debate – if reviewers are putting in comments that don't necessarily add value (why else would it be minor?), then too many of them can put the progress at risk. This is a very good example of a problem that would remain "hidden" unless brought to light by the dashboard above.

Returned Tickets


There are two rounds of QA that occur in project delivery – orange indicates second level return and blue indicates first level return. Obviously, orange is more expensive than blue. Luckily in the above, only a few tickets come back from verification.

Number of Sprints per Ticket


This speaks to sprint-over-sprint turnover, perhaps second round of verification isn't complete and the development team moves on to the next set of tickets. And the pile up continues to grow. Tickets really shouldn't linger for more than a couple of sprints.

For any dashboard to be truly effective, users need to be able to interact with it and zoom in on particular data points. Excel uses slicers to solve this problem. In the next section, we will talk about how users can slice the dashboard:

Slicer Name Description
Sprint Number You want to focus on your current sprint and see how you compared to the previous sprint or sprints
login For individual users to focus on themselves and see what they would like to improve upon
project The project may not necessarily focus on a single repository for delivery – typically, the UI repository is different from the service repository. Perhaps the comments differ based on the type of project we are working on.
Estimate vs actual difference This is to help isolate the data points of interest. Hopefully, most stories have their actual and the estimate close to one another, but those that are very off are the ones we would like to zoom in on
Number of Sprints per Ticket Lastly, a pile up of tickets can also skew our results, so perhaps a developer would like to zoom in on the ones he has only worked on for one sprint. Or the project manager would like to know about the tickets that have lingered over two sprints.

From the above, hopefully it is quite evident of the power a dashboard can bring to the retrospective process. On my current project, prior to having the dashboard, I would on occasion have to pry information out of people to get a sense of areas to improve upon, but I have found that we now have more information than the time allocated and the team as a whole has benefit tremendously from it.

Don’t be intimated by the metrics collected above. The dashboard started out very different from the picture you see above, having fewer charts and slicers. Over time, the team as a whole started getting interested in other measures, which were then collected and added to the dashboard for further analysis. If you adopt this, chances are that your dashboard will be very different from the one above!

The dashboard itself was made in Excel and required two Microsoft add-ins – Power Query and Power Pivot – both of which are free to download and use. Data itself was sourced through scripts that would just run queries against the tools we were already using. There were no licensing costs and the scripts and dashboard were built to work over the entire project life. As sprints would complete, we would just run the scripts again and "refresh" the dashboard. Pretty powerful stuff!

I would like to conclude with just one caution, "with great power comes great responsibility" – Uncle Ben from Spiderman. The dashboard above is just a tool to help you find insights on your project, but can quickly be misused in a finger-pointing exercise. During retrospective, our advice is to talk about the bottlenecks and issues. People on the project need to be able to speak about them as well, but they should not be ambushed. If someone on the team is weak, it will consistently show on the dashboard (their tickets will be delayed, the number of comments will likely be high, etc). The chances are that you and the team are already aware about this, but it gives them an opportunity to improve upon and for you and the team to see to their success.

Good luck in creating and using the dashboard to run your own projects!

About Jonah Group

Jonah Group is a digital consultancy the designs and builds high-performance software applications for the enterprise. Our industry is constantly changing, so we help our clients keep pace by making them aware of the possibilities of digital technology as it relates to their business.

  • 24,465
    sq. ft office in downtown Toronto
  • 126
    team members in our close-knit group
  • 17
    years in business, and counting