Data Driven Retrospectives: a must for high performing agile teams

It’s true.  Agile delivery teams get ‘review’ fatigue.  We certainly do.  Retros that repeat the same actions, lack structure or become venting playgrounds can deplete the will to live some days.   Organisational cultural issues beyond a team’s control can also act as a drag.

And then there’s the pushback:

“I think we’re doing really well, we don’t need to do this.” 
(code for ego talking: call it out!  there’s always room for improvement)


“We’re too busy to do this, we’ve just got to get the work done.”

(code for lack of focus and drowning – all the more reason to take stock and get things back under control!)


“This is a bore – chore”
(code for apathy!)

Then there are the days where you get the ah-ha’s!  The shared insights that lift everyone’s blinkers and reignite the spark and joy that comes from getting in synch and progressing together. These are the moments of magic that make it worth it.

Maintaining a tempo of continuous improvement takes effort and discipline.  It can be a delicate balance to find the sweet spot between affirming your collective strengths and addressing the things that need facing up to in an ever fresh and stimulating way.

We all know the Sprint Retrospective or ‘retro’ is Scrum’s ceremony to underpin a great reflective practice. It builds your team’s culture of continuous improvement and accountability for being self-directed in your collective learning and growth.  But what does a good retro practice look like and how do you build repeatable high performance?  To reenergise the practice for our own delivery team, I wrote the following playbook and am happy to share it here.

What’s a retro anyway?

The Retrospective, or ‘retro’ as it’s more affectionally referred to is usually the last of the ceremonies in a Sprint Cycle.  Once you’ve reviewed the sprint’s progress with a showcase to demonstrate work completed, a team will typically spend an hour to stop, take stock and learn for learning’s sake.

Ultimately, a retro is as simple as people coming together to solve problems about their own way of working. It’s an opportunity to intervene with bad habits and ensure your team doesn’t repeat these in future sprints. It’s the space to call out mediocrity and status quo. Fundamentally it’s the art of creating awareness and being intentional about how to get in synch and level up, continuously. The outcome is about getting aligned and focussed on a set of actions to experiment and iterate upon for the next sprint cycle and observe the impact (negative or positive) those experiments have on the way you deliver. Whilst the focus is on team improvement, the opportunity should also spark individual reflection under great facilitation.

Whose job is it?

In short, everyone’s! The retro is usually facilitated by your team’s Scrum Master, or equivalent role responsible for building and maintaining your agile practice. This could be either the Product Manager or Tech Lead if you don’t have the support of a SM.  We like to assign different team members to facilitate the retrospective. This brings a fresh approach and ‘flavour’ to the review and prevents the retro from becoming the ‘Scrum Master’s show.  Because it’s an all-hands meeting, I expect all team members to attend.  This gives everyone the opportunity to participate in building a psychologically safe space, giving and receiving quality feedback centred of growing our teammates.

Why be data-driven?

If you’re applying the Scrum method, you’re inadvertently signing up to the three pillars of empirical process control: Transparency, Inspection, Adaption.  Specifically, that means working in a fact-based, experience-based and evidence-based manner.  This is built into the method to support teams adapting to the changing requirements of the customer.


Despite this, few teams apply data to inform a stronger retrospective practice and continue to review progress and practices based on fictitious plans and gut feel.


Without data,

  • how will you know whether your actions are actually leading to stronger team practices and customer outcomes?
  • How will you know whether your practice experiments positively or negatively impact other agility outcomes?
  • Do you know what your baseline is, and whether you’re tracking above or below that?


When it comes to the crunch, only data provides the common ground of reality.  All teams already have a treasure trove of data with available insights into how they can improve at their fingertips, sitting within their delivery and collaboration tools.

Get clear on what data to collect

Define an agility metrics framework that’s flexible enough to help you focus on improving different agility outcomes over time. Collect metrics around your team’s workflow, review, collaboration and communication practices.  Strengthening your agility outcomes depends on it.

Scrum Masters should be watching queues for long waiting times (lead time and Cycle time), Tech Leads should be looking at quality metrics and Product Managers need to be spotting patterns of recurring blockers.  Everyone should be across the same set of shared agile delivery metrics. This ensures a wholistic, real-time team health check.  We believe it’s really important that you build a common understanding and appreciation for each participating discipline, where insights from metrics tracked by each discipline are shared and discussed.

Gather the data

Get the information flowing and build a platform of common understanding around the facts! You’ll be able to gather reports from your delivery tools such as your issue tracker and CI/CD tools.  Alternatively, find a tool that can take the pain out of collecting and aggregating your data into agility insights by automating the end to end process for you and presenting insights into your team’s communication, process and quality practices.  This also gives you time back to focus on your core delivery tasks.

 Set the right behaviours  & tone

Generally, people don’t like changing behaviours.  There can be fear and uncertainty on why data is being introduced into retro’s and how it is going to be applied.

When facilitating,, be tough on adjudicating the conversation when required. Reenforce a culture of listening, vulnerability, trust and support.  Stamp out blaming and voice offs.  Ensure language adopts a tone of collective responsibility, favouring ‘we’ and ‘our’.  If individuals are called out, ensure it’s constructive.  Define your team‘s Ground Rules and have a team member read them aloud at the beginning of every retro. Believe that everyone has done their best job given what they knew at the time with their skills, abilities and resources available for the situation at hand. Your goal as a facilitator is to move from the team away from sitting on the fence to embracing data-informed conversations in an active way.

Guiding Principles

The guiding principle for the retro is to review ‘on game’ performance for the cycle and identify ways to make the next ‘game’ cycle more effective and enjoyable. As a team, you want to bring value to the customer as fast as possible. This principle helps to frame why you want to improve your practices.

There’s no one to compare yourselves against you except your own track record.  Take-aways are rarely whole-sale changes, and more often iterative adaptations that lead to improvements in behaviours, process, quality and ultimately better outcomes for customers.  It’s a practice that builds with maturity as you go, strengthen your team’s adaptability and resilience over time.

Principles for building a successful retro habits

  1. Don’t ditch the retro – ever! Host it even if it’s a retro-lite and runs for 15 minutes.
  2. Be prepared with a consistent agenda, a clear set of ground rules and actions from last meeting to be get an update on
  3. Frame the use of data in your ground rules: it’s a ‘carrot’ to improve and facilitate the facts for focussed growth and reenforce accountability; not as ‘stick’ to trash team morale and motivation.
  4. Collect data from your team’s delivery and collaboration tools before you start – it’s what they’ll trust most
  5. Drill down into the data, spot the patterns and outliers (manually or your tools should do this for you)
  6. Affirm what you expect to see and shout out the wins
  7. Address what you didn’t expect to see and set a path (including actions, experiments and ownership) to remediate
  8. Compare your practices against your own track record and not against other team’s practices
  9. Set the action you want to take; name a guardrail (target metric) that will focus your attention and keep you on track
  10. Make sure your teams action list is not too short, and not too laborious, but just right for you!
  11. Iterate small and often; if it’s a wholesale change, break it down into cycle sized improvements
  12. Get early runs on the board and demonstrate the impact of experiments (with data of course)
  13. Encourage dialogue, don’t shut it down unless it’s unproductive. If you’re uncomfortable, be vulnerable, lean in and guide it through where it needs to go, hopefully an outcome. It’s good to be vulnerable, don’t be afraid if you don’t know where that means it will take you
  14. Let the team chose what matters most to them; feel the energy in the room to determine where team members want to improve.
  15. Check for bias; are you leaning on the facts?
  16. As a team lead, pull rank only when you need to reset the discussion, and re-frame with ‘agree to disagree’
  17. Monitor experiments for the next several iterations until you collectively feel (and your metrics show) you’ve reached the desired outcome
  18. Monitor experiments for any undesired impacts on other agile metrics.

Review Format

There are a number of different review formats that teams can apply to set the right tone and guide your team’s discussion towards great outcomes.  You can find a host at, or in Atlassian’s playbook.  These can be vales based or ‘well/not so well/do differently’ as an example. All of these are good and helpful.  As a team, we like to keep the reflective process fresh by adopting different formats after a period of time (usually every quarter).

Note however that these formats defer to a qualitative based discussion.  Make sure that a portion of your time is allocated to a qualitative discussion and that an equal amount of time is focussed on reviewing data-driven insights from your agile metrics dashboard.  This will provide the concrete facts that underpin your team’s performance and lead to a stronger conversation about how you continue to level-up.

Retro Agenda

The agenda we adopt, and the supporting dashboard we use includes:

Team Principles (5 minutes)

  1. Team’s Product Vision
  2. Ground Rules
  3. Definition of Done

Sprint Practices (20 mins)

  1. Sprint Goals
  2. Items completed and incomplete
  3. Sprint metrics and mid-range trends

Review (20 mins)

Well/not so well do differently

Actions (15 mins)

  1. Past sprint’s actions
  2. New sprint’s actions


Benefits from embracing data-driven retros

By embedding data-driven ways of working into your retros, you’ll be building a transparent and adaptive team culture.

Data-driven retros will also help promote a culture of ownership, accountability and self-directed learning for your team’s growth in all aspects of their agile practice and process.

Remember if you’re taking data into your retros for the first time, you might actually be changing team behaviours.  Go gently and build a repeatable habit of presenting the insights from your data every retro so the team reduces any resilience.  It will be much easier to embrace the use of data in retros as familiarity builds.

Finally, in the spirit of fast feedback loops, avoid waiting for the next retro to look for improvements.  Your data should be available in real-time.  Why not retrospect in real-time?


Umano is on a mission to support agile delivery teams perform at their best through automated metrics, reporting and insights.

Sign up here to access your complimentary Umano account and see how your team’s agile sprint practices are tracking.


Shoutout to Mael BALLAND for the hero image.