Summary

In this article, I describe a User Story Maturity Model to help teams assess their usage of the practice and identify areas for improvement. You won't find this anywhere else, because this model is a Charles Bradley original. I base this model on the most credible User Story literature as well as on my experiences coaching Scrum Teams. The model's best practices are listed in a particular order that represents a strategic path for teams looking to advance their User Story Maturity.

Disclaimers!

  • Modeling is not an exact science. That's why it's called a "model." As such, the purpose of the model is to help teams assess and improve. In particular, any poor faith or poorly executed attempt at implementing a best practice(or role) could have bad consequences.
  • The model is not intended as a predictor of success, by any means. It is simply meant as a way for teams to assess and improve their usage of the practice.
    • In particular, if the PO is not doing a good job prioritizing or gathering requirements, then the team might suffer really bad consequences even though their User Story Maturity is high.
    • On the flip side, if the developers are not good at delivering stories that pass the story tests and/or getting PO signoff, then the team might suffer really bad consequences even though their User Story Maturity is high.
    • I'm sure numerous other examples of this exist, so be forewarned!
  • Topics specifically not covered yet by the model (generally because they are not easy to self assess)
    • Prioritizing User Stories for value or ROI
    • Estimation of story size (Story Points, Ideal Days, Counting Stories, etc)
    • Velocity


The Model

Points

Best Practice

3*
3 Components and 2 Must Haves
2
No Stories more than 5 days in size**
2
Constant PO interaction
1
PO 100% Allocated
1
PO Co-located (Talking Distance)
2
Weekly Story Grooming
2
Story Tests Defined before Development Begins
1
Immediate Story Signoff
2
All Stories Sized to 2-3 days
4
Story Tests 90+% automated
(for all current stories)
* This best practice is required.
** This is more of a "good practice" than a best practice. More details below.

Notes

  • No Partial Credit
  • While I use the Scrum term "Product Owner", this term translates pretty directly to other processes and concepts. Other names for essentially the same role with respect to User Stories: Onsite Customer (XP), User Proxy, Product Manager, etc.
  • Common Sense Rule: Since no team is perfect, if you feel like your team is exhibiting the best practice 90+% of the time, go ahead and give your team the points for that practice.
  • Note the importance of the Product Owner's availability in terms of the model. The availability of the PO to the dev team is crucial in the User Story practice.
  • The best practices are listed in the model above in the rough order in which I believe teams should attempt the practices. If a team is unable to execute or make progress on a practice, then the team should move on to the next practice and focus on attaining the level of the next practice.
    • For instance, if you can't get your PO 100% allocated, then move on to trying to get the PO co-located. If that doesn't work, then work on having Weekly Story Grooming, and so on.

Maturity

Score

Team User Story Maturity

5-7
Beginning Team
8-15
Intermediate Team
16-17
Advanced Team
18-20
Expert Team

Notes

  • Note that, in order to even be a beginning team, you have to be exhibiting the "3 Components and 2 Must Haves" and at least 2 more points worth of Best Practices. This is due to my belief that you are not even doing User Stories if you don't have at least the required practice and 5 total points. Anything less is a really poor attempt at User Stories, which is not giving the practice a fair shake.

The Best Practices

Best Practice

Description

3 Components and
2 Must Haves
This practice is a required practice. All stories have the 3 Components (Card, Conversation, Confirmations) and 2 Must Have Characteristics(Direct Value to External Stakeholder + Story describes change in the System Under Development). Confirmations are also known as Story Tests.
No Stories more than 5 days in size
This isn't really a best practice. It's a good practice that is a stepping stone to getting to the best practice(Stories 2-3 days in size). I include it here to demonstrate that this good practice is a good first goal in terms of keeping stories small. The 5 days here means it would take a person(or pair, if you're pair programming) 5 days to fully implement, fully test, and get the story accepted by the PO. Said another way, 5 ideal person days(or 5 ideal pair days if you're pair programming).
Constant PO interaction
The Product Owner should be constantly interacting with the development team, with the only possible exception of when the PO is working with external customers and stakeholders to gather and prioritize stories. Even then, it would probably benefit all involved if the PO brought someone from the dev team as a (mostly silent) observer of those interactions with external stakeholders. As a rule of thumb, the PO should be interacting with one or more developers at least once an hour or so, and hopefully numerous times every hour.
PO 100% Allocated
The person playing the Product Owner type role spends 100% of their time playing that role for that one team.
PO Co-located (Talking Distance)
The person playing the Product Owner type role sits within talking distance of the development team.
Weekly Story Grooming Meeting
A minimum of 1-2 hours each week is set aside for a meeting to groom User Stories. Your team may need more than that, but having at least 1-2 hours set aside is a best practice. This typically means time to have conversations, create Story Tests, estimate the size of stories, and splitting the stories when necessary. The entire team participates in this activity. Note that, while this time is set aside specifically for the entire team to groom stories, all user stories are constantly being groomed by the entire team as necessary. Also, not all story grooming has to involve the whole team. In other words, don't just assume that the weekly story grooming meeting is the only time the team will spend grooming user stories in a given week.
Story Tests Defined before Development Begins(ATDD)
The team defines story tests upfront before code development on the story begins. Note here that we're speaking of "conceptual" tests, or test confirmations. Defining story tests upfront is consistent with Acceptance Test Driven Design, because story tests are essentially acceptance tests. Automating the story tests is a whole different practice, so don't confuse defining story tests with automating story tests. Improving how to capture story definitions is sometimes aided by the knowledge of various story testing styles.
Immediate Story Signoff
Stories are "accepted" or signed off on by the Product Owner as soon as the dev team thinks they are complete. Say, within 1 day at the minimum, and preferably within one hour.
All Stories Sized to 2-3 days
All stories are sized to 2-3 days. This means all stories more than 3 days are split into 2-3 day stories. This also means that any stories less than 2 days are combined with other stories to keep the size to a fairly constant size of 2-3 days. This practice requires pretty advanced story splitting skills, as well as the ability to keep an eye on the "bigger picture" as you aggregate several small related stories(sometimes called a Theme) into a logical and shippable feature. The 2-3 days here means it would take a person(or pair, if you're pair programming) 2-3 days to fully implement, fully test, and get the story accepted by the PO. Said another way, 2-3 ideal person days(or 2-3 ideal pair days if you're pair programming). As an aside, some teams abandon using Story Points once they get to this point because all stories are very close in size. Instead of estimating stories, they just count the number of stories, and use this count to measure and forecast velocity.
Story Tests 90+% automated
(for all current stories)
90+% of all story tests for current stories are automated. This can be difficult for many teams to get to, but it is possible. Note that we don't say how they're automated here, just that they are automated. The best method of automation will depend on a lot of team specific factors.

Related Articles