Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Why velocity burndowns can be poor predictors for agile projects

public://pictures/3b950fe.jpg
Johanna Rothman President, Rothman Consulting Group, Inc.
Measuring tape
 

Does your agile team review its velocity burndown to predict what it can do in the next iteration? Many teams do, and sometimes they encounter problems.

This happens for a variety of reasons: The stories are not roughly the same size, managers don’t know what velocity really means, and velocity doesn’t show the true project state. Fortunately, there's another way to solve the problem.

 

Velocity is a capacity measure

One of the problems with velocity is that it is a measure of capacity, but it sounds as if it measures speed.

When teams calculate velocity (either as a burndown or a burnup), they measure the sum total of what they do for a given iteration. That total is a rough approximation of capacity.  

I call it a rough approximation because, unless your team works on one-day stories, it has variations in the number of points a story is worth. If your stories are small, the variation doesn’t matter so much. But when the variation is large, teams encounter problems.

One team that had this problem used relative estimation of story points for deciding on story sizes. It worked with its product owner to create these relatively small stories. It had a pretty stable velocity of about 40 points per iteration.

Then the product owner changed roles in the company, and a new product owner, Dan, started to work with the team. Dan had to wind down his work with his previous team, which took several weeks, and meanwhile, he had to be a product owner for two teams, working on totally different products.

Since Dan was new to this product, he didn’t know how to make the stories as small as the previous product owner had done. And he didn’t have the time to workshop stories with both teams. So Dan’s stories were larger than the team was used to.

Instead of stories of sizes 1, 2, or 3 (in a Fibonacci sequence), Dan’s stories ranged from 5 to 20 points. The team had trouble using velocity as a measure because story sizes had changed so much. Even if it accepted only four 5-point stories (half of its usual 40 points per iteration), the team had trouble finishing the work.

That’s because the estimates were not precise. They were accurate, but not precise. The estimates weren’t precise because the unit of measure was not the same. The estimation variation was too large.

The team realized that its velocity was no longer stable and that it could not use it as a predictor of project progress. So it started to use other measures. Here's how it worked out.

Show working product

The team started to demo every day, to each other and to Dan, to show its progress. The demos had a couple of terrific side effects: Dan had to be present and accept or reject stories, and he learned about the team's product much faster than he otherwise would have.

Demos are terrific if the parties that need to see them do so. Unfortunately, the management team was not willing to do so. Instead, the team created a product dashboard so managers could see what the team completed and what still remained to be done.

Show the product backlog burnup chart

One of the team’s managers was famous for asking, “Are you done yet?” When the team showed the manager the product backlog burnup chart, the manager had more questions.  

This chart helped the manager realize that as long as people kept adding features, the team would not be done. Yes, the team made progress, but someone kept adding features that kept the finish line out of sight. It turned out that it was several someones—Dan was not too sure of himself for this particular product and took too many requests for the project.

Show the feature growth chart

The team thought that was a great chart, but a different manager wanted to know how much the team had completed compared to all the work. The team used a product feature chart to show that manager the overall feature growth (the green line), the progress against the features (the red line), and the remaining features (the blue line).  

Clearly, the “Features Complete” line doesn’t show the entire picture. 

You might be wondering how to count the features here and in the product backlog burnup chart. I count the number of stories. That’s it. The reason I count the stories is that customers only buy or use stories. Customers don’t use points. Points are activity. Stories have value.

Show cumulative flow

I like to know whether anyone on the team is ahead of or behind the rest of the team. Cumulative flow can show you that.  

This chart shows that the developers and testers are staying close to finishing work without too much delay between development and testing. However, there is a fair amount of work in the "ready" and "analysis" states. This particular team had a product owner who liked to have a "ready" backlog that was several months ahead of the team. One of the problems with that was that the team discussed work (in "analysis") long before it ever took any of that work. Once the team and the product owner saw this chart, the product owner realized that his prep might not be what the team needed.

Show multitasking requests

Sometimes, teams have trouble making progress because someone asks a team member to do something on another project. That was certainly the case for Dan. I have found that when a team can create a multitasking chart something like this one, the managers understand why the team is not making the progress everyone expected.

Show what’s done but not released

Sometimes, teams release partially complete work, often behind flags. Sometimes, the team waits for customer feedback. Sometimes, the team has to wait for another team to deploy the work. Sometimes, it’s not safe for a team to release partial work—the customers need the entire feature set. In that case, it’s useful to see the state of all the work.     

This particular team has a lot of work at the “waiting for release” status. Is it literally work in progress? It is. The WIP might not be under this team’s control, and it’s useful to know what the work is and why it’s not released.

Velocity is personal and can change

Velocity is personal to a team. A team’s velocity is based on its familiarity with the code, the amount of automated (and useful) testing the team has to support its work, and the story size. If the stories become too large, the team’s estimation variance will show itself in a varying velocity.

If you need to show your project’s progress, use other measures. I’ve never been a fan of traffic lights, because they don’t have enough granularity to show the real project state. And the cynical part of me says that all projects are "red" until proved otherwise.

You can show your project’s progress with a product backlog burnup chart and a feature chart. A team can monitor its cumulative flow to make sure the team is in that sweet spot of looking just far enough ahead and making sure no one is behind. And when you report multitasking data and what’s not released to your management team, it can help you remove obstacles that prevent you from finishing your project.

Velocity is a useful measure of capacity. Use other data to show your project’s progress.

All images are from Johanna’s upcoming book: Create Your Successful Agile Project: Collaborate, Measure, Estimate, Deliver.

 

Keep learning

Read more articles about: App Dev & TestingAgile