Optimizing your team’s velocity (of learning)

Posted on May 8, 2017.
Here is what people are saying about Sense & Respond:
“some great case studies and an extraordinarily clear case for the application of lean/agile methods across business. I’ll certainly be recommending Sense & Respond to every manager I know.”
Don’t have a copy yet?
Grab a copy here and buy one for your boss too.

This post was originally published to my newsletter subscribers (40k of them now). If you’d like to get these updates via email sign up here.

“Fail fast, fail often” or, as I prefer to say it, “learn fast, learn often” is the mantra du jour in organizations embracing digital transformation and hiring agile and lean practices. Yet, as we covered last week, if you’re not managing to outcomes, building in that learning work is going to be tough. There’s another reason for why teams struggle to build continuous learning cadences: they are being managed to increase the velocity of delivery.

The velocity of delivery is an Agile term used, incorrectly in many cases, as a measure of a team’s productivity. Different teams’ velocities (usually measured as points completed per sprint) are compared against each other as if there is some kind of defined standard for what a “point” is and what a “good” velocity should be (hint: there isn’t). With incentives tied to velocity improvement, teams optimize the production of software rather than the quality of the customer experience. Changing the incentive to outcomes, is one way to help alleviate this pressure however, many mature teams who consider outcomes as one definition of success, are still measuring velocity of delivery.

I propose to teams that can’t ween off velocity as a measure of success to add in a second velocity metric: the velocity of learning. In the same way the team is optimizing how much they’re delivering, the velocity of learning asks the team to measure how much discovery work is taking place in each sprint.

IMPORTANT: It’s critical to point out here that measuring how much discovery work is happening is a vanity metric*. While I don’t want to promote the use of vanity metrics (i.e., measuring activities that make us feel good but don’t actually tell us if we’re making progress), without some kind of incentive to do the discovery work the team will always optimize for delivery.

The unit of velocity of learning can be whatever makes sense to your team. For example, if you’re estimating with points (and I am NOT opening the estimation can of worms here), it can certainly be the number of points dedicated to discovery. If you’re counting completed stories, it can be the number of discovery or experiment stories the team completed during the iteration. Whatever your unit of measure may be, velocity of learning helps ensure that some kind of discovery work is taking place. Our goal, ultimately, is not to quantify the discovery work but to get the teams into the habit of doing it. Once this becomes a regular part of the teams’ daily practice, the metric can be removed.

Now, here’s the catch. Iterations are finite units of time (e.g., 1 week, 2 weeks, etc). There is only so much work of any kind that can fit into this defined time box. As your velocity of learning increases the velocity of delivery decreases. There is no way for one team to maintain a consistent velocity of delivery and start to build up their discovery activities. This is the tradeoff. Doing discovery work means doing LESS delivery work. Teams and their managers will need to reconcile that there is value in discovery work and that it’s worth trading that value in return for not doing some of the delivery. The most successful teams find the right balance between their discovery and delivery work and then use the learning to determine how best to proceed within each iteration.

Do you measure your learning velocity? If yes, how? If not, why not? Hit reply and let me know.

[Jeff]
@jboogie
jeff@gothelf.co

(*) Here’s why the velocity of learning is ultimately a vanity metric. Teams can run insightful, well-executed experiments. They can conduct customer interviews regularly. They can build remarkable prototypes. None of those activities actually guarantee the team will learn anything. The quantity of activities doesn’t mean the team is getting smarter nor putting that insight to good use. The best teams understand which discovery tools to use and when and, most importantly, how to leverage the data they collect. That is the ultimate measure of the value of discovery work but it’s much harder to quantify.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Book News

Sense & Respond continues its 100% 5-star review streak. Once you’ve had a chance to read it we’d be grateful for your reviews on Amazon.

Research Archive Made Public: Head on over here and find over 1000 links to primary and secondary research we collected over the 2 years we spent writing Sense & Respond.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Upcoming Events

“Dude. You’ve got the whole company on fire. Even one of the accountants said to me, “great speaker!” :)” — one of my workshop clients from last week.

Barcelona — May 31, 2017–1 day Lean UX in the Enterprise Public Workshop
This is my first time working in Barcelona. If you’re not in Spain, it’s a short hop from pretty much anywhere in Europe and you can’t beat the weather and food. Also, we’re 2/3 sold out. Don’t sleep on this one. Did I mention this was in Barcelona? You know you want to go.

New York City — June 20–21–2 Day Certified Scrum Product Owner Course with Jeff Patton
This is our 3rd time teaching this course in NYC. It has sold out weeks in advance both times. Don’t wait. Early bird tickets are now gone and we’re over halfway sold out.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

As always, if you want me to work directly with your company on training, coaching or workshops on the topics of organizational agility, digital transformation, product discovery and agile leadership, don’t hesitate to reach out.