In my previous blog post about the “Real Danger of Quick-and-Dirty Programming“, I criticized some Agile practices such as measuring the Velocity and drawing Burndown charts. In this post I would like to extend and clarify this criticism, bringing also some very relevant references.
Here is the definition of Velocity from Wikipedia:
“Velocity is a capacity planning tool sometimes used in Agile software development. The velocity is calculated by counting the number of units of work completed in a certain interval, the length of which is determined at the start of the project. The main idea behind Velocity is to help teams estimate how much work they can complete in a given time period based on how quickly similar work was previously completed.”
In my opinion, the main problem with Velocity is when it becomes a goal by itself. When software development teams are focused on keeping their Velocity or increasing their Velocity, they will be ready to make sacrifices.
Having Velocity as a goal introduces trade-offs:
- Should we make it faster or should we make it better?
- Do we have enough time for Refactoring?
- Why not accumulate some Technical Debt to increase our Velocity?
In contrast to the Agile term, here is the definition of the Physical concept of velocity, also from Wikipedia:
“Velocity is a physical vector quantity; both magnitude and direction are needed to define it. The scalar absolute value (magnitude) of velocity is called ‘speed’, a quantity that is measured in metres per second (m/s) in the SI (metric) system.”
Thus apparently what we call Velocity in Agile should actually be called Speed, because there is no clear definition of the dimensions of “Direction”. Of course any User Story that is implemented should satisfy some basic quality attributes such as Correctness. But what about other desirable quality attributes, both functional and non-functional, such as low coupling, high cohesion, efficiency and robustness? Who is measuring them? Who is tracking them? Unfortunately I’ve never heard of an Agile team drawing charts to track the average Cyclomatic Complexity of their code.
When the only thing you are really measuring is your Speed, you will attempt anything to “make progress” as fast as possible. Anything that causes you to move slower may be considered a burden. Including improving your code. Including Refactoring. Including avoiding Technical Debt.
To conclude this article, I bring below two opinions about Velocity from people that we should listen to.
Sacrificing Quality For Predictability by Joshua Kerievsky
“Technical practices like test-driven development and refactoring are often the first things to be dropped when someone is trying to ‘make their estimate.’
We see this behavior emerge from the outset, starting in training: a team that completes all work at the end of a timebox (say, two hours) often does so by sacrificing quality.
For years, we addressed this common problem by helping students experience what it was like to deliver on commitments while also producing quality code.
We explained how great teams ultimately learned to maintain technical excellence while also sustaining a consistent velocity (the same number of story points completed per sprint).
Yet the fact is, such teams are rare and their consistency is often fleeting as their team size changes or other pressures take hold.
For too many years, I thought it was important to maintain a consistent velocity, since it was supposed to be helpful in planning and predicting when a product/system could be completed.
Yet I’ve seen too many teams sacrifice quality and face compounding technical debt solely to make or exceed their estimates and keep their burndown charts looking pretty.”
Velocity is Killing Agility by Jim Highsmith
Jim Highsmith is an executive consultant with ThoughtWorks and author of several books on Agile software development, including “Agile Software Development Ecosystems” and “Agile Project Management: Creating Innovative Products“. Here is an extract from his article “Velocity is Killing Agility!“:
“Over emphasis on velocity causes problems because of its wide used as a productivity measure. The proper use of velocity is as a calibration tool, a way to help do capacity-based planning, as Kent Beck describes in ‘Extreme Programming: Embrace Change‘. Productivity measures in general make little sense in knowledge work—but that’s fodder for another blog. Velocity is also a seductive measure because it’s easy to calculate. Even though story-points per iteration are calculated on the basis of releasable features, velocity at its core is about effort.
While Agile teams try to focus on delivering high value features, they get side-tracked by reporting on productivity. Time after time I hear about comments from managers or product owners, ‘Your velocity fell from 24 last iteration to 21 this time, what’s wrong? Let’s get it back up, or go even higher.’ In this scenario velocity has moved from a useful calibration tool (what is our capacity for the next iteration?) to a performance (productivity) measurement tool. This means that two things are short-changed: the quality of the customer experience (quantity of features over customer experience) and improving the delivery engine (technical quality).”
What do you think? Should we continue measuring our Velocity? Please share your opinions in the comments below.