A lot of developers often are opposed to the idea of capturing metrics and being compared to others both on the team and to other teams. However, metrics are an important part of the whole process of continuous improvement. If you want to really know if you are improving, you got to have some sort of indicator telling you how you are doing. You also can’t escape the fact that management also want some metrics to make informed decisions.
A good metric first of all needs to be simple. Not just simple to understand, but also simple to collect. If a developer has to do a tedious manual process for collecting data for your metrics, it probably is going to do more harm than good. As far as it’s possible – find a way to automate data collecting. For most teams there’s a good chance your agile project management tool already is capable of capturing the data needed and maybe it even tracks the metric you are looking for already. If the metric isn’t tracked in the tool you could probably still query the data collected through the tools API. Having a decent tool could be important for making metrics simple.
Any metric you choose should also be comparable. This means the metric should be comparable with itself over time. Consistency is important. A metric that isn’t comparable with itself over time isn’t going to help you out much, so don’t bother collecting data for such metrics. An important aspect of metrics is that we aren’t necessarily focusing on a single data point, but rather the trend of that metric. The code coverage f.ex doesn’t provide that much value if you only measure it once. Yes, you get a percentage value, but you don’t know if that is going up or down – and that is what we want. We want the trends – and to be able to see trends we need metrics that are comparable to itself over time. Having said that it’s important to avoid common pitfalls such as comparing velocity across teams. Yes, velocity is comparable to itself over time – but that is within the team. Don’t compare velocity across teams, just don’t do it, please. Try to avoid comparing metrics across teams altogether. Look at trends for the individual teams as indicators for any potential problem areas instead.
A good metric must provide information that can be acted on – that is, it should be actionable. A clear plan of action and causality relation is a key element for successful metrics tracking. A metric that merely measures finite or completed actions, not ongoing activity, is not that interesting. Metrics should, as mentioned, be regarded as a trend and must trigger appropriate actions. The issue with many data points is that they are usually lagging indicators, in other words, they show what happened in the past. A metric that, when analyzed, can forecast the direction actions should take in the future is called a leading indicator.
Another key thing to take into consideration is that management will both track and use the metrics. Metrics in this respect should be honest, or said differently; you should be honest about how management uses metrics. The metrics need to be aligned with the business, in that when a metric trends upward (if thats the way you want it to go), the business value is also moving the right direction. Finding metrics that are truly honest can actually be really hard. F.ex, using lines of code to measure productivity. LOC is a terrible indicator of productivity because it is really easy to throw in extra lines of code if individual performance is measured on how many lines of code the developer is writing. Code coverage is also easily manipulated by not putting in the assert statement. Velocity as well – a team under pressure to “work faster” could easily inflate their story point estimations to please management. Finding truly honest metrics will prove to be hard if the metrics are used the wrong way by management. If metrics are used as probes and indicators of pain-points for teams as a starting point for team discussions, the need to manipulate metrics probably goes away. This means management has to realize the limitations of metrics, which might be easier said than done, right?
So which metrics should you use? I don’t really need to go to deep into that seeing as Ron Jeffries already has done an excellent job of describing his opinions on metrics. Just remember, you get what you measure.