Performance Measurement: When Good Data Goes Bad


Were you thrilled with the new lower price you got by waiting for your iPhone… until you found out the hot new G3 will be coming out for the lowest price yet of only $199? Have you ever rushed to finish your dinner and join a snaking line in front of the movie theater, only to find there is a second, less-crowded showing of your movie starting 20 minutes later? What about that great sale on hot water heaters you saw – the week before yours gave out? Being able to make good decisions that take advantage of opportunities is about more than getting the right kind of information. What value is good data at the wrong time?

Designing your performance measurement system to provide good data with meaningful context at the right time does more than simply providing great information. It makes the information useful and easy to absorb. Now, what do we mean by context?

  • When do you need to see the information to identify opportunities and make changes?
  • How do you need to see the information? Do other elements need to accompany your data to make it meaningful?

So how can you keep your good data from going bad? Consider the following five questions.

  1. How should the information be grouped?

Chunking information around a few basic concepts allows you to consider all of the information in context at the same time. This means you can spot patterns and relationships more readily, and develop better intuition about root causes and opportunities.Let’s test the power of ‘chunking’. Look, for as long as you like, at the letters below. Now look away, and see if you can write them all down flawlessly.

Did you spot the pattern? Did you wonder why three of those letters are included? Or are you still just trying to remember them all?

Now let’s see what happens when you have the same letters, but with context. To keep this exercise realistic, we’ve placed this image below.

Now are you able to look away and record the series flawlessly? In this example, most people find it much easier to remember the letters, which means now they can focus on the meaning instead.

Is your reporting actionable?

A report showing declining revenue doesn’t tell you

    • Whether revenue typically declines during this time of year
    • Where you thought revenue would be, compared to where it is
    • Whether any changes have occurred in marketing or sales that may have caused the shift
    • Whether the change is due to a particular customer, product, lower prices, or lower volumes
    • Whether the revenue decrease is offset by decreased costs.

If you chose to report the changes in profitability within the context of what’s happening in your top 10% of customers or products, however, you’d target the discussion to the few elements that make up 80% – 90% of the results. Add some trend and target lines, and your opportunities can become crystal clear, rather than clear as mud.

  1. Are you reflecting what’s meaningful? Or just what’s easy?

Did you know that the Yankee Group estimates 40% – 80% of new business leads are lost, not followed up, or otherwise dropped? How much business is that? Yet when you choose to measure specifics, you find it’s not always easy to measure net close rates attributable to a specific campaign, particularly when you haven’t described the context for meaning. If you find people are spending less time solving the problem than they are spending arguing about which is the right version of “the truth”, it’s time for a time-out.

Which “truth” you select usually isn’t nearly as important as simply selecting one, and insisting that the one you’ve chosen measures only the final collaborative outcome. It will transform the level of attention your teams give to resolving those communication and process issues.

Whether it’s collaboration, innovation, trust or expertise, don’t shy away from measuring more nebulous concepts that are crucial to your success. In most cases, the trend is more important than the absolute measurement. Provide the appropriate context to interpret and act prudently, and then measure away. Chances are, you’ll see significant improvements simply from providing a clear and public definition of what you value and the results you expect it will create.

  1. Are you distinguishing trends from facts, or just massaging the facts?

If you’re really measuring what matters, some measurements won’t be absolute and objective. For softer information, keep your presentation focused on trends, not exact values. At the same time, be alert for attempts to muddy the facts when results such as profitability aren’t favorable.

But if profit from top customers is dwindling, changing the overhead allocation will only distract resources, not increase profits. Be attentive to changes that can make your decision-making more effective, but stand firm against tweaking measurements that may change the look of trends, but not the facts.

  1. What other contextual elements will make your measures most valuable? And more meaningful?

We work in a wide variety of business types, and what provides context for one business is simply distracting chatter for another. Our advice: think about the decisions you can make about people, processes, products and customers.

For example, let’s say that you think you’ve started to see your sales pulling back, perhaps from a weakening economy. But your sales typically don’t occur in a straight line. So what you see when you look at your sales trend is this:

Are sales going up? Or down? In this example you may see a weekly pattern, but it’s difficult to see more of a trend than lows going lower, and highs rising higher. Without more context it’s difficult to understand what’s going on, and even more difficult too determine what you might want to do about it.

What happens if we add a little more context? Let’s try looking at it within the context of a weekly cycle.


This picture makes it easier to see what’s changing. With so many lines, however, it’s beginning to look like spaghetti. Imagine if you wanted to look back over the last quarter rather than just eight weeks. Would this chart be able to provide much context? Would it give you any intuitive sense of what might be the cause of those lower lows, or the opportunity you might be able to take advantage of with those higher highs?

Let’s see if a cycle plot can add the context we’re missing. Again, this will have the same data, but it’s now shown in a different context.


Now is it easier to tell what’s happening? Is your mind filling with ideas about what you might check? For example, is that new part-time sales person having an impact on sales? Or by Tuesday are you running out of a key product that typically arrives Wednesday morning? How about that earlier question: would last quarter data be just as easy to understand as these eight weeks? Might they even add more value?

Compare again the first and last pictures in this series. Can you see how even simple contextual changes such as this will help you make better decisions, faster, and less painfully?

Also consider the value you might gain by presenting the answers to some of your earlier questions as contextual elements in the same picture. For example, looking at coverage ratios such as average sales per person might reveal under-staffing on Tuesdays, while Mondays and Wednesdays might reveal the secret for just the right amount of coverage.

If you’re interested in finding more ideas for providing context to your performance measurement, try this list to jump-start your thinking:

    • Review Frequency. Should sales or production data be reviewed weekly or even daily? Or should you focus on staff training or a marketing campaign that will take a quarter or more to play out? Different elements are meaningful at different reporting frequencies.

Magazines have editorial calendars; we recommend your performance measurement system has a calendar too. Don’t waste time preparing and reviewing information just so the report looks the same each time. Your team will understand weekly, monthly, and quarterly schedules.

    • Is your information for learning or for monitoring? For example, are you split-testing a concept of marketing, management or financial investments to see which method works better? Are you learning whether a new idea or process is viable or more successful than the old way? Or are you monitoring for problems with service, product defects, or unintended consequences of the new sales compensation plan? Match your measurements and presentation to what you need to know.
    • Are relationships more important than measurements? If you need to see how differing inputs change results, don’t force yourself to use your imagination. Show it all in a single picture, then use your energy to figure out what to do about it instead.
    • What should you compare it to? Planned results, planned inputs, or what you forecast as results for that level of input? For example, if one 727 has 10% lower fuel efficiency for the same weight as another, will United want to pull that plane in to improve its performance? Or if your sales team has 1,800 leads, will you be more interested in knowing that you that you closed 215 of them, or that your close rate increased 7.5% over the typical 200 you’ve closed in the past?
    • Would it be more meaningful to show relationships to sister elements? In the sales lead example above, if you’ve closed 7.5% more sales, is it important also to show that sales shifted to products with lower profitability? In a changing economy, product preferences often change, which often changes your productivity and efficiency priorities, too.
    • Is it useful to show projected results? In sales, showing projected results from your current pipeline of opportunities can provide early-warning and manage the shift of work between building relationships and closing deals.

Vetting your measures for context can make the difference between a performance measurement system that’s interesting, and one that drives results. Will you make the investment to improve your results?

Letters with Context:

Now are you able to look away and write the series of letters flawlessly? Did you experience a sense of relief as you were able to let go of trying decrypt the message and focus on the meaning instead? And why is IBM the only for-profit on this list?

Note: We’d like to credit Naomi Robbins, Ph.D. for the cycle plot demonstration. See Naomi’s web site for a tutorial on creating your own cycle plots in Excel.