Performance Measurement: Much More Than Meets The Eye


On Monday, July 14, eBay’s stock began the week at $27.49. On July 15, the front pages of newspapers announced eBay’s crucial legal win over Tiffany’s assertions of trademark infringement. And on July 16, the Wall Street Journal sent a subscriber alert: eBay had posted a 22% gain in second quarter earnings, and 13% growth in its auction business revenue, rather upbeat results in a gloomy market. That Friday, eBay’s stock closed at $23.48.

When eBay had a week of incredibly positive news, why did its stock drop by nearly 15%? Although eBay’s strong second quarter performance was received positively by the markets, for eBay itself, investors had a different context for interpreting its results. Investors were expecting more, and were uncertain about eBay’s future.

In fact, Apple had a similar reception to its quarterly earnings report the week of July 21. Though it posted revenue of $7.5 billion and profit of $1.1 billion, its stock dropped 5% on disappointment about the company’s financial forecast.

Now, we all understand the importance of context. For example, non-profit organizations know we are more likely to support research on a disease affecting 3 million Americans than one affecting only 1% of our population, even though this is the same number of people. They also understand that we’re much more likely to give if they contrast their requested donation with something a bit more frivolous, such as a weekly cup of coffee at Starbucks, or a family evening at the movies. Through arguments of context, we are all swayed.

Although we all understand the importance of context, very few organizations make conscious efforts to assure that performance measurement is presented in an appropriate context. Our minds have limited memory and processing capacity. Yet we continually choose, by default, to tax them even more — to remember data, relationships, and context. To the extent that you can relieve your mind of the need to process this information, your brain is free to focus on meaning and action. Instead of starting with what it is, you recognize what it means. You move to action.

If you’re interested in performance reporting with the context to move you and your team to profitable action, consider the following tips.

  1. Tables or Graphics? While visual design expert Edward Tufte now argues for data tables when you’re analyzing less than 100 values, our thinking aligns more closely with that of Stephen Few:Tables are great for looking up and comparing individual values. They also do the job when precision is needed. They fail miserably, however, when the point is to see relationships, other than between only two individual values at a time.

In our view, the goal of your presentation is to create meaning and illuminate action. If numerical tables might hide relationships, for heaven’s sake, use a graph. Consider this.

Do you see any meaningful patterns in these eight numbers? If you did, it probably took considerable concentration – and you knew it was there. What about when you don’t know to look for it?

Now take a look at this example.

Here, instead of wondering whether you’ve captured the meaning, you grasp it immediately. When Few uses this example in his classes, before seeing the graph, no more than one in 50 spot this fictitious data’s decline in job satisfaction for the one group.

Remember, you can’t expect everyone on your team to spot relevant relationships and patterns in the mountains of data they review every day. And you don’t need to. Which brings us to our first of two rules of thumb:

Rule of Thumb #1 If you need to look up or compare individual values, tables work best. But when you need to see patterns, exceptions or other relationships, use a graph – no matter how many numbers there are.

  1. Meaning and Distortion Remember the saying Mark Twain popularized?

There are three kinds of lies: lies, damned lies, and statistics.

While data and visuals can illuminate and inform, they can also distort and mis-inform. The presenter must be careful to avoid not only intentional bias, but unintentional bias. Take a look at the tables below.

Unless you’ve seen Shepard’s Tables before, it’s unlikely you realized these tables are the same size. In fact many people are so convinced that the table tops are of different length and width that they get out a ruler to check, before accepting that their brain has been fooled. (Still a skeptic? Check out the online demonstration here.)

In addition to perceptual challenges like those above, research by William Cleveland also affirms our poor judgment in gauging most angles and relative volume, leading Cleveland, Tufte, Few and other experts to decry the use of pie charts and their natural ability to mislead.

What are some of the other ways presentations in performance measurement reports often unintentionally distort or hide meaning?

    • Unequal or Uneven Scales. A few years ago the NEA publicized a graph showing the decline in reading skills. As you can see, the NEA’s graph (top) showed reading declines far more dramatic than those from a zero-based full scale (bottom).

These shortened, unequal or uneven scales are some of the most common, unintentional (or sometimes intentional) distortions we see in performance graphs. For those creating reporting in Excel this is especially common, because Excel’s defaults are set to automatically scale graphs.

Now, if your axis is not set to zero, carefully assess whether you’re informing or misinforming your reader. It’s not uncommon to see automatically-scaled. separate Excel graphs displaying events such as 2% and 8% drops in sales as if they were similar in proportion, or even showing significantly differing changes as similar in critically-linked factors, such as materials costs rising faster than margin growth.

When constructing your performance reports, the tendency is to work extremely hard on the visual for each piece of the presentation. When you’re done, however, remember to step back to assure that all of the scales continue to make sense in comparison to the whole, and the message.

    • Mind-Bending Altered Aspect Ratios. The human eye can judge a 45-degree angle with considerable accuracy, extensive research by William Cleveland shows. Accordingly most visualization experts advocate targeting aspect ratios that show your peaks and valleys at close to 45 degrees.

As an example, using annual trends from 1750-1924, Cleveland, demonstrated that sunspot data graphs using this aspect ratio clearly revealed the 11-year cycle.

With a second graph at a 0.55 aspect ratio, however, he also demonstrated that altering your aspect ratio can influence whether your graph reveals the secrets of its data. The cycle’s rapid rise, but extended decline, was not evident until this 0.55 ratio was used (bottom picture).

    • Whiz-Bang, Distracting Graphics. When developing visuals to enhance your performance reporting, it’s easy to become fascinated with glitzy or iconic presentations. But it takes time for the reader to process that distracting glitz and shine, and the attractive shading or graduated colors usually distort your perception. For example, in the picture to the right, although the center bar appears to darken from left to right, it’s actually the same color all the way across. Just as the bar’s background distorts its color, your tool’s fancy graduated color, shading, highlighting and other glitz will distort and distract from the meaning in your visuals. Glitz and glamour have their place. But when you’re looking for maximum meaning with minimal distortion and time, just say “no” and avoid them.
    • Color Confusion. Sometimes the colors chosen for graphs may be attractive, but instead of enhancing the information, they confuse or distract from it.

For example, if a graph is meant to contrast the performance of a single element to its group, such as your market share versus that of your external or internal competitors, the colors you select will influence how quickly you understand results.  In the first example here, the colors differentiate the regions, but they distract from the analysis more than they enhance it.

The second picture reveals the same losses, but the color change moves our eye and mind more quickly to the gaining and largest losing regions. Which graph did you feel you understood most quickly?

When choosing colors for your graphs you’ll also want to remember that about 10% of your male readers and 1% of your female readers are color blind. This means red-green or blue-yellow color-coding may be invisible to a significant portion of your audience. Try to avoid schemes that will be difficult for your audience to interpret.

If you’re having trouble selecting a color scheme, try the color-blind sensitive schemes at Color Brewer, which provides multiple color schemes for sequential (series), diverging (two ends of a scale) and qualitative data, with tips to help you decide which color scheme is right for you.

If you know that you have color blind members in your audience, you can also test your visuals to determine whether they’ll be effective for everyone at Vischeck.

Which brings us to our final Rule of Thumb for this issue.

Rule of Thumb #2 While you want your presentation to be attractive, be careful to weed out glitzy graphics, poor color choices, or perspective-altering elements that distort your data and hide its secrets. Simplicity and context reduce processing effort, creating faster, more accurate conclusions and quicker progress to action.

Stay tuned! Rules of Thumb #3 and #4 will be featured in our next issue of Grey Matter.