When good data go bad.

Bob Lutz, longtime car guy who held senior leadership positions at GM, BMW, Ford, and Chrysler, tells a story about the feedback that David Davis, an auto industry expert, received on a speech he delivered at GM:

Sometime in the early ’80s, he’d accepted a gig as speaker to a large group of GM executives. The speech appeared to go well, and the applause felt genuine. David went home pleased and thought no more about it until he received the following letter:

Dear David:

You asked for feedback on your remarks at our recent conference. The data is just now available.

The rating scale was zero to ten with ten being “best.” The five non-GM speakers had scores ranging from zero to ten. Yours ranged from three to ten. The five “outside speakers’” average scores ranged from 5.25 to 8.25.

Your average was 7.35.

Two speakers had higher scores than yours. Your standard deviation from the mean was 1.719 and ranked second among the variances, showing that most people had a similar opinion about your remarks.

I personally enjoyed your remarks very much. Your refreshing candor, coupled with your broad understanding of people, product, and the market, gave us exactly what we asked you for—”widened competitive awareness.”

Thank you for your participation.

Absurd, right? Hopefully you didn’t snort the milk from your Cheerios out your nose as you read this. It’s a miracle that GM survived as long as it did with this kind of bureaucratic plaque clogging its organizational arteries.

But before you sprain your shoulder patting yourself on your own back for how much smarter you and your company are than big, stupid GM, think about the birth of that colossal dysfunction. At some point, a diligent, well-meaning employee—or her manager—probably wanted to help improve the quality of presentations. And he probably read in business school that what matters gets measured, so he created a simple 10-point rating scale.

[Stop here. Does your organization use one of these scales to evaluate speakers, or training sessions, or the selection of deli meats in the company cafeteria?]

It’s a short—very short—step from a 10-point rating of an individual event, to a comparison of multiple events. And an even shorter step from that comparison to a deeper, more thorough statistical analysis, replete with r2-values and more Greek letters than you’ve seen since your last purchase of foreign yogurt.

Organizations, and individuals within organizations, drive themselves to the land of absurdity all the time because they don’t ask the first question that lean thinkers focus on: What is customer value?

Learning that a speech was well received with a score of 7.35 out of 10 is valuable, important, and worthwhile for the customers (in this case, the speaker and the people who invited the speaker). The other data, not so much. The Outside Speaker Effective Analysis Group could have identified that value by simply (gasp!) asking the customers what information would be helpful for them. Hell, there probably wouldn’t even be a need for an Outside Speaker Effective Analysis Group in the first place had GM focused on this question.

In my mind, this is where traditional approaches to productivity go wrong. These approaches focus on improving the efficiency of producing these reports without considering whether or not they should be produced in the first place. The lean approach—first, identify the value—is, to me, a far better way to operate. And once you’ve identified the value, you can apply the lean tool of 5S to the information: sort the value from the waste, set it in order, systematize the delivery of the information, etc.

Of course, the waste from not focusing on customer value isn’t always as obvious as having an Outside Speaker Effective Analysis Group (The existence of a department like that is pretty much a dead giveaway.) Sometimes it’s subtler, like having the IT department generate dozens, or even hundreds, of reports per week, most of which go unread (as happened at one of my old employers).

Unless you continually evaluate your own generation of data, reports, and statistics, you run the risk of becoming the punch line to a joke and an object lesson in making good data go bad.

2 thoughts on “When good data go bad.

  1. Pingback: Management Improvement Carnival #135 » Curious Cat Management Improvement Blog

  2. Pingback: First, think about the purpose. |

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>