BY ART VAN BODEGRAVEN AND
KENNETH B. ACKERMAN
basictraining
Dashing through the snow, sun,
surf, sand—whatever
THIS IS NOT ABOUT—SURPRISE, SURPRISE—A ONE-HORSE
open sleigh. It is about being able to show how good, or not-so-good, you have been in your supply chain operations.
Dashboarding, it seems, has become a four-season adventure.
Many, many organizations use “dashboards” to show how they are
doing. Publication might be daily, weekly, or monthly, with monthly
being the most frequently used at high levels, and daily being used to
give rapid feedback on high-velocity volume movement, among
other applications. Sometimes, daily visuals are posted as indicators
of performance of lower-volume, but higher-cost-impact operations,
and/or to provide recurring stimuli for performance motivation.
In theory, dashboards provide mission-critical information about
factors that are core drivers of organizational performance. In practice, they are overloaded with trivia, abused in their
motivational role, and simply too much for anyone—
CEOs or order pickers—to process and internalize.
THE USUAL SUSPECTS
Who has led us to a state in which a perfectly good
idea has been rendered less-than-useful, and even
misleading? We nominate an unholy alliance between
accountants and engineers. It’s not because they are
bad people; they are often just wired differently from
doers and decision-makers. They can get a little compulsive about filling in all the blanks or every cell in a
logical matrix, no matter the relevance or the potential usefulness.
It is far better, for example, to know at a glance that inventory
turns are approximately “x” than it is to be overwhelmed with half a
dozen different turn performance indicators for six categories of
inventory. It is useful to see that a weighted average productivity
value is “y” instead of having to sort out individual productivity
numbers for a couple dozen specific tasks.
We will grant that being able to examine the next level of detail
can be important when the singular metric begins to get out of
whack, but it is the clarity and simplicity of the singular measure
that triggers an investiga-tion into specific problem (or success) categories. And a laundry list of lower-level “dashboard” data is likely
to receive less attention than a highly visible single indicator of a key
driver of business success.
And these are even fewer today than they were two
or three decades ago. So, the speedometer, the gas
gauge, and tire pressure indicators are vital.
Warnings about engine temperature, oil pressure,
and diverse on-board computer problems show up
when a “check engine” light goes on.
The tachometer is only important if you are
driving your Chevy Aveo four-banger around the
track at Daytona, but the marketing team loves the
concept. Still, there is no reason on earth to make
the auto dashboard as complex as a 747’s cockpit.
Yet we try, and the information available in contemporary information systems gives us plenty of
trivia to drool over. Our Dutch
colleague once developed a warehouse performance reporting system that presented pages of “
dashboard” dials. The idea was attractive, in that the icons were all little
dials with red, yellow, and green
segments based on performance
history, targets, or imperatives, as
the case might be. Each of the
dozens of “key” indicators could
be clicked on to reveal another
level of detail. An analyst’s dream, as in this case,
can become a manager’s nightmare.
So, developers of effective dashboards need to
spend a lot of front-end time in deciding which are
the right balls to be watching and which need to be
let go for another day.
TO ILLUSTRATE THE CASE
The classic, even trite, analogy is comparing performance dashboards with automobile dashboards. Think about it. There are only
a few things you really need to know about how the car is doing.
WATCHING THE BALLS THE RIGHT WAY
Some analysts, less inclined to focus on every little
nit, maintain that longer-term and aggregate metrics are more indicative of what is really going on
over time. We will cheerfully admit that it is easier
to look at a chart of rolling 12-month trailing data
to get a clear picture of the aggregate effects of past
events. Certainly, it’s much easier than trying to
interpret a chart of specific data points over time,
with wild fluctuations and unexplained dips, and
snuffles and sneezes.
But the argument misses the point. Dashboards