Blog Series: Getting Ready for Smart Manufacturing

Check out our blog series to learn how other customers are simplifying access to data, improving plant performance and reducing costs.

September 09, 2016 by Robert Golightly / 0 Comment
Robert GoLightly

How KPI Hierarchies Solve the Performance Visibility Problem

Let’s talk about Key Performance Indicator (KPI) hierarchies. KPIs have become not only an institutional buzzword, but the way in each individual's performance is evaluated within an organization. In the same way that our employment is tied to meeting metrics, so too are the hierarchy of KPIs within any manufacturing environment. I will posit the following thesis: There is a hierarchy of KPIs within any manufacturing concern that once recognized, can be aligned throughout the layers of the organization such that good performance of lower level KPIs will most likely produce the same with the highest level KPIs. This Druckeresque pronouncement sounds reasonable as a theory. The question is: How does one put this into practice?

Effective KPIs and KPI hierarchies are one of 6 Keys to Unlocking Operational Excellence, along with data management, rapid problem solving, effective KPIs and KPI hierarchies, collaboration, IT/OT strategy and holistic process monitoring. According to one online source, a KPI is defined as a type of performance measurement that organizations use to measure the success of an activity in which it engages. KPIs have the following characteristics: 

Three things become immediately clear:

1) A KPI must be numerically quantifiable.  In many cases this is expressed as a Score = 100*(value/target).  
2) There is some value that can either be measured directly or computed from direct measurements.
3) There exists some target with which this measurement can be compared.

The challenge is that KPIs are often lagging measures that don’t provide actionable information and can be numerous and conflicting. Furthermore, it can be difficult to ascertain root causes through KPI hierarchies, and the information on KPIs is not always in a format that makes it easy for all relevant stakeholders to collaborate. Jeffrey Kluger, in his book “Simplexity,” gives several excellent examples of how the most simple of concepts often lead to highly complex systems in practice.  KPIs are no different. Here are some examples:

Gasoline KPIs
Consider the KPI calculation. The Score calculated as 100 times the ratio of value to target seems very rational when the KPI is something like Octane level when blending to make regular gasoline. Too low, and it is unsalable or must be sold as a lower grade.  Too high, and the blender is losing value (giveaway).  I will refer to this variety of KPI as Type I – it has both upper and lower boundaries to the Score.  

Margin KPIs
Margin is another KPI – one could say that it is the ultimate KPI and the one to which all others are subordinate.  Margin effectively knows no upper restriction, and thus any boundaries to that effect are superfluous.  Scores in excess of 100 are not only permissible, but encouraged.  Although continually overachieving will likely have the target changed by management.  I will refer to this as Type IIa.  There is the converse situation, which is illustrated by a KPI for contaminant levels.  Here there are no lower boundaries, and scores below 100 are encouraged, especially if a premium can be charged.  This will be Type IIb.  (There is one other type of KPI that I refer to as Type 0 (Zero), which deals with EH&S events.  This will be treated as its own sub-topic at a later point.)

The next series of complications is with respect to time.  More specifically frequency and time horizons.  Dealing with the frequency issue first, there are two questions:

1) How frequently does a KPI need to be reported or updated?
2) What is the highest frequency that a KPI can be practically be computed?

The answer to the first question depends on the audience or information consumer. The rule of thumb would be the closer one is to the manufacturing process, the more frequent the report or update.  A vice president is likely to be happy with weekly reports on production. An operator will want to know the KPI for a catalytic cracker to within a minute. Which leads to the second question of practical frequency.  If a KPI relies on a measurement that can only be provided once an hour, then the evaluation of a KPI every minute is unlikely to be very valuable.  It is accurate once every 60 evaluations and loses accuracy with each successive evaluation of the KPI. (There are ways around this using inferential sensors, but the example serves only to illustrate a point.)

The second issue is time horizon. Time horizons can range from instantaneous to quarterly roll-ups. Again, the answer depends on how close the information consumer is to the manufacturing process. Operators will want instantaneous values, while a plant manager is interested in daily or shift roll-ups.

The bottom line is that you can’t improve what you don’t measure. Measurement is not easy, and data must be collected in a way that is easy to reference and share across the organization for improved decision making. We are dependent on KPIs that are mostly inferential measures. What is needed is to approach to the design of KPI networks with the same engineering excellence applied to plant instrumentation and control systems.

In the next installment, I will discuss the issues around the measurement and computation of the value part of the KPI. 


Thank you for leaving a comment

Your comment will appear once it's been approved by a moderator.

If you weren't automatically redirect, click here

  • Robert GoLightly
    December 06, 2016 by Robert Golightly / 0 Comment

    Who Needs Another IT/OT Headache

    With all of the challenges facing IT/OT organizations, spending time on desktop software upgrades is not top of the list. Yeah, it has to be done– but, wait, that’s not true.

  • Robert GoLightly
    November 04, 2016 by Robert Golightly / 0 Comment

    Visualization and Analysis for Lean Daily Management

    We need to know the critical things that must be monitored, responded to and continuously improved – but we also need to understand what to ignore.

  • Robert GoLightly
    October 05, 2016 by Robert Golightly / 0 Comment

    Integrating Batch Processing in a Continuous Processing Culture

    There’s a lot to be done to successfully integrate batch into your day-to-day operations and the stakes are high.

  • Robert GoLightly
    August 12, 2016 by Robert Golightly / 0 Comment

    Git ‘er Done: Better, Faster Decisions Drive Operational Excellence

    Are we working with the wrong data? Are we thinking the right way about using it to compete? Can we make fact-based decisions in the needed timeframe?

  • Robert GoLightly
    July 08, 2016 by Robert Golightly / 0 Comment

    Process Data Management: It’s as exciting as doing the laundry

    I guess everyone has that chore they hate doing. Mine is laundry.