Skip to main content

Transforming KPI effectiveness – part two

Published by , Senior Editor
Hydrocarbon Engineering,


This is part two of a two-part article. Part one is available here.

How deeper digitalisation can help

As discussed in part one of this article, ‘shallow’ digitalisation can move data around, and get KPIs displayed to the right people at the right time, however this type of digitalisation does not address the fact that the KPIs and targets themselves may be weak. A ‘deeper’ level of digitalisation, utilising digital twins and data analytics can address many of the pitfalls with KPI setting.

To illustrate the philosophy of digitally transformed KPIs, a furnace coil inlet temperature will be considered as an example of an energy KPI. The KPI is intended to indicate to maintenance or engineering if the preheat exchangers are performing as expected, and triggers actions such as exchanger cleaning. Some of the challenges associated with this KPI are:

  • The coil inlet temperature will decrease due to fouling in the exchangers, but is also affected by factors such as the type of feedstock, production rate and product mix. This means the KPI itself is ‘noisy’ and needs to be normalised.
  • The reduction in temperature implies heat exchangers need cleaning, but does not indicate which exchanger should be cleaned.
  • The temperature only reduces after fouling has occurred, so while the KPI is intended as a leading indicator it is really a lagging indicator of performance.
  • The KPI may suggest that cleaning is required, but this is still somewhat qualitative, and the action is competing with many other possible maintenance activities.

KBC proposes a more robust philosophy for managing the exchanger cleaning decision making process (Figure 2).


Figure 2. Statistical analytics built on a foundation of first principles. 

The first step in the process is data assurance. This eliminates the ‘correct input/output’ problem by monitoring the data quality and reconciling via a first principles digital twin. Using a physics-based model rather than purely statistical data reconciliation forces the data to obey physical and chemical laws such as the conservation of mass and energy and fluid behaviour. The digital twin can thus pick up data errors and to a certain extent patch-in missing data. The digital twin, in turn, generates a new set of KPIs around data quality (KBC refers to these as data quality parameters, [DQPs]) which are then used to trigger instrumentation checks or maintenance.

The second step is using engineering simulation in the digital twin to calculate higher value engineering parameters. Whereas the original KPI was a directly measured temperature, the digital twin calculates parameters such as fouling factor. The fouling factor normalises out the changes in flow rate, fluid type etc., and provides a pure indication of fouling, rather than an indicator muddied by external factors. In this example, the fouling factor is the higher value parameter, but similar examples for other applications include equipment efficiencies, distillation tray flooding, catalyst activities, etc.

The third step is automated simulations of alternate futures, in this case the technical impacts of cleaning different exchangers in the network. This changes the KPI from a vague indication that fouling has occurred to a detailed pinpoint of the impact of fouling in each exchanger. The simulation model also has KPIs governing its performance (KBC refers to these as model performance indicators [MPIs]) which are then used to trigger engineering checks to recalibrate and update the model when necessary.

The fourth step adds economics to the analysis, and calculates the cost/benefits of cleaning each exchanger. This economic analysis allows exchanger maintenance to be evaluated in a risk based work selection process alongside any other flagged maintenance activities.

The fifth step adds predictive analytics to predict the evolution of fouling in future, and thus indicate upcoming requirements for cleaning. The predictive analytics turns a lagging indicator into a leading indicator and allows action to be taken before losses are incurred.

The final step is adding artificial intelligence (AI) or knowledge base analytics that can generate correct advice taking in all elements of the situation and then provide clear actions to resolve it. This action may require human intervention or where the process is trusted then automated, such as automatic maintenance requests.

This work process adds a large amount of data processing and calculations, and generates a large amount of additional information. However, it does not increase the number of KPIs for frontline staff. This process is entirely automated, and the data and model performance KPIs only need to be addressed by exception. The frontline staff are presented with a simple dashboard, that has extremely clear actions (Figure 3).


Figure 3. Final analytics dashboard. 

This requires no expert training or implicit knowledge to interpret. This automated work process leverages many aspects of digitalisation:

  • Multiple data sources (process data, price data, engineering data) are aggregated together.
  • First principles process digital twin reconciles the data, simulates the scenarios and calculates the value of cleaning.
  • Data analytics predicts the future and changes the process from reactive to proactive.
  • Dashboards are used to visualise the end results in an accessible way.

KBC believes this type of digitalisation will greatly reduce the number of KPIs. Since decisions will be increasingly made by AI, and execution of those decisions will be automated, there is no longer any need to provide decision support KPIs for routine operation. Instead there will be KPIs monitoring the performance of the analytics and data to ensure the system is operating correctly.

Conclusion

KPIs are a key element to decision support in the energy and chemicals industry. Digitalisation can address many of the issues that have compromised KPI effectiveness in the past. In the future, routine operation will have very few KPIs due to the highly autonomous nature of operations, instead KPIs will be focused on identifying exceptions and anomalies. KPIs will be smart, with targets adapting to variable situations optimised by rigorous modelling and will be increasingly future facing – leveraging predictive analytics rather than retrospectively measuring what went wrong. This will require changes in culture and organisation to get the most out of the new information, and rigorous and robust technology to calculate the correct course of action.

This is part two of a two-part article. Part one is available here.

Written by Duncan Micklem, KBC (A Yokogawa Company), USA.

Read the article online at: https://www.hydrocarbonengineering.com/special-reports/21052019/transforming-kpi-effectiveness-part-two/

You might also like

 
 

Embed article link: (copy the HTML code below):


 

This article has been tagged under the following:

Downstream news