Big Data, Predictive Analytics and Reliability – Moving Beyond Better Maintenance

By Gary West posted 04-25-2019 10:06 AM


There can be no doubt that the Industrial Internet of Things (IIoT) and big data are reshaping how we think about maintenance and reliability. Often the push to capitalize on these technologies is driven by IT departments, who are happy to invest significant effort and funding to build the ‘data lake’ and then go looking for ‘swimmers.'

In this two-part article, we will consider specifically how the abundance of accessible and consistent data, combined with applications to support visualisation, modelling, analysis and machine learning, are supporting the four key pillars of modern reliability engineering, namely:

  • Reliability by Design
  • Operating for Reliability
  • Maintenance Tactics Optimisation
  • Defect Elimination

Big Data, Predictive Analytics and Maintenance

Much of the conversation surrounding IIoT, big data and predictive analytics has been in relation to the potential these developments offer to more timely and targeted maintenance. Most of today’s literature focuses on the progression from diagnostics to prognostics to prescriptive maintenance. In the diagnostics domain, businesses are making use of new sensor and analytics technology to enhance condition monitoring programs. They are using these technologies to predict imminent failure earlier to better plan maintenance, or they are using improved technologies to more closely track failure progression and fine tune the timing of a maintenance intervention. 

In the prognostics domain, real time sensor data is combined with understanding of overall asset condition and failure characteristics to not only detect an imminent failure, but also to make a determination of remaining life.

The prescriptive domain is the ‘ultimate’ aim for today.  Here we combine understanding of the failure and condition of the asset with an understanding of consequences of failure to determine what to do and when to do it.  At its simplest, this could be to trigger a fully populated work order for a maintenance intervention, but the prescriptive domain can mean so much more.  Imagine being able to trade off failure risk, production options and maintenance costs on a real time basis to make the best business decision about whether to intervene, when to intervene and what action to take -- continually refining this decision as the failure progresses.

As exciting as these advances seem, the focus on optimising maintenance decisions is only half the puzzle.  The other half is how these emerging technologies and application can support the elimination of failures in the first place.    

Changing the game for reliability

So, what is it that the IIoT, big data and analytics bring to the table to enhance equipment reliability? 

Firstly, let's explore they key terms. When we use the acronym IIoT, we are referring to the Industrial Internet of Things, a network of physical objects or "things" embedded with electronics, software, sensors and network connectivity, which enables these objects to collect and exchange data. Big data refers to the large amounts of data that are available from disparate systems (such as enterprise resource planning systems, process control systems, condition monitoring systems, etc.).

Finally, analytics refers to the advanced information technologies that now make it possible to store, aggregate, analyze and visually represent data to provide a more complete picture of our assets, based on a complete set of data drawn from a variety of sources. Advances in the way we manage data and technologies enable us to improve maintenance but also to enhance reliability.


Advancing technologies have changed the way data can be used to add value to the business:

  • Open Standards – A key advance has been the move to more open data standards. In the past, sensors have been connected via proprietary communications protocols, and their outputs have usually only been available for viewing via proprietary control interfaces. These devices can now communicate using standard internet communications protocols, using open standards, and as a result, the data is now available for a broad range of uses.
  • Commonality – In the past, challenges around common naming and numbering conventions often limited our ability to connect the dots between data from disparate systems. Today’s technologies are breaking down these barriers, even enabling unstructured data such as text comments to be connected to structured data to support analytical capabilities that were not thinkable a decade ago.
  • Persistence – Advances in data storage have seen us progress from PLCs storing perhaps 24 hours of data, to plant historians that could store weeks or months of data, to ‘data lakes’ that have ever expanding shorelines. With this longer history of higher resolution data comes the ability to perform more detailed analysis.
  • Accessibility – The removal of barriers around data structures and communication protocols combined with advances in data storage means data can now be more readily shared between functions within an organization, and between organizations – for example, equipment manufacturer and customer.  


The applications available to make use of data have also improved exponentially. Reports and dashboards that were once the domain of IT departments are now within reach for most users via advances in visualization tools. Business and process models can now be informed by real time data to self-adapt and learn, and better represent the real world.  Powerful analytical tools that were once the domain of supercomputers and a small community of specialists are now available to businesses as server and even desktop applications are able to be used by the ever-growing group of ‘data scientists.'

Check back in two weeks for the second-installment of "Big Data, Predictive Analytics and Reliability – Moving Beyond Better Maintenance."