Unfortunately, there are many examples of where master data is neglected. Often master data is not properly maintained because it is considered too expensive. But if the Master data is not properly maintained, every small mistake in this data can lead to recurring and costly issues in the operations phase.
Master data is defined as all business-critical information, which is typically scattered within the organization. Business-critical information can be split amongst six categories: financial (operational costs, overhead, price lists, profits etc.), contractual (suppliers, vendors, customers etc.), organizational (business process, org chart, employees, training etc.), process (recipes, PFD, safe operating windows, setpoints etc.), asset-related (certifications, design, construction drawings, OEM information and so on) and safety information (scenario’s, HAZOP, SIS, SIL etc.). Apart from the transactional data, such as work request, work order, purchase orders, sales orders, etc., all information is Master data. But the Master Data is utilized within the transactional processes.
Master data needs to be maintained, and the more changes are being made to the asset or organization, the more maintenance is required. And as the Master data is scattered around the organization, maintenance also is scattered around the organization.
Therefore, the quality of Master data is dependent on the focus of the department manager on this issue. And let’s be real, not many department managers or leaders think Master data is essential, leading to poor quality of data – to make your Master Data as-built then, it seems to be too expensive.
When things go wrong due to incorrect Master data (wrong part delivered, product quality not to specs etc.), typically the blame is on Master data. It is seldom the responsible manager for a simple reason: no one is made accountable, therefore no one is.
Not having your Master data up-to-date can potentially lead to significant disruptions. For example, a chemical company ordered new gaskets to be installed after cleaning of a reactor during a shutdown. The gaskets were ordered as per the system and delivered. During the installation of the gaskets, an observing engineer noticed that the wrong type of gasket was used and this particular gasket would not last long in the process, resulting in leakages and maybe worse. It was found that the gaskets used were as per design, but after problems with those gaskets, these were all replaced by a different material type gasket with excellent results. The Master data was not updated accordingly. Hence the wrong type of gasket was ordered resulting, luckily only, in rework– but if it were not noticed the result could have been far worse.
Since the industry globally embraced digitization in 2014 with Industry 4.0, many companies have worked on the different elements, e.g. digital twin, predictive maintenance, AR/VR. Still, Master Data has again not received the attention it needs. Yet for many of the elements in Industry 4.0 high quality of Master Data is required. In the digital plant, the Master data is even more important than in the analogue plant.
To digitize equipment, the digital twin and to compare equipment, the Master data should be accurate. This data not only pertains to the supplier and possibly the model of the equipment and spare parts, but also to the business impact of failure, the changes made, the process conditions, and much, much more. The more data, the better the model, the higher the quality of the data, and the better the prediction.