Systems in the processing industry are often operated for decades, which means constant renovations, expansions, and modernizations. Aspirations to make processes more efficient are as old as the sector itself. The digital transformation of the economy is creating new possibilities for more advantageous configurations in processing through numerous innovative approaches and technologies. Consensus has been reached that machines must communicate with each other in order to function with maximum efficiency. The Internet of Things is the new reality – even if it continues to be interpreted and implemented in different ways. Data are provided, collected, evaluated, and used at different levels… Yet how do companies in the processing industry actually profit from big data?
By using the oil and gas industry as an example, the relevance of measures that increase efficiency becomes quite clear. In the past, the high price of oil enabled the financing of technologically challenging projects with high investment costs. In times of low, volatile oil prices, more efficient, inexpensive systems are required for implementing more flexible projects. Standardization is the relevant term for this point in time.
Standards for Higher Efficiency
These include concepts that unify function blocks, interfaces, and protocols for automating machines and systems. Well-known oil concerns already rely on established standards in programming. Every single step in the software engineering is exactly documented and codified so that the same program modules can be used every time. The same principle is applied to the visualization of systems and processes: in this case, page layouts and visual forms are defined in detail. This means that everyone who works on the system can orient themselves quickly and easily. NORSOK i-500, for example, is one such example. It was established by the Norwegian oil industry and determines basic security regulations as well as standards applicable across manufacturers. By using uniform interfaces between different systems, they reduce the failure rate during the design phase and when operating the systems, and also increase system availability.
Companies in offshore oil production, in particular, thus rely on the advantages of standardization and manufacturer independence. Modular system design, in which individual process steps are encapsulated in separate system modules, is less relevant to operators, since oil production systems in the offshore sector must be thoroughly and completely planned from the outset. While modular design of a system may facilitate later changes to the system architecture, this must be considered in light of transportation to the drilling platform and subsequently high installation costs. In addition, any space present on a drilling rig is completely exploited from the outset, and cannot be expanded.
Adaptable Production through Modularity
The onshore situation appears radically different. Processing systems for oil and gas extraction can be flexibly adjusted using a modular design. This applies not only for secondary and ancillary systems, but across the entire process. The main focus of the pharmaceutical and chemical industries during system design also lies in flexible production; processes here must be even more quickly adapted than in the oil industry. In the context of batch size 1, fluctuating volumes and ever-shifting regional sales markets, companies must have manufacturing processes that are universal, mobile, scalable, modular, and compatible. This is exactly the reason that modular systems have long been the norm in the processing industry. However, the advantages that should result from modular system design, have not been consistently exploited. This is primarily because the individual system modules are connected to one another in the central process control system by the software. Removing, adding, or exchanging individual modules automatically leads to reprogramming in the control system, which generally costs time and money. With DIMA (Decentralized Intelligence for Modular Applications), WAGO has developed a manufacturerindependent solution for automating modular systems, which significantly reduces the time required to configure production systems. DIMA moves beyond modular management of the system hardware, and instead takes the entire system module into consideration, including all of its functionalities.
Each module is thus equipped with its own intelligence. The intelligent modules can communicate with one another and with the process control system. This allows the control level to penetrate every parameter of the system. The automation components used for this must have a standardized interface, such as the MTP interface provided by DIMA. The MTP (Module Type Package) is the centerpiece of the DIMA approach: a new digital description for process technology application modules that contains all of the information required both for user-independent integration of a system module into the process control system and for autonomous operation of the system module. Only then can all components be networked and integrated across the entire lifecycle.
Increasing Availability, Minimizing Idle Times
Consistent modular system design is not the only method for achieving the efficiency increases, that result from the potentials found in digitalization and networking. Modularity provides a foundation for versatility and reduced design times, and is, without a doubt, an essential component for increasing efficiency. However, data acquisition and cloud connectivity can be just as profitable for the processing industry as modular system design. Data transparency, for example, provides the opportunity for predictive system maintenance. “Predictive maintenance” includes proactive maintenance and servicing of systems and machines. If one production system fails, this can result in high costs due to bottlenecks and other failures. Efficiency analytics, for example, the evaluation of relevant measured values, like output, pressure, temperature, vibrations, and throughput, can be used to recognized maintenance requirements early. This enables the targeted replacement or maintenance of individual modules, without negatively impacting system availability. It is, however, necessary to read and evaluate the collected machine data in higher level systems. The system can then be monitored by local analysis tools or via the cloud. The later possibility is particularly interesting in view of sustainable technologies like self-learning algorithms. Specially programmed software tools could then autonomously calculate maintenance needs and potential failures by using historical and current machine data, and thus predict the state of the machine without requiring human monitoring of the data.
Independent and Flexible due to the Cloud
Despite all efforts to participate in Industry 4.0, the skepticism with respect to complete data transparency remains high. Only a few companies in the processing industry consider a “cloud first policy” to be possible. The fear that process-relevant data will be attacked and manipulated by third parties is great. And yet, consciousness about the advantages that cloud applications and solutions offer is increasing: they are independent of individual servers, because cloud operators guarantee constant availability by storing data on multiple servers. At the same time, costs are dropping for enormous storage potentials and their maintenance, as well as costs for establishing server farms, because the principle behind the cloud is that each pays only for what is used. The delivery and use of cloud services are carried out exclusively via interfaces, protocols, or browsers via the Internet, without necessitating software installations on local computers. Thus, it is possible, in principle, to be able to access process-relevant data from anywhere in the world.
Location-Independent Access Enables Process Optimization in Realtime
All relevant data are available, regardless of location. Processes can thus be improved in real time, and adjustment measures can be easily monitored. The software company M&M, for example, relies on the trend toward decentralized data acquisition. In this case, data from different machines, entire production lines, or even building and energy data in individual production facilities, are gathered and evaluated in a central cloud service. In the cloud, all processes are controlled and coordinated; access to current and historical data is carried out independent of location. This allows a company’s various facilities to compare key data among themselves and introduce necessary optimization measures. The operating states and maintenance needs of machines and systems can be automatically determined through constant monitoring.
IT and Data Security as the Central Challenge
Especially in the processing industry, the expertise about individual approaches and methods is the greatest asset, which must be protected at all costs. Therefore, security is the primary topic in discussions about digitalization and networking. Security can be implemented at various levels in factories and IT systems. It begins with checks at the facility gate, extends to different authorization profiles for people with different functions, includes firewalls and password protection, up to data encryption with the controller itself. “Security plays a superordinate role in all of our solutions,” explains Frank Schmid, Head of Business Unit System Solutions at M&M, “the selection of data that are sent to the cloud, and the connection, through which the data arrive in the cloud, are of similar importance.”
To completely secure the data, WAGO controllers from the PFC family can be used, which have comprehensive onboard security mechanisms. They allow data security using SSL/TLS-1.2 encryption with transmission directly from the controller using a VPN and IPsec connection.
In addition, the controllers also have a configurable firewall. Thus, they offer maximum protection from thirdparty access and guarantee data integrity. The PFC from WAGO thus already satisfies all relevant guidelines in cyber security. In general, companies individually estimate the criticality of their data. They decide themselves, which data should be available in the cloud, and which should remain in the factory, where they are transmitted only at the control level. Specific descriptions of processes can thus only be viewed by those people who have the correct authorizations.
With the development of the Asset Management Cloud, M&M facilitates data evaluation in the cloud, which provides a central data service that forms the basis for all user-specific solutions. In addition to production, processes like supply chain management and logistics can be linked in, which enables the optimization of transport capacities and paths as well as warehouse utilization, in addition to system efficiency.
The Future Is Intelligent
An encouraging view of the development of cloud systems is found in the constantly increasing capabilities of algorithms. Artificial intelligence is already used in many sectors, for example, in medical image recognition or voice control of various systems. Clouds can also be equipped with artificial intelligence using specialized software. By this means, it would be theoretically possible for algorithms to calculate potential future errors on the basis of archived data. Predictive maintenance could thus be operated in a completely new dimension, which would in turn redefine constant system availability.
Text: Benjamin Boehm, Julia Grobe
Photo: iStock.com, WAGO