Mobile Navigation

Chemical Engineering

View Comments PDF

A Boundless Future for Process Control in the CPI

| By Claudio Fayad

Emerging automation-system architectures will provide the foundation required for optimized operations across the enterprise

For decades, when a person mentioned process control, most engineers’ minds went to proportional integral derivative (PID) loop control, interlocks, safety and advanced process control. Perhaps even more in the chemical process industries (CPI) than many others, the ability to manage these aspects of control, and correctly configure them, has been the key component to improve operations, and as such, the main criteria in selecting control systems. Users were laser focused on designing systems that ran individual plants as efficiently as possible, and these tools were integral to those efforts.

Of course, part of keeping those systems efficient and effective was isolating them from intentional and unintentional interference. As a result, engineering teams employed the Purdue model and segregation of systems to ensure the control system was isolated in its own layer, and — perhaps most importantly as technologies advanced — isolated from the internet. At the time, plants were frequently instrumented for control only, and that model of control worked well for safe, efficient operation of individual facilities. Engineers focused on ensuring instruments could communicate with PID controllers. Control capabilities and optimal human-machine interfaces were therefore key criteria for selecting technologies.

Even in the most forward-thinking plants — those applying advanced process control strategies and technologies — the control system was still isolated within its own layer in the Purdue model. Communication occurred among different control technologies in different process units, but that communication was carefully managed and tightly controlled. Systems worked well and were safe, but they were not necessarily prepared for impending changes, particularly a connected future of self-optimizing plants, a shrinking workforce, and shifting business and public expectations. These and other trends upended process manufacturing across the span of only a few years.


New needs

For years, teams have considered sustainability initiatives, energy management and, in some cases, even predictive-reliability improvements to be burdens at odds with production. However, these needs are simply high-level optimization problems, with each requiring teams to look at process control and operations in a more holistic way.

Much of this struggle stems from the fact that, for decades, automation — often because of rigid design structure — has been implemented plant by plant or unit by unit, leaving a wide variety of systems all operating independently. Many process plant teams have long since maximized the gains from such a strategy. But sustainability, energy and reliability initiatives are refocusing efforts on optimizing the entire fleet or enterprise to capture previously unnoticed or undervalued synergies.

For a company with multiple plants manufacturing the same product, a focus on enterprise optimization typically results in significant gains from balancing production based on equipment status or green energy availability. Or, for companies with limited personnel, enterprise optimization can make it easier to move people and products from one site to another, creating consistency among plants to improve efficiency. But all these strategies require the implementation of modern and upcoming advanced automation technologies.


Changing needs, new era

Today, the process control landscape is rapidly changing. As shifts in corporate and public opinion have increased focus on reliability, sustainability, energy use and more, operations teams are rapidly transforming the way they design, build and operate their fleet of plants. With those changes come new needs (Figure 1).

FIGURE 1. New market needs have created an inflection point, along with a corresponding demand for a new automation paradigm

Increased production now rarely comes from simply deploying a new plant to manufacture more product, and costs are higher and demand is more volatile than ever before. As a result, companies are much more cautious about approving capital expenditures for new plants across a fleet. Moreover, building a new plant requires more people to operate it — and experienced people are becoming harder to find.

Instead, today’s successful companies are focused on optimizing existing assets. But because many plants have been optimizing their local operations for years, accomplishing a true paradigm shift in operations requires an expanded vision. The most successful organizations are optimizing from an enterprise level to drive improvements, with more confidence in the strategies their teams implement to achieve these objectives.

For example, more and more teams are identifying a need for advanced analytics to optimize operations across their fleets. By leveraging artificial intelligence (AI) and machine learning (ML), analytics technologies provide guidance for holistic optimization. Data from a robust suite of sensors, software and control technologies enable companies to optimize the business and sustainability performance of their plants and enterprise through advanced asset and business optimization software.

Similarly, many teams are finding a need to employ powerful simulation software to identify, design and test process changes to optimize operations. Simulation software empowers enterprise teams to use design models to identify potential changes and improvements across an organization’s facilities.

Many teams further improve on the gains of simulation with digital twin software, replicating entire plants to see how changes to improve reliability, sustainability and energy use will cascade across all areas. And when optimization is complete, those same digital twins are used to train operators on the new processes (Figure 2).

FIGURE 2. A digital twin can replicate an entire plant, empowering operations teams to test and train on new operations strategies to improve performance, without the risk or downtime of using live equipment

Typically, these new systems (and their associated integration with business systems) also create a need for cloud connectivity. But letting control systems touch the internet introduces a wide array of cybersecurity concerns. In the past, those concerns were alleviated by completely isolating the control system from the outside world. Today, however, as cloud connectivity becomes a need rather than a desire, operations teams are flattening the Purdue model to simplify and enable external connectivity. As they do so, they dramatically increase their need for more cybersecure, data-centric solutions.


Old strategies struggle

At the heart of making all these new strategies work are data, and operations teams are rapidly bringing in the technology necessary to gather those data, add value to it, and make it easy to access and use. Input/output (I/O) points are expanding, with many plants adding three to ten times the number of I/O points they had only five or ten years ago, many of them inputs from wireless instruments. A vast new array of intelligent field devices is providing companies with access to far more data than ever before, but only if they can harness the data effectively.

Creating value from data is difficult using legacy automation strategies. Often, teams are hesitant to connect intelligent field devices used outside process control into the existing distributed control system (DCS) because of the cost of the DCS infrastructure. To avoid the extra cost, operations teams instead design new infrastructure to collect and move data where they need it, which in turn increases complexity and creates new silos.

Creating new infrastructure introduces its own set of problems, however. Most importantly, when process data bypasses the control system, it loses all the critical contextual metadata the control system can provide, which is often needed to turn raw data into valuable information. Data without context can potentially be helpful if it is used immediately and locally, but over time and when aggregated, the data becomes less valuable.

Moreover, every time the team creates a new infrastructure for moving data, they create a new data silo. Such a data set may be useful in a single reliability tool or edge analytics package, but it is typically not available across the enterprise to the other users and cross-functional teams that might otherwise be able to use it.

Relying on these legacy strategies puts operations teams in a no-win situation. They lose the critical context of their data and create additional silos, inhibiting access and making it harder to use data effectively for meaningful optimizations. And while teams can layer edge solutions over siloed architectures to help move data where it is needed, those types of solutions do not address the core problem of creating a truly connected system to seamlessly move data sets where teams need them without losing any context.

To meet those more holistic needs, today’s forward-thinking automation providers are breaking down the boundaries around automation technologies by expanding the next-generation control system to incorporate all essential operational domains.


Integrating operations

To truly get the most benefit from automation, organizations need an architecture that will seamlessly integrate the software, tools, technologies, data and controls of a next-generation automation stack across every operational domain, from the field to the edge to the cloud. Driving this type of holistic optimization, necessary to compete in a global market, will require more than complex independent edge architectures to collect, standardize and democratize data.

Innovators leveraging a boundless automation vision will use technology across every domain — reliability, sustainability, energy use, quality and more — to perform optimization across the enterprise. Much like enterprise business solutions, boundless automation will eliminate silos through a scalable, interoperable, and extensible unified data model (Figure 3).

FIGURE 3. As organizations embrace more automation connectivity across their enterprise, the isolated architectures of the past are making way for an integrated, cohesive boundless automation architecture based on the next generation control system

To accomplish this vision, innovative organizations will shift their operations strategies to embrace digital transformation technologies that expand automation and control to encompass all process automation needs. They will begin implementing next-generation DCS technologies that have evolved to become enterprise automation platforms, without giving up on core process control. Control will still be a strong core competency, but it will be one of the functions manipulating and modifying contextualized field data.

While much of this boundless automation future is focused on advancements beyond the horizon, the building blocks are visible today, and they provide a roadmap to help organizations prepare for the brave new world of automation that is rapidly approaching.


Unified software ecosystem

New technologies and applications combined with market needs — including “born digital” companies, decentralized operating models, and the move toward self-optimized plants — have created demand for a new automation paradigm where a unified software environment streams data across the enterprise effortlessly, when and where it’s needed. Today’s operators expect intuitive, integrated software platforms, much like the solutions that revolutionized business software ten or fifteen years ago.

As teams begin to build an automation architecture comprehensive of all domains, they will rapidly eliminate segmentation. All operational domains will be integrated into a single, cohesive software ecosystem, with information from intelligent field devices, edge systems, and the cloud all coming together as part of a cohesive data model. Essential data will be democratized — standardized in format and laden with context — for easy collaboration among cross-functional teams. Moreover, that same data will be securely available on demand to authorized people across the enterprise, anytime from anywhere in the world.


A single technology stack

Modern architectures, technologies and protocols — such as Ethernet advanced physical layer, 5G communication, OPC UA, hyperconverged infrastructure, containerization and Kubernetes — are creating a bridge between operational technology (OT) and information technology (IT), enabling easier connectivity and more data flow.

For example, smart sensors can perform more tasks right at the asset and pass large amounts of data directly to process control, and send data to reliability and analytics software hosted at the edge or in the cloud. Not only do these new technologies improve visibility and decision making at the enterprise level, they also assist operations teams in their efforts to improve operations. Teams will be able to use new, critical information from the field to make better decisions and continuously improve efficiency.


Multiple domain optimization

While most modern plants can optimize their own operations domain by domain, boundless automation architecture creates a framework for tensioning multiple domains against each other simultaneously to deliver enterprise-wide optimization. A unified software platform combined with next-generation automation systems will help organizations continue to digitally transform their operations through an intuitive ecosystem of seamlessly interconnected technologies from the intelligent field across the edge and into the cloud. Every piece of operations-relevant data will flow securely from any operations domain to all relevant OT functions in the ecosystem.

Cross-functional teams from all levels of the organization will be able to instantly access that democratized and contextualized data. When needed, these data can be fed to artificial intelligence and machine learning systems, simulations and digital twins, reliability and energy monitoring, and other solutions hosted on premises or in the cloud. Using this deep well of critical data, teams will be able to develop the control strategies necessary to ensure peak performance, drive more sustainable operation, lower energy use, and begin the move toward autonomous operation.


Preparing for the future

Today’s automation technologies are purpose-built for improved performance, but often leave out domains critical to modern, efficient, fully optimized operation. In most cases, this disconnect happens because data in those domains are stranded, or accessible only through custom overlay architectures built at the edge. But the newest automation technologies on the market today, as well as those coming in the years ahead, will change that paradigm.

Operators want comprehensive, unified solutions that deliver consistency and ease of use, and forward-thinking automation providers are modernizing control systems to meet that need. Tomorrow’s automation architecture will dramatically expand to encompass all domains, providing highly contextualized data that are standardized for use by any authorized person or system, anywhere and at any time.

As more companies modernize their automation systems to deliver the flexibility, standardization, and visibility necessary to meet a future more focused on malleable, sustainable operation, they are changing the criteria by which they evaluate the technologies that will carry them into the future. Efficient, effective control is no longer enough. Fully optimized operation across the enterprise will require an automation vision for industrial architecture that integrates all of operations, using next-generation tools to make the most of data. n

Edited by Gerald Ondrey



All figures courtesy of Emerson



Claudio Fayad serves as vice president of technology of Emerson’s Process Systems and Solutions business (1100 W. Louis Henna Blvd., Round Rock, TX 78681-7430; Phone: 512-835-2190; Email: [email protected]). Prior to this role, Claudio held a variety of positions within Emerson, from sales and marketing director to vice president of software. He joined Emerson as director of Process Systems and Solutions in May of 2006, based in Brazil. Claudio holds a bachelor’s degree in engineering from the Universidade Estadual de Campinas and a master’s degree in business administration from Northwestern University.