The field of data analytics is evolving quickly, and an increasing number of tools are being designed with chemical engineers in mind
Big data, smart devices, Internet of Things (IoT) — these are terms heard more and more frequently in the chemical process industries (CPI). But as connectivity proliferates, questions arise about how CPI facilities can best make use of the wealth of data they are collecting (Figure 1). “Data analytics are here to stay — those who don’t use it will lose out to competitors who do,” says Petri Vasara, vice president of management consulting at Pöyry plc (Helsinki, Finland; www.poyry.com). These benefits reach beyond process improvements into more esoteric areas, such as safety and sustainability. Vasara lists minimizing raw materials consumption and reducing emissions as two specific environmental benefits realized by data analytics. “Modern process-control equipment provides significant amounts of data that can be used for proactive disturbance elimination and process stabilization, promoting not only safety, but higher efficiency and sustainability,” explains Tom Lind, vice president of technology and new solutions at Pöyry. Introducing these principles into CPI sites will require continued dedication and diligence — integrating new elements piece by piece, and improving upon existing solutions, mentions Vasara. “It’s an evolution rather than a revolution,” he emphasizes.
Designing tools for end users
Although data availability and analysis technologies have made massive strides in recent years, these platforms are limited if end users cannot easily apply them. TrendMiner (Hasselt, Belgium; www.trendminer.com) provides an advanced analytics platform that engineers can use to analyze, monitor and predict massive time-series data. “We are aware that the engineers we work with are not data scientists, but chemical engineers. They have a different background and different questions arise for them,” says Thomas Dhollander, chief technology officer and co-founder of TrendMiner (Figure 2). “We identified a number of day-to-day questions that process experts are confronted with, and we found ways to answer them,” he explains.
Giving engineers the ability to interpret their own data in ways that make sense to them reduces the bottleneck of bringing in data scientists to provide insight. For instance, an engineer looking to diagnose a fouling problem could search for similar instances in the history of a unit and develop and validate a hypothesis for future operational anomalies. The system can recommend certain parameters to suggest root causes and compare events, and it can even reveal early indicators for adverse asset behavior. “You can basically create communication channels across the plant about possible early indicators so operators can intervene before abnormal situations occur,” says Dhollander. Another benefit, explains Edwin van Dijk, TrendMiner’s vice president of marketing, is ensuring that assets operate in an optimal operating zone that is less stressful so that they can run longer.
The company recently worked with Covestro AG (Leverkusen, Germany; www.covestro.com) to initiate an energy-savings project at Covestro’s site in Antwerp, Belgium (Figure 3). Energy consumption was logged and compared with historical values, looking for unexpected spikes in consumption. This led to an examination of steam use, and the team reconfigured a control scheme to prevent reactor temperature fluctuations caused by spontaneous heating and cooling. These types of comparative analytics are also useful in other improvement areas, such as the reduction of cycle times in batch processes, mentions Van Dijk.
Dhollander believes that the growing awareness of data’s value will be transformative to the CPI. “Data science evolution is happening in different areas in parallel, and this will lead to a better understanding of processes and make new products or business models possible,” he adds. The self-service approach lends itself to faster adoption, he explains, because there is no steep learning curve for process experts to begin using the tools. The next generation of data analytics may become even more accessible, as TrendMiner developers are looking to other markets for inspiration, including the creation of a “recommendation engine” modeled after e-commerce websites. “When working with measurement data, you will be presented with relevant suggestions of early indicators that potentially relate to an anomaly you’re studying,” he says.
Michael Risse, vice president of Seeq (Seattle, Wash.; www.seeq.com) further underlines the importance of direct interaction between engineers and their process data without requiring assistance from data scientists, and the ability to share those data across organizations and enterprises. “This makes it easier to collaborate, in realtime if required, across teams and sites by providing a productive and organization-friendly approach,” says Risse. Seeq provides a network of connectors to organize and analyze process data housed in historians and other data sources, without moving or copying the data. This enables engineers to query and interact with the data without modifying the source itself (Figure 4). Seeq has seen success using its data cleansing and capsule technologies to eliminate noise from sensors, allowing users to quickly identify critical operating conditions contributing to adverse situations, such as fouling. A focus on root causes eliminates the false positives that are sometimes seen in other analytical approaches, making it easier to determine optimal conditions and avoid outages, explains Risse.
Looking forward, Risse believes that to truly embrace modern data-analytics capabilities, companies need to break the mold. “The big opportunity in data analytics is still to move beyond using spreadsheets,” he says. He notes that while data analytics is possible using spreadsheets, it is cumbersome and more time–consuming than modern analytics software. He believes the CPI should look to the same underlying technologies utilized by more consumer-facing sectors to facilitate their data analytics, including the principles employed by large entities like Amazon, Uber and Google. “It’s time to get beyond spreadsheets as the everyday tool for investigation and reporting. Using a spreadsheet to perform data analytics is like using a wrench to drive a nail.”
Analyzing the human element
Advances in data analytics have enabled wide-ranging, cross-industry studies that previously would have been impossible to conduct. A recently completed project led by Texas A&M University’s (TAMU; College Station, Tex.; www.tamu.edu) school of public health, in collaboration with the college of engineering, investigated the application of written procedures at several different companies’ operating sites across the globe and the ramifications on process safety. The team conducted extensive interviews with numerous plant operators across multiple sites. The research led by Farzan Sasangohar, an assistant professor of industrial and systems engineering at TAMU, used qualitative data analysis (QDA) to generate hypotheses and recommendations on a broad range of topics affecting safety procedures, from the effects of weather to the provision of handheld devices. “There are many systemic issues in industry procedures that still need to be uncovered, and a lot of them are difficult to uncover because they are deep in the system’s design, and they are societal or psychological,” says Sasangohar. The team used a QDA method known as grounded theory, which enabled interviewees’ responses to be coded and analyzed to reveal commonalities and patterns on different environmental, organizational, team and task levels. “These levels all interact. There are personal issues, there are team-level issues, organizational and societal issues, and they all interact in really complex ways. The only way to uncover these issues is to conduct a large-scale holistic study to understand the contributors to safety culture,” he continued.
One pattern the work revealed is a phenomenon the team dubbed “automaticity,” meaning that operators become so experienced in doing certain tasks that they complete them simply via rote repetition rather than referencing the official procedures. It is situations like this, says Sasongohar, where human error and distraction can become increasingly problematic. Another issue revealed in the study is the prevalence of a “reactive” mindset with regard to updating procedures. “Even though organizations know a problem may exist, they still do not intervene by making procedural changes until an incident actually occurs,” explains Sasangohar. “There are still systematic barriers to making the ‘safety first’ mindset a reality. Recent advances in training for safety and human factors engineering have resulted in engineers who are more prepared and resilient at a personal level, but organizations themselves must be resilient. This does not change overnight. Systematic intervention must happen,” he emphasizes.
The sheer amount of information required for this study has taken years to collect, codify and analyze, and the team believes that many more articles and recommendations will be generated from this study. According to Sasangohar, an area of opportunity in data science that will help facilitate similar studies is improving probability and risk assessment with regard to human error and decision-making. “Some of the current activities to analyze human reliability require superficial quantification of qualitative data. We try very hard to quantify risk for human decision-making and human error, but much work still remains. Multiple efforts are moving forward to address this challenge,” he says.
Predicting the future with big data
A key part of effectively implementing data analytics is collecting a large enough volume of the most relevant data. “Chemical plants have used production-driven data, such as temperatures, pressures and flowrates, to manage processes for decades, but an explosion in data volumes is coming as the costs of sensors and data-acquisition systems drop, and the value of data for other functions, such as maintenance and reliability, is proven,” says Bob Miller, director of predictive diagnostics at John Crane Group (Chicago, Ill.; www.johncrane.com). However, he says that most plants do not currently capture the data required to properly drive a predictive maintenance strategy. “Process data alone cannot tell the whole story about the health of an asset. Sensing and data acquisition must be a part of the solution,” he points out. John Crane is currently piloting a new predictive-diagnostics platform for mechanical seals and other rotating-machinery components at several CPI facilities. The company expects a full commercial rollout of this technology in 2018. The platform features proprietary algorithms created with input from rotating-machinery experts. “The goal of our algorithms is to have enough specificity on what’s going wrong to give sufficient lead time to recommend some kind of minimally invasive maintenance action to prevent failure,” Miller explains. “We are selling people the knowledge of when something may fail and what can be done to prevent it.” While some of the operational and safety benefits realized by predictive diagnostics may seem obvious, Miller points out that one of the most significant parameters is still overlooked. “Even the most sophisticated users may not be tracking the impact of lost production due to an unplanned failure on a routine basis,” he adds.
A major challenge in implementing these types of strategies, however, is the current practice of “siloing” plant data. According to Miller, many sites will run predictive diagnostics on a single machine, rather than looking at the entire process, sometimes resulting in unexpected maintenance burdens. “Sometimes we observe that an operational decision designed to increase production in one unit can have an impact on the maintenance of a downstream unit. Bridging gaps in those data silos can unlock another level of optimization for plant operations,” he states (Figure 5). Miller has observed a few market leaders in the CPI who are proactively implementing integrated data-analytics strategies. There is, however, still much work to be done in moving toward a more plant-wide model of data analytics. Miller believes that as value is proven for analytics in specific applications, more users will become receptive to integrated, overarching platforms.
In January, Black & Veatch (Overland Park, Kan.; www.bv.com) launched a new subsidiary focused on data analytics called Atonix Digital. According to Fred Ellermeier, president of Atonix Digital, the group’s focus is deploying operational intelligence and adaptive planning capabilities through the Asset360 platform, a predictive cloud-based artificial intelligence system. “The big change we’ve seen is the openness to allow very mission-critical systems to be monitored, diagnosed and planned with cloud-based solutions,” says Ellermeier. While cloud-based solutions are very powerful, people are still learning to be comfortable with data residing in the cloud. However, according to Ellermeier, through encryption and anonymization, data can be stored very securely in the cloud, and in many cases, more securely than at on-site premises.
“The operational intelligence space is becoming more crowded, but the predictive side of things is quite cutting-edge — being able to tell the future using large streams of data,” explains Ellermeier. This includes being able to incorporate new streams of data — weather patterns, for instance — and overlay them with detailed historical data.
Atonix has successfully used predictive analytics for leak prevention in water-treatment sites. “We use advanced tools to predict where leaks will happen next. New adaptations in data sciences have allowed us to be able to do that,” says Ellermeier. “You can begin to predict when problems are going to occur and look at how to take care of assets in a controlled manner before problems arise,” he continues. Atonix has also used advanced analytics to optimize water-treament pumping stations. Work has also been done supporting analytics for waste-to-energy and combined heat and power projects. Atonix is looking to expand its footprint further into the CPI, and plans are underway to branch into the food-and-beverage market with new predictive tools.
A number of other groups from across the CPI have announced business efforts in the digitalization and analytics realm. In January, Koch-Glitsch, LP (Wichita, Kan.; www.koch-glitsch.com) entered a partnership with EFT Analytics focused on applying machine learning to enable comprehensive visibility of live plant operations for refining and petrochemical plants. BASF SE (Ludwigshafen, Germany; www.basf.com) invested in startup company Ahrma Holding B.V. to accelerate analytics-based logistics and supply-chain capabilities. Last year, thyssenkrupp Industrial Solutions (Essen Germany; www.thyssenkrupp-industrial-solutions.com) began offering data-based services to the mining sector, including a nearly universally compatible radar technology to perform online, realtime measurements. SUEZ (Paris, France; www.suez-environnement.fr) recently introduced a cloud-based, analytics-driven asset management system with geomapping technology for water-treatment sites. These are surely only a fraction of the analytics efforts coming down the CPI pipeline in the coming months.
The 2018 Connected Plant Conference (www.connectedplantconference.com) will take place February 26–28 in Charlotte, N.C. Focused on digitalization and connectivity trends…
When it comes to managing complex global supply chains, visibility and collaboration are the name of the game. To drive…
Version 5.3a, the latest release of this company’s Multiphysics software (photo), provides many enhanced simulation capabilities for users of computational…
Innovations place more robust instruments in demanding applications and enable processors to take advantage of the industrial internet of things…
Licensee Locator is an analysis tool that enables users to identify and locate potential acquirers or licensees for patented technology.…