Mobile Navigation

Latest Technologies

View Comments

Online Exclusive: Q&A on AI and digitalization with Honeywell’s Tathagata Basu

| By Scott Jenkins

The following questions and answers come from a conversation between Chemical Engineering magazine and Tathagata Basu, General Manager – Refining and Petrochemical at Honeywell Process Solutions (Houston, Texas; process.honeywell.com).  The discussion revolves around the use of artificial intelligence (AI) technologies in chemical manufacturing operations, as well as the deployment of other digitalization technologies and the progress of digital initiatives. The Q&A follows:

Question from Chemical Engineering magazine: What are the most common use cases for artificial intelligence in chemical process facilities that you have observed?

Answer from Honeywell’s Tathagata Basu: In our experience, the most common uses of artificial intelligence (AI) and machine learning (ML) have been in the following areas:

  • Pattern Recognition: Using ML tools such as clustering on real-time process and alarms data for finding any repeatable patterns for conditions that cause abnormal situations in plant operations.
  • Leak Detection: AI is used for classifying thousands of pixels of hyperspectral images to detect and identify gas leaks in a process plant to proactively alert operators.
  • Convergence of physics based, and data driven model for:
    • Soft Sensor: Using both physics based and data driven ML models to predict quality of product streams (especially in the reactor) with bias adjustments from intermittent lab readings. This is extremely useful for operators and engineers, as it saves them valuable time to take actions to correct anything in the process by getting a reasonably good approximation rather than waiting for a lab analysis, which may take hours.
    • Process Optimization: Using hybrid models (an ensemble of first principal physics-based models and ML models) to identify multi-variability in a complex unit operation with intermittent nonlinearity to execute multivariable predictive control using a model-based controller.
    • Asset Reliability: Using heuristics and machine learning-based classification algorithm to diagnose faults from real-time symptoms and alarms.

CE: What are some emerging use cases for AI that you are anticipating for fast growth in 2024-2025?

TB: A proliferation of powerful edge devices, ML coupled with AI is emerging as the new accelerant for deployment of AI in chemical plant operations. Some use cases are enumerated below:

  • Emission Calculation: Using machine learning algorithms to calculate emissions on the fly at the equipment level by triangulating several types of real-time and discrete information of data.
  • Operator Guidance: Using heuristics and sophisticated ML – it is now possible to predict future plant performance based on historical information to guide operators through difficult conditions while navigating a tight operating window.
  • Root Cause Analysis: Using knowledge graphs and ML algorithms to autonomously contextualize data to identify causes for a KPI variance and generate actionable insights by grading the major causes for future actions.
  • Vision sensing for Inspection: Convergence of computer vision, drone technology and AI has led to multiple use cases of inspection and monitoring of confined spaces where it is extremely dangerous for humans to inspect in a chemical facility.

CE: How has the availability of AI-powered large language models changed the approach to using AI at plant sites? 

TB: Natural language processing (NLP) on large language models (LLM) using sentiment analysis has the potential to change the landscape of how generative AI can be used in the industry. Some of the use cases we are working on are as follows:

  • NLP using LLM is driving some of the advisory service-based applications. Using generative AI to provide a “Siri” like feature in the operator station with conversational UI capabilities to support operator actions during abnormal situations. Once actions are taken, the generative AI (with support from other systems) could generate a report autonomously saving a lot of time to collate and generate shift reports.
  • NLP is also being used as an intelligent search tool for operators and engineers to look for information, saving thousands of manhours to analyze and make decisions on an identified issue. Generative AI algorithms can be trained on multiple documents and systems to find answers to questions that generally take multiple departments and agencies to answer. This is typically used by functional teams that need to get information and coordinate activities with the plant.
  • Generative AI with reinforced learning is also being used to autonomously drive workflows that needed to have a human to make decisions by ensuring human-like factors such as emotion and cultural dimensions are captured in the model as it learns from the predictions.

CE: What are the main hurdles to deploying digital transformation tools like digital twins and machine learning?

TB: In our experience of having implemented several digital transformation programs across industries, there are several hurdles to deploy and scale large digital transformation initiatives, especially in terms of AI and ML. Some of those are listed below:

  • Data challenge: Large AI/ML models need a lot of data both in quantity and variety. To ensure the data is clean and accessible at the right time with the correct time stamp is extremely critical. Once that is taken care of, the next big thing is to be able to contextualize the data to draw actionable insights.
  • Latency and Cyber Security: Speed and security of data access, especially in a plant environment, is extremely critical. Due to intrinsic safety and security needs for critical infrastructure, a lot of the communication tools that are used in a commercial environment are not applicable for industrial context. Therefore, care should be taken to ensure the right and safe technologies are used for digital transformation initiatives.
  • Bias and Causality: The biggest hurdle for deploying AI-based solutions is the predictions models are often considered to be a “black box,” where it is difficult to ascertain that there was no bias or overfitting of the model. In addition, it is extremely difficult to demystify a prediction to identify causality and make the predictions meaningful and actionable.
  • Scalability: Scaling an industrial digital transformation initiative is always challenging because of the inherent complexity of change management and adoption. In our experience over the last two decades of seeing these engagements, we are convinced that scalability of a digital transformation initiative is not technical – it is about change management and executive engagement. Success depends on setting the key success criteria, setting adoption milestones, driving a change program and ensuring regular measurement and reporting of progress.

CE: How are remote monitoring capabilities being used at plant sites?  Where is the most effort being directed in this area?

TB: In our experience, most of the remote monitoring applications are in unmanned, geographically dispersed, hard to get to sites like oil and gas production sites, transcontinental pipelines, geographically remote places like Alaska, mining sites and hazardous chemical sites which are located far away from the working-class population. Most efforts are being directed at:

  • Remote Monitoring of Assets: Remote monitoring centers for reliability and situational awareness by using ubiquitous sensing and analytics to ensure that there are no catastrophic failures that can cause a lot of damage. These technologies range from conventional wireless instrumentation to robotics, to bring data into the remote centers for experts to analyze and provide proactive feedback and insights to the local operators. The speed and the ability to process data with actionable insights is the key to success.
  • Greenhouse Gas Monitoring: In the recent past, a lot of effort has gone towards remote monitoring of emissions driven by ESG mandates. A lot of effort is being directed towards detecting CO2 and methane vents and leaks from equipment and pipelines due to malfunctioning equipment or operator error. Several technologies such as point sensors, sonic, infrared, drones and satellites are being used for remote monitoring and reporting greenhouse gas emissions of a remote facility.

CE: To what extent are digital twins having an impact on plant decision-making now?  Is there an impact being felt already, or still in the future?

TB: Digital twins can be data driven, physics-based or a combination of both addressing supply chain complexities, process design, control and automation as well as mechanical integrity. Honeywell has been involved in developing and deploying digital twins for at least two decades.

Those are primarily being used in industries in the following areas:

  • Design for flexibility – Digital twins for process and control systems are being used in a virtual environment to design and validate large process plants for conducting scenario analysis in steady state and dynamics without having to spend significant time with complex spreadsheets.
  • Accelerated Schedule – Digital twins of physical layouts and processes also help with accelerated project schedule by decoupling hardware and software engineering. It allows for parallel testing in a safe and secure environment, allowing for late-stage changes in the configuration without spending a significant amount of time and money reengineering.
  • Workforce Excellence: Digital twins of processes, plants, and equipment allow for training and developing operators on a virtual plant without needing the actual equipment. This tremendously fast tracks competency and shortens the learning curve to a fully qualified operator by addressing abnormal situations in the virtual environment before working in the real plant.
  • Plantwide Optimization: Well designed and robust digital twins are being deployed by process engineers for plant wide optimization by using curated “live” data to predict process behavior in order to adjust operator actions to optimize throughput and energy, ensuring best performance at all times.

CE: What has been the impact of open process automation standards on digital transformation efforts?

TB: Process control engineers in chemical plants are generally responsible for solutions from a variety of vendors that communicate over varied protocols. In a digital transformation effort, it is important that the varied components can communicate with each other and to higher level systems with minimal effort. Hence, it is necessary to embrace open standards in order to provide a common “language” for the automation components to communicate. Open process automation standards have fully embraced the need to standardize and provide an ecosystem of components that fit into that automation landscape. This is not a new challenge. End users have challenged the automation industry to develop solutions that make it easy to have a single view into the entire automation landscape. Honeywell answered this challenge in 2003 with Experion® PKS. Experion PKS is a system that was built on open standards. It is very flexible with support for numerous open protocols – for example including HART, Foundation Fieldbus, Modbus, Profibus, Profinet, ISA100, OPC, OPC UA, IEC 61850, as well as security standards such as IEC 62443. Experion PKS is designed to support future protocols as well – the ability to evolve over time is what makes Experion PKS future proof. Solutions such as Experion PKS HIVE continue to lead the automation industry with an approach that uses well-proven Experion technologies and combines it with simplified engineering efforts that decouple IO from process controllers and control strategies from fixed controller assignment.

Open process automation standards put a spotlight on the need for digital transformation efforts to embrace open protocols and standards to streamline communication within an end user’s automation landscape. The reality is that Experion PKS has been doing just that for the last 20 years.

CE: What other observations have you made about digital transformation in the chemical process industries that you expect to have an impact in 2024?

TB: In 2024 and the rest of the decade, we will see continued development in the following areas:

  • Virtualization and Cloud Computing: As more and more analytics and AI algorithms are deployed in digital transformation programs, it is clearly going to be an architectural tradeoff between physical, virtual and cloud computing risks and opportunities. Although virtualization of hardware has become a norm, several customers are still not comfortable with cloud computing for certain mission critical applications. As we develop trust and improve latency issues, we expect to see more virtual and cloud adoption in the medium term.
  • Cyber Security: With open systems and cloud infrastructure comes the associated risk of malicious cyber-attacks. Hence, a lot of customers are seeking help in securing their infrastructure by designing, deploying and testing industrial grade cyber secure hardware and software. This trend is likely to go up at a very fast pace.
  • Analytics on the Edge: More smart edge devices are expected to be deployed with powerful analytics to address some of the latency issues of deploying mission critical applications on the cloud.
  • Generative AI: As OpenAI catches on, there will be a proliferation of generative AI based applications in the industrial world as discussed above.
  • Robotics: In the chemical industry, robotic applications are expected to go up above and beyond the current uses of inspection and monitoring. There are several use cases of operations and maintenance activities that are being tested using robots.
  • Industrial Metaverse: As digital twins of specific parts of the plant or applications get more and more industrialized and deployed, there is an evolving need for a metaverse of digital twins that can autonomously orchestrate many workflows that are currently prone to human errors. It is a convergence of individual technologies that, when used in combination, can create an immersive three-dimensional virtual or virtual/physical industrial environment. This will unlock a large part of the value that is still unexplored.