Mobile Navigation

Chemical Engineering

View Comments PDF

Sanitize, Analyze and Optimize Bioprocessing Operations

| By Joy LePree

Better designs and improved monitoring and control technologies lead to higher efficiencies

While bioprocessing can be challenging due to the sanitation requirements and exacting processes, new technologies are allowing bioprocessors to more easily sanitize, analyze and optimize their processes in order to achieve greater flexibility, tighter control, better insight and more efficiency, which leads to more predictable results and higher-quality end products.

Kevin Knopp, CEO and co-founder of 908 Devices (Boston, Mass.; www.908devices.com), says the need for optimization is increasing due to a growing biologics pipeline. “It is really an incredible time for bioprocessors as the worldwide biologics pipeline is very large right now. We estimate that there are about 4,000 assets in development today across all sorts of new modalities. About half of that pipeline is now in gene and cell therapies, and that creates a lot of complexities as they continue with development of classical monoclonal antibodies (mAbs) and add customized gene and cell therapies, which have more complex modalities.

“This creates multiple challenges,” Knopp continues. “One is scaling up the process development laboratory to accommodate the rising number of assets in the pipeline, as well as handling the more complex variants and modalities in the pipeline. For this reason, bioprocessors are looking for tools and analytical gear to help them analyze and optimize the process to handle the increased workload with efficiency in order to develop a process that can accelerate a proper candidate with the necessary critical quality attributes and get it into clinical trial.”

At the same time, there are additional challenges. “Hygienic design is still very demanding,” adds Martin Mayer, business development of digital services with Zeta GmbH (Spatenhof, Austria; www.zeta.com). “Also, more customers are approaching us with requests for multi-purpose plants in which they want to run five or six different products. In some cases, they don’t yet know what one or more of those products may be, so they need extremely high flexibility.

“Hygiene and flexibility can be obtained by combining smart design of the plant’s hardware and vessels with a smart approach to how to optimize the plant,” Mayer continues. “For suppliers of bioprocessing equipment, process and equipment design know-how, plant design and instrumentation and automation become the keys to delivering optimized multi-purpose plants.”

 

Simplifying sanitation

Since bioprocesses deal with living cells, sanitation is of utmost importance. “Cross contamination can not only ruin the current batch, but processors could be subjected to losing production approval,” explains Markus Kuehberger, head of business line chem/pharma at GEA Westfalia Separator (Düsseldorf, Germany; www.gea.com). “For this reason, before beginning a batch, you must be absolutely certain that not a single cell from a former batch is still inside your system. This is highly challenging if you look at how a centrifugal separator is designed.”

Inside the separator bowl is a disk-stack with inlets, outlets and many channels, which requires a lot of expertise to be sure every corner, channel and part is absolutely clean. And, of course, it helps if the separator is properly designed. “Designing an easy-to-clean centrifuge requires know-how to design the separator’s geometry, as well as the piping system around it without dead spaces. It requires know-how when it comes to the treatment of stainless steel surfaces to make them as smooth as possible for better cleaning effects. Finally, it requires know-how about how to run CIP [clean in place] and SIP [steam in place] cycles in the most effective way” (Figure 1).

bioprocessing

Figure 1. Designing an easy-to-clean centrifuge requires know-how to design the separator’s geometry, as well as the piping system around it without dead spaces. It requires know-how when it comes to the treatment of stainless steel surfaces to make them as smooth as possible for better cleaning effects. Shown here are the internals of GEA’s centrifugal separator

Kuehberger says that centrifuge design is just one side of the coin. It is the package unit that takes care of the proper CIP and SIP procedures, allowing a clean centrifuge after processing. “We bring both sides together, mount everything on a skid that is, as a unit, completely pre-tested and pre-qualified, including all necessary documentation and validation. Our customers benefit from fewer interfaces, less trouble and less time needed for validation and installation.”

GEA aseptic separator skids offer efficient mAb separation that minimizes cell shearing, while the mechanical concepts and precisely adjustable ejection system ensure optimal product purity and high yield for mAb producers. The combination of the hydrohermetic inlet and bowl overflow has also been designed to achieve better CIP results. To avoid product contamination from seal abrasion, only one double mechanical seal is used. It is made of high-quality silicon carbide and is not in contact with the product.

 

Monitoring and control

Bioprocessing also requires a lot of control over many variables, which in turn necessitates top-notch analytical instruments. “The minute you put everything together, you have to control fermentation times and feedstock distribution and many other variables that, if not exact, can create low-quality product with changes in efficacy or immune response, so it’s important to do analysis so you know what is happening with the process and to make sure the product you produce is the same every time you produce it,” explains Ken Cook, EU pharma/biopharma expert support group team leader, with Thermo Fisher Scientific (Waltham, Mass.; www.thermofisher.com).

In addition to the general need for analysis, lately processors want to analyze more than one fermentation at the same time, using the same equipment, so the faster an analysis can be done, the more fermentations that can be monitored at the same time. “Speed of analysis is always a good thing and the more you can monitor, the better off you are,” says Cook. Thermo Fisher offers the Thermo Scientific Vanquish Duo UHPLC System to support two high-throughput workflows (Dual LC and Tandem LC or LC-MS) by combining two flow paths in one integrated UHPLC solution (Figure 2). These workflows improve productivity by saving time, reducing cost per sample, increasing capacity without added bench space and using resources more efficiently. The company’s Orbitrap Exploris 240 mass spectrometer can be coupled with the Vanquish Duo UHPLC system to further increase sample throughput and give additional information provided by high-resolution mass spectrometry analysis (HRMS). Accurate intact mass, glycan profile and additional information on specific post-translational modifications can be obtained in one analytical run through the use of HRMS as an additional detector.

Figure 2. Thermo Fisher offers the Vanquish Duo UHPLC System to support two high throughput workflows (Dual LC and Tandem LC or LC-MS) by combining two flow paths in one integrated UHPLC solution

However, once the sterile sampling and analysis are in place, data handling is needed to provide feedback in order to properly control the fermenter, which means there’s a lot of software involved and it all has to act together, notes Cook. Thermo Scientific’s Chromeleon Chromatography Data System (CDS) software can help streamline these laboratory workflows. It delivers compliance tools, networking capabilities, instrument control, automation, data processing and more. The enterprise solution is designed for tracking, accountability and quality assurance (QA) and quality control (QC).

908 Devices’ Knopp agrees that analysis is a critical part of the bioprocess. “In addition to the growing biologics pipeline, there are initiatives from the FDA [U.S. Food and Drug Admin.] regarding quality by design where they want you to understand the importance of monitoring and controlling each and every process parameter and the critical quality attributes of each biologic,” he says. “This puts a lot of pressure on the processor and analytical technologies and creates a major drive toward process integration, automation, control and the ability to use these analytical technologies inline.”

“Historically, there has been a hinderance in taking these instruments out to the process due to the large size, complexity and expertise required to run mass spectrometers and interpret the data,” says Knopp. “However, we are taking an approach that is more analogous to the computer industry in order to bring it down from centralized mainframes to desktops, laptops and mobile devices by making simple and easy-to-use desktop and hand-held mass spectrometer-based devices that are more accessible.”

The company’s REBEL desktop device sits beside the bioreactor and is capable of providing a quantitative list of amino acids, dipeptides, water-soluble vitamins and amines within seven minutes (Figure 3). This allows users to combine that information with other standard biochemical markers and make decisions much faster, shaving weeks off process development cycles.

Figure 3. 908 Devices’ REBEL desktop device sits beside the bioreactor and is capable of providing a quantitative list of amino acids, dipeptides, water soluble vitamins and amines within seven minutes. This allows users to combine that information with other standard biochemical markers and make decisions much faster, shaving weeks off process development cycles

Knopp says bioprocessors are showing enthusiasm at the prospect of taking a laboratory-grade answer from a mass spectrometer, moving it out to their workplace and receiving a comprehensive panel of analyses that they would historically have to send to a third-party laboratory and suffer a long turnaround for results. “They can analyze the information by looking at the output on the screen and we’ve made some recent integrations that allow them to take that data into other software programs they are using to perform data analysis, enabling them to look at trends and correlations within their process,” he says. That information can then be used to provide insight into the process that helps improve the efficiency and quality of the process and end product, and allows them to adjust their process as needed in near real time.

 

Digitalization and modeling

As bioprocesses become more automated and integrated with various instruments and software programs, they are accumulating more data that should be used to further drive optimization of the process, says Zeta’s Mayer. “Bioprocessing is no different from other chemical processes in that processors need to support all the data in the automation system, get it into a historian and turn the data into actionable information,” he says. “This is where digitalization and modeling will begin to play a very important role and why demand for this is growing.”

Zeta is currently working on a project that uses digitalization and modeling for predictive control of bioreactors, says Mayer. “In theory, this is simple: You mimic the process by applying a mathematical model that traces the entire process from the early stages to the quality at the end. If you have this model, you can use it for optimizing the process and exploring different ideas you’d like to try to improve efficiency or quality. Simply put, you get a map of the process and exploit it for some optimum you’d like to achieve,” he explains.

“We wanted to have the ability to control bioreactors based on a model,” he says. The company recently began a project with two large pharmaceutical companies in order to explore the potential business and commercial benefits behind the bioreactor-modeling concept. By the end of 2021, Zeta expects to have a working prototype that would allow the pharmaceutical companies to experiment with options, such as: “I want to reduce my fermentation time by 8%. What parameters can be changed to meet that goal without sacrificing quality?” or “How can I combine unit operations and optimize operations such as fermentation, screening and downstream processes to run one after the other to get an overall optimum instead of a local optimum?”

There are many different ideas on how this technology may be applied to improve operations and optimize the process, but processors have to know early enough in the process development phase to create an accurate model in order to take the model through all the validation steps, says Mayer. “This is one of the reasons this technology has not yet been applied in the bioprocessing industry, so we are working on this project as a proof of concept to show end users that modeling control does provide value.”

 

Overcoming future challenges

As the upstream part of the process continues to evolve, the downstream side faces its own set of challenges, says Patrick Farquet, head renewables and bio-based applications with Sulzer Chemtech (Winterthur, Switzerland; www.sulzer.com). “The challenges on the downstream side are coming from the upstream part of the process,” he says. “They create dilute streams that contain a lot of water and a mix of different products. This mix of smaller impurities makes it difficult to concentrate and purify the one product they want to have. Furthering the challenge is the range of natural feedstocks that often have some seasonality or process differences that generate varying impurities with time. Often, in the lab, biotech companies will use one set of techniques to purify, but, from a cost perspective, are nearly impossible to replicate on an industrial scale. While it seems feasible to make 100 grams, making 10 tons continuously is often a challenge.”

While the laboratory may use liquid chromatography or difficult solvent-crystallization methods to achieve their results, they often prove to be impossible to scale up, so providers such as Sulzer must find ways to use standard techniques, such as extraction, to make the downstream process profitable and scalable. Finding the technique that provides the desired outcome in a way that is profitable requires pilot testing and tweaking of existing technologies. “If you look at the chemical industry, the most widely used separation technology is distillation,” Farquet says. “We still use it, but in most bioprocessing applications, it works for concentration stages, but it is unlikely you will get to the necessary final purity levels just by using distillation techniques.” Often technology providers rely on extractions using solvents to extract ingredients (Figure 4) or biochemicals from highly diluted feeds and proven crystallization techniques to reach the desired purity. In other cases, they may employ techniques like adsorptions to adsorb organic impurities that can’t be removed via traditional technologies. “All sorts of techniques have to be examined, tested and possibly combined in order to find the method that works best to get the product that is needed with a reasonable cost,” he says.

Figure 4. Often, technology providers rely on extractions using solvents to extract ingredients or biochemicals from highly diluted feeds and proven crystallization techniques to reach the desired purity. Shown here is a large liquid-liquid extraction column

Pramod Kumbhar, CTO with Praj Industries (Pune, India; www.praj.net) agrees that scaling up can be difficult, especially as processors are moving toward continuous processing versus batch. “With a mindshare towards continuous manufacturing in the biotech space, high-capacity plants that are planned to provide economies of scale create two challenges: handling large volumes calls for very high capital expenses and increases the risk involved. Gradual scale up of output without very high capital expenses and fewer risks or problems due to lower volume handling call for deeper engagement at the process development stage,” says Kumbhar. “This also means we need to focus on all the disciplines of the plant, including process, infrastructure and automation. And for the connected world, there is a growing need to make the system future-ready. This calls for a different thought process going forward that involves plant automation, integration, data management and predictability,” he says.

“Centralized process-automation solutions will have to be integrated with enterprise software to manage inventory, process data, proactive maintenance and real time records. This will form the foundation for artificial intelligence in process plants and will be the eventual future of bioprocessing,” says Kumbhar.