Skip to main content Skip to footer
Cognizant Benelux Blog
Subscribe for more and stay relevant

The Northern European newsletters deliver quarterly industry insights to help your business adapt, evolve, and respond—as if on intuition


 

5 mins

 

This article was originally published in Dutch by IT Daily. Click here to read the original publication.

Everyone is talking about digital twins today, but what exactly does this technology mean? “The term is enthusiastically used in different industry sectors,” agrees Christophe Stas, Associate Director Life Sciences Manufacturing at Cognizant. “But everyone understands it differently.” Time, then, for a definition.

Despite the fact that the term “process digital twin” might have different understandings, we aim to clarify its meaning in a life sciences environment, highlight its real value for an organization, and identify critical factors for successful implementation.

Process digital twins in life sciences are virtual models connected with their real-life counterparts, replicating the behavior of biological processes like cell culture growth in a bioreactor. These twins leverage real-time data architectures to feed models with sensor data, predict process evolution, and suggest optimal control actions to maximize production yield. A digital twin can directly control the physical assets in full automation in its most advanced implementation.

Connecting to the real world

Dr. Elisa Canzani, Data Science Lead at Cognizant, clarifies, “A digital twin is a model which relies on sensors to provide (near) real-time feedback or updates about its physical counterpart. Many organizations confuse a digital model with a digital twin. We see the digital twin as an end-to-end connected reality which leverages data flows to offer a dynamic representation of physical processes in real-time.”

This distinction is important to us. The digital twin only has the most impact when it is connected to the physical manufacturing process. Digital twins, according to Cognizant, must be connected to physical sensors and process controllers to use actual production data to feed virtual models. The digital twin, in return, can predict and optimize the production parameters to improve efficiency of the manufacturing process via an open or closed feedback loop.

From process to platform to model and back

Canzani clarifies what such a system should look like in its basic form. “Batch data comes in real-time from sensors along the production process, and it is transferred in a reliable and cybersecure way to an (edge or cloud-based) analytics platform. Digital twin models process and consume such sensor data to run simulations and optimization scenarios to suggest optimal controls. “Then, if the model identifies an optimization using real-time data, the optimal control action is identified and sent back through the data platform to be implemented in the production process.” In an advanced implementation, the digital twin can thus drive production itself in full automation, but as an intermediary step the digital twin can also provide optimization suggestions to human operators.

“A twin does not have to necessarily be a 3D replica of the physical asset,” Canzani emphasizes. She continues, “Rather, the solution must replicate process dynamics in order to optimize it. 3D models can be a comprehensive representation contributing to usability, but the core of digital twins consists of machine learning, data-driven, and mechanistic models.”

Cognizant is betting heavily on digital twins running on real-time streaming platforms, focusing initially on large organizations in the life sciences domain such as pharmaceutical companies that can leverage process digital twins to optimize the complex manufacturing of drug products.

A model is never discarded

There are many challenges associated with the successful implementation of a digital twin; the creation of the model itself is not the biggest one. Christophe Stas states, “Customers have often already built models or at least a basic design. Such models must then be developed or re-implemented to be able to run on real-time connectivity. Communication must be automatic and fast enough, and of course bi-directional.”

Canzani has another caveat: “Sometimes organizations have already developed excellent models, but leverage a closed environment, such as third-party software, to do so. A closed ecosystem might create problems, because then it becomes difficult to provide real-time connectivity with other platforms and automation systems.”

Regardless, organizations that already have a digital model but not an actual digital twin, have a foundation to start from. “Developing such models is never wasted time,” Stas emphasizes.

Connecting systems and people

Much of the complexity comes from connecting different teams that must work together to achieve an end-to-end data flow for the digital twin. Usually research and development (R&D) scientists develop the bioprocess models. However, the digital twin also involves an IT component, with a tech team managing the data processing pipeline and deploying the model in edge or cloud environments (referred to as TwinOps). Additionally, an automation team ensures the model can consume data from different OT systems, and shop-floor operators install and calibrate sensor devices to collect data.

All of those business units are populated by different experts with different priorities and expectations. A successful digital twin must connect not only the systems, but to some extent, the people as well.

“People must first communicate to ensure alignment and common understanding- then you can connect technology,” Canzani says. “A twin is an end-to-end system, which comprises many different technologies. Usually, organizations are strong in one of three key areas: manufacturing, IT, or analytics. When you bring the three together, the real value is added.”

Reusable and scalable

“Ideally, you don't develop a digital twin as a one-off project,” says Stas. “The solution can be scalable and adaptable to different processes. You build the twin for one production line, and later when you introduce a new product, you can reuse the framework and optimize production from the get-go with the twin.”

Cognizant developed TwinOps with that in mind. That's a platform that consists of building blocks for data processing and analysis, modelling, pipelines for training and deployment, validation, and monitoring. Those building blocks can be reused across digital twins, providing a sustainable and scalable link between data and models. “TwinOps, besides being a framework with best development practices for digital twins, is a Python-based package of customizable components and building blocks for standardized, compliant, and repeatable workflows,” Canzani clarified.

Broad knowledge

Furthermore, according to Stas, Cognizant has a holistic understanding of the challenges across all domains involved in implementing a digital twin. “We know where the challenges are and the bottlenecks are located,” he says. “We can make a good assessment of whether the investment is worth its money, and what makes a realistic business case.”

Stas continues, “You need vendor-agnostic knowledge for a proper assessment of relevant technologies, with a deep understanding of rules and standards, and of course cyber security.”

Worth millions

An investment in an actual digital twin will pay for itself in the long run. “A digital twin can improve process yield by ten percent. Within life science manufacturing, in some cases, such yield improvement corresponds to saving up 400 million euros per year,” Canzani justifies.

Still, don't start big right away. “The beginning of every trajectory is an assessment of the maturity of the organization,” Canzani knows. “Along the assessment, we look for a use case that is already 'advanced enough', which means there is data available and a good understanding of the underlying processes to model with the twin.”

“Then we build a digital twin as proof of concept,” Canzani continues. “We start small scale, but we build the twin with an architecture that supports real-time usage and scalability by design. The workflows must be automated and repeatable as per TwinOps best practices.” Such a small rollout will immediately add value and show that the investment is worth the money. She continues, “Based on a first ‘quick-win' use case as such, we can build confidence for stakeholders to clearly realize feasibility and potentials. We can then scale up the project, while also rolling out others to other use cases.”



Cognizant Benelux
Author Image



Latest blog posts
Related blog posts