Digital twins: When and why to use one

by Mickael Brossard, Mithun Kamat, Tomás Lajous, Kayvaun Rowshankish, and Cenk Tunasar

This blog post is the second in a three-part series on digital twins. The first post explained how generative AI and digital twins make a powerful pairing. The third post in the series will explore an in-depth case study on how generative AI and digital twins could be used to enhance customer experience.

When most people consider digital twins, they think of the visualization and simulation tools used to develop innovative new products. Offshore oil platforms, racing sailboats, luxury automobile engines, hotels, and train stations are just some of the things that have been designed and refined using advanced digital simulations.

These uses represent a fraction of what digital twins can accomplish. It is helpful to think of digital twins less as a tool for designers, engineers, and manufacturers and more as a laboratory in which nearly any organization can optimize its most precious resource—information—to continually push the boundaries of what it can accomplish. By using data to mirror real-world situations, digital twins can be deployed to create, fine-tune, or entirely reimagine nearly any complex process or system, including supply chains, public transit systems, and assembly lines.

For example, a global retailer recently set out to rethink its supply chain with an eye toward cutting costs, optimizing service, and boosting sustainability. It was a complex problem that involved optimizing an array of key levers, such as inventory positioning, product flow optimization, supply planning, and carbon emissions. Drawing on the organization’s vast quantities of real-time data, a team created a digital twin of its global supply chain operations—a sprawling system of manufacturing facilities, freight and cargo operations, third-party contractors, and distribution centers. The digital replica allowed the retailer to test more than 50 scenarios a day, examining potential outcomes for various large and small choices along the supply chain, all without any real-life disruptions. An optimization engine embedded within the digital twin provided users with informed recommendations in the meantime. Ultimately, the company made a series of optimized decisions that sparked a 7 percent reduction in carbon emissions and a 5 percent improvement in customer orders received on time.

Digital twins offer immense value, but only under the right circumstances

Clearly, digital twins have come a long way since the 1960s, when NASA engineers tested simulations of Apollo spacecraft using early digital models. It’s also clear that a growing number of organizations are intrigued by the technology and are looking at ways to adopt it. Yet it’s important to understand that digital twins are not a one-size-fits-all solution for business problems. Digital twins are as complex and nuanced as the systems and objects they seek to emulate and can deliver value only in particular circumstances. How, then, can a business determine when and where to implement a digital twin? There are several key characteristics to look for:

  • High-stakes areas with high costs and real revenue on the line. The problem to be solved must be relevant enough to warrant significant investment. Building digital twins requires considerable effort and resources, and businesses should be fairly certain they can derive value from the effort.
  • A complex or dynamic environment. Some products and systems are so complicated that inefficiencies and their root causes are difficult to identify. In the supply chain, for example, a change to one element can lead to second- or third-order effects that manifest over time. This is the kind of situation that could benefit from a digital twin. Simpler environments, characterized by more direct and apparent chains of cause and effect, can likely be optimized manually through more standard forms of investigation and analysis.
  • The availability of high-quality data. Digital twins require a substantial amount of high-quality data to accurately represent their real-world counterparts and simulate the underlying product, system, or process. The data set must be robust and reliable across all the problem’s parameters for the issue to be solved; data that is incomplete or otherwise flawed can undermine, or even doom, a twin’s effectiveness.
  • The environment being simulated is recurring. Digital twins are built to be used and reused, to repeatedly simulate and optimize multivariable problems. If the objective is to solve for a one-time optimization of a single variable, an optimization model would work better than a digital twin.

If these requirements are met, a digital twin can deliver considerable impact. For example, a leading semiconductor manufacturer was repeatedly losing bids because it relied on slow-moving, inefficient design and production processes. This was a high-stakes, complex problem with no immediately apparent root cause—precisely the kind of problem that digital twins are best suited to address. Using historical data, the business built a digital twin that leveraged AI and machine learning to rapidly run multiple simulations of potential designs. This process generated insights that decreased time to market and increased first-time-right designs by up to 25 percent. Engineering capacity saw a 20 percent boost as time spent on manual physics models, which previously had used traditional methods and took hours, was reduced to seconds through the creation of a deep learning surrogate model.

Even when it meets the foundational prerequisites (including a viable business case, high-quality data, and ongoing operational demand), a digital twin is far from a sure thing. Digital twins can fail to deliver for a number of reasons. One common pitfall is the incorrect configuration of technical components, which can compromise the accuracy and reliability of the digital twin’s simulations and insights. Additionally, the long-term viability of the business case—and thus the digital twin—may be limited as market conditions, technological advancements, or organizational priorities shift over time. Finally, a lack of sustained maintenance and updates to the digital twin can lead to its gradual obsolescence. These challenges underscore the importance of holistic planning, ongoing evaluation, and agile adaptation strategies to ensure the enduring success of digital-twin initiatives.

Taking the next step

With a clear understanding of business objectives and meticulous diligence in implementation, a digital twin can unlock new realms of possibilities. The emergence of generative AI, with its ability to analyze vast quantities of structured and unstructured data and generate insights via user-friendly interfaces, promises to supercharge the promise even more (see the first post in this series, “Digital twins and generative AI: A powerful pairing”). If your company has a complex and thorny problem that is holding back performance, the technology is well worth your consideration.

Mickael Brossard is a partner in McKinsey’s Paris office; Mithun Kamat is a partner in the Dallas office; Tomás Lajous and Kayvaun Rowshankish are senior partners in the New York office; and Cenk Tunasar is a partner in the Boston office.