More companies are creating digital twins to keep critical systems and processes working as designed. What does IT need to do to prepare?

Mary E. Shacklett, President of Transworld Data

November 3, 2022

5 Min Read
Simulating of car manufacturing by robots, digital twin
Alexander Tolstykh via Alamy Stock

A digital twin is a digital model of a physical thing or process. Its purpose is to aid organizations in detecting bottlenecks and/or problems in a process or a physical thing before those issues occur. In this way, proactive (and aversive) actions can be taken.

For example, a digital twin can be constructed of a physical factory for the purposes of improving manufacturing processes, or for analyzing the impact of altering a factory equipment maintenance cycle. Or it can be used in a smart city for purposes of analyzing traffic logjams and improving traffic patterns.

For enterprises, using digital twins can prevent the need for costly repairs and adverse impacts on customer service. The use of digital twins for purposes of analysis and evaluation can also reduce risk.

All of these are reasons why enterprises increasingly are seeking to adopt digital twin technology.

Creating a Digital Twin

Creating a digital twin requires information and data from everyone who uses and creates the system. This is because the reality of any individual thing or process involves a 360-degree view of information and data, so any digital twin that attempts to emulate this reality must have that same view.

If you are modeling the manufacturing process for a certain product, you will need CAD drawings and information for the product from your engineering system, parts information from your manufacturing and purchasing systems, assembly and fabrication information from your manufacturing systems, business process flow information from ERP or MRP system, cost information from purchasing, cost accounting and financial systems, facility information from building information system, IT flow information from your IT systems, personnel information from your HR systems, and possibly more.

Accumulating and verifying all this information and then modeling all of the data so it fits and works within the context of a digital twin, is a prodigious task. On the IT side, this task will require active efforts from data analysts, data administrators, applications specialists, business analysts and network/system personnel.

If your organization is new to digital twin technology, you can secure the assistance of a digital twin consultancy or data twin building software platforms. However, none of these entities understand your organization or its dynamics. You still need your own people to do the work.

Testing the Digital Twin

Once a digital twin is constructed, it must be rigorously tested against the physical reality of the device or process it is designed to emulate.

If the digital twin emulates a start-to-finish manufacturing process, it must be run in pilot and in parallel against the actual operations of the process until it reaches a point where every process step and result is accurately captured in the digital twin and reflected in concurrent and accurate results between the twin and the physical thing or process.

If the digital twin is constructed to emulate a road and highway grid of a city with the intention of monitoring traffic flows and determining traffic jam points, the digital twin must be able to acquire and process real-time traffic events; flow these onto roads; identify traffic jam points; assess whether the traffic jams are temporary due to road construction or an accident; or recognize whether the jams are due to something like traffic route design. This digital model must be run against the physical reality of actual road and highway conditions until all the digital twin stakeholders are satisfied that the digital twin emulation accurately reflects the physical reality of city roads and highways.

The process of aligning the results of a digital twin with the physical reality of a thing or process is iterative in nature. Often, the digital twin team of IT and business users uncover critical pieces of data or information that are missing, and then add them. In some cases, data modeling must be revised to ensure that the connections between various data points are accurate.

In analytics disciplines, the gold standard is 95% accuracy between what analytics yield and what a subject matter expert would conclude. This should be the accuracy target of the digital twin.

To arrive at this point, the IT QA group must be heavily involved in testing and retesting, as should end users and any area of IT that is required to debug or adjust the digital twin model.

Maintaining the Digital Twin

Once you achieve a digital twin that accurately emulates the process or physical thing you are modeling, you have to ensure continuing accuracy in production.

A business process can change. It can be as subtle as a minor tweak that a manufacturing supervisor makes -- and forgets to tell anyone about.

Data can also drift. What if your company changes to a new purchasing system and new APIs or data transformations need to be made for the digital twin?

Finally, there is day-to-day maintenance. Like other software, digital twins will have bugs. When a processing or data error occurs, IT must be on hand to fix the problem.

Final Remarks

The future of IT in companies will increasingly move to virtual reality and pictorial modeling that is informed by real-time data.

Digital twins are an integral part of this future.

But to develop, refine, and support digital twins, IT must also reform its development, testing and maintenance strategies.

Digital twins that are actively used in production must be accurate. Like an online transaction system, they must be kept up and running. If something goes wrong, IT must immediately move in and fix it. If this isn't done, an erroneous decision could cause catastrophic damage to the company.

It’s not too early for CIOs and other IT leaders to think through the digital twin life cycle, and how IT will support it.

What to Read Next:

Digital Twin Smart Mapping Hits the Slopes

How Digital Twins & Data Analytics Power Sustainability

The Metaverse: How to Get Your Organization Ready

About the Author(s)

Mary E. Shacklett

President of Transworld Data

Mary E. Shacklett is an internationally recognized technology commentator and President of Transworld Data, a marketing and technology services firm. Prior to founding her own company, she was Vice President of Product Research and Software Development for Summit Information Systems, a computer software company; and Vice President of Strategic Planning and Technology at FSI International, a multinational manufacturer in the semiconductor industry.

Mary has business experience in Europe, Japan, and the Pacific Rim. She has a BS degree from the University of Wisconsin and an MA from the University of Southern California, where she taught for several years. She is listed in Who's Who Worldwide and in Who's Who in the Computer Industry.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights