Myths, reality and getting a clear definition of this thing we call the digital twin
One of the great challenges related to the digital twin is accurately defining the thing. You gotta know what you’re talking about before you can productively talk about it.
The digital twin is a digital model related to the IoT that includes context information to optimize business performance by employing virtual replicas.
Confused yet? You’re not alone.
Clearing up that confusion was at the core of Enrique Herrera’s presentation—“Digital
Twins: Myth vs. Reality”—during last week’s OSIsoft Chicago Regional Seminar focused on industrial analytics. Countering all of that confusion jargon, Herrera, OSIsoft’s manufacturing industry principal, described the digital twin as “a digital ball of goodness.”
Much easier to understand.
The speaker detailed the different iterations of what we call a digital twin. Some want a digital twin of the product they are manufacturing—the product that their customers are using. Some are using a digital twin of a process to understand how they are working more accurately. Some refer to the digital twin of the equipment inside factory. Then there are subcomponents of that approach; they may have a machine and its digital model but they also a digital model of a pump within that machine. A focused digital twin inside the larger digital twin.
At the core, digital twins are merely models, no matter what form they take, no matter what asset or process or concept they replicate, said Herrera, who told the audience that “All models are wrong, but some are useful. Whether they are physical-based models or analytics models or asset-centric models, they all help you understand how to roll up data.”
Key considerations when working with digital twins include:
- The economy of the data—What is the cost of collection vs. downtime of asset?
- Collaboration—Where do you start and who within your organization should spearhead.
- Data Governance—There is always the fear of spiraling out of control; with that in mind how to you govern the data, structure it, and maintain it for integrity?
- Scalability—Determining whether you have to scale down to the data sources or scale up to some massive computing environment.
In conclusion, Herrera stressed to the audience the need for simplifying the application of the digital twin, even as he labeled this approach with a complex-sounding phrase: operationalizing the analytics.
“If you can get a mathematical representation of scenarios,” he said, “then you get benefits from the digital twin into an operational environment.”