This article is the final part of a series that calls on subject matter experts to look ahead to digital transformation and manufacturing trends in 2024.
Also next month, please watch for Smart Industry's annual Crystal Ball Report, available to members as a downloadable e-handbook, from a larger group of experts who also forecast "smart" manufacturing milestones in the new year.
Operational twins will be among the most interesting and crucial manufacturing digitization projects in 2024, judging by the interest and requests we have been receiving from clients. Our research shows that investment in digitizing factories and warehouses is increasing as companies strive to become more operationally resilient.
What is an operational twin? It’s the notion of a digital twin applied to the production process. If you now think of flashy, dynamic, and interactive 3D replicas of plants, stop right there.
See also: The Crystal Ball Report 2024: A preview podcast
Operational twins integrate data from disparate IT and OT systems—BOM data, product routings, maintenance and quality data, energy, and emission information. From that data, the twin generates a view of operations that can be understood by employees asking questions such as, “What is the impact of raw material specs conformity on the yield of the production lines?”
How does an operational twin provide a clearer picture of operations?
An operational twin makes information from all the highly specific optimization and decision-making IT and OT applications accessible in dashboards, reports, and low/no code applications.
It contextualizes that information to provide usable information that can be enriched into shared knowledge. This differs from traditional architectures, where data is collected, collated, and communicated vertically through established static integrations.
What does an operational twin change from a managerial perspective?
It addresses one of digital manufacturing’s most pressing challenges—complexity. Across industries, most manufacturing and IT managers now understand two critical benefits an operational twin brings.
- Rather than replacing the existing MOM stack, the twin complements it to deal with increasingly complex data-based use cases and applications.
- The twin provides the unified semantic model of operations that isolates engineers, data scientists, and developers from the often intractable complexity of the underlying manufacturing systems.
This is reflected in increasingly insightful requests from clients, many of which share the same theme: The existing MOM and manufacturing architecture won’t allow us to reach our digital manufacturing goals—we need a future-ready architecture. The ultimate ambition behind these requests often is reaching autonomous operations—something never even mentioned three years ago.
What is the business case for operational twins?
Trying to justify the investment based on a particular use case is a losing proposition, as any particular use case can most likely be developed without investing in an operational win.
See also: Powering cognitive digital threads with Gen-AI to accelerate innovation
Its value is in its ability to support many disparate use cases without creating data silos and application silos that can progressively “choke” manufacturing architectures. On the contrary, each use case adds to the completeness of the model, creating a virtuous cycle of shorter time-to-value.
But haven’t things been working well enough so far? It’s true that smart engineers have generated billions in value over the last two decades, solving thousands of use cases with Excel workbooks. These workbooks are still the most used tool to:
- Extract data from a variety of IT/OT systems.
- Contextualize this data in some implicit model.
- Develop analytics to support decisions.
- Implement those decisions by manually keying in new set points into an operational system.
But there are serious problems with such informally created operational intelligence nuggets and the granular process expertise embedded in these thousands of workbooks.
For starters, the organization becomes totally dependent on the specific plant systems and then nonscalable by design. Furthermore, the workbooks are highly use case-specific and not “interoperable” with any other use case in the plant, e.g., how does maintenance affect the quality or performance? And many of these nuggets are lost when the manager or engineer leaves or retires.
Most importantly, though, these workbooks are impossible to integrate into a “closed loop” logic that can contribute to the future autonomy of machines, lines, and production plants.
Why do operational twins make organizations future-ready?
A true operational twin is the next generation of data management platforms—one that engineers can use to innovate the myriad use cases that add up to a couple of points of overall equipment effectiveness (OEE) per year.
The twin also addresses the fundamental issue just explained. One of its key features is that it decouples the functional model of the machine/line/plant from the specifics of each of those, thus making use cases easily scalable across plants. The functional model (also known as industrial knowledge graph) unifies all “local” models into a semantic model of operations managers and engineers can evolve and extend without having to become IT experts.
See also: With Industrial Copilot, Siemens and Schaeffler help make Gen-AI ‘industrial grade’
With the rapid development of generative AI, or Gen-AI, the knowledge graph will allow the organization to capture the tacit knowledge that walks out when people retiring plant associates and enable it to get incoming talent up to speed.
Operational twins likely will be the 1% of a manufacturing head’s yearly investment budget that allows their engineers to get the best of the other 99%. The twins will require more governance and rigor than hundreds of random workbooks, but that’s the whole point—high-quality contextualized data should be the foundation of any digital strategy and even more so in manufacturing.