Edge computing best practices for connecting disparate systems
By: John Fryer, senior director—industry solutions, Stratus Technologies
Companies wanting to create a true digital transformation must determine the best ways
to interconnect their intelligent assets and growing numbers of IoT devices. This is easier said than done. They often encounter challenges as they merge IT and OT systems using varying forms of networking communications technology (CT).
While IT networks can take various sizes and shapes, there are consistent elements. Such is not the case when a network must reach into the industrial edge where OT systems live and work. This industrial world can include anything from an auto-assembly plant to a brewery, and typically consists of both legacy systems and cutting-edge devices. Office and commercial operations also commonly have field edges supporting building control and other mechanical systems.
Trying to integrate these edge elements with the core IT systems is challenging because of the equipment involved. Motors, conveyor belts, packaging lines, compressors, pumps, distillation towers—these things are controlled and monitored using systems often unfamiliar to IT personnel. Without understanding the nuances of OT networks, interconnection often results in a sub-optimal system. Yes, there’s connectivity, but network availability may be reduced, data storage can be inadequate and cybersecurity vulnerabilities might emerge. Eventually, these problems can bleed into the core networks, causing slowdowns and congestion.
Such situations stem from pursuing the wrong approach to OT-asset management. All the specialized hardware and software supporting these functions can’t be strung together using a semiautomated provisioning/management approach. Success instead depends on an automated software-defined infrastructure overlay to handle management and orchestration of hardware and software assets.
Using best practices
IT architects approaching the OT world to establish this infrastructure (especially the edge-computing elements) should follow three critical design best practices:
First, stick with a use-case driven approach. While conventional IT-network designs are shifting more to mixed-workload models, edge-computing architecture is still driven by concepts as basic as the location of devices in a facility and the hardware form factor. Placing a hardware device in an area of a plant or building that is hot and dusty can limit its survivability. This may require asking questions as to whether a device should be embedded or standalone based on the environment, and also what duties it must perform. Is it interfacing directly with IT-, OT- or CT-related data? Does it communicate via wireless, “standard” ethernet or some industrial variant, USB or another interface? Answers to these questions will inform the selection of the compute platform and operating system.
Second, consider the applications hosted on the edge platform. Working hand-in-hand with the concept of use-case centricity, it is important to consider which types of applications (IT, OT or CT) will be hosted in a particular part of the infrastructure and how they will be deployed. Virtualization, through virtual machines and containers, can bring core-like functionality to the edge while supporting concurrency and sharing. Additionally, the infrastructure must support the types of communication capabilities—wired, Wi-Fi, cellular or industrial wireless—needed to provide connectivity between the edge and the core.
Third, designers must meet service-level objectives. Designing infrastructure capable of delivering exactly the right mix of computing and data-persistence resources from startup is difficult at best. Nonetheless, it is critical to support the stated use case-specific applications. Success in this situation depends on building elasticity and scalability into the infrastructure so computing resources and applications can be added or removed as necessary. This calls for a mix of core/cloud, fog and edge-only application deployments, which may need to be adjusted as operational experience is gained.
Navigate the data
Initial forays into OT can leave IT architects swimming in data from thousands of sensors and endpoints. If the practices suggested so far have been implemented, management and analysis will be a much easier task. A hybrid data-management framework incorporating a data-lake platform for combining structured and unstructured data can avoid a flood. It can support effective management of raw sensor and device data, with initial analysis directing it to the appropriate part of the core.
Multi-pronged security
Extending connectivity to the edge can introduce a host of security issues, since typical industrial equipment was designed with little protection in mind. Industrial equipment also has a long lifespan, making the problem worse. Anyone designing edge systems must understand the need for a multipronged approach to remediate these deficiencies, using security appliances where needed, combined with secure applications and operating systems. Network activity must be monitored continuously for any kinds of physical, network, application or data breaches.
Of course, any solution for communicating with and securing the edge requires more than bolting on accessories. Benefitting from the data and information at the edge calls for a systematic approach following proven best practices to deploy edge platforms and manage the resulting data flows.