By Bob Sperber
Analysts predict the number of semiconductors shipped for use in IoT applications overall will grow from 30-something billion chips today to more than 70 billion annually in the next five years. Meanwhile, enormous cloud datacenter campuses continue to proliferate worldwide, and are measured on an energy consumption scale once reserved for the power plants that feed them.
On the edge, in the cloud and embedded within the pervasive networks that weave these digital worlds together, the designer of industrial automation and information management solutions today has at his or her disposal computing power that was unimaginable only a few years ago.
Increasingly, it’s not a question of computing resource availability, but rather where it’s best to solve which types of problems. Oftentimes the edge wins out for reasons of performance and determinism. Other times, the cloud carries the day because new applications can be spun up more quickly and readily address problems that are broader in scope than a single unit or facility.
Indeed, an informal tour of industry leaders indicates that the edge or cloud seldom takes all. Rather, an integrated, hybrid architecture that features a combination of edge and cloud execution promise the best of both worlds.
Chipmaker, heal thyself
Take Intel, for example. The semiconductor manufacturer is both creator and user of the technology that makes the digital industrial age possible, and it recently completed its first test of a scaleable, edge-to-cloud predictive maintenance solution for use in its own fabs.
The project sought to automate the labor-intensive process of monitoring fan filter units (FFUs) that purify air for manufacturing. To optimize vibration data collection from each FFU and allow quick event triggering, Intel leveraged its GE Digital factory automationplatform and its own Intel IoT gateways to create an edge solution. Locating data processing on-site made more sense in terms of processing power, cost and time.
“The fact that we analyze at the edge will optimize the traffic that crosses the network. If we were to send all data to the cloud, the solution would be far too expensive,” says Chet Hullum, Intel’s general manager, Industrial Internet of Things. Yet there remains a role for the cloud: collecting summary-level data for long-term trend analysis. The project has been a success, based on results including:
- A more-than 97% increase in FFU uptime due to early parts orders upon potential failure detection, and rapid replacement.
- 300% reduction in unscheduled downtime over manual, labor-intensive FFU monitoring, which in turn boosted productivity.
- An estimated reduction in cloud traffic of nearly 94%.
Hullum calls the foray into edge computing an “ideal example” of leveraging an IoT gateway at the “point of ingestion” and optimizing system scalability. He reports that Intel is now expanding the FFU solution to more production lines. The company is also expanding the edge-based predictive maintenance architecture “to two additional use cases this year,” including one project to predict pump failures before they occur.
This sort of solution squares neatly with the Hewlett Packard Enterprise (HPE) view of an applications architecture that is “edge-centric, cloud-enabled and data-driven,” said CEO Antonio Neri, in a keynote address to attendees of the recent ABB Customer World 2019 event in Houston.
HPE has long contended that the new digital architecture for industry will be a hybrid, edge-centric one due in part to compliance (data ownership/privacy), latency and bandwidth issues. “75% of all data is created at the edge, where we live and work,” Neri said. Meanwhile, only 6% of that data is put to use, and sending it to the cloud can make matters worse, Neri added, likening the cloud to The Eagles’ Hotel California: “Once it’s in, it can be really hard to check your data out.”
“We believe the better solution is an edge-to-cloud architecture, where you only move data to the cloud as needed,” Neri said. “It’s all about managing that data effectively, and extracting outcomes faster.”
New revenue in new services
One case in point is the development by industrial equipment manufacturers of new cloud services enabled by local, edge-based analytics for assets in the field. Caterpillar, for one, recently built on cloud services from OSIsoft to offer its own Asset Intelligence platform to analyze fuel consumption, equipment health and other critical operations. These cloud-based services, which work in conjunction with analytics performed locally, helped one operator of large marine vessels save $450,000 in fuel per ship annually by optimizing hull-cleaning maintenance to reduce drag. The service also helped saved a cruise line $1.5 million per ship in reduced fuel consumption.
Elsewhere, Gardner Denver, global provider of industrial equipment, employed Software AG’s Cumulocity IoT platform to launch a subscription-based condition monitoring service for its IoT-enabled compressors. Users can remotely monitor operational parameters in near real-time, and receive notice of fault conditions. These instances of remote monitoring, maintenance and management of manufactured assets show how remote edge analytics can be parlayed into cloud-services that bring new value to stakeholders.
Industry players also are looking for the killer use case to unlock new analytics-driven services. One new arena that’s got Steve Carlini excited is electrical energy storage and grid management. As the traditional, centralized energy grid continues shifting to distributed generation and micro-grids, lithium ion battery banks will begin to emerge as a key resource for buffering supply and demand, says the Schneider Electric vice president of innovation and data center. “With the right amount of information and the right analytics, you can start discharging these batteries to cut electric costs or supplement power when it’s needed.”
Edge analysis + cloud perspective
Industry leaders also are seeking to push analytics to the next level: from predictive to prescriptive. Currently, 50% of industrial firms have, or are piloting, an industrial analytics program, while another 48% of companies plan to within the next three years, according to LNS Research. Firms are, however, finding it difficult to break into the prescriptive realm because “it takes far more information than is available at the edge,” says Dan Miklovic, LNS Research fellow. There’s new value to be found in systems that can tell operators, “run this bearing at X speed to meet the production schedule, then take it offline for service,” Miklovic says. Now, edge-to-cloud systems are targeting such solutions.
NRG Energy, which supplies electricity to more than 38 million U.S. households, credits prescriptive analytics with increasing turbine efficiency to save an estimated $5 million a year, with zero impact on planned outage schedules. These results are based on the company’s implementation of GE Digital’s Predix platform to bring essential turbine data from the edge to the cloud. There, analytics blend real-time production data, external
lifing models for turbine components, and periodic pricing and weather reports “to let plant operators know the most profitable way operate the turbine,” explains Amy Aragones, senior director of product management, GE Digital.
This hybrid edge/cloud solution analyzes how and when to run turbines safely beyond baseload conditions during periods of peak market prices, and then guides users how and when to under-fire the turbines to recoup the wear incurred during the peak-profitability hustle. This reportedly preserves both expected turbine service life and planned maintenance schedules.
For leading photovoltaic manufacturer First Solar, most analytics are performed in the cloud. Beyond the on-premise capabilities of its Rockwell Automation control and information management infrastructure, data from virtually all machines, PLCs, robots and other IoT-enabled devices are sent to the cloud for deeper and more broadly based analysis. In one month at First Solar, four plants send five billion manufacturing database records to the cloud, each containing approximately 100 data points on equipment performance.
Cloud-based services makes sense because “We’re not looking at a single piece of equipment at a single location, we’re looking to compare all similar equipment at every one of our locations,” says Allen Blackmore, IT domain architect for global enterprise technology, First Solar. He envisions analytics on an enterprise-wide data lake of unstructured data across manufacturing, sales, finance, and supply chain functions—essentially the entire business—which by its scope necessarily transcends the limits of an edge-only approach.
Modernizing made profitable
Bringing IoT capability to legacy assets “is one of the biggest issues that enterprises face,” says Ricardo Buranello, vice president of global factory solutions at Telit, an IoT infrastructure provider. But it’s worth the effort, he says, citing results achieved at a customer that manufactures automotive axles. By connecting and analyzing the data from 1,000 formerly isolated CNC machines, the company was able to increase productivity and pocket savings worth an additional 50 assemblies per day.
“We’re seeing an enormous increase in the tag values companies would like to collect from their legacy machines, like 30 year-old lathes,” says Dave Cronberger, infrastructure architect with Cisco Systems, adding that manufacturers are showing “a strong belief that they’re going to gain new insights from their controls and I/O blocks, and learn new things that they don’t know currently.”
Mitsubishi Electric Automation has alternately put edge and cloud strategies to work on a range of manufacturing challenges. When the use case called for real-time analytics to predict and improve electroplating quality or to control defects in injection molding, they focused on local execution: “It would have cost us a fortune to put all our production data into the cloud; we’d run out of space in 10 minutes,” says Timothy Lomax, strategic alliance manager.
Elsewhere, Mitsubishi has used Oracle’s cloud and business applications to increase the accuracy of pick-and-place robots, improving the visual detection and rejection of off-spec product. The project, now underway, uses the cloud for robot data, images, data trending and artificial intelligence analyses. For many projects at Mitsubishi and elsewhere, Eric Prevost, vice president and global head of emerging technologies for Industry 4.0 at Oracle, reports working “in coordination with edge-device providers for many projects.”
Research from Wikibon predicts the evolution of a coordinated edge/cloud network model from sensor to supply chain. The research firm modeled a small wind-farm 200 miles from the cloud data center with IoT-connected security cameras, security sensors, sensors on the wind-turbines and access sensors for all employee physical access points. The result: When an edge network handled 95% of the data traffic for video and sensors, total cost was “reduced from about $81,000 to $29,000 over three years,” about one-third the cost of the cloud-only approach.
Given the caveat that every organization has unique conditions, the mainstream belief that the “you can’t do everything in the cloud” rings true, says Jason Andersen, vice president of business line management for Stratus Technologies, maker of high availability edge-computing solutions. “Today, from what we understand, the breakeven point seems to be around 30%,” Andersen says. “Processing 30% of your data in the cloud will cost about the same as doing it all at the edge.”