With the increased integration of AI and machine-learning technologies into industrial
environments (such as smart factories), IT managers seek the best strategies for enabling all the benefits—tangible as well as potential—that will affect factory efficiency, now and into the future.
New capabilities like predictive maintenance are made possible by recent developments driving real-time insights generated from the data that’s being captured, processed and analyzed from all the connected machines, critical systems and other sensor-laden devices in the factory. The most common use case for machine learning in IoT is for predictive analytics, according to an IDC survey, where 83% of the respondents were developing predictive-analytics capabilities.
Optimizing use of this data requires IT managers to store, move, manage and properly utilize the data. But, what’s the best strategy to capture the necessary content (what it is and where it’s from) and context (why it’s important and how it can be used) of all the data coming in from all the connected devices at the edge?
The answer is tricky.
Take the factory environment as an example: a hybrid approach collects data from the edge and moves it up to the cloud. IT managers, equipped with the context they have for all the edge devices in the factory, can filter, dissect, organize and command this huge amount of data to determine the best ways to integrate and optimize AI technologies for the factory. In a smart factory, data is not funneled directly to the cloud because valuable factory-floor context can be lost in this transition. Additionally, latency, security and data integrity are all considerations for accuracy and safety.
Instead, a new hybrid model is emerging where data is collected from all devices at the edge and stored on a local gateway equipped with necessary AI and machine-learning capabilities to perform the analysis, inference and other critical tasks in real-time, with lower latency and without some of the same security risks.
Data that’s kept on the gateway is sent to the cloud during off-peak hours when timeliness is no longer a concern. More machine-learning algorithms are performed in the cloud over time to produce patterns and insights that can be used for predictions and future operations.
The variety of sensors has multiplied in recent years, enabling granular monitoring of the entire factory. IDC predicts the number of deployed sensors to exceed 80 billion worldwide by 2025. Thanks to the leapfrogging advance in technologies from chipset to applications, the data generated by these sensors is put to great use. From a strategic business perspective, factories can enhance their operations and output through data analysis both at the edge and in the cloud. Data acted upon at the edge can prevent mishaps in the production; line from a machine error, data sent to the cloud, or the “core,” enables long-term analysis of variance factors to render savings or heighten efficiency within the factory.
Machine-performance deviations can be detected (if not predicted), enabling factory managers to fix issues rather than reactively replacing—at great cost—faltering units.
This mix of real-time analysis and long-term archival monitoring frees up time for factory workers to focus instead on productivity, quality and innovations. This hybrid approach reaps significant benefits for factories, employees, their customers and the community.
Real-time analysis helps prevent mishaps and provides context to factory operations to make pertinent decisions on the floor, while the archival analysis helps factory owners make strategic decisions to enhance the layout, machinery, cost savings and overall output efficiency of the business over time. Industry players stay abreast of ever-changing technologies to construct optimal floor environments. And as quality improves and the quantity increases, the working environment becomes more positive, productive and optimal for innovation.
Oded Sagee is senior director of embedded and integrated solutions for Western Digital.