Machines are talking to us. Are you listening?
Machines are talking to us. Are you listening?
Saar Yoskovitz, CEO of predictive-diagnostics company Augury is, focusing his efforts on the ultrasonic and virbration communiques broadcast by machines via sensors that arebecoming more commonplace (and cheaper) by the day. We chatted about what’s coming down the pike in the world of enabling machines to diagnose their own problems. Take a look…
Smart Industry: How is the digital transformation affecting predictive machine diagnostics?
Saar: We are constantly seeing Internet-age technologies trickle into this very traditional market. The current state of vibration-analysis equipment is analogous to the personal computer market in the 80's—when computers cost thousands of dollars, were hard to use, and were reserved for the elite few that required these advanced tools. Today we see mobile-computing technology come into the market so anyone with an iPhone, suddenly, becomes a vibration analyst with years of experience.
On the continuous-monitoring front, MEMS (micro electro-magnetic systems) sensors are getting to a quality (sampling noise and resolution) where they can replace traditional piezo-electric sensors. Unlike piezo-electric sensors, MEMS sensors obey Moore's law—every year they will drop in price and become more efficient. This is a huge shift in the market and, over the next few years, we will undoubtedly see these sensors embedded into equipment, which will shift power dynamics to the OEMs.
Smart Industry: How do algorithms enable us to more closely listen to machines?
Saar: Given a decent amount of data, computers can be as good as humans, and even surpass human performance in specific, tedious tasks. Computers have no bias, don't get tired and don't lose focus. Across the whole industry we see computers taking a bigger role as decision-supporting tools. They will not replace the experts, rather, they will help them do a better job.
We combine a multitude of sensor inputs including vibration, ultrasound, electro-magnetic, temperature and current to gain insights on the mechanical behavior of the machine. We track that behavior over time and build a specific model for every machine we monitor. We also have the luxury of aggregating data from a large number of similar machines. This enables us to compare one machine to the tens of thousands of machines we've seen around the world. Compare that to a reliability officer who worked in one facility for his entire career. He probably saw 100 machines, maximum 1,000. Our algorithms have been exposed to more types of failures and edge-cases and therefore have the upper hand.
By using all of these data-sources, we are able to not only tell you if something is wrong, we can give you actionable insights. What specific malfunction do I have? What is the risk if I don't fix it? Can I postpone the repair until next winter? How will this affect my energy bills?
Smart Industry: What is on the horizon—developments or new applications—for big data in 2017?
Saar: I'd probably go with “edge computing” or “fog computing” (which is the new buzz-word). Sensors create a lot of data. Our continuous-monitoring solution samples 5GB worth of data per month—for a facility this could easily get to 10TB per year. Transmitting all of this data over cellular connection is a non-starter. Therefore, we must run local algorithms on the sensor level to intelligently control what we send to the cloud and what we don't. This has the added benefit of being able to keep diagnosing machines, even when the connectivity is down.
On the flip side, it requires “big-data companies,” which traditionally worked in high-level, distributed computing environments, to design efficient algorithms that run on battery-powered equipment. This is a real challenge that requires a different skill-set in the engineering team.