Industry 4.0 enthusiast and thought leader Jeff Winter teamed up in July with Smart Industry's Scott Achelpohl for “the sequel” to our debut in June of SI's monthly manufacturing technology series, (R)Evolutionizing Manufacturing.
While the June premiere dealt with the ABCs of digital transformation, this episode—which is now a bonus installment of Great Question: A Manufacturing Podcast—covered an absolutely foundational concept that every company trying to transform must understand: It’s all about the data.
You’ve got to have solid and accessible data for any transformation project, including AI, which Episode 3 of “(R)Ev” will cover this Monday, Aug. 26.
Below is an excerpt from this podcast:
Scott Achelpohl: In the spirit of summer blockbusters, this episode is “the great sequel” to our June premiere … like “The Empire Strikes Back” or “The Godfather: Part II” … but hopefully not “Jaws 2.”
Me and Jeff got to thinking about the foundations of a company’s digital transformation, and we concluded a lot of it has to start with data, whether a manufacturer has a handle on all the data that feeds every single one of its processes—hour in, hour out, day in, day out, shift after shift of production run time.
Jeff Winter: That's right. We started talking about quite a few different topics for future episodes, and we concluded that at the end of the day—it’s all about data.
Podcast: How DIY automation can help small, medium-size businesses
All the cool new technologies and transformational initiatives you hear about revolve around properly capturing and extracting value from data.
It's the secret sauce that helps manufacturers predict problems before they happen, streamline processes, and make smarter decisions faster. It's not just about keeping up with the competition; it's about staying ahead and setting new standards.
Without data, you're missing out on all the efficiencies and innovations that make modern manufacturing so exciting.
Scott Achelpohl: I got to thinking about the movie “Apollo 13”—yes, the Tom Hanks blockbuster, he of “Houston, we have a problem.” Great movie, right!?
If you recall from that film (and the real mission in 1970), the Apollo astronauts are stranded, headed on a perilous slingshot course around the moon, and the team on the ground at NASA has to figure out how the three crew can conserve power and generate more for the crippled Odyssey command module to keep the heroes alive for several long days in space before splashdown. There’s a line from that hit movie that stands out: “Power is everything.” Well, in our case, think about it, we can draw the analogy with data. Data is everything.
See also: Smart factories: A roadmap to optimization, not overhaul
Lots of manufacturing companies have told us they couldn’t build a solid foundation for ANY project related to digital transformation without getting a handle on their unbounded and oftentimes disorganized and disjointed data.
Before embarking on, say, setting up predictive maintenance in a plant, they often need to figure out data: where it’s stored, how it’s organized, where it comes from, how it’s been utilized, and how they want to organize and use it, in their goal to use sensors and PM to avoid downtime and keep the line running.
In this way, data could be thought of as a foundational concept in this era of Industry 4.0, like what we talked about in Episode 1 last month. So that’s the direction we’ll take today, with the sequel, Episode 2, our summer blockbuster.
Jeff, you are the stats and trends guy; tell us about some of the most interesting stats you have found as they relate to data.
Jeff Winter: Absolutely! I love these stats—and some of them took me a long time to find.
I always like to start off by painting the picture of how much data is created.
Consider this: According to IoT Analytics, we’re surrounded by a staggering 16.7 billion IoT devices—that's roughly two devices for every person on this planet. But why does this matter? Because it's these devices that are driving an unprecedented revolution in data generation.
Podcast: AI best practices—Lessons learned at Girtz and Ford
Let’s put this into perspective: Eric Schmidt, former CEO of Google, famously said in 2010 that the amount of data generated since the dawn of civilization up until 2003 is estimated to be 5 exabytes. Sounds impressive, right? But fast forward to 2024, and according to Statista, we're staring at a mind-blowing 147 zettabytes of data generated this year.
To grasp the magnitude of that stat means we are going to create roughly 29,500 times more data THIS YEAR alone than we did in all of humanity up until 2003. That is CRAZY! EXPLOSION!
You might be wondering: Who’s producing and collecting all this data? It's not the banks. It's not the health-care sector. It isn't even retail. It's manufacturing! According to McKinsey Global Institute, manufacturing is collecting nearly double the data volume of the next highest industry.
Yet, here's where reality sinks in. Out of the 2,000 petabytes of data collected by industry each year, The Industrial IoT Consortium estimates we've let 99% of it slip through our fingers, discarded like yesterday’s newspaper.
Cybersecurity Ventures predicts that the total amount of data stored in the cloud—which includes public clouds operated by vendors and social media companies like Facebook, government-owned clouds that are accessible to citizens and businesses, private clouds and cloud storage providers—will reach 100 zettabytes by 2025, or 50% of the world’s data at that time, up from about 25% stored in the cloud in 2015.
Scott Achelpohl: Predictive maintenance is one example of a digital transformation project that might hinge on good-quality data, coming from sensors. Other much larger ones like automation, AI, and better visibility into factory OT also would be candidates for creating “clean” data and keeping it that way.
See also: IT/OT convergence: The making of a modern plant
At SI, we’ve touched on this in earlier programs (shameless plug for you!) like an episode in March of our Great Question: A Manufacturing Podcast. Our guests for that episode, Craig Resnick and Marcel van Helten, coincidentally drew an “Apollo 13”-like comparison—that data is power.
During that podcast, Craig said: “Data is what makes the world go round.”
So, Jeff, I want to draw on your expertise with this, what are some other examples in digital transformation where intact data is absolutely critical for success?
Jeff Winter: I laugh because the answer is simple. The answer is … all of them. There is no shortage of digital transformation use cases out there, but all revolve around one key element: being digital—and that means data.
If you look at IoT Analytics in their 2022 Industry 4.0 Adoption report, they tracked the top 15 smart manufacturing use cases and 16 smart product use cases. All 31 revolve around collecting, transferring, or using data.
So, what is most important? The answer is obviously good data. It’s important because it helps us make informed decisions, identify trends, and solve problems effectively. Without good data, you're not just guessing; you’re making informed decisions, not bad ones.
See also: How to choose security for your OT operations
Most data companies out there will tell you there are seven main characteristics of data quality:
- Completeness: Measures how much of the data is present and usable.
- Uniqueness: Ensures no duplicate data exists.
- Validity: Checks if data matches the required format and business rules.
- Timeliness: Ensures data is available within the expected timeframe.
- Accuracy: Confirms data correctness based on a designated "source of truth."
- Consistency: Compares data across different datasets to ensure similar trends and reliable relationships.
- Auditability: Data is accessible, and changes are traceable. Bad data costs us dearly. According to Gartner research, “the average financial impact of poor data quality on organizations is $12.9 million per year.” And that was back in 2021. IBM also discovered that, in the U.S. alone, businesses lose $3.1 trillion annually due to poor data quality.
So yeah, it matters.