Recently, I read a couple of long articles on the wonders of Industry 4.0, the latest buzzword to hit a world where buzzwords make fortunes. If you’re a consultant, getting in front of a good buzzword can yield a high-paying career as an industry guru.
So what is Industry 4.0? The way it was (vaguely) defined in the articles I read, it’s the use of Big Data to lower costs and increase quality. In short, it’s what I used to know as root cause analysis with some statistical process control thrown in.
One example was putting RFID trackers onto engine crates, to see where they were damaged in transit; they found a single transport machine was damaging the engines, and by fixing the machine or training the operator, they could save gobs of cash that would have gone to sturdier crates.
In short, they did exactly what any self-respecting quality professional would have done in 1990. We had data back then, too, and we weren’t afraid to use it.
Sometimes, it seems like consultants and pundits follow the rule I was given for creating a dissertation: take something old and combine it was something new and trendy (I was told to use a moderating variable, too, but that’s not really needed here. My own, rather irrelevant-to-industry dissertation is summarized here. I wish I’d thought of using visibility and credibility instead, since it’s a nicer relationship and more useful.)
In this case, we take the new Big Data concept that is so hot these days, and so often not quite understood by executives who still know they need to do something with it, and link it to root cause analysis. Again, though, I’ve never known a quality pro to ignore data, or refuse to get more data. It’s what they do, and what they have always done. Sure, more data is available now; but it’s the same old idea.
As for Quality 2.2 — that’s just a guess, based on generations of work in the field of quality. If you have a better version number, please suggest it here! Continuous improvement hasn’t really been relabelled lately, has it?