During my discussions with IT and industry executives on the internet of things, analytics and artificial intelligence — digital transformation for short — what silver-bullet application he expects technologies to yield. He immediately answered: “Video.”
I was especially fascinated by the example he gave of a developer’s recent trade show demo. He — I’ll call him Mr. Smart — said it depicted a factory with cameras installed at each of roughly 2,000 employees’ work stations. The goal, he said, was “to figure out if folks were deviating from the agreed processes.”
It all sounds very straightforward, in that industrial firms, like businesses of all types, have forever sought to standardize and automate their work processes to reduce labor, boost productivity, cut costs, improve profits, and so on. Now, it’s just including better measurement of human work processes.
Organizations routinely monitor and measure machines and other inanimate factory assets using sensors, and video serves the same purpose, albeit for human assets. Per Mr. Smart, the goal of AI-assisted video analytics is “to effectively use a camera to sensorize a person.”
It’s a logical extension of what Henry Ford and Ray “McDonald’s” Kroc did to observe and improve upon the status quo to improve automaking and burger-slinging. There is, however, something of a qualitative departure as research into AI seeks to get inside people’s heads to discern how they feel. Using artificial emotional intelligence, smart machines — software bots — have already been used to identify anger and frustration in doctor-patient relationships and customer service chatbots. By analyzing audio as well as facial expressions, movements, even posture, anyone, one has to ask whether tomorrow’s employees will have a right to be paranoid.
The enabling IT underpinnings
My tech-guru Mr. Smart isn’t a video expert but a leader in AI and a key proponent of developments at the crossroads of IoT, AI and analytics — enabling technologies that he believes finally provide the bandwidth for the kind of video applications in the trade show demo.
In particular, edge computing enables real-time analysis of data (such as video feeds) at the source (the factory in this discussion). In contrast, remote cloud services are better suited to larger datasets and deeper analysis (among many other things). In a recent story in SmartIndustry, Intel’s Chet Hullum and his team used real-time edge analytics to improve semiconductor manufacturing, because the cloud would have been “much too expensive.” (See graphic below.)
With IT advances and applications proliferating from edge to cloud, the eye in the sky really isn’t the limit anymore.
FURTHER DISCUSSION: It’s important to consider the changing relationship between human beings and technology, and the social impact of the changes. As I covered in “The Robots Are Winning,” credible sources say automation has eliminated more jobs than it’s created and destroyed the U.S. middle class. Do you dis/agree? Do you have sources to help me revisit the topic in greater depth? Contact me.