Edge computing & the eye in the sky

(Part One of three in my 2019 digital transformation travels.)

As the price of computing drops, chip speed accelerates, and AI takes hold, it’s becoming more practical for companies to deploy video and analytics systems to see what you’re doing while you’re doing it. The profound rise in affordable computing power now makes it possible to apply real-time analytics toward the goal of reduce labor expenses by helping — or telling you — how to be more productive. This slice of the factory of tomorrow isn’t fiction; it’s taking root today.

After decades of working with industry leaders, sometimes visionaries, I’m agnostic on the role technology’s with regard to personal privacy. No matter your feelings about it, it’s here. I have personal concerns about invasive technology, but reticent to judge it in workplace workplace applications, because employees’ on-the-job time belongs to the employer, contractually speaking. It’s just an extension of how Henry Ford and Ray “McDonald’s” Kroc observed their people and processes to automate car making and burger flipping. It just swaps the clipboards and spreadsheets of yore for AI and the Internet of Things video to efficiently capture, analyze and optimize every move employees make.

Early manufacturing applications of machine vision used artificial neural networks and fuzzy logic tied to programmable controllers and computers. In time such software became common in controls that keep industrial processes tuned, mitigate the risk of nuclear meltdowns, and rejecting off-spec car parts and burnt potato chips from high-speed manufacturing lines in the name of quality assurance.

Now, AI-aided analytics analyzing massive of real-time video data flows might just commoditize workers in the same way that led to the widespread replacement of low-skilled jobs with robots.

One tech firm catering to global tech and industry giants recently unveiled a factory-of-the-future demo using individual cameras watching and analyzing thousands of workstations “to figure out if folks were deviating from the agreed processes,” Matt Vasey, a top Microsoft AI luminary and chairman of the OpenFog Consortium, told me. He said the issue strategy wasn’t just about “sensorizing” a factory machinery, so much as “effectively using a camera to sensorize a person.”

The massive amount of data generated can only be accomplished in any practical sense using an edge computing strategy as part of a larger infrastructure that will analyze and pass incrementally more abstract knowledge with higher-level systems to cloud systems for the greatest analytical depth, with a few foggy steps between.

Edge, fog and cloud vaguely defined

For people used to working in or with IT and probably most lay folk, cloud computing is old hat, and edge computing is becoming somewhat intuitive. The cloud is the Internet’s version of a centralized data center; the edge is a decentralized architecture for data aggregation and analysis that’s located on premise, close to the physical processes of interest. 

Fog computing, which resides between edge and cloud, is a bit trickier to define, but nobody seems to care too much. Because the definitions continue to evolve, as do the applications and growing agreement that the new IT reality will end up being an “everything bagel,” my term for a hybrid edge-to-cloud architecture. Where the computing takes place will be increasingly fluid.

At this point, I need to clarify my clickbait “eye in the sky” headline for this article: The cameras and super-fast computing transactions I cited above will occur in the factory, not in the sky. The fog and cloud (if not sky) will be used for larger, slower and longer-term analyses, though.

You can learn more about this by reading my story on the rise of edge computing.

The emerging tech reality

In addition to learning a lot from Vasey I also spoke with IT hotshots from Sysco, Intel and other companies running the digital transformation race. I came away accepting the inevitability of increasingly massive applications like the kind of real-time video scenario above — and a new IT reality that will change the relationship between human beings and technology.

A little logroll on the blogroll: When SmartIndustry ran the article earlier this year, Intel’s IoT group company gave the article a shout-out on Twitter and LinkedIn, and I duly re-shouted it:

Intel's IoT social callout for SI edge coverage
Intel IoT social callout for SI edge coverage

In forthcoming blog articles I’ll reflect on clue you in on more of the science fiction becoming reality across global organizations.