Edge Computing Heads to CES
The focus at CES is on shiny gadgets and entertainment, but the show is also about connecting anything and everything to a network so that companies can add features to their products, collect data on customer habits and make money on new services. That's why network operators make the rounds at CES, although some do it more publicly than others.
It's also why the topic of edge computing is going to pop up in booth demos and hotel suites throughout Las Vegas next week.
Among the companies with edge computing on the brain is NXP Semiconductors N.V. (Nasdaq: NXPI). Still embroiled in a merger transaction with Qualcomm Inc. (Nasdaq: QCOM) (which is itself the object of a hostile takeover bid by Broadcom), the semiconductor company is nonetheless pushing forward with its own agenda, which includes partnering with cloud providers to bring a little more of a data center environment to the devices consumers use. At CES, NXP says it will demonstrate new edge cloud applications based on development work it's done with Google Cloud Platform , Amazon Web Services Inc. and others. At one end, that means demos with basic IoT sensors that confer and process information locally with a nearby network access point. At another, it means demos of data-intensive applications, like facial recognition, that require a level of processing that would more typically take place in the cloud.
NXP's role in these scenarios is to provide the chips that go into a local access point and communicate both with end-user products and the centralized cloud services that facilitate application delivery.
"We used to call things embedded computers," comments the head of system solutions for NXP's digital networking group Sam Fuller. "It's not the first time that we've had computers that weren't in the cloud, but what we really are creating here is that ability to create a framework whereby it's very easy to develop software that collaboratively works between these edge devices and cloud devices."
Fuller has a great analogy for explaining how the collaborative process works. In the movie The Matrix, characters could download new skills -- like flying a helicopter or performing kung fu -- and then moments later be proficient in them. The edge computing framework enables something similar. A gateway or access point can download the equivalent of the training guide for a skill from the cloud and then do the actual computation work to execute that skill locally. So, for example, a cloud-based service would deliver the tools and information set needed for a facial recognition program to a local gateway with the NXP technology. Then the gateway would implement the program itself to analyze images streamed to it from a nearby security camera.
There are numerous and obvious benefits. Local processing lowers latency rates, minimizes the volume of traffic that has to travel extended distances and allows for more tightly controlled security.
But less talked about is the fact that edge computing could also significantly extend the potential capabilities of consumer and industrial devices. It's similar to the way adding Internet connections to products made it possible to download updates instead of always buying new hardware. With edge computing, devices gain the potential to access new skills that would have been impossible to support previously given their hardware limitations.
"What has really changed is that historically devices like that have been hard-wired. They're really appliances, and they're not platforms. And I think what edge computing does... [is] extend a programming model such that new types of applications can be developed that have a computing component that's local. Might be in your set-top box. Might be in your home gateway. Might be at the base of a cell phone tower. But also could be in a building doing the control of the HVAC," says Fuller.
Fuller adds, "I think the edge to us, what we're seeing is an opportunity to make that a much more flexible compute-type device that you're able to do processing work local to the data whether you have cloud connectivity or not."
Cloud providers, meanwhile, are creating software constructs that allow local devices to easily communicate with their cloud systems.
Amazon Web Services, for example, created Greengrass which extends the AWS Lambda operating environment to a gateway-type device. That means developers can use the Lambda platform to take advantage of AWS functions even when there isn't persistent connectivity back to the AWS cloud. It brings the AWS ecosystem to new devices with a hybrid model for connectivity and computation. Sometimes Greengrass-enabled devices rely on their connection to the AWS cloud, but at other times they function with only local resources. (See Technicolor Table Lamp Runs on AWS.)
The implications of this edge computing work are breathtaking. From virtual reality to healthcare management to entire smart cities, edge computing is arguably on the verge of revolutionizing whole industries in much the same way the centralized cloud movement did a decade or so ago. (See Operators Must Cloudify at the Edge.)
Equally important, while the momentum behind edge computing has only just begun to accelerate, companies are already creating the platforms that will be the foundation for edge-based application development work for years to come.
NXP's work is one example of this, and the applications it enables are likely to become more and more visible in the near future -- first on the industry conference circuit, and then in real-world deployments everywhere the supporting infrastructure is available.
- Technicolor Brings AWS Home to Gateways
- AT&T Intelligent Edge Blurs Lines Between Cloud, CPE
- Crown Castle Eyes Edge Computing in 2018
- Vapor IO Is Virtualizing the Edge
— Mari Silbey, Senior Editor, Cable/Video, Light Reading