Featured Story
Huawei has one 5G power that is hard for the US to hurt
Its long use of gallium nitride for 5G power amplifiers has put Huawei ahead of Ericsson and Nokia, says a leading analyst, as China moves to cut the US off from gallium.
Vapor IO said it will deploy Nvidia's AI Aerial platform in Las Vegas, a move intended to kickstart the market for AI applications running in edge computing data centers.
Vapor IO said it will show off advanced AI capabilities at the MWC Vegas 2024 trade show this week that run atop Nvidia's AI Aerial platform. The company's broad goal is to leverage its early work in edge computing into the burgeoning market for speedy AI-powered services.
With AI running at the edge, "you unlock some really incredible capabilities that you didn't have before," said Cole Crawford, the CEO and founder of Vapor IO, in an interview with Light Reading.
"This is the next transformation of the MEC [mobile edge compute] concept," said Soma Velayutham, VP of telecom at Nvidia, in that same interview.
Specifically, Vapor IO said it would offer private 5G services to the city of Las Vegas that are powered by Nvidia's AI Aerial platform. Vapor IO deployed Nvidia's Grace Hopper chipsets with server vendor Supermicro across some of its edge computing sites in Las Vegas in March. This week the company said it would upgrade those chipsets with the AI Aerial platform that Nvidia has been working to upgrade and expand.
Crawford, Vapor IO's CEO, said the effort will allow the city of Las Vegas to conduct AI operations without sending data outside the city's borders. For example, he said one of the company's MWC Vegas demonstrations will apply AI technology to the city's traffic video surveillance efforts, thereby allowing city managers to ask an AI system to "show me the white car that hit a red car and drove north," for example.
"This is why accelerated compute matters," Crawford said.
Crawford confirmed the private 5G network in Las Vegas uses 3.5GHz CBRS spectrum, but he wouldn't provide any other details. Vendors including Juniper Networks, Celona and NTT have previously discussed their work on private 5G for Las Vegas.
From the edge to AI
Vapor IO was founded roughly eight years ago and enjoys backing from the likes of Crown Castle. It was one of many companies that bet demand for edge computing services would spark sales of mini data centers spread all over the country. Those kinds of small, unmanned data centers in smaller cities – potentially at the base of cell towers – would be the only way to provide super low-latency services to residents in such locations. Otherwise, their Internet traffic would have to travel all the way to bigger data centers in Denver or Dallas, adding precious milliseconds to services like streaming virtual reality that need to be instantaneous.
But demand for latency-sensitive, near-instantaneous connections hasn't yet developed, resulting in edge computing casualties like EdgeMicro, MobiledgeX and Ericsson's Edge Gravity. Indeed, Vapor IO had initially hoped to end 2021 with edge computing data centers in 36 markets. Today it counts commercial operations in seven markets, one of which is Las Vegas.
But thanks to the rise of ChatGPT and other AI services, demand for localized, low latency could be changing.
"Where does latency come in? Part of the [AI] compute to generate a response to a question is in the inference business," explained Patrick Lopez of research firm Core Analysis in a post earlier this year. "While the data set resides in a large compute data center in a centralized cloud, inference is closer to the user, at the edge, where it parses the request and attempts to feed the trained model with unlabeled input to receive a prediction of the answer based on the trained model. The faster the inference is, the more responses the model can provide, which means that low latency is a competitive advantage for a gen AI service."
Others agree. "As the focus of AI shifts from training to inference, edge computing will be required to address the need for reduced latency and enhanced privacy," said Dave McCarthy of research firm IDC in a recent release.
IDC recently estimated that global spending on edge computing will reach $228 billion in 2024, up 14% from 2023.
Nvidia's rise
Driving much of the interest in AI is Nvidia, now one of the world's most valuable companies. Its chipsets power many commercial AI services like ChatGPT.
For its part, Nvidia has shown a growing interest in the mobile industry. For example, the company recently announced a new computer called ARC-1 that includes all the baseband hardware and software a telco needs to run a 5G network. It also doubles up as a host of AI applications that could predict network outages or support a better service for customers.
According to Velayutham, the Nvidia executive, that kind of technology could allow mobile network operators to both operate their radio access networks with AI as well as sell AI computing services. After all, mobile networks generally rely on a wide spread of basestations, and each one of those locations could potentially host Nvidia hardware to either run 5G or AI applications for enterprise customers.
"Every basestation can be an AI delivery network," Velayutham explained.
That idea also ties into the edge computing network Vapor IO has built in a handful of cities, including Atlanta, Chicago, Dallas and Las Vegas. Vapor IO hopes that mobile network operators might eventually place some of their networking equipment in its edge sites, alongside others looking for localized computing services.
You May Also Like