Early AI data center investments target the core, not the edge

Companies including Microsoft and Amazon are investing in high-performance computing infrastructure for AI. So far, they're not targeting the edge, but edge computing may still play a role in the future of AI.

Mike Dano, Editorial Director, 5G & Mobile Strategies

October 2, 2023

5 Min Read
Edge computing photo illustration
(Source: Kirill Ivanov/Alamy Stock Photo)

Early investments into advanced artificial intelligence (AI) services will be funneled into large data centers and won't involve edge computing, according to industry executives and analysts. But that may change in the future.

"When we move to the large-scale distribution of the inference [AI] model, that's when edge becomes important," Brian Pryor told Light Reading. But that won't happen anytime soon. "That end of the market is not developed enough," he said.

Pryor is a managing director at Houlihan Lokey, an investment banking firm with a focus on data centers. He helped spearhead the firm's early work into the edge computing sector, which is an offshoot of the cloud computing market that so far has failed to develop in the way early edge computing proponents had hoped.

According to Pryor, the current AI boom – which is focused on generative AI services like ChatGPT – will not serve as a catalyst for edge computing. At least, not yet.

"The major investments right now are in the large language learning models," he said. "That has actually very little to do with edge."

But Pryor said it's possible the AI market will spark demand for edge computing services in the future, when those learning models shift to an "inference" model geared toward the speedy delivery of AI capabilities. For example, applications ranging from connected cars or factory floors may eventually benefit from inference-based AI services – and edge computing facilities may play a major role in the delivery of those services in the future.

Related:How COVID-19 derailed edge computing

"That [inference] piece would be much more latency sensitive," Pryor said, and could therefore require edge infrastructure.

Edge isn't needed yet

Others largely agree with Pryor's outlook.

"AI tasks don't normally have the same latency requirements as many other workloads, enabling hyperscalers to focus AI-oriented investments in their core data centers, which are not located in the usual major economic hubs and metro markets," said John Dinsdale, an analyst with Synergy Research Group, in response to questions from Light Reading. He said such economic hubs include New York City, Silicon Valley, Chicago and Dallas.

But Dinsdale argued that eventually an edge delivery network could become important for big AI players like Microsoft and Amazon.

"The mix of workloads may change, but the overarching requirement to better serve a broad geographic footprint of customers only gets stronger," he explained. "They will continue to invest heavily in local zones, on-premise solutions, CDN-type nodes, colocated points of presence and telco relationships."

Related:Edge Computing Mini Data Centers Are Rolling Out for Real. What's Next?

Stephen Rose, IBM's GM for the global telco industry, told Light Reading that edge infrastructure isn't currently a focus for most generative AI efforts. IBM is positioning its new watsonx AI service as a way for enterprises – including those in the telecom industry – to scale up their use of AI foundation models and generative AI.

But he, too, concluded that AI infrastructure would eventually make its way to the edge.

Inflated expectations

Edge computing has been the subject of plenty of hype in recent years. That's mainly due to the belief that demand for near-instantaneous computing would push investments into a dispersed network of smaller, mini edge computing data centers. Such a network would represent a significant departure from today's Internet architecture, which is primarily supported by dozens of huge data centers located primarily in big cities like Atlanta, Phoenix and Chicago. 

The Internet and cloud computing push gave birth to these types of sprawling data center facilities, given the need to centralize computing resources for efficiency. But edge computing would speed up Internet functions by positioning computing resources closer to end users.

Or so the theory goes.

As Light Reading has previously reported, demand for edge computing essentially evaporated during the COVID-19 pandemic as telecom and data center companies reinvested in their core operations amid spikes in Internet traffic. That left most of the early participants in edge computing high and dry.

Houlihan Lokey's Pryor was in the middle of that early edge computing hype and agreed that players in the space were too early. But he said that AI could eventually help reignite the demand for edge computing.

AI will drive demand

Many companies are counting on renewed demand for edge computing.

"We think we have really good assets there that really maybe able to connect into the tower assets in the future through edge computing and some of the demands that are coming down the pike in terms of the 5G networks – applications that require lower latency, that require higher capacity, that could benefit from having compute power closer to the base radios," said American Tower CFO Rod Smith during a recent investor event. American Tower in 2021 purchased data center operator CoreSite as part of a bid to position itself for edge computing.

Smith is among those waiting for AI to drive demand for edge computing.

"We certainly think AI is going to be a big driver of data center demand for data center space. Early on here in the early stages it's driving a demand for hyperscale ... our business isn't hyperscale centric," he explained. "So, we don't think AI is going to actually drive a material level of business for us for a couple of years, but we do think it's coming."

The financial analysts at TD Cowen agree.

"The preference of hyperscalers is to continue [data center] leasing in Tier 1 markets given flexibility, as this capacity can either be leveraged to support an existing Availability Zone (AZ) or to support AI-related use cases," they wrote in a recent note to investors discussing the rise in AI-related data center investments.

The analysts explained that massive tech companies like Microsoft and Amazon are purchasing high-performance computing hardware from companies like Nvidia to run early AI learning models. Those investments are primarily headed into the companies' existing data centers, which are generally located in major US cities, they said.

Read more about:

AI

About the Author(s)

Mike Dano

Editorial Director, 5G & Mobile Strategies, Light Reading

Mike Dano is Light Reading's Editorial Director, 5G & Mobile Strategies. Mike can be reached at [email protected], @mikeddano or on LinkedIn.

Based in Denver, Mike has covered the wireless industry as a journalist for almost two decades, first at RCR Wireless News and then at FierceWireless and recalls once writing a story about the transition from black and white to color screens on cell phones.

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like