Fastly promises better access to ChatGPT via a new edge APIFastly promises better access to ChatGPT via a new edge API

Fastly said its new AI Accelerator will offer application developers quicker and cheaper access to ChatGPT via an application programming interface built atop the company's edge computing network.

Mike Dano, Editorial Director, 5G & Mobile Strategies

June 13, 2024

4 Min Read
Edge computing photo illustration
(Source: Kirill Ivanov/Alamy Stock Photo)

Fastly said it will sell an edge computing API to developers that the company said will give them faster and cheaper access to AI provider ChatGPT.

Dubbed the "Fastly AI Accelerator," the company said the service is built on its Fastly's Edge Cloud Platform and will use semantic caching for speedier access to OpenAI's ChatGPT AI-powered chatbot, which can provide developer services including code generation and debugging. Fastly said it may expand its specialized API (application programming interface) to support other large language models (LLM) beyond ChatGPT in the future.

Popular AI applications can process hundreds of thousands of API calls or questions daily. Many of the questions users ask are likely very similar and may have been asked before," the company explained. "Without semantic caching, each call to the LLM requires going back to the provider for the information, potentially increasing costs and latency. However, Fastly AI Accelerator's semantic caching provides a cached response for repeated queries directly from Fastly's high performance edge platform, instead of going back to the AI provider, helping to deliver a better experience by improving performance while reducing costs."

One analyst described the offering as a possible game changer: "This announcement plays to Fastly's architectural design strengths & demonstrates its application beyond conventional CDN," wrote analyst Will Townsend of Moor Insights & Strategy on social media. "The potential for its AI Accelerator to improve GenAI performance & mitigate token cost could be a game changer."

For its part, Fastly competes against other big content delivery networks (CDNs) like those from Akamai and Edgio. But the company is also expanding into other areas including managed security, cloud computing, video streaming and edge computing.

"I think customers who are moving to Fastly are looking for sort of next-gen solutions, serverless compute, edge storage, that type of technology that we're delivering," said Fastly CEO Todd Nightingale during his company's most recent quarterly earnings call, according to Seeking Alpha

Fastly's edge network stretches across more than two dozen sites in the US, and dozens more internationally. In its most recent quarterly report, Fastly recorded almost $3 million in revenues from the business unit that houses its edge computing offerings. That's up from around $2 million in the year-ago quarter. The company said it's pinning much of its future hopes on edge computing.

Fastly's core "network services" business unit, meanwhile, comprised the bulk of its revenues in the first quarter, with around $106 million.

The context

Fastly's new AI Accelerator offering is noteworthy for the telecom industry because companies ranging from Lumen Technologies to Verizon to Akamai have hinted at the opportunities around AI and edge computing.

Indeed, Lumen and T-Mobile announced an edge computing pact in 2021. Verizon partnered with cloud computing giant Amazon Web Services in 2019 in order to build an edge computing network spanning roughly two dozen sites. And Akamai recently announced plans to bring its Generalized Edge Compute to 100 cities by the end of the year.

Such efforts sprang from a belief in the 2010s that edge computing demand would drive the construction of smaller, mini data centers in locations all over the country. After all, small, unmanned data centers in smaller cities – potentially at the base of cell towers – would be the only way to provide super low-latency services to residents in such locations. Otherwise, their Internet traffic would have to travel all the way to bigger data centers in Denver or Dallas, adding precious milliseconds to services like streaming virtual reality that need to be instantaneous.

But demand for latency-sensitive, near-instantaneous connections hasn't developed, resulting in edge computing casualties like EdgeMicro, MobiledgeX and Ericsson's Edge Gravity.

Now, though, companies in the industry are eying the white-hot hype around AI services like ChatGPT as a way to spark renewed interest in edge computing. Most AI deployments today focus on training AI models in large data centers. But the AI services of tomorrow might shift to an "inference" approach that could focus on speedy, low-latency connections to AI offerings. That would rely on edge computing networks.

Fastly's new AI Accelerator is also noteworthy considering the company is selling the service via an API. There is a global push in the wireless industry to sell new networking capabilities to enterprise developers via such APIs. The importance of the network API push has been rising as 5G operators struggle to find other ways to generate returns from their massive 5G network investments.

About the Author

Mike Dano

Editorial Director, 5G & Mobile Strategies, Light Reading

Mike Dano is Light Reading's Editorial Director, 5G & Mobile Strategies. Mike can be reached at [email protected], @mikeddano or on LinkedIn.

Based in Denver, Mike has covered the wireless industry as a journalist for almost two decades, first at RCR Wireless News and then at FierceWireless and recalls once writing a story about the transition from black and white to color screens on cell phones.

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like