Akamai aims for a new competitive edge in cloud computing

Akamai, one of the Internet's major content delivery network players, says it is working to build the most distributed cloud computing platform on the planet.

Phil Harvey, Editor-in-Chief

February 13, 2024

4 Min Read
Can Akamai move the edge closer to its cloud computing customers?(Source: Phil Harvey/Alamy Stock Photo)

What's old is cool again, as Akamai announced today that it plans to bring its Generalized Edge Compute (codenamed "Gecko"), with support for virtual machines, to 100 cities by the end of the year.

The cute-sounding name means Akamai is serious about taking on Amazon and other cloud providers by providing more cloud and computing capabilities at the network edge – and marrying that with the security and content delivery capabilities it is already known for.

The edge computing market has been a just-about-to-explode market for decades. But Akamai is already well established in security and distribution, so it's not as much of a build-it-and-they-will-come scenario. The big idea is to help telcos and enterprises provide lower latency and more secure connections for applications and data deployed for modern use cases like AI inferencing, multiplayer gaming, spatial computing and a wider diet of media products like social streaming services. 

To a degree, Akamai is building on what cloud pioneer Linode started and pairing that with aggressive infrastructure improvement in areas where hyperscaler edge computing offerings may be lacking. That's our best guess anyway – the company declined to provide investment timelines or capex estimates for Gecko.

Akamai said it has already deployed cloud computing capabilities at the edge in nine cities: Hong Kong; Kuala Lumpur, Malaysia; Querétaro, Mexico; Johannesburg; Bogotá, Colombia; Denver; Houston; Hamburg, Germany; and Marseille, France.

Santiago, Chile, will be the tenth city where Akamai turns up its Gecko cloud-edge capabilities, according to the company, with that installation to be generally available in Q1.

Ye Olde Internet Shoppe

If you're unfamiliar, Akamai has been helping ISPs and content companies move bits around the Internet for decades. Along the way, the Internet infrastructure company has evolved and acquired capabilities, consolidated other content delivery networks, and pushed into related adjacent businesses like security and cloud computing. 

Akamai went public in 1999 and by 2007 had landed Verizon and several US government agencies as customers of its media distribution and application delivery services.

It became a CDN partner to Netflix in 2010, the same year Akamai cut deals with Brightcove and the NFL. Orange and AT&T announced separate deals with the cloud provider in 2012. KT, Swisscom and Telefónica signed deals mostly related to Akamai's CDN capabilities in 2013.

As Akamai grew, its network was designed to avoid backhauling data as much as possible. In an April 2022 interview with Light Reading, Akamai CTO Robert Blumofe said, "Rather than moving the data to the compute, move the compute to the data – that is a rule of thumb and its probably the right rule of thumb, 90-something percent of the time."

Since 2014, Akamai has acquired at least 14 companies that we know of: Prolexic Technologies, Xerocole, Bloxx, Nominum, Janrain, ChameleonX, KryptCo, Exceda, Asavie Technologies, Guardicore, Inverse, Linode, Neosec and Ondat. Each added new networking, security and cloud capabilities to the already mighty Akamai content and media delivery network. 

Last year, Akamai rolled up the CDN market, a sign that its customers' needs have broadened. Akamai acquired the content delivery customer contracts, some CDN assets and some customers from Lumen Technologies and StackPath.

Akamai said its network today comprises 4,100 points of presence (PoPs) around the globe. 

'Data needs to be everywhere'

In a conversation with Light Reading on Friday, Akamai VP of Product Management Jon Alexander said Gecko sprang from network capability discussions that Akamai has been having with ISPs and its top customers, many of whom are beginning to look at AI to help with customer recommendations and security, applications where heavy computing needs to be done in close proximity to the task at hand. 

"Your application needs to be everywhere, and your data needs to be everywhere. Your users are everywhere, so let's provide the infrastructure that will allow you to do that," Alexander said.

As cloud computing has made the Internet more efficient in some ways by centralizing storage and computing, Alexander said there's a growing need for infrastructure designed to support workloads that demand low latency and heavy computing. 

"For 25 years, everything we've built has been designed to run in hundreds, if not thousands of locations," Alexander said. "That means you've got to make some key design choices and the infrastructure we're putting out, as part of this announcement, is designed to enable those types of services."

About the Author(s)

Phil Harvey

Editor-in-Chief, Light Reading

Phil Harvey has been a Light Reading writer and editor for more than 18 years combined. He began his second tour as the site's chief editor in April 2020.

His interest in speed and scale means he often covers optical networking and the foundational technologies powering the modern Internet.

Harvey covered networking, Internet infrastructure and dot-com mania in the late 90s for Silicon Valley magazines like UPSIDE and Red Herring before joining Light Reading (for the first time) in late 2000.

After moving to the Republic of Texas, Harvey spent eight years as a contributing tech writer for D CEO magazine, producing columns about tech advances in everything from supercomputing to cellphone recycling.

Harvey is an avid photographer and camera collector – if you accept that compulsive shopping and "collecting" are the same.

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like