With COVID-19 forcefully pushing its way into the second half of 2020 – at the very least – it appears that Zoom and other videoconferencing platforms are here to stay for work, school, entertainment and seemingly everything else.
And that obviously raises some concerns for users like myself who aren't fond of conversations constantly interrupted by frozen screens, delayed responses and the generally stilted exchanges that appear to be part and parcel of the dreaded "new normal."
That's why I was so intrigued by recent suggestions that edge computing technologies could alleviate the situation.
Edge computing as the savior
"The rise of edge data centers enables more efficiency for videoconferencing technology apps by allowing them to get closer to the end user, reducing latency," writes EdgeMicro, a startup working to build mini-data centers for edge computing services. "Traditional data centers are typically located in large metropolitan areas while micro data centers, like ours, bring service to Tier II and Tier III cities. Locating in places other than large cities provide users in rural areas with the same connectivity options as everyone else."
Others seem to agree.
"There is active discussion on edge computing. With the rapid increase in videoconferencing in the COVID-19 period, latency is becoming a common obstacle for natural conversations," wrote Alex Choi, the SVP of technology and innovation for Deutsche Telekom, Germany's biggest operator, on LinkedIn. Choi suggested that operators like Deutsche Telekom might have a role to play in implementing edge computing to solve the videoconferencing issue.
Curious about the notion of edge computing solving our collective coronavirus videoconferencing woes, I reached out to a number of players in the edge computing space to see whether they agreed.
Not surprisingly, some vendors cheered the idea.
"Home is now the new edge with so much data being both consumed and created locally," explained Phillip Marangella, the CMO of edge computing data center company EdgeConneX, noting that the pandemic is forcing everyone to stay at home. "As a result, this places a big burden on the networks and CDNs [content delivery networks]. To help solve for this surge in traffic and some of the network bottlenecks, you're seeing a growing demand for edge data centers, where more peering, enhanced local access, and smart routing of traffic can occur to help alleviate those service delivery challenges and ensure the user experience is not negatively affected."
Taking the shine off
Others, however, offered a much less rosy response. "If you're in Peoria and I'm in Timbuktu, how would an edge data center make our experience any better when >99% of the path has nothing to do with an 'edge data center?'" wondered Matthew Trifiro, CMO of Vapor IO, another startup building edge data centers.
He explained that the problem has less to do with data centers and more to do with the overall design of the network between videoconferencing participants.
Similarly, analyst Dean Bubley with research and consulting firm Disruptive Analysis said that the notion of edge computing improving videoconferening is "tenuous."
"There's all sorts of sources of latencies involved in videoconferencing, and it's as much dependent on the architecture of the system ... and choice of codecs as it is about networks," he wrote. "If you're having a video call with someone *in your city* then yes, you want all the traffic to stay locally if possible. But if you're talking to someone in Australia, then the edge computing argument falls down."
If there were a tie-breaker in this discussion, it would probably be Amazon Web Services (AWS), one of the biggest cloud computing providers in the world and a major new player on the edge computing front.
So what does AWS have to say on the topic? After some back and forth on the parameters of the question, an AWS spokesperson said the company would decline to provide a response.
Perhaps that's not necessarily a surprise given a lengthy post on the "edge computing opportunity" penned by Matthew Prince, the CEO and co-founder of web-infrastructure and website-security company Cloudflare. Prince explained that Cloudflare, through its "Workers" product, has been providing edge computing services across more than 200 cities in more than 100 countries for the past three years – and he said that "only a limited set of applications are sensitive to network latency of a few hundred milliseconds."
That's a critical statement considering edge computing proponents widely argue that lower latency – the time it takes for a computing request to reach a data center that can process it – is a primary benefit of edge computing. After all, edge computing designs position computing resources physically closer to users to reduce latency.
Edge computing drivers and questions
"When we launched Cloudflare Workers, we thought the killer feature was speed," Prince wrote. "However, we've learned by watching developers use Cloudflare Workers that there are a number of attributes to a development platform that are far more important than just speed. Speed is the icing on the cake, but it's not, for most applications, an initial requirement. Focusing only on it is a mistake that will doom edge computing platforms to obscurity."
Instead of speedy, real-time Internet connections – the kind that might improve videoconferencing – Cloudflare's Prince argued that the real driver for edge computing is compliance with local data-storage laws.
"Herein lies the killer feature of edge computing. As governments impose new data sovereignty regulations, having a network that, with a single platform, spans every regulated geography will be critical for companies seeking to keep and process locally to comply with these new laws while remaining efficient," he wrote. "While the regulations are just beginning to emerge, Cloudflare Workers already can run locally in more than 100 countries worldwide. That positions us to help developers meet data sovereignty requirements as they see fit."
What this all really means is that the discussion around edge computing continues to evolve, and the real drivers and applications of the technology remain unclear.