Edge computing brings high-performance storage, compute and network resources closer to users and devices than ever before. The goals of this approach include lowering the cost of data transport, decreasing latency and increasing locality.
This trend runs directly counter to the historical approach of housing compute and storage in a small number of massive (or hyperscale) data centers placed in often remote locations to minimize the operations costs of running them. Thus, with edge computing, the pendulum is moving from a centralized cloud architecture to a highly distributed one.
While interest in and hype around edge computing is high, detailed knowledge of the edge is remarkably low. At this nascent stage, the most fundamental questions are being asked: "What exactly is the edge?", "Where is the edge?" and "What use cases will drive edge deployments?"
To answer these fundamental questions, and many more, Heavy Reading launched its inaugural Edge Computing Market Leadership Study focused specifically on how edge computing will affect the future of network connectivity, including capacity demands, data center interconnection (DCI), technical and architectural requirements and deployment models. Conducted by Heavy Reading and co-sponsored by Arista, CoreSite, Fujitsu and Infinera, the study is anchored in a global survey of 91 network operators and enterprises.
In the first of two blogs based on this new research, we focus on questions around market drivers and demand.
Importance of edge
At this early stage, service providers and enterprises understand the importance of edge computing to their business. In our survey, 80% of respondents reported that edge computing is at least important for their business, with 20% of the group stating that edge computing is critical and that their business cannot succeed without it. Just 3% of those surveyed believe that edge computing is not important at all.
Figure 1: N=91
Source: Heavy Reading
Use cases and business drivers
Interest in edge computing is widespread across telecom operators and enterprises, but their views differ on the use cases and applications that are driving that need. For telecom operators, the top use cases/drivers are 5G (selected by 67% of telecom respondents); Internet of Things (IoT; selected by 63%); ultra-reliable, low latency applications (selected by 55%); and high-performance content delivery (selected by 47%).
For enterprises, the top use cases and drivers are artificial intelligence (AI) applications (selected by 61% of enterprise respondents); ultra-reliable, low latency applications (selected by 57%); and IoT (selected by 48%). The figure below breaks out the full results by telecom and enterprise respondents.
Figure 2: N=60 telecom, 23 enterprise
Source: Heavy Reading
In comparing the two survey groups, there are some significant differences in drivers. First, telecom operators are making big bets on their futures with 5G. The survey data shows that edge computing strategies will be tightly coupled with those plans. 5G will also form the basis of operators' plans for ultra-reliable low latency communication (URLLC) and IoT applications, which scored highly, too. Enterprises surveyed, meanwhile, overwhelmingly link edge computing with AI applications, whereas telecom operators place AI nearly at the bottom of their drivers list.
In the next blog in the series, we will dig into some of the survey details on specific technology and architecture preferences for edge computing.
For additional information, readers can access the archived version of the recent webinar "Connectivity for the Edge Computing Era." Heavy Reading and study sponsors discuss in detail how edge computing will affect the future of network connectivity.
— Sterling Perrin, Principal Analyst, Heavy Reading
This blog was sponsored by Arista, CoreSite, Fujitsu and Infinera.