Open RAN platforms to support far edge AI inference
A key benefit of using general-purpose processors to implement open RAN/vRAN is that the same platforms can be used to support AI inference and other applications at the far edge of the network, such as cell site routers (CSRs) and content delivery and hosting. These edge platforms can be used to host virtualized applications closer to the user, offering significant benefits in terms of lower latency and shared infrastructure.
To find out more about which applications service providers plan to support on shared far edge solutions and how they plan to deploy open RAN and vRAN platforms and architectures for 5G networks, Heavy Reading ran an exclusive survey of individuals working for operators with mobile network businesses. The results are presented in an analyst report, Open RAN Platforms and Architectures Operator Survey Report, that can be downloaded for free here.
Benefits of edge cloud integration
The survey presented options for five edge applications that can share server platforms with virtualized open RAN baseband implementations. Respondents were asked to indicate the top three applications they plan to support in their open RAN/far edge solution. 62% selected AI, 54% each for CSR and content/media delivery/hosting and 50% for firewall. 43% selected IPsec and just 1% selected "other."
AI is becoming a critical application in many market areas, including wireless networks. The use of AI within wireless networks can significantly improve network performance and end-user quality of experience. AI learning is usually hosted in data centers with high power servers and GPU, FPGA or other acceleration hardware. Cloud native AI inference can be easily moved to far edge server platforms, reducing latency and sharing resources with open RAN baseband functions.
A CSR aggregates mobile data traffic from the RAN and routes it back to the service provider's core network. CSRs implemented as cloud native virtual functions (CNFs) are available that can share COTS server platforms with open RAN distributed unit, firewall and IPsec functions. Video caching and other content delivery and hosting can also benefit from sharing the same edge server platforms, reducing latency and backhaul traffic.
Far edge operating environments
The use of CNFs allows open RAN and other applications, such as AI inference, to be scaled and shared across the network, including far edge server platforms. Some of these far edge servers may be located in cabinets with controlled temperatures; however, many will be located in cabinets without controlled temperatures or in outdoor locations. For these locations, server platforms need to support extended temperature operation and be designed for demanding environments.
When asked about operating environments, almost 70% of respondents to the survey indicated that they had an outdoor application, and 29% said they had operating environments below –5°C. Multiple choices were allowed. 57% said they had indoor cabinets with controlled temperature and 57% said they had indoor cabinets without controlled temperature. The full results are included in the analyst report.
Heavy Reading's Open RAN Platforms and Architectures Operator Survey Report focuses on why operators are deploying open RAN and which platform architectures, hardware accelerators and software and integration solutions are viewed as most important for these deployments. You can download a PDF copy here.
— Simon Stanley, Analyst-at-large, Heavy Reading
This blog is sponsored by Kontron.