With 'pivot' to XGS-PON comes a much broader look at how access itself can be virtualized for multivendor interoperability and flexibility.

July 3, 2017

6 Min Read
AT&T Embracing Software-Defined Access

AT&T's XGS-PON trial, announced last week, is part of its broader software-defined access strategy, an effort that includes a move to white boxes in the access network and interoperability of network and premises-based optical gear that isn't possible today.

Access and SDN are coming together because it makes sense to implement software-defined networking at a point of technology "pivot," when existing systems are in flux anyway, says Eddy Barker, assistant vice president of Access Architecture and Design at AT&T, and the move from Gigabit PON to XGS-PON is just such a point.

SDN in the access network has been a work-in-progress for the vendor community for some time, but AT&T's push forward with XGS-PON trials and later deployments is likely to speed the pace at which all this is done. In her most recent market report, Ovum Ltd. Analyst Julie Kunstler noted that PON spending will reach $6.79 billion in 2020 and that next-gen PON deployments will account for more than 50% of that revenue by 2022.

The drivers for AT&T's XGS-PON deployment are purely economic, Barker adds, in an interview with Light Reading. Based on responses to the RFP that AT&T put out earlier this year, it makes better economic sense to use XGS-PON to support a converged access network -- business, residential and mobile backhaul services on one fiber network -- than it does to continue down the GPON route, or look to NG-PON 2, as Verizon is doing. (See Verizon Readies Landmark NG-PON2 Trial.)

"When we looked at the economics, our goal was to literally be able to deploy XGS at the same economic price per living unit that we could do GPON at, and three years ago, even two years ago, a lot of vendors didn't think that would ever happen," Barker says. "Earlier this year with RFPs and things we have been pressing, we crossed that barrier."

Part of that economic equation is the fact that XGS-PON with its symmetric 10 Gigabit capability will allow delivery of higher-speed services to businesses and consumers, he explains. And by building a software-defined access network, AT&T will be enabling future PON technologies to be implemented without the major upheaval in operations and premises gear that happens today. In fact, Barker says, a fully implemented software-defined access strategy would also support G.fast copper-based deployment as well.

Should AT&T move to NG-PON 2 -- when the tunable optics required are more cost-effective -- it would do so on the same software-defined access network, he adds. But instead, AT&T "might jump over that and move to the next generation of speed that is being worked on by IEEE and ITU," Barker says.

Key elements of software-defined access on which AT&T is already at work:

  • White-box versions of open optical line terminals for XGS-PON, designs for which AT&T has submitted to the Open Compute Project.

  • Software required to run XGS-PON, including the ONOS controller softwqare and VOLTHA (Virtual Optical Line Terminator Hardware Abstraction). AT&T is working with ON.Lab on both.

  • An OpenOMCI specification, which will enable interoperability between central office (CO)-based OLTs and premises-based devices. Today, vendors have created their own versions of OMCI, and these differ enough to prevent interoperability and limit multivendor implementations, which keeps costs higher.

"For quite some time, we have been working heavily with ON.lab and part of an offshoot of the CORD network has been the software-defined access piece," Barker says. "We have been focusing on establishing all of the microservices that would support virtualization of access. The first one was passive optical networking and for us, XGS and working with the merchant silicon vendors on that. But that same set of code we are working on with ON.labs is really designed to also support, just like with CORD, other types of access technology. So what we are building for virtual OLTs, we would also like it to support G.fast, and in the future, have it support RAN [radio access network] technologies such as M-CORD."

The goal is to have "a common orchestration system and a common set of local controls and virtualized services" that would let AT&T quickly and at low cost substitute new underlying technology without having to also change out all the piece parts, Barker says. That's not possible today because access network technology is vendor-proprietary, and using one vendor's gear in the CO means using that same vendor's gear at the premises, and likely using that vendor's element management system as well.

Under those circumstances, "changing vendors or using multiple sources had tremendous implications on the business" because it meant maintaining multiple support systems, he says. "So the goal here is to break that traditional system apart -- disaggregate, focus on a software platform that can be open, commercialized but based on an open source and hoping that we have many options on the hardware side that can be developed by OEMs or white box builds we get through contract manufacturers," Barker says.

Get real-world answers to virtualization challenges from industry leaders. Join us for the NFV & Carrier SDN event in Denver. Register now for this exclusive opportunity to learn from and network with industry experts -- communications service providers get in free!

AT&T has already been "pretty successful on the hardware piece" and is working on the software components with ON.labs and the open source community. The biggest hurdle to software-defined access?

Barker cites two things: the pace at which open source groups can be pulled together -- including vendors worried about losing their traditional hardware-based business models -- and the operational challenges, getting to a common orchestration platform for managing the access realm, which is complex and features many different generations of equipment in operating and delivering services.

Getting operators, existing vendors and new entrants together in a community and "accelerating them to putting out the different services that are needed to have a mature ecosystem that can be deployed" takes time, especially "in order for it to be robust," he comments. "Some of the operational tooling may not be getting developed in open source as fast as we'd like. And a lot of the OEMs are participating -- but may be apprehensive to contribute too much because they are in a new space and they have to figure out how to re-monetize, they've got intellectual property."

AT&T's choice may be to "seed" some things, as it did by releasing its ECOMP software into open source, through what is now the Open Network Automation Platform (ONAP), Barker says. That can be done by contributing software or software developers to the effort.

— Carol Wilson, Editor-at-Large, Light Reading

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like