Featured Story
Huawei 5G products not hurt by US sanctions – sources
Measures against China's biggest network equipment vendor have not had a noticeable impact on the quality of its products, Light Reading has learned.
AI and machine learning are essential to scaling the deployment and configuration of the exploding numbers of cellsites needed for 5G, says Mazin Gilbert in the first of a two-part series.
September 7, 2018
Artificial intelligence is playing a major role in the buildout of physical infrastructure for 5G and its importance in that area will only grow, says the AT&T expert who is the governing board chair of the new Linux Foundation umbrella organization devoted to AI, the LF Deep Learning Foundation.
Mazin Gilbert is vice president of Advanced Technology and Systems at AT&T Inc. (NYSE: T) and has been very involved in previous AT&T open source efforts at the Linux Foundation , including the Open Network Automation Platform (ONAP) . But as he points out, the need to move quickly on the AI front led AT&T to turn to the open source community much earlier in the process than it did with ONAP. (See AT&T VP: Using AI to Hyper-Automate Processes.)
And that's being driven in part by the urgency of scaling out physical infrastructure -- cellsites including small cells and micro cells, as well as white boxes -- to support 5G, Gilbert says.
Figure 1: AT&T's Mazin Gilbert
Today, AT&T has 75,000 macrocells, and several thousand microcells, but 5G will push the envelope to having 100,000-plus microcells, he explains in an interview.
"Where do you put those? What building? What pole? It takes a year to put one of those out today," he says. "That cannot scale. The question is how do you make this mainstream, reduce the time cycle, and take into account traffic changes?"
The answer, Gilbert says, is using machine learning and artificial intelligence to "redo completely the network planning process." To do that, AT&T is building a virtual world to be able to determine, "on the spot" where a microcell is needed and how it should be configured.
"We can't send an army of people every time we want to build one, it's not possible to scale 5G without it," he says.
Gilbert also agrees with the developing results of the current Light Reading readership poll that network maintenance and monitoring will be another area of immediate AI impact. (See AI's Impact.)
AT&T is currently piloting a project that uses AI and drones to check on the status of things such as the more than 8 million poles it owns today, that have to be maintained.
"We send an army of people and trucks with technicians and they go up and they can check," he says, but that's not only costly and inefficient but potentially dangerous. "I can send a drone with video capabilities and machine learning that can tell me what is wrong and diagnose the problem. And in the future, that drone will have a robot that can fix the problem that doesn't jeopardize someone's safety."
Cybersecurity is another area in which AT&T is already using AI and machine learning to detect new patterns in network traffic that indicate bad actors are at work and can predict problems ahead of network disruptions or data breaches, he adds.
AI and machine learning are not really new things, Gilbert explains, as many of the technologies and algorithms have been around for years, but deploying them in new ways and at scale has required taking a new approach, which is one reason AT&T looked to the open source community.
Zero in on the most attractive 5G NR deployment strategies, and take a look ahead to later technology developments and service innovations. Join us for the Deployment Strategies for 5G NR breakfast workshop in LA at MWCA on September 12. Register now to learn from and network with industry experts – communications service providers get in free!
AT&T helped kickstart the AI effort at the Linux Foundation by launching Acumos, first announced last year and then formally in March, with seed code from AT&T and partner Tech Mahindra Ltd. . This was not the first time AT&T took this approach -- its ECOMP software became core seed code for ONAP, which was created by merging ECOMP with OPEN-O -- but that was actually a very different approach, Gilbert says.
"We did this completely the opposite of how we did ECOMP and ONAP," he says. "With those we started the architecture in 2012, started building the software in 2014, started deployment, and then, in 2016, we made the decision to go open source. And we had to figure how what do you open source, which components you keep and which ones you don't and that was a lot of work."
With Acumos, AT&T started with an open source, open API and open platform approach, building it with the goal of ultimately using it in AT&T. Gilbert says AT&T's AI platform, called CMLP [Common Machine Learning Platform], is now leveraging Acumos elements including its design and the marketplace of usable components that Acumos is developing.
"And that platform adds additional capabilities that are AT&T-centric that allows deployment within the AT&T framework," he adds.
AI ultimately impacts every part of AT&T's very diverse business, including advertising and entertainment as well as telecom operations, and it is already in use in many of those segments, although not at the scale that will be enabled by the work of Acumos, which is now part of the broader Linux Foundation initiative, the LF Deep Learning Foundation.
Stay tuned for the second part of our conversation with Mazin Gilbert, which will explore the ongoing work of the LF Deep Learning Foundation and how it is evolving.
Related posts:
— Carol Wilson, Editor-at-Large, Light Reading
You May Also Like