The "fourth platform" uses artificial intelligence to analyze data from the underlying platforms in order to better serve customers. Specifically, Telefónica has launched its voice-activated "cognitive" assistant, Aura, in six markets -- Argentina, Brazil, Chile, Germany, Spain and the UK. Telefónica says that Aura will learn from its interactions with individual customers and ultimately be able to provide tailored recommendations and support based on a user's preferences. (See Telefónica Takes Aura AI Tool Into 6 Markets.)
While at MWC in Barcelona we met with Juan Manuel Caro Bernat, director of operations and customer experience at Telefónica, to learn how Telefónica is applying big data analytics and machine learning not just in its customer-facing activities but also to improve the operational efficiency of the business.
Back in 2007 Telefónica first started applying algorithms and machine learning to troubleshooting in its network operations centers (NOCs). The team of data scientists from Telefónica R&D that developed various tools for NOCs have published several academic papers and filed multiple patents related to their work. Three years ago, this team moved to be part of Caro Bernat's organization where they are now helping to solve operational issues across Telefónica.
Leveraging this small group of data scientists is a team of hundreds that apply business intelligence and data visualization tools to operational and commercial use cases across Telefónica. Caro Bernat's goal is to make these analytical tools available throughout the organization, not just for the elite cadre of data scientists and centralized business analysts. Instead the aim is to enable data driven operations across the group.
The data analytics process
The first part of the process is creating the data repositories themselves. Telefónica has a separate data lake in each operating business (Spain, Germany, UK, Brazil, etc.) and a centralized, global big data platform for analyses across the group. In total Telefónica collects data from over 170 sources of information including contact center calls, field technician reports, bills, energy usage, OSS, network telemetry and so on. Over time, more and more data sources are being added.
Once data has been collected and anonymized, it must be normalized using a standard data model and checked for quality. Poor data from inventory systems has required Telefónica to replace some systems (e.g transport and access inventory) and to change some operational processes to improve the quality of the data that is entered into these systems. Even if a perfect network inventory is created it will soon diverge from reality if field technicians fail to report changes such as a change in port on a device to resolve a trouble ticket. As Caro Bernat notes, "If you don't have the right data you are lost. We learned that the hard way."
Once a reliable data set is available, the focus shifts to analyzing use cases. These normally come from business units looking for solutions to real-world problems. The data scientists do not typically come up with their own use cases in isolation. So far, Telefónica's data analytics team has worked on solutions to around 300 use cases the benefits of which they track on an ongoing basis. Most of the use cases are operations related (e.g. infrastructure management, customer experience, customer service delivery, internal plant management, etc.). The next-largest category is technology related (video platform, radio planning, etc.).
Example analytics use cases
Below we summarize some of the use cases that Caro Bernat shared with us in our meeting. Use cases typically enable savings in operational or capital expenditure through the optimization of resources.
Table 1: Telefónica Analytics Use Cases
|Basestation profitability||Total cost is based on rental (data coming from real estate team), maintenance (data from operations), field technician costs, and necessary level 3 support in the NOC. Traffic and associated revenue is derived from commercial team. The profitability of each base station is calculated and an assessment is made of the least profitable base stations to understand what can be changed e.g. relocating basestation 50m to a different site with lower rental cost.|
|Preventive maintenance||Normal practice is to replace components periodically based on the vendor's recommended schedule. By collecting their own history of equipment faults Telefónica can make more accurate predictions of faults based on the specifics of its cell sites (e.g. ambient temperature and humidity). So far this has led to the reduction of hundreds of site visits in one operating business (country).|
|Battery capex optimization||Most of the batteries (backup power supplies) deployed in the field are never actually used yet their theft and replacement represents a significant cost. Telefónica has analyzed which sites have historically suffered from low electrical supply reliability and is now focusing the replacement of stolen batteries where the probability of them being needed is highest.|
|Trouble ticket prioritization||Traditionally trouble tickets are prioritized based on the estimated number of customers impacted and the length of time the ticket has been open. Now when a ticket is opened Telefónica is predicting how long the ticket is likely to take to resolve and is prioritizing tickets based on this measure rather than the time elapsed so far.|
Incorporating AI into analytics
Although many of the analytics use cases that Telefónica has developed are based on traditional statistical techniques, some of them also incorporate artificial intelligence algorithms. For example, Telefónica is currently exploring how to make UNICA (Telefónica's end-to-end network virtualization project) resources more intelligent by using AI. They are also exploring AI to suggest next best actions for staff in the Service Operations Center to resolve issues more quickly. These techniques are equally relevant in orchestration. In video operations Telefónica is using AI to detect anomalies and transfer customers onto a different headend before their service is impacted.
Another use case where the analytics team has employed AI is to create a real-time index of customer satisfaction. Just measuring traditional KPIs such as dropped calls and throughput does not always correlate well with customer experience (as determined by survey data), especially for complex services such as VoLTE and IPTV. As such Telefónica has turned to some sophisticated machine learning algorithms which use network KPIs collected every 15 minutes to predict the customer's satisfaction level with an accuracy of around 60% (target 75% by year end).
A work in progress
Three years ago, when Telefónica first began exploring data analytics for operations, Caro Bernat could not find any suitable commercial solutions on the market. The data analytics tools that he evaluated at the time were generally focused on the marketing and commercial aspects of the telecom business (e.g. identifying upselling prospects). As such, Telefónica had to build its own analytics tools using the R programming language, the big data framework Hadoop, and data visualization software such as Tableau.
More recently Caro Bernat notes several startups and established companies have developed analytics solutions for operations in areas such as fault prediction. But rather than invest in new tools, the approach that Telefónica is taking is to open up its anonymized data sets to third parties so that they can propose their own uses cases and analysis. Telefónica will then share a portion of any value found with the third party.
Caro Bernat is excited about the potential for further cost savings that can be tapped through Telefonica's own data analytics team, as well as the possibilities to improve the performance of the network and customer experience. "It is an ongoing learning process," Caro Bernat notes, "but it is already enabling us to resolve issues more quickly and we see techniques such as machine learning as very promising.”
Towards extreme network automation
Caro Bernat believes that network automation will become increasingly important as the evolution of radio and network architecture required for massive 5G deployments leads to greater complexity. "The orchestration capabilities need to evolve to handle end-to-end lifecycle management with automatic network configuration," he notes. "This new networking paradigm will require extreme, almost zero-touch automation to be able to cope with large scale 5G deployments between 2021 and 2025. An artificial intelligence layer can support automation by enabling real-time configuration changes at the network and service layers. The ultimate goal is to be able to manage the complexity of future networks while continuing to meet customer needs”.
— James Crawshaw, Senior Analyst, Heavy Reading