Colt: Automation's 'Silent Killer' Is Poor Quality Data

Iain Morris
11/2/2017

LONDON -- Automation & the New Carrier Network -- Colt has highlighted the risks of trying to automate systems and processes when dealing with poor quality or inconsistent data that is difficult to interpret without human intervention.

Fahim Sabir, the director of architecture and development for the UK service provider's Colt on Demand business, told attendees at Light Reading's inaugural automation event in London today that data quality has been a massive concern for Colt Technology Services Group Ltd as it has worked on automating parts of its business.

"Data quality is the silent killer… the thing that absolutely hurts," Sabir said during a keynote presentation this morning. "When you are doing a manual task you have to make sure the operational organization documents it in a way in OSS and BSS platforms so that things can be automated. No matter how smart computers get, human beings are smarter. They can take the information and work out what it should actually mean."

Automation for the People
Fahim Sabir, the director of architecture and development for Colt on Demand, says telcos need to convince operational and delivery teams to trust automation.
Fahim Sabir, the director of architecture and development for Colt on Demand, says telcos need to convince operational and delivery teams to trust automation.

Sabir's comments draw attention to the dangers of racing to automate complex processes and tasks. In most cases, he says, operators need to work on changing those processes entirely before they start to think about automating systems.

"Automation is not that difficult with the right foundation," he said. "Unless you establish that it becomes hard."

Like other telcos trying to reduce manual effort, Colt sees automation as a way of speeding up its time to market and making resources available for service development instead of fault resolution. As part of Novitas, its overarching network transformation project, it has already built an intelligence engine that takes advantage of so-called robotic process automation (RPA) to reduce the costs it incurs as a result of manual effort.

James Crawshaw, a senior analyst at the Heavy Reading market research business, describes RPA as one of three key aspects of intelligent automation. "It's software that can automate tasks like a macro does in Excel," he said during today's automation event. "Then you have machine learning, which can spot patterns and provide insights, and natural language processing, whereby computers understand human language."

While it remains early days for machine learning and natural language processing, Colt has also taken steps in the artificial intelligence area through a research project called Sentio. That initiative could help it to predict when network elements are about to expire, and allow it to reduce costs through more efficient network planning, according to Sabir. (See Colt Preps AI-Enabled Network Management.)


Want to know more about cloud services? Check out our dedicated cloud services content channel here on Light Reading.


Colt has already been able to realize some of the benefits of automation through process changes, but Sabir says the operator has inevitably encountered some resistance among operations staff affected by the overhaul.

"Operational teams have been working for decades in a manual way and now we are saying you have to trust this engine to do the job you were doing manually," he explains. "We have had a big fight to convince the operations organization that the engine works and they can trust it."

Colt is not alone in arguing that automation could free up staff resources for development purposes, but concern has grown that automation could lead to widespread job losses in the next few years. (See 'Brutal' Automation & the Looming Workforce Cull and The Revolution Will Be Automated .)

"The more [operators] can automate the less people they will need," said Crawshaw. "But the staff that keep jobs will do more interesting stuff."

While industry executives often talk about the need to retrain existing telco staff as networks become more software-based, Sabir reckons that in-house programmers already have the requisite skills -- in many cases -- but spend most of their time on fixing faults rather than service development.

Asked about the impact of automation on jobs within Colt's operations organization, Sabir said: "For employees in provisioning activities and soft provisioning, absolutely the idea is to reduce costs in that area and that is an accepted reality."

However, Colt's operations workforce mainly comprises field technicians whose job is to install equipment, and Sabir does not expect automation to have much impact on these individuals.

— Iain Morris, News Editor, Light Reading

(20)  | 
Comment  | 
Print  | 
Related Stories
Newest First  |  Oldest First  |  Threaded View        ADD A COMMENT
<<   <   Page 2 / 2
Carol Wilson
Carol Wilson
11/3/2017 | 12:37:37 AM
Re: Trexit
Agreed, it is an ongoing process to keep databases accurate. 
brooks7
brooks7
11/2/2017 | 6:32:40 PM
Re: Trexit
It is not A CLEANUP.  It is the ongoing cleanup process.  You can expect creep in bad data all the time.  The longer that it is not gone through and compared the more bad there is.

 

seven

 
Carol Wilson
Carol Wilson
11/2/2017 | 6:06:54 PM
Re: Trexit
I would say Colt is definitely working on cleaning up data but I don't think Big Data/Analytics helps with that process - I could be misinformed but it seems the old garbage in/garbage out rules apply. 
mendyk
mendyk
11/2/2017 | 4:07:12 PM
Re: Trexit
I don't know -- it sounds like Colt is serious about automation, and that it understands that "legacy data" needs to be cleaned up. But shouldn't that cleanup be happening anyway? And wouldn't BDA help in that process? Maybe the conflation is more in the story than in what was said. 
Carol Wilson
Carol Wilson
11/2/2017 | 3:52:19 PM
Re: Trexit
I don't think he's conflating anything. I think the data to which he is refering is the network and customer data housed in legacy databases - things like which lines support which customers and which ports support which services. 

Every operator I know has said the same thing - that getting to on-demand services and automating service provisioning required them to go back and methodically clean up their data. 

I could be wrong, but I think he's talking about what Larissa Herda was talking about a so long ago in this story:

 http://www.lightreading.com/spit-(service-provider-it)/customer-experience-management-(cem)/doing-the-dirty-work-pays-off/d/d-id/708358

If you automate without cleaning up your data, you just speed up the pace at which errors occur. 
mendyk
mendyk
11/2/2017 | 2:40:33 PM
Re: Trexit
Got it. Mr. Sabir seemed to conflate two things -- data quality and the potential for autonomous systems (as in, artificial intelligence and machine learning). The whole point of "big data analytics" is that processors can comb through massive amounts of information and separate the good from the bad. That's only one step toward true autonomous systems, but an important one. If his argument is that, today, humans are better able to handle decision-making than computers, he's absolutely correct simply because AI is still in fairly early-stage development. A decade from now, his argument will hold a lot less water.
Phil_Britt
Phil_Britt
11/2/2017 | 2:00:58 PM
Re: Trexit
No, 

But the initial algorithms, etc., going into the computer in the first place have to be good.

Case in point: In one of those survey sweepstakes in which you got a trinket for filling out the form, I entered my name as John Jacob Jingleheimer Schmidt. A year later, I received junk mail for John Jacob Jingleheimer Schmidt (but at least his name was my name too :).
mendyk
mendyk
11/2/2017 | 1:37:06 PM
Re: Trexit
Are you saying that people are better at discriminating between good data and bad data than computers are likely to be?
Phil_Britt
Phil_Britt
11/2/2017 | 1:34:57 PM
Re: Trexit
But the old adage of garbage in, garbage out has never been truer. If processes or data input is bad, computers only allow us to do wrong things (or come up with wrong answers) much faster than before.
mendyk
mendyk
11/2/2017 | 1:32:05 PM
Trexit
"No matter how smart computers get, human beings are smarter." But these smarter human beings are now working feverishly to create computers that some day will be infinitely smarter than those human beings could ever hope to be.  
<<   <   Page 2 / 2
Featured Video
Upcoming Live Events
December 3-5, 2019, Vienna, Austria
December 3, 2019, New York, New York
March 16-18, 2020, Embassy Suites, Denver, Colorado
May 18-20, 2020, Irving Convention Center, Dallas, TX
All Upcoming Live Events
Partner Perspectives - content from our sponsors
How China's 5G Launch Will Gear Up the Global 5G Industry
By Daisy Zhu, Head of Marketing Operations, Huawei Wireless Network
5G Business Case Revisited
By Hayim Porat, CTO, ECI
All Partner Perspectives