& cplSiteName &

How to Prevent Your Data Lake From Turning Into a Swamp

James Crawshaw

In their book Building the Network of the Future, Mazin Gilbert and Mark Austin of AT&T describe the big data framework that the operator has adopted to process the 118 petabytes of data that pass through its networks each day (as of 2016).

The operator not only tracks the payload of data that traverses its networks but also captures and stores, for later analysis, myriad data from user devices, radio access infrastructure, core network elements (such as XDRs), Internet cloud infrastructure (for example, CDN logs) as well as the application data itself (such as website logs). Much of this data stored in a Hadoop-based system running on common, off-the-shelf hardware. Feeding the Hadoop distributed file systems is a data ingestion engine based on open-source tools such as Kafka, Flume and Scoop. Sitting on top of Hadoop (figuratively) are modules for analytics (SPARK), batch processing (Map Reduce), search (SOLR) and NoSQL (e.g., MongoDB, Cassandra).

While all this open source technology looks fantastically fun, operators should step back for a second and ask themselves if filling huge data lakes with streaming telemetry about network paths, traffic flows and performance is going to provide a valuable resource for analytics or simply rack up a rather large bill for storage infrastructure (albeit commodity hardware). After all, the key point of the exercise is to unearth some valuable insights from the data that enables them to improve the business, such as faster root cause analysis, reduced mean time to repair or earlier detection of security threats. Might they be better off applying a courser filter to the data they collect, focusing on the metrics which are likely to have a material impact on performance? Judgement calls about which data is worth keeping require networking expertise which may be lacking in the IT development team tasked with building the data analytics platforms.

How will service providers enable automated and efficient network operations to support NFV & SDN? Find the answers at Light Reading's Software-Defined Operations & the Autonomous Network event in London, November 7-8. Take advantage of this opportunity to learn from and network with industry experts – communications service providers get in free!

As this article notes: "The best strategy for data lakes is to only collect data that is useful now. Data loses its value over time and if you can’t find what you’re looking for in the mess that is the data swamp, it's pointless to keep adding to it. Projects should only go after sources that can provide useful solutions to clearly defined business problems."

To find out more about data collection best practices and what to do with the data once you have decided to store it (standard correlations, sophisticated machine learning algorithms, etc.), join us at Software-Defined Operations & the Autonomous Network event in London, November 7-8 for the panel Zero Touch Analytics – Delivering Insights In Real Time.

Operators want analytics tools to provide them with tangible insights: findings that are actionable, concrete and palpable. At the same time, they want these systems to be highly automated, employ artificial intelligence and be zero-touch. So palpable and zero-touch at the same time -- quite a challenge. I'll be discussing this, and more, with speakers from Atrinet, Netcracker and Telefonica.

— James Crawshaw, Senior Analyst, Heavy Reading

(0)  | 
Comment  | 
Print  | 
Newest First  |  Oldest First  |  Threaded View        ADD A COMMENT
More Blogs from Heavy Lifting Analyst Notes
For the first time, Light Reading will offer two breakfast roundtables on key tech topics at this fall's SCTE|ISBE Cable-Tec Expo in New Orleans – the first on network virtualization and the second on cable and 5G.
As a keen OSS-BSS observer, I like to parse financial disclosures to gauge the health of the market.
For any operator, revamping an entire business support system (BSS) architecture in just six months and cutting IT opex by 80% as a result is a notable achievement.
Dedicated 5G campus networks, designed to meet the coverage, performance and security requirements of industrial users, are one of the most exciting – and tangible – advanced 5G use-cases under development.
LiFi is the latest effort to bring optical wireless communications into the commercial communications networking fold.
Featured Video
Upcoming Live Events
September 17-19, 2019, Dallas, Texas
October 1-2, 2019, New Orleans, Louisiana
October 10, 2019, New York, New York
October 22, 2019, Los Angeles, CA
November 5, 2019, London, England
November 7, 2019, London, UK
November 14, 2019, Maritim Hotel, Berlin
December 3-5, 2019, Vienna, Austria
December 3, 2019, New York, New York
March 16-18, 2020, Embassy Suites, Denver, Colorado
May 18-20, 2020, Irving Convention Center, Dallas, TX
All Upcoming Live Events