December 21, 2017
We have all seen our networks slip out of our direct control as they extend beyond the traditional walled perimeter. Today, we not only have our own private data center on-premises, we have applications in various public clouds, we use outside SaaS services, and we move information on and off mobile devices. The result is a large amount of distributed data that needs to be monitored.
Our networks also continue to surge in size and complexity, extending far and wide, including third-party devices that are often not managed with the same vigilance we have in our traditional walled network. This all adds up to a massive attack surface.
A state of consistency in monitoring is necessary to prevent intrusions and breaches, especially in the cloud. We hear a lot about having full visibility into our data and applications, but what does that mean? All too often we settle for whatever visibility options are offered, allowing our data monitoring -- and by association, our entire business -- to be cruising on autopilot. We also rely on automation to eliminate the "human error" in configuration, but often those automation capabilities are not delivering full visibility. Sure, it makes deployment faster, but the network architects and administrators need to be fully aware of what those automated functions are and are not doing. Securing a network requires knowing where data is stored -- including in the cloud -- and knowing when it is in motion.
Even modern aircraft pilots need to learn how to control the aircraft without the aid of an autopilot. They trust but verify. Much like the pilot who can take over the plane’s autopilot in an emergency, your business needs to be aware of what is on the network and what the network is doing. This requires consistent visibility across physical networks, as well as public and private cloud environments.
Too much defense, not enough offense
Global forecasts predict data consumption will grow 2.5x between 2015 and 2020, to 25 gigabytes of data per capita per month in 2020, up from 10 gigabytes per capita in 2015. Analysts also predict that by 2020, there will be more than 20 billion connected devices. That translates into a great deal of data coming from -- and going to -- a lot more ingress/egress points. This makes the network further complex and difficult to secure. The new network has extended its reach so far beyond the perimeter that the actual edge of the network is no longer in anyone’s individual control.
This network complexity translates into a complicated mind map for the security architect and CSO. The area they have to be concerned with -- and the number of tools they need to manage for that area -- is enormous. To address potential threats inside and outside the perimeter, they need to train team members on how to manage a wide range of security monitoring and data compliance tools. This could lead to error when architecting the network itself. More and more connections need to access monitoring tools, and most of those are coming from cloud sources where access to scalable monitoring resources is not a given.
Visibility into data leakage
Monitoring data leakage is one of the biggest security objectives, but it is not always simple to detect. What appears to be normal, legitimate traffic movement in one part of the network might easily be redirected to another part, and ultimately to a cybercriminal’s destination. This is exactly what happened to Sony in 2014 when they could not recognize that they were being hacked because of how the data exfiltration was being routed.
Monitoring data flows requires the ability to identify and track data flows by application type, as well as being able to monitor and track threats. To do this requires continuous real-time data feed analysis with an application and threat intelligence capability at the data level. That means data from all sources needs to be monitored. Visibility into anomalous user activity and into sensitive data across the network cannot be isolated to just parts of the network.
Monitoring in the public cloud
It is not all doom and gloom. The good news is that the new network has sparked a shift in the way enterprises handle monitoring and visibility. This shift becomes increasingly important as companies transition to the public cloud.
If you asked IT professionals last year as to what their biggest headache was with the cloud, they would have said migration. Now, their top concerns are data privacy and compliance, securing the network, and achieving full data visibility. In other words, their concerns have shifted from migration to operation. What’s more, over 93 percent of IT professionals worldwide are concerned about maintaining data and applications security in public and private cloud environments. While initial public cloud monitoring options were limited, those limits are expanding as cloud providers like AWS introduce competency programs in which third party visibility solutions are available. Just make sure the visibility solution you pick auto-scales without needing constant reconfiguration.
The new network is complex, and automation will help you manage it. While there are a myriad of ways to architect and manage your network, your future security will rely on how complete the visibility, coupled with how easy it is to manage. You need to have a complete picture of our network -- today and into the future.
— Jeff Harris, CMO, Ixia
About the Author(s)
You May Also Like
5G Network Automation and AI at Global Megaevents: A Telco AI-at-scale case study with Ooredoo and EricssonOct 10, 2023
5G Transport & Networking Strategies Digital Symposium.Oct 26, 2023
Improve Service Efficiency in the Call Center and Field with Slack AutomationOct 13, 2023
Open RAN Evolution Digital Symposium Day 1Jul 26, 2023