& cplSiteName &

4 Tips for Maintaining Control in the Cloud

Jeff Harris
10/9/2017

It is no secret that the cloud is fundamentally changing the way businesses operate. Critical applications that were traditionally considered on-premises, such as CRM and ERP, now run on a public infrastructure that can be scaled out, in and down on demand, as needs ebb and flow. The cloud enables great efficiencies and growth -- two essential factors for ongoing business success.

But what the cloud adds in cost efficiencies and scalability, it lacks in touch control. Most enterprises build hyperconverged cloud environments that span across multiple cloud-provider platforms and data center infrastructure.  This is smart. It provides nimbleness that allows these enterprises to meet different service levels within the organization. The flip side is inconsistent policy enforcement, as each provider has different internal architectures and service-level agreements (SLAs).

Visibility into various cloud environments would provide a certain level of control, but providers -- all of which manage multi-tenant infrastructures -- limit access to packet data to protect their customers' proprietary information. Even if enterprises could get the proper level of physical access they needed to enforce policies, it would be extremely complicated, because cloud has a distributed architecture where resources are pooled, and it would be time-consuming to manage each provider individually while trying to piece together some sort of central control.

As hyperconverged cloud environments grow for enterprises, holistic performance monitoring, testing and security are taking on more importance.  Enterprises need this whether they use Salesforce, NetSuite, or have workflows running on AWS, Google Cloud, or Azure. Having this breadth and depth of reach is necessary to gain touch control over hyperconverged cloud infrastructures.

Here are four tips for regaining touch control over your hyperconverged cloud environments:

1. Access monitoring data in the cloud
Gaining safe access to your cloud packet data is the first step, and that means having the ability to tap data in public or private cloud environments. This cannot be achieved with basic software agents that spit out everything. It requires intelligent sensors or virtual taps capable of extracting only the data you need to monitor and secure -- and they need to automatically scale without operator configuration, reconfiguration or intervention.   

2. Feed data to monitoring and testing systems
Data accessed from the cloud then needs to run through a highly intelligent network packet broker or virtual machine (VM) acting as a packer broker, where it can be deduped and routed to the various monitoring, compliance, analytics and security appliances. Alternately, there should be enough flexibility for sensors or virtual machines to send filtered data directly to cloud-based or cloud-compatible security, monitoring and analytics tools.

3. Use intelligence to parse data appropriately
This is what allows data to be distributed to appliances and systems logically. For example, most enterprises run all network traffic through a firewall, but a university that gets 30% of its incoming traffic from students streaming video may decide it is safe enough to exempt Netflix data from certain security policies. Intelligence at the network packet level, both in data center and cloud, allows this to be a choice. 

4. Test like you are already in the cloud
Most cloud providers offer some level of enterprise configuration, performance monitoring and basic performance testing, but are they enough? Enterprises can spin up home-grown tests and attempt to emulate traffic prior to launch, but how insightful is that? The more accurate and realistic the test environment and design, the more accurately enterprises can predict what will happen once real customer data is on their hyperconverged cloud network. With the customer expectations of instant availability, flawless performance and ironclad security rising every year, enterprises need to test, test, test, and test again to ensure consistent, quality user experiences. This is even more critical for applications that are running on infrastructures over which they have limited control.

No question the cloud provides flexibility, cost-efficiency, and scalability, but enterprises still need control over performance and security to ensure a consistent and quality user experience. Better visibility into hyperconverged cloud environments, driven by smart data access and intelligence at the network packet level, can deliver both so enterprises can better manage user expectations.

— Jeff Harris, CMO, Ixia Solutions Group, a Keysight Business

(0)  | 
Comment  | 
Print  | 
Newest First  |  Oldest First  |  Threaded View        ADD A COMMENT
More Blogs from Column
The optical networking industry has seen its fair share of customers show up to the party and then leave without warning. One analyst ponders what's going to be different in the next 12 months.
NFV has many naysayers, but it's alive, kicking and thriving, with SD-WAN as a significant catalyst.
Industrial IoT is often cited as a demand driver for 5G, but, asks William Webb, will many companies actually need 5G to meet their needs?
Fresh survey results show that the distributed enterprise market is up for grabs as cable operators, traditional telcos and wireless providers vie for advantage.
Operators are challenged with finding a tech-agnostic approach and a data-driven upgrade strategy that will stand the test of time.
Featured Video
Upcoming Live Events
September 17-19, 2019, Dallas, Texas
October 1-2, 2019, New Orleans, Louisiana
October 10, 2019, New York, New York
October 22, 2019, Los Angeles, CA
November 5, 2019, London, England
November 7, 2019, London, UK
November 14, 2019, Maritim Hotel, Berlin
December 3-5, 2019, Vienna, Austria
December 3, 2019, New York, New York
March 16-18, 2020, Embassy Suites, Denver, Colorado
May 18-20, 2020, Irving Convention Center, Dallas, TX
All Upcoming Live Events