& cplSiteName &

4 Tips for Maintaining Control in the Cloud

Jeff Harris
10/9/2017
50%
50%

It is no secret that the cloud is fundamentally changing the way businesses operate. Critical applications that were traditionally considered on-premises, such as CRM and ERP, now run on a public infrastructure that can be scaled out, in and down on demand, as needs ebb and flow. The cloud enables great efficiencies and growth -- two essential factors for ongoing business success.

But what the cloud adds in cost efficiencies and scalability, it lacks in touch control. Most enterprises build hyperconverged cloud environments that span across multiple cloud-provider platforms and data center infrastructure.  This is smart. It provides nimbleness that allows these enterprises to meet different service levels within the organization. The flip side is inconsistent policy enforcement, as each provider has different internal architectures and service-level agreements (SLAs).

Visibility into various cloud environments would provide a certain level of control, but providers -- all of which manage multi-tenant infrastructures -- limit access to packet data to protect their customers' proprietary information. Even if enterprises could get the proper level of physical access they needed to enforce policies, it would be extremely complicated, because cloud has a distributed architecture where resources are pooled, and it would be time-consuming to manage each provider individually while trying to piece together some sort of central control.

As hyperconverged cloud environments grow for enterprises, holistic performance monitoring, testing and security are taking on more importance.  Enterprises need this whether they use Salesforce, NetSuite, or have workflows running on AWS, Google Cloud, or Azure. Having this breadth and depth of reach is necessary to gain touch control over hyperconverged cloud infrastructures.

Here are four tips for regaining touch control over your hyperconverged cloud environments:

1. Access monitoring data in the cloud
Gaining safe access to your cloud packet data is the first step, and that means having the ability to tap data in public or private cloud environments. This cannot be achieved with basic software agents that spit out everything. It requires intelligent sensors or virtual taps capable of extracting only the data you need to monitor and secure -- and they need to automatically scale without operator configuration, reconfiguration or intervention.   

2. Feed data to monitoring and testing systems
Data accessed from the cloud then needs to run through a highly intelligent network packet broker or virtual machine (VM) acting as a packer broker, where it can be deduped and routed to the various monitoring, compliance, analytics and security appliances. Alternately, there should be enough flexibility for sensors or virtual machines to send filtered data directly to cloud-based or cloud-compatible security, monitoring and analytics tools.

3. Use intelligence to parse data appropriately
This is what allows data to be distributed to appliances and systems logically. For example, most enterprises run all network traffic through a firewall, but a university that gets 30% of its incoming traffic from students streaming video may decide it is safe enough to exempt Netflix data from certain security policies. Intelligence at the network packet level, both in data center and cloud, allows this to be a choice. 

4. Test like you are already in the cloud
Most cloud providers offer some level of enterprise configuration, performance monitoring and basic performance testing, but are they enough? Enterprises can spin up home-grown tests and attempt to emulate traffic prior to launch, but how insightful is that? The more accurate and realistic the test environment and design, the more accurately enterprises can predict what will happen once real customer data is on their hyperconverged cloud network. With the customer expectations of instant availability, flawless performance and ironclad security rising every year, enterprises need to test, test, test, and test again to ensure consistent, quality user experiences. This is even more critical for applications that are running on infrastructures over which they have limited control.

No question the cloud provides flexibility, cost-efficiency, and scalability, but enterprises still need control over performance and security to ensure a consistent and quality user experience. Better visibility into hyperconverged cloud environments, driven by smart data access and intelligence at the network packet level, can deliver both so enterprises can better manage user expectations.

— Jeff Harris, CMO, Ixia Solutions Group, a Keysight Business

(0)  | 
Comment  | 
Print  | 
Newest First  |  Oldest First  |  Threaded View        ADD A COMMENT
More Blogs from Column
Networking containerized environments is tough; Arista wants to help.
The emergence of the eSIM will make it easier for customers to change operators and force the industry to have a proper conversation about the largely overlooked prepaid side of the business.
AWS is transforming to reach beyond its hardcore developer base, says analyst Zeus Kerravala.
The security industry has an incredible opportunity to move forward and close the talent gap by thinking outside of the norm and taking a chance on technology-savvy women with translatable skills.
Network operators will each take their own unique journey to becoming more cloud-native. This heterogeneous, 'lumpy' universe will be with us for quite a while.
Featured Video
Flash Poll
Upcoming Live Events
March 12-14, 2019, Denver, Colorado
April 2, 2019, New York, New York
April 8, 2019, Las Vegas, Nevada
May 6, 2019, Denver, Colorado
May 6-8, 2019, Denver, Colorado
May 21, 2019, Nice, France
September 17-19, 2019, Dallas, Texas
October 1, 2019, New Orleans, Louisiana
December 5-3, 2019, Viena, Austria
All Upcoming Live Events