Content owners are launching new products that lean heavily on large volumes of data to deliver the best possible experience. Of course, this data usage brings heightened responsibility. After all, there have been enough high-profile hacks lately to highlight the liability, embarrassment and privacy exposure that is at stake when things go wrong.
That is what makes this such an interesting time for CIOs. They are juggling management of precious subscriber data and a relatively new environment -- the cloud. While it may be tempting to try leveraging existing on-premises operational processes and security policies to meet new needs, cracks are bound to emerge in this scenario.
There is a balance to be struck between relying on trusted processes and security policies and achieving operational agility. Especially when subscriber data is at stake. Both objectives can be achieved with a security strategy that re-examines the fundamentals: user access, data security and network security.
Take a multi-tiered approach to user roles and access
Data center access security has typically evolved over extended periods, with multiple parts of an organization establishing individual access silos. Cloud deployments present the opportunity to take a more centralized approach. This can be driven by defining roles, responsibilities and access controls for each group and its individuals by a single security group.
IBB Consulting recommends a multi-tiered approach that is separated by user-level access and identities, and groups with authority applied at the group level. This is seemingly obvious but critical to the strong foundation of a good access policy. Always use a strong password policy with password rotation and multifactor authentication for "root" account users with billing control. Finally, keys allowing access to cloud resources should be created for a limited set of users with "user roles" assigning authority to use them. This ensures that teams will not share passwords just to get something done. All user access must start with minimal privileges that are only then expanded as needed.
Implement data security policy
Any data security approach should include policy for different types of data, from simple "dummy" and anonymous user behavioral data all the way up to Payment Card Industry (PCI) and Personally Identifiable Information (PII) data. Collaboration with legal departments is vital, especially to ensure that engineering can explain the nuances between different approaches and head off a "lowest common denominator" approach where all data is treated as the highest risk. This policy approach ensures heavier process and enables that stricter, but often more cumbersome, security, only to be used when truly required. For the best agility, organizations cannot take a one-size-fits-all approach.
Another consideration is the burden or cost of applying extra security measures. For example, while encrypting "data at rest" often only protects against some extreme and rare security compromises, the feature can typically be enabled transparently and with minimal performance impact. The same is typically true for data transport; good turnkey options are available from most cloud providers. In this case however, data transport attack vectors are many and so encryption and strong authentication are nearly always recommended.
Use access control lists to improve network security
Most cloud service providers offer a mechanism to design a private cloud and connect it via VPN to your own data center. This provides a high level of security for data and applications. However, it introduces a weak point where the internal and cloud networks are connected. Still, this level of security is typically adopted by organizations moving highly sensitive data to the cloud for solution development and insights. What's worse, this approach can also stifle development and operational agility because access is constrained to on-site only or via a remote desktop solution.
A lighter option that maintains strong network security is to leverage access control lists (ACLs) that allow only certain IP addresses to access resources in the cloud. For example, ACLs can be applied to a cloud machine so that only the IT group from a certain range of IPs can access it. This approach enhances business agility, but introduces no risk to on-premises data centers because it does not allow the cloud to access on-premises resources and data.
Putting it all together
Ultimately, all cloud components should be tested independently for security vulnerabilities. One organizational group should also be responsible for performing the security audit and enforcing policy. Well-designed cloud architectures will also continuously measure the health of all the resources and thus provide a mechanism to introduce elasticity offered by the cloud. This also enables cost control and alerting to make sure the savings available from high utilization of resource are truly realized.
Rethinking processes and policies when moving to the cloud can ensure strong security and deliver the agility benefits of moving to the cloud. Luckily, big results will not require big investments. Strategies do, however, need to be focused and deliberate when attempting to adapt to this new world. After all, subscribers demand it -- whether they know it or not.
— Prateek Duble, Consulting Manager, and Miles Johnson, Principal, IBB Consulting Group