Container technology is becoming increasingly important to major telecom service providers as they move to be more applications-focused and to maximize the benefits of virtualization by moving more overhead into the cloud.
The growing importance of this technology is why AT&T Inc. (NYSE: T) and Verizon Communications Inc. (NYSE: VZ) are among the 14 companies joining the Open Container Initiative , the Linux Foundation and Docker Inc. project launched earlier this year, and why AT&T is also part of the Cloud Native Computing Foundation , which is tackling container ecosystem challenges.
Containers -- and Docker is the de facto container -- essentially create a package around a piece of software that includes everything it needs to run, including its code, runtime and systems tools and libraries. The software can then run the same way in different operating environments. That fits neatly into the telecom operators' movement from an infrastructure focus toward an application one in response to customer demand, says Doug Nassaur, a lead principal technical architect with AT&T. This becomes a fundamental step in the evolution to the New IP, and a more scalable, flexible service platform.
Instead of wanting to plan their IT spend based on number of servers or network capacity, AT&T's enterprise customers want to plan based on functions, and know what they will need to support those functions going forward.
"They want to know how much money it will cost to do 2,000 credit checks per month, for example," says Nassaur. "First, they are looking for an unprecedented visibility and level of control; secondly they are looking for elasticity or scalability not so much at the server level but scalability that aligns with their business functions."
Container technology enables the telecom operators to support those applications, and virtual network functions, in software packages that can run where they need to -- which is why it's important to have an open standard, says Larry Rau, a director on the IT Technology team at Verizon Labs.
"It allows us a very good packaging scheme for how I take applications and package and deploy them onto various hardware resources," he says.
Trimming the overhead
One important advantage of containers is that they don't have the overhead requirements that standard virtualization usually does, according to both men, so in some ways containers will let telecom operators more fully capture the efficiencies they expected from virtualization.
The initial move to virtual machines involved the addition of a hypervisor, which enabled the physical server to be carved into smaller virtual servers, each of which had its own operating system, all running on top of the host operating system.
"One could argue, and people are questioning right now, that we may have increased cost and complexity in going to virtualization in the way we did," says Nassaur. "I would support that notion, which is why containers are coming at such an important time and why containers are so critical to AT&T and all the consumers and partners that we wish to do business with. That container, simply put, addresses that overhead issue. It takes it out of the equation. It gets us back to why we wanted to do this in the first place."
With containers, he says, there isn't the need for a hypervisor and a native OS for each virtual instance, as that intelligence is pushed up into the cloud control layer. Nassaur calls it replacing a bunch of little mini-brains with one gigantic brain in the cloud.
Next page: Containers critical to future architectures