Telecom operators are looking to containers to help create an app-focused architecture and eliminate heavy virtualization overhead.

August 12, 2015

6 Min Read
Containers a Critical Piece of Telecom's Future

Container technology is becoming increasingly important to major telecom service providers as they move to be more applications-focused and to maximize the benefits of virtualization by moving more overhead into the cloud.

The growing importance of this technology is why AT&T Inc. (NYSE: T) and Verizon Communications Inc. (NYSE: VZ) are among the 14 companies joining the Open Container Initiative , the Linux Foundation and Docker Inc. project launched earlier this year, and why AT&T is also part of the Cloud Native Computing Foundation , which is tackling container ecosystem challenges.

Containers -- and Docker is the de facto container -- essentially create a package around a piece of software that includes everything it needs to run, including its code, runtime and systems tools and libraries. The software can then run the same way in different operating environments. That fits neatly into the telecom operators' movement from an infrastructure focus toward an application one in response to customer demand, says Doug Nassaur, a lead principal technical architect with AT&T. This becomes a fundamental step in the evolution to the New IP, and a more scalable, flexible service platform.

Instead of wanting to plan their IT spend based on number of servers or network capacity, AT&T's enterprise customers want to plan based on functions, and know what they will need to support those functions going forward.

"They want to know how much money it will cost to do 2,000 credit checks per month, for example," says Nassaur. "First, they are looking for an unprecedented visibility and level of control; secondly they are looking for elasticity or scalability not so much at the server level but scalability that aligns with their business functions."

Container technology enables the telecom operators to support those applications, and virtual network functions, in software packages that can run where they need to -- which is why it's important to have an open standard, says Larry Rau, a director on the IT Technology team at Verizon Labs.

"It allows us a very good packaging scheme for how I take applications and package and deploy them onto various hardware resources," he says.

Trimming the overhead
One important advantage of containers is that they don't have the overhead requirements that standard virtualization usually does, according to both men, so in some ways containers will let telecom operators more fully capture the efficiencies they expected from virtualization.

The initial move to virtual machines involved the addition of a hypervisor, which enabled the physical server to be carved into smaller virtual servers, each of which had its own operating system, all running on top of the host operating system.

"One could argue, and people are questioning right now, that we may have increased cost and complexity in going to virtualization in the way we did," says Nassaur. "I would support that notion, which is why containers are coming at such an important time and why containers are so critical to AT&T and all the consumers and partners that we wish to do business with. That container, simply put, addresses that overhead issue. It takes it out of the equation. It gets us back to why we wanted to do this in the first place."

With containers, he says, there isn't the need for a hypervisor and a native OS for each virtual instance, as that intelligence is pushed up into the cloud control layer. Nassaur calls it replacing a bunch of little mini-brains with one gigantic brain in the cloud.

Next page: Containers critical to future architectures

For AT&T, that movement to a cloud approach is a significant step that fuels the company's plans to manage geo-diverse locations, multiple cloud partners and a variety of connections.

"For us as a service provider, this is ginormous," Nassaur says. "It takes all the operating expense of all of those mini-brains out and allows us to manage it like we're used to and like we are good at, in managing telco infrastructure, to a dial-tone level, a service level agreement approach -- we are really good at that. So this is evolving to look more like our power alley, more like managing a telco, managing dial tone and unified services than an IT problem."

Verizon has been working with Docker on containers for some time, says Rau, back to their earliest version. What is changing now -- as containers evolve and, through the OCI, become an open resource -- is the ability for container technology to support the robust scalability of a New IP network.

"When we look at OCI, our interest is in taking what Docker has started and is now contributing to open source and make sure the technology and innovation evolves at a very quick rate, and the industry doesn't go in different directions," he says. "If you use standards and the community involvement as a way to drive the industry to agree on certain aspects of the technology, it allows you to innovate quicker."

Container technology also enables what Rau calls "a micro-service" architecture approach to developing apps.

"You take your application and break it into multiple concurrent parts that can be distributed across different systems in a distributed computing process," he says. "Containers are a natural way to package and deploy those micro-services. If I have lots of micro-services spread around my hardware, I make much more efficient use of those hardware resources. My utilization of hardware goes up."

Why open matters
The significance of the Open Container Initiative is that it is creating a standard definition and corresponding reference runtime that industry players will need to be able to move containers between different clouds, which will allow applications and containers to become the portability layer going forward, notes Jim Zemlin, executive director of the Linux Foundation, which with Docker is creating the OCI.

"People will be able to write apps to this format and run them on a variety of public or private cloud infrastructures," he notes. "The scope of [OCI] is specifically narrowed to the idea of being able to have portable containers across public-private clouds so that anyone who implements this specification would achieve portability across all of those different infrastructures."

That guarantee of portability is important to customers, AT&T's Nassaur notes.

"Consumers, corporations, the government, even the development community, are not going to buy into what AT&T is doing in a large way unless they know their investment is going to be protected, no matter where the technology industry tends to veer, right or left," he says. "Fundamentally we need a container format that guarantees that so that is why we are participating in open container initiative."

— Carol Wilson, Editor-at-Large, Light Reading

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like