A common theme among enterprises speaking at DockerCon last week was that while Docker containers have proven their usefulness, they're also still a work in progress.
Speaker sessions were slanted toward success stories, of course, but they also provided some lessons based on fresh wounds.
Intuit Inc. (Nasdaq: INTU), for instance, learned that applications have to behave differently in a container environment. The company started using Docker in earnest in mid-2016, starting with one project that had careful boundaries -- no container networking allowed, for instance.
That created other problems. Lacking container networking, each Docker instance had to keep contacting a router, and it turned out Intuit's applications did this way too often. That caused baffling connection resets.
"We had to go to work on our applications," Lahpoor said. "The applications had to get used to more volatile environments."
Intuit also set up containers to automatically update their operating systems -- which, it turned out, had the capacity to make DNS go haywire inside a container. That created a "zombie container" that wouldn't respond to any requests.
Intuit's case has an interesting twist: tax season. Internal applications start drawing lots of traffic in November and December, and it sounds like that's when a lot of the company's problems came to light. Christmastime was particularly unhappy, Lahpoor said.
"If you have a project that's already behind schedule, that is not a good candidate to Dockerize," he said. "This is only four years old. You can't expect it to run like VMware."
The company's core systems have run with zero downtime for two decades, Chief Systems Architect Sasi Kannappan said, so high availability was a major concern. That meant constantly pinging containers to check status, and Visa had to write some of its own code to handle that. Docker 1.12, in 2016, helped by including a health-check function for containers.
Given all these troubles, why do any of this?
Oddly, the biggest champion for Docker was one of the oldest and most staid enterprises to present: Automatic Data Processing Inc. (ADP), which offers software for payroll and other human resources functions. ADP spoke at last year's DockerCon, and Jim Ford, chief architect, came back this year for an update.
Most enterprises talk about "developer productivity" as a reason to us containers. ADP was able to confirm that, noting that developers have more power to see what's happening with their code.
"It helped us get developers into more of what I would call a flow state," he said. "They kind of stay in that coding mindset while they work through [a problem] and get that immediate feedback."
Containers also make it faster to move applications into production from test. "We were tired of this complex deployment cycle that would take the better part of a Sunday and roll into Monday," said Shawn Bower, cloud architect at Corning University.
Security is an interesting side effect of containers. They're small and meant to be ephemeral, so they don't linger as tempting targets.
"Get to where your stuff is short-lived, because then the bad actor needs to get in and re-compromise you," Ford advised the audience. "If you're churning images every month, you've reduced the compromise window to 30 days."
At the same time, Ford has a security concern: Are the images available on Docker Hub secure? Docker certifies the containers on the hub, which anybody can download and use, but that's not enough assurance for an $11 billion-a-year company like ADP. "I'm not risking $11 billion on Docker telling me it's safe," he said.
— Craig Matsumoto, Editor-in-Chief, Light Reading