Fast forward to today, and Kirksey credits that experience and what she learned with qualifying her for her current job at OPNFV -- and a process that is very different.
One overarching difference is that the open source process builds on code, not specs drawn up in advance of implementation, and that shortens the process considerably.
"It also helps circumvent some of the political aspects of the fighting," Kirksey comments. "Because you can very easily argue this way is better than that way and to have long arguments to back it up. But once you've got cold hard code, that can put the kibosh on a purely political argument."
That doesn't mean conceptual discussions go away entirely, she adds, as some are still necessary, "but this process reduces those to the necessary ones instead of the grandstanding ones."
The collaborative process itself changes somewhat -- at least that has been Kirksey's experience -- because tough technical problems are now things that the group has to work together on in a different way. "When you are beginning to write code together, that actual activity and process encourages more collaboration," she says. "Because you are rolling up your sleeves together to solve the problem and that naturally just leads to more collegial happenstance than if you have [everyone around] the table doing something more like contract negotiations."
There is also respect given within the group to the arguments of anyone willing to go and create code to back up what they are saying. "To call it 'put up or shut up' sounds unnecessarily harsh," Kirksey says. "It's more like, if you are passionate about something enough to go and create code, that is going to be considered by the community, a lot more than throwing sticks from the sidelines. There is a certain respect for people who put the work in and that helps resolve issues."
Agile principles key
Two other key differences were apparent in OPNFV's process of developing its first release, Arno, which came out earlier this summer. First, open source uses agile principles, as opposed to a more traditional waterfall approach and so it can actually seem more chaotic, especially at first.
"There is less of that top-down, upfront decision-making, so it can feel a bit more chaotic," Kirksey says. "You can perhaps ship a spec out the door on an arbitrary date but if your builds aren't working, you aren't going to ship a release. Working through the bugs to get the release out can feel more chaotic. But that's a good thing because it allows us to not pick winners."
And that's the second thing that's different about open source -- not picking winners and losers in the standards process.
"I'll use an example from OPNFV: When we were first talking about Arno, lots of people were saying we should use installer X," she says. "In a traditional standards world, we would have had some period of time where we would have argued the merits of installer X versus Y versus Z for months until we reached the point where we said, 'Okay we are going with installer Y and all of our decisions would have flowed from that.' "
Instead, OPNFV established requirements for any installer that is going to be part of the Arno release -- its first -- and at the point of release, two installers were ready and met those requirements. Kirksey expects more will be coming, but OPNFV doesn't try to arbitrate or pick a winner.
OPNFV was six weeks late with that first release, missing the regular cadence of a release every six months, mostly due to the enormity of its initial task, Kirksey says. In telecom terms, that wasn't a significant lag, and was still a much faster and more streamlined process than what would have happened if each service provider worked through these issues separately with its vendors.
"We were integrating all the components together in a way that no one had done before, in an automated repeatable way, and we were finding bugs, configuration issues, people having different assumptions, solutions that worked on one type of hardware and not another when we wanted it to be hardware-agnostic," she says. "Those issues all took a while to solve."
Because the community is working together on the code, it is also easier to catch bugs or other issues, Kirksey notes. "There's a saying: given enough eyeballs, every bug is shallow. That is where the speed comes from." And the collective work of the community replaces many more man-hours that would have been spent in individual labs, making this a more efficient process even if it seems more chaotic at times.
Release cadence critical
The shift to a software-driven universe also tilts the scale toward the open source approach, which uses iterations and a cadence of releases to constantly update and improve, rather than relying on purpose-built hardware with substantially less flexibility.
"When you are doing something like a protocol and you know people are going to go off and implement it, and that protocol is going to be in all the hard-to-upgrade legacy hardware on the planet, there is a great burden to make sure you get it exactly right," Kirksey notes. "With open source, where you are updating it at a roughly six-month cadence, you have a bit more leeway to be experimental and then fix problems, if they arise."
Since NFV is specifically designed not to be monolithic, but to be flexible and offer the freedom to experiment with new things and turn services up and down as needed, the iteration process enables the faster change that is appropriate.
And the cadence of the release schedule also imposes a discipline on the process that keeps the focus on timely problem solving. "That's where the bigger change -- the move to agile processes versus waterfall -- really has an impact," Kirksey notes. "Instead of feeling like we have to have everything perfect, we design our processes around being able to iterate and have that freedom."
— Carol Wilson, Editor-at-Large, Light Reading