NFV Specs/Open Source

An Intimate View: Standards vs. Open Source

Fast forward to today, and Kirksey credits that experience and what she learned with qualifying her for her current job at OPNFV -- and a process that is very different.

One overarching difference is that the open source process builds on code, not specs drawn up in advance of implementation, and that shortens the process considerably.

"It also helps circumvent some of the political aspects of the fighting," Kirksey comments. "Because you can very easily argue this way is better than that way and to have long arguments to back it up. But once you've got cold hard code, that can put the kibosh on a purely political argument."

That doesn't mean conceptual discussions go away entirely, she adds, as some are still necessary, "but this process reduces those to the necessary ones instead of the grandstanding ones."

The collaborative process itself changes somewhat -- at least that has been Kirksey's experience -- because tough technical problems are now things that the group has to work together on in a different way. "When you are beginning to write code together, that actual activity and process encourages more collaboration," she says. "Because you are rolling up your sleeves together to solve the problem and that naturally just leads to more collegial happenstance than if you have [everyone around] the table doing something more like contract negotiations."

There is also respect given within the group to the arguments of anyone willing to go and create code to back up what they are saying. "To call it 'put up or shut up' sounds unnecessarily harsh," Kirksey says. "It's more like, if you are passionate about something enough to go and create code, that is going to be considered by the community, a lot more than throwing sticks from the sidelines. There is a certain respect for people who put the work in and that helps resolve issues."

Agile principles key
Two other key differences were apparent in OPNFV's process of developing its first release, Arno, which came out earlier this summer. First, open source uses agile principles, as opposed to a more traditional waterfall approach and so it can actually seem more chaotic, especially at first.

"There is less of that top-down, upfront decision-making, so it can feel a bit more chaotic," Kirksey says. "You can perhaps ship a spec out the door on an arbitrary date but if your builds aren't working, you aren't going to ship a release. Working through the bugs to get the release out can feel more chaotic. But that's a good thing because it allows us to not pick winners."

And that's the second thing that's different about open source -- not picking winners and losers in the standards process.

"I'll use an example from OPNFV: When we were first talking about Arno, lots of people were saying we should use installer X," she says. "In a traditional standards world, we would have had some period of time where we would have argued the merits of installer X versus Y versus Z for months until we reached the point where we said, 'Okay we are going with installer Y and all of our decisions would have flowed from that.' "

Instead, OPNFV established requirements for any installer that is going to be part of the Arno release -- its first -- and at the point of release, two installers were ready and met those requirements. Kirksey expects more will be coming, but OPNFV doesn't try to arbitrate or pick a winner.

OPNFV was six weeks late with that first release, missing the regular cadence of a release every six months, mostly due to the enormity of its initial task, Kirksey says. In telecom terms, that wasn't a significant lag, and was still a much faster and more streamlined process than what would have happened if each service provider worked through these issues separately with its vendors.

"We were integrating all the components together in a way that no one had done before, in an automated repeatable way, and we were finding bugs, configuration issues, people having different assumptions, solutions that worked on one type of hardware and not another when we wanted it to be hardware-agnostic," she says. "Those issues all took a while to solve."

Because the community is working together on the code, it is also easier to catch bugs or other issues, Kirksey notes. "There's a saying: given enough eyeballs, every bug is shallow. That is where the speed comes from." And the collective work of the community replaces many more man-hours that would have been spent in individual labs, making this a more efficient process even if it seems more chaotic at times.

Release cadence critical
The shift to a software-driven universe also tilts the scale toward the open source approach, which uses iterations and a cadence of releases to constantly update and improve, rather than relying on purpose-built hardware with substantially less flexibility.

"When you are doing something like a protocol and you know people are going to go off and implement it, and that protocol is going to be in all the hard-to-upgrade legacy hardware on the planet, there is a great burden to make sure you get it exactly right," Kirksey notes. "With open source, where you are updating it at a roughly six-month cadence, you have a bit more leeway to be experimental and then fix problems, if they arise."

Since NFV is specifically designed not to be monolithic, but to be flexible and offer the freedom to experiment with new things and turn services up and down as needed, the iteration process enables the faster change that is appropriate.

And the cadence of the release schedule also imposes a discipline on the process that keeps the focus on timely problem solving. "That's where the bigger change -- the move to agile processes versus waterfall -- really has an impact," Kirksey notes. "Instead of feeling like we have to have everything perfect, we design our processes around being able to iterate and have that freedom."

— Carol Wilson, Editor-at-Large, Light Reading

Previous Page
3 of 3
mhhf1ve 8/6/2015 | 7:04:01 PM
Open source is no silver bullet... I don't think too many people think Open Source is a magical answer for everything... but it can help with getting some things developed faster by reducing re-invention of the wheel here and there.

There will never be a single solution that fits every use case.
brooks7 7/23/2015 | 8:39:37 PM
Re: Understanding industry requirements and Open Source solutions No - every product in every carrier will have different open source.

Go look at IT - when something new comes up it spawns dozens of Open Source (OS) Projects.  5 - 10 years later there are only 2 or 3 (like say Debian versus Red Hat versus Unbuntu + there are others).  The rest are abandoned.

The way that OS works in IT is that they all use 2 things:

- Standards and;

- APIs

Without Standards like HTLM5 or DNS or XML or SQL then you can't develop effective APIs and standard interfaces.

So - OS does not eliminate standards - it makes them infinitely more important.  

It is faster to build products based on Open Source.  I know - I have done it.  In fact the product could be defined as NFV (as we virtualized a network function and spawned new ones when we wanted to) and SDN (although we built our network on an overlay IP VPN).

What it DOESN'T do is streamline the process of getting standards adopted and interoperability assured.  And then remember you have to test every new version of every OS package that you include.  The big chunky stuff that you would want - say a Web Server - nobody has to write from scratch.  But there is a huge challenge on systems integration and testing.

So, in my view OS is irrelevant to standardization timelines.  Standardization, Integration and Testing are very important.  How many NFV OS projects are out there?  How many people will ignore them and just do what is already done in IT (as this is a solved problem in the IT domain)?  What do you do when there is an interoperability challenge between OS projects?

So - my view - with the experience of the last 15 years of the Enterprise IT world is that OS does not make standardization faster.  What is does do is eliminate the need for SW engineers to write major code elements - except when they are writing for competing OS projects.


PS - By the way, you guys need a responsive site for Mobile or you are going to get clobbered in the SEO rankings.
cnwedit 7/23/2015 | 4:36:18 PM
Re: Understanding industry requirements and Open Source solutions Okay, I admit I'm not entirely following what you are saying there - do you mean each operator winds up with a different "open source" version of NFV and SDN? If so, that's not at all what I'm saying.

I'm talking about what exists today and how you get from what exists today to a more flexible, scalable infrastructure that is software-driven, and uses mostly - not purely, but primarily - commercial off-the-shelf hardware instead of purpose-built telecom gear and builds intelligence into the software layers that can be more easily updated and changed.

And yes, I know there's a lot more to it than that. But the open source approach that Heather describes is a faster way of getting the industry to agree of what it needs to agree on, and differentiate on the rest as it chooses. 
brooks7 7/23/2015 | 4:32:11 PM
Re: Understanding industry requirements and Open Source solutions I disagree with that set of comments Carol.

You would have to make an assumption that either you are running with a "standard" api that EVERYBODY uses (like say HTML5 - which is a standard) or that ALL products that you use have the exact same version (unedited) of the exact same Open Source Product for that category.  Otherwise you don't have the API (interface) standardization that allows two disparate systems work together that were developed independently.  Assume every product in the network uses as different NFV and SDN set of Open Source.  You end up having to have standards to make them work together.


cnwedit 7/23/2015 | 2:49:33 PM
Re: Understanding industry requirements and Open Source solutions Interesting comment, Robin. The diversity of what's already deployed is definitely what makes technology transitions like this one so challenging, and standards work as well. It's one of the things that open source and the agile methods of developing software seem well-suited to address. 
Robin Mersh 7/23/2015 | 2:11:00 PM
Understanding industry requirements and Open Source solutions I really appreciate Heather (and Light Reading) highlighting the issues around standards development and Open Source development. I have known Heather for quite some time and I really value her insight.

It is certainly true that we are moving to more programmable, adaptable and software defined networks. Developing solutions with more agility is paramount. The old paradigm of standards, where the 'perfect' solution trumped the speed of delivery is receding.

It is also true that there is still a lot of diversity in deployed networks. This is due to a number of factors, earlier technology decisions, the competitive landscape and the regulatory environment.

Developing effective solutions needs a clear understanding of requirements, and that does take discussion and agreement.  

I think we should welcome the disruption to standards from software driven solutions and openly collaborate.
Sign In