x
NFV Specs/Open Source

An Intimate View: Standards vs. Open Source

So now we're down to brass tacks: The comparison of what the Broadband Forum did to create TR-69 and what OPNFV is doing with NFV infrastructure.

For a telecom standard, TR-69 actually came together fairly quickly, most likely because it addressed a specific and universal pain point for broadband ISPs -- it's still widely used today, by the way. The group that assembled within the forum to address the issue -- which started out as the auto-config working group, says Kirksey -- spent 18 months arguing over how to accomplish what everyone agreed needed to be done. But after lengthy discussions, including some meetings that ended in the bar, where beer helped fuel the collaborative spirit, the group had -- on paper -- the specifications for how TR-69 could work to enable remote diagnostics and management of CPE.

But that's when things got interesting, Kirksey notes -- and by interesting, she means a bit contentious. The paper spec was given to the engineers at each of the companies involved in the spec at the time, and there were many. Two things happened after that: Either the engineers found the specs incomplete or different people at different companies interpreted them differently and built differing solutions on the same specs.

So the first 18 months of hard work was followed by another 18 months of painful work because implementations of the spec had started but they weren't interoperable, which was a baseline requirement of the broadband operators.

"We had to do a lot in the weeds, a deep dive, painful technical work to get to that next appropriate level," Kirksey says. "People who had invested engineering resources in doing things one way had to go home and tell their engineers 'You have to redo it.' "

Plugfests to the rescue
What ultimately produced the spec that works well today was a series of plugfests at the University of New Hampshire InterOperability Laboratory (IOL) which produced results -- kept confidential at the time so vendors would continue to share information -- that were fed into the Broadband Forum working group, until "every single point of dissent or lack of interoperability" was addressed, says Kirksey, who was co-chair of the group at the time.

She knows this because, along with 2Wire Cofounder Jeff Bernstein, Kirksey created a spreadsheet that tracked every bug, every interop-related issue and every plugfest result. "We made sure each one was addressed and each bug and interop-related issue was fixed," she recalls.

Relationships became a key part of the process -- figuring out who was intent on getting work done and who was a political grandstander, and understanding when to defer to a participant because of their specific expertise, were both essential factors, she says. Kirksey credits TR-69's staying power to the group's determination to work through even the "big ticket items" to deliver true interoperability.

Next page: How it works in open source

Previous Page
2 of 3
Next Page
Robin Mersh 7/23/2015 | 2:11:00 PM
Understanding industry requirements and Open Source solutions I really appreciate Heather (and Light Reading) highlighting the issues around standards development and Open Source development. I have known Heather for quite some time and I really value her insight.

It is certainly true that we are moving to more programmable, adaptable and software defined networks. Developing solutions with more agility is paramount. The old paradigm of standards, where the 'perfect' solution trumped the speed of delivery is receding.

It is also true that there is still a lot of diversity in deployed networks. This is due to a number of factors, earlier technology decisions, the competitive landscape and the regulatory environment.

Developing effective solutions needs a clear understanding of requirements, and that does take discussion and agreement.  

I think we should welcome the disruption to standards from software driven solutions and openly collaborate.
cnwedit 7/23/2015 | 2:49:33 PM
Re: Understanding industry requirements and Open Source solutions Interesting comment, Robin. The diversity of what's already deployed is definitely what makes technology transitions like this one so challenging, and standards work as well. It's one of the things that open source and the agile methods of developing software seem well-suited to address. 
brooks7 7/23/2015 | 4:32:11 PM
Re: Understanding industry requirements and Open Source solutions I disagree with that set of comments Carol.

You would have to make an assumption that either you are running with a "standard" api that EVERYBODY uses (like say HTML5 - which is a standard) or that ALL products that you use have the exact same version (unedited) of the exact same Open Source Product for that category.  Otherwise you don't have the API (interface) standardization that allows two disparate systems work together that were developed independently.  Assume every product in the network uses as different NFV and SDN set of Open Source.  You end up having to have standards to make them work together.

seven

 
cnwedit 7/23/2015 | 4:36:18 PM
Re: Understanding industry requirements and Open Source solutions Okay, I admit I'm not entirely following what you are saying there - do you mean each operator winds up with a different "open source" version of NFV and SDN? If so, that's not at all what I'm saying.

I'm talking about what exists today and how you get from what exists today to a more flexible, scalable infrastructure that is software-driven, and uses mostly - not purely, but primarily - commercial off-the-shelf hardware instead of purpose-built telecom gear and builds intelligence into the software layers that can be more easily updated and changed.

And yes, I know there's a lot more to it than that. But the open source approach that Heather describes is a faster way of getting the industry to agree of what it needs to agree on, and differentiate on the rest as it chooses. 
brooks7 7/23/2015 | 8:39:37 PM
Re: Understanding industry requirements and Open Source solutions No - every product in every carrier will have different open source.

Go look at IT - when something new comes up it spawns dozens of Open Source (OS) Projects.  5 - 10 years later there are only 2 or 3 (like say Debian versus Red Hat versus Unbuntu + there are others).  The rest are abandoned.

The way that OS works in IT is that they all use 2 things:

- Standards and;

- APIs

Without Standards like HTLM5 or DNS or XML or SQL then you can't develop effective APIs and standard interfaces.

So - OS does not eliminate standards - it makes them infinitely more important.  

It is faster to build products based on Open Source.  I know - I have done it.  In fact the product could be defined as NFV (as we virtualized a network function and spawned new ones when we wanted to) and SDN (although we built our network on an overlay IP VPN).

What it DOESN'T do is streamline the process of getting standards adopted and interoperability assured.  And then remember you have to test every new version of every OS package that you include.  The big chunky stuff that you would want - say a Web Server - nobody has to write from scratch.  But there is a huge challenge on systems integration and testing.

So, in my view OS is irrelevant to standardization timelines.  Standardization, Integration and Testing are very important.  How many NFV OS projects are out there?  How many people will ignore them and just do what is already done in IT (as this is a solved problem in the IT domain)?  What do you do when there is an interoperability challenge between OS projects?

So - my view - with the experience of the last 15 years of the Enterprise IT world is that OS does not make standardization faster.  What is does do is eliminate the need for SW engineers to write major code elements - except when they are writing for competing OS projects.

seven

PS - By the way, you guys need a responsive site for Mobile or you are going to get clobbered in the SEO rankings.
mhhf1ve 8/6/2015 | 7:04:01 PM
Open source is no silver bullet... I don't think too many people think Open Source is a magical answer for everything... but it can help with getting some things developed faster by reducing re-invention of the wheel here and there.

There will never be a single solution that fits every use case.
HOME
Sign In
SEARCH
CLOSE
MORE
CLOSE