x
NFV Specs/Open Source

An Intimate View: Standards vs. Open Source

How different is open source from the traditional telecom standards process?

One person with intimate knowledge of those key differences is Heather Kirksey, director of NFV for the Open Platform for NFV Project Inc. , the Linux Foundation -backed open source effort. As someone directly involved in developing a recent and enduring telecom standard, TR-69, Kirksey has seen firsthand how both processes work and knows why open source is faster, as the result of a different kind of cooperation.

From her perspective, open source development speeds things up because it helps avoid company politics in reaching decisions and is specifically well-suited to a telecom world in which functionality moves out of hardware into software.


Read more about NFV strategies and the role of open source in our NFV section
here on Light Reading.

In comparing Kirksey's two standards experiences, a short history lesson is in order. TR-69 was developed shortly after the turn of the century and continually updated through the 2000s to solve a critical problem for broadband ISPs: the need to manage and monitor in-home devices including modems and gateways. This is a capability we now take for granted, but in the early days of DSL and cable modems, broadband ISPs were plagued by the inability to quickly discover what was causing a customer problem. Often the broadband line itself was fine, but there was an issue with the modem, or its connection to a PC, or the computer itself that couldn't be remotely detected.

OPNFV Director Heather Kirksey

The result was a massive customer service headache that initially slowed broadband adoption and drove up deployment costs significantly. With each customer added, there were time-consuming and costly issues around installation, maintenance and upgrades for CPE that led to thousands of hours of customer service calls and wasted truck rolls and many angry and frustrated customers.

This process was also killing broadband profits by driving costs up dramatically.

"The process was so manual and had to be so customized to all the different firmware versions and makes and models of devices -- it was an exponentially difficult problem," recalls Kirksey, who initially managed the CPE partner program at Broadjump, which was acquired by Motive, which was later acquired by Alcatel-Lucent (NYSE: ALU). (See Motive, BroadJump to Merge and AlcaLu Gets Motivated.)

Bleeding edge action
As someone managing CPE partnerships, Kirksey found herself at the bleeding edge of solving the issue, but she also discovered her company's service provider customers -- once-familiar names such as BellSouth and SBC Communications - weren't interested in a single company's solution to their problems but were pushing all their vendors to the Broadband Forum -- which might have still been the DSL Forum at that time -- to work out an industry solution.

If that sounds familiar, it's because it is. OPNFV is also an organization driven by network operators who need help with network management and orchestration of their network functions virtualization and want to push the NFV specs forward using open source.

Next page: Progress and pain for TR-69 group

1 of 3
Next Page
mhhf1ve 8/6/2015 | 7:04:01 PM
Open source is no silver bullet... I don't think too many people think Open Source is a magical answer for everything... but it can help with getting some things developed faster by reducing re-invention of the wheel here and there.

There will never be a single solution that fits every use case.
brooks7 7/23/2015 | 8:39:37 PM
Re: Understanding industry requirements and Open Source solutions No - every product in every carrier will have different open source.

Go look at IT - when something new comes up it spawns dozens of Open Source (OS) Projects.  5 - 10 years later there are only 2 or 3 (like say Debian versus Red Hat versus Unbuntu + there are others).  The rest are abandoned.

The way that OS works in IT is that they all use 2 things:

- Standards and;

- APIs

Without Standards like HTLM5 or DNS or XML or SQL then you can't develop effective APIs and standard interfaces.

So - OS does not eliminate standards - it makes them infinitely more important.  

It is faster to build products based on Open Source.  I know - I have done it.  In fact the product could be defined as NFV (as we virtualized a network function and spawned new ones when we wanted to) and SDN (although we built our network on an overlay IP VPN).

What it DOESN'T do is streamline the process of getting standards adopted and interoperability assured.  And then remember you have to test every new version of every OS package that you include.  The big chunky stuff that you would want - say a Web Server - nobody has to write from scratch.  But there is a huge challenge on systems integration and testing.

So, in my view OS is irrelevant to standardization timelines.  Standardization, Integration and Testing are very important.  How many NFV OS projects are out there?  How many people will ignore them and just do what is already done in IT (as this is a solved problem in the IT domain)?  What do you do when there is an interoperability challenge between OS projects?

So - my view - with the experience of the last 15 years of the Enterprise IT world is that OS does not make standardization faster.  What is does do is eliminate the need for SW engineers to write major code elements - except when they are writing for competing OS projects.

seven

PS - By the way, you guys need a responsive site for Mobile or you are going to get clobbered in the SEO rankings.
cnwedit 7/23/2015 | 4:36:18 PM
Re: Understanding industry requirements and Open Source solutions Okay, I admit I'm not entirely following what you are saying there - do you mean each operator winds up with a different "open source" version of NFV and SDN? If so, that's not at all what I'm saying.

I'm talking about what exists today and how you get from what exists today to a more flexible, scalable infrastructure that is software-driven, and uses mostly - not purely, but primarily - commercial off-the-shelf hardware instead of purpose-built telecom gear and builds intelligence into the software layers that can be more easily updated and changed.

And yes, I know there's a lot more to it than that. But the open source approach that Heather describes is a faster way of getting the industry to agree of what it needs to agree on, and differentiate on the rest as it chooses. 
brooks7 7/23/2015 | 4:32:11 PM
Re: Understanding industry requirements and Open Source solutions I disagree with that set of comments Carol.

You would have to make an assumption that either you are running with a "standard" api that EVERYBODY uses (like say HTML5 - which is a standard) or that ALL products that you use have the exact same version (unedited) of the exact same Open Source Product for that category.  Otherwise you don't have the API (interface) standardization that allows two disparate systems work together that were developed independently.  Assume every product in the network uses as different NFV and SDN set of Open Source.  You end up having to have standards to make them work together.

seven

 
cnwedit 7/23/2015 | 2:49:33 PM
Re: Understanding industry requirements and Open Source solutions Interesting comment, Robin. The diversity of what's already deployed is definitely what makes technology transitions like this one so challenging, and standards work as well. It's one of the things that open source and the agile methods of developing software seem well-suited to address. 
Robin Mersh 7/23/2015 | 2:11:00 PM
Understanding industry requirements and Open Source solutions I really appreciate Heather (and Light Reading) highlighting the issues around standards development and Open Source development. I have known Heather for quite some time and I really value her insight.

It is certainly true that we are moving to more programmable, adaptable and software defined networks. Developing solutions with more agility is paramount. The old paradigm of standards, where the 'perfect' solution trumped the speed of delivery is receding.

It is also true that there is still a lot of diversity in deployed networks. This is due to a number of factors, earlier technology decisions, the competitive landscape and the regulatory environment.

Developing effective solutions needs a clear understanding of requirements, and that does take discussion and agreement.  

I think we should welcome the disruption to standards from software driven solutions and openly collaborate.
HOME
Sign In
SEARCH
CLOSE
MORE
CLOSE