Analyst Nolle: Open Source More Likely to Solve Telecom's Problems

Carol Wilson
1/3/2018
50%
50%

A veteran industry strategy consultant says the traditional standards development process isn't going to play much of a role going forward because it's too slow, and he points to the developing 5G standards as an example.

Tom Nolle, president of CIMI Corp. , and someone I frequently quote because I think he's smart, says open source processes represent a better approach to consensus on standards than the traditional processes, but quickly adds there are problems there, too.

"I think 5G is demonstrating, and NFV/SDN have already demonstrated, that we are going to have to figure out a different way to do this or we've lost any hope of relevancy," Nolle comments in an interview, following his 2018 predictions blog. "So compared with standards, open source is great."

Tom Nolle

Because of anti-trust laws in the US and Europe, an open consensus-based process is required for developing telecom specs, but in standards bodies, those processes have for too long been dominated by vendors, whose concern for near-term products and profits can skew their participation toward obstructionism, Nolle notes. There are countless examples over the past few decades of standards processes bogging down as vendors pressed their own advantage and their own versions -- think IMS, AIN and even ISDN, for those with long memories.

"The advantage open source has is, in the main, vendors are not tremendously interested in adopting open source because they don't make money on it," he continues. "So vendors have been inclined to be less obstructionist in open source areas than they have been in standards body and that reduction in obstruction from vendors is one of the factors that allows the open source to move forward."

If you sense the "but" coming...

Nolle believes that an underlying presumption of the open source process is they are addressing software-driven functionality. The success of any given open source project, in his mind, will be based on having the right software architecture. "In the main, the people involved in these processes are not software architects, however, so we don't necessarily start open source projects with the right approach," he adds.


Want to know more about NFV and open source strategies? Check out our dedicated NFV content channel here on Light Reading.


One project that is off on the right foot in that regard, in Nolle's opinion, is the Open Network Automation Platform (ONAP) -- and yes, that is a reversal of some earlier comments he made on this project. (See ONAP Takes Flak as Telcos Prep for Release 1 and ONAP Takes Flak as Telcos Prep for Release 1.)

Nolle now describes ONAP as "the only hope remaining for successful software-driven networking in telecom," adding that this lofty position is based on four years of work internally at AT&T Inc. (NYSE: T).

"The only reason ONAP is any good is because AT&T did it internally and then made it open source," he says. "If they had ceded the concept of ONAP to open source four years ago when the process started, I think they would still be screwing around with scope and other discussions. We'd be nowhere close to where it is today."

What it all comes down to is that the industry needs consensus, and in today's world, that's easier to achieve in open source than in traditional SDOs -- assuming they get the software architecture right to begin with, Nolle says. In fact, he adds, open source approaches may be the only way to tackle the scope of the issues the industry faces.

"If you look at NFV, for example, the real problem isn't NFV anymore, it is service lifecycle automation -- and that envelopes cloud hosting, data center operations, Big Data, FCAPS, OSS-BSS -- there are so many pieces that there is probably no vendor who would try to dream of implementation that broad," he notes. "But the business case of software-defined next-gen networking demands that kind of breadth or you are not going to be able to deploy it."

Open source represents a way of developing such a "huge strategy" because "you don't have ten different vendors all duplicating efforts trying to get things done, you have a cooperative activity. If you make the assumption that they get the software architectures right then the reason why ONAP and concepts like ONAPs succeed is because they are too big for anybody to do any other way," Nolle concludes.

It's an interesting perspective on open source at a time when there is still broad industry debate on its value and whether it does speed innovation and consensus, and that's a topic that will certainly rise to the fore in the coming year, so stay tuned.

— Carol Wilson, Editor-at-Large, Light Reading

(9)  | 
Comment  | 
Print  | 
Newest First  |  Oldest First  |  Threaded View        ADD A COMMENT
lightreceding
50%
50%
lightreceding,
User Rank: Light Sabre
1/16/2018 | 5:20:51 PM
Opensource in the Datacenter
I remember going to the Open Compute Project event that is sponsored by Facebook a couple of years ago and the keynote speaker stated that there would be no more software licenses, and no more cardboard boxes. Facebook seems to have enough clout to make it happen with servers and switches. Amazon now designs their own switch as well and it's a Cisco and Juniper killer.
brooks7
50%
50%
brooks7,
User Rank: Light Sabre
1/4/2018 | 11:17:18 AM
Re: Speed of Open Source
Open Source is a development and eco system play.

Development:  Most (the VAST majority) of OS projects are components of a larger solution.  What I have seen in many cases here is the desire for end products to be Open Source or better stated free.  That makes no sense and software vendors won't do it.

Eco System:  This is the equivalent of ONAP.  AT&T creates an entire solution for itself.  It adds value to the solution by releasing it to the public so that they can add code, find bugs, ask for features.  AT&T gets incremental improvements on what they have already done.

In the IT domain, the component players make money through consulting.  The products are pretty good but if you drive them to the edges of their performance, then you will need their help to make this all work.  

Also, SOME standards bodies have been driven by carriers.  I would say the ITU, FSAN, and the later phases of the DSL Forum were driven by Service Providers.  The IEEE and IETF were vendor driven.  Japanese standards are somewhere in the middle...they are cooperative.

To Sterling's speed point.  It is speed of development where OS shines.  Don't write what you can download.  Not sure that OS is any faster to deploy (in fact I am sure that it is not) than commercial software.  That is because OS quality is a challenge.  They aren't realistically going to test to the same quality level as commercial software.  So, that means that you should be using stable versions of existing OS projects.  The issues then comes up about what you do in adopting newer versions of OS.  Theoretically, you should be able to not go back and reverify functionality.  The reality is that you have to.  On top of that the APIs in most OS programs change between major revisions, sometimes becoming incompatible.

In the end, it is not the cost of the tool...but how you use it. :)

seven

 

Edit:  To answer Carol, actually it is how OS projects get started.  The broader community joins in over time if and only if the OS project ends up being popular.  The original code contributors tend to be 1 to a handful of people.

 
Carol Wilson
50%
50%
Carol Wilson,
User Rank: Blogger
1/4/2018 | 11:13:42 AM
Re: Speed of Open Source
More the latter -- I think his point about AT&T and the four years of internal development of what is now ONAP is about getting the software architecture right.  AT&T went about it one way, Tom is saying that doing it faster would require a small group smart software architectures - or one very smart guy -- and that this isn't how open source has been done in the past. 

 

 
Sterling Perrin
50%
50%
Sterling Perrin,
User Rank: Light Sabre
1/4/2018 | 10:38:27 AM
Re: Speed of Open Source
Carol.

<I think you've hit at the heart of the issue, although I don't know that I agree with your assessment of Tom's argument.>

Which part of my assessment do you question? That Tom's argument puts speed as critical to open source value? Or that the four years at AT&T part erodes the speed value?

Sterling
Carol Wilson
50%
50%
Carol Wilson,
User Rank: Blogger
1/4/2018 | 10:30:22 AM
Re: Speed of Open Source
Sterling, 

I think you've hit at the heart of the issue, although I don't know that I agree with your assessment of Tom's argument. 

I have heard many network operators say that open source is a faster way to address issues and arrive at consensus because it brings to the table those parties interested enough to develop and share code, and focuses the community on solving problems at that level. Bugs are identified and addressed faster and working groups tackle specific challenges and work to resolution faster. 

That said, operators aren't deploying open source on their own, they are looking to their vendors to provide distributions and ongoing support. And the question is, does that extra step mean open source moves at the same pace as standards development?

I personally don't think so because you still have the advantages mentioned above of getting to consensus faster and working out issues faster, but that's my impression as a journalist. These are questions I'm going to be asking a lot in the coming year. 

What I thought was interesting about Tom's insight was that he sees the value of the open source process but also its significant weakness - the software architecture piece. We are also seeing issues around how vendors derive value from this whole exercise, and that's a well-acknowledged challenge within the open source community but one it clearly hasn't solved

 
Sterling Perrin
50%
50%
Sterling Perrin,
User Rank: Light Sabre
1/4/2018 | 10:16:44 AM
Speed of Open Source
Originally people thought open source's value was in being cheaper but that value was diminished by the facts that software and hardware needs to be supported and vendors need to make money in order to exist. 

In its place, the speed of open source standard development emerged as the most compelling value proposition - particularly when compared to traditional standards development (such as ITU, IETF, etc.). I have presented on this open source value proposition at several conferences over the past year. And speed seems to be at the heart of Tom Nolle's argument as well.

But very recently I have been hearing network operators say that open source is not faster than traditional standards development. If open source is not faster, then this is a big problem for open source in telecom - it starts to lack a real value proposition!

While defending open source, Tom's assessment that ONAP is good because the code spent years under development by AT&T also erodes the open source speed argument.

If it can't be faster, I think open source will have little value.

Sterling
vances
100%
0%
vances,
User Rank: Lightning
1/3/2018 | 10:47:21 PM
New Models for Partner Engagement
Open source has proven to be the only acceptable solution for foundational technologies.

Vendors introduce proprietary solutions at the top of the stack but as we build more and more layers above those dependencies wil be met by open source replacements. Some vendors will hang on to the lion's share of that business but open source replacements are necessary for the health, safety and stability of the ecosystem. The state of the operating sytems market tells the story quite well. There are very few companies selling proprietary OSes left today.

As communications service providers becomesoftware driven network businesses they need to adapt their procurement rules of engagement and partnering strategies. Open source provides many advantages and opportunities which CSPs may thrive on but you can't make a direct comparison to a vendor selling run time use licenses. Open source solutions remove the CAPEX and focus on OPEX. Unlike hardware software is dynamic and changes over time, or at least it should in a cloud native environment. The CTO office and procurement should focus on the cost of operations. Open source frees up CAPEX to fund OPEX and allows greater choice in future partnering choices and ability to inhouse development and support.

You can't ignore open source as it's successes are abundantly obvious, most of the software in use today is open source. That trend is only going to increase.

 
rocket101
0%
100%
rocket101,
User Rank: Lightning
1/3/2018 | 9:22:11 PM
Re: Some Analyst trying to justify pros of OpenSource.
You get what you pay for. Period. Nothing is free. Analyst Tom, please look at the state of OpenStack. 
rocket101
50%
50%
rocket101,
User Rank: Lightning
1/3/2018 | 7:45:40 PM
Some Analyst trying to justify pros of OpenSource.
>>Tom Nolle, president of CIMI Corp. , and someone I frequently quote because I think he's >>smart, says open source processes represent a better approach to consensus on >>standards than the traditional processes, but quickly adds there are problems there, too.

 

Quickly adds there are problems too. Got it? Does he even know what he is talking?

Open source is better, but buyer beware!!!! LOL
More Blogs from Rewired
Web giant contributes seed code to new open source group within ONF that is redefining SDN and promising faster innovation and upgrades.
Internet users have grown used to being tracked online, but will they ever accept the fact that some applications need special treatment by ISPs?
New ads call for Internet Bill of Rights that applies to ISPs and content giants, but what are the chances Congress can get this done?
Better Internet access for rural areas is getting a lot of attention from the Trump administration but the plan of action seems less than solid.
Featured Video
From The Founder
Light Reading founder Steve Saunders talks with VMware's Shekar Ayyar, who explains why cloud architectures are becoming more distributed, what that means for workloads, and why telcos can still be significant cloud services players.
Flash Poll
Upcoming Live Events
May 14-16, 2018, Austin Convention Center
May 14, 2018, Brazos Hall, Austin, Texas
September 24-26, 2018, Westin Westminster, Denver
October 9, 2018, The Westin Times Square, New York
October 23, 2018, Georgia World Congress Centre, Atlanta, GA
November 7-8, 2018, London, United Kingdom
November 8, 2018, The Montcalm by Marble Arch, London
November 15, 2018, The Westin Times Square, New York
December 4-6, 2018, Lisbon, Portugal
All Upcoming Live Events
Hot Topics
Is Gmail Testing Self-Destructing Messages?
Mitch Wagner, Mitch Wagner, Editor, Enterprise Cloud, Light Reading, 4/13/2018
BDAC Blowback – Ex-Chair Arrested
Mari Silbey, Senior Editor, Cable/Video, 4/17/2018
I'm Back for the Future of Communications
Phil Harvey, US News Editor, 4/20/2018
Verizon: Lack of Interoperability, Consistency Slows Automation
Carol Wilson, Editor-at-large, 4/18/2018
AT&T Exec Dishes That He's Not So Hot on Rival-Partner Comcast
Mari Silbey, Senior Editor, Cable/Video, 4/19/2018
Animals with Phones
I Heard There Was a Dresscode... Click Here
Live Digital Audio

A CSP's digital transformation involves so much more than technology. Crucial – and often most challenging – is the cultural transformation that goes along with it. As Sigma's Chief Technology Officer, Catherine Michel has extensive experience with technology as she leads the company's entire product portfolio and strategy. But she's also no stranger to merging technology and culture, having taken a company — Tribold — from inception to acquisition (by Sigma in 2013), and she continues to advise service providers on how to drive their own transformations. This impressive female leader and vocal advocate for other women in the industry will join Women in Comms for a live radio show to discuss all things digital transformation, including the cultural transformation that goes along with it.

Like Us on Facebook
Twitter Feed