lightreceding 1/16/2018 | 5:20:51 PM
Opensource in the Datacenter I remember going to the Open Compute Project event that is sponsored by Facebook a couple of years ago and the keynote speaker stated that there would be no more software licenses, and no more cardboard boxes. Facebook seems to have enough clout to make it happen with servers and switches. Amazon now designs their own switch as well and it's a Cisco and Juniper killer.
brooks7 1/4/2018 | 11:17:18 AM
Re: Speed of Open Source Open Source is a development and eco system play.

Development:  Most (the VAST majority) of OS projects are components of a larger solution.  What I have seen in many cases here is the desire for end products to be Open Source or better stated free.  That makes no sense and software vendors won't do it.

Eco System:  This is the equivalent of ONAP.  AT&T creates an entire solution for itself.  It adds value to the solution by releasing it to the public so that they can add code, find bugs, ask for features.  AT&T gets incremental improvements on what they have already done.

In the IT domain, the component players make money through consulting.  The products are pretty good but if you drive them to the edges of their performance, then you will need their help to make this all work.  

Also, SOME standards bodies have been driven by carriers.  I would say the ITU, FSAN, and the later phases of the DSL Forum were driven by Service Providers.  The IEEE and IETF were vendor driven.  Japanese standards are somewhere in the middle...they are cooperative.

To Sterling's speed point.  It is speed of development where OS shines.  Don't write what you can download.  Not sure that OS is any faster to deploy (in fact I am sure that it is not) than commercial software.  That is because OS quality is a challenge.  They aren't realistically going to test to the same quality level as commercial software.  So, that means that you should be using stable versions of existing OS projects.  The issues then comes up about what you do in adopting newer versions of OS.  Theoretically, you should be able to not go back and reverify functionality.  The reality is that you have to.  On top of that the APIs in most OS programs change between major revisions, sometimes becoming incompatible.

In the end, it is not the cost of the tool...but how you use it. :)



Edit:  To answer Carol, actually it is how OS projects get started.  The broader community joins in over time if and only if the OS project ends up being popular.  The original code contributors tend to be 1 to a handful of people.

Carol Wilson 1/4/2018 | 11:13:42 AM
Re: Speed of Open Source More the latter -- I think his point about AT&T and the four years of internal development of what is now ONAP is about getting the software architecture right.  AT&T went about it one way, Tom is saying that doing it faster would require a small group smart software architectures - or one very smart guy -- and that this isn't how open source has been done in the past. 


Sterling Perrin 1/4/2018 | 10:38:27 AM
Re: Speed of Open Source Carol.

<I think you've hit at the heart of the issue, although I don't know that I agree with your assessment of Tom's argument.>

Which part of my assessment do you question? That Tom's argument puts speed as critical to open source value? Or that the four years at AT&T part erodes the speed value?

Carol Wilson 1/4/2018 | 10:30:22 AM
Re: Speed of Open Source Sterling, 

I think you've hit at the heart of the issue, although I don't know that I agree with your assessment of Tom's argument. 

I have heard many network operators say that open source is a faster way to address issues and arrive at consensus because it brings to the table those parties interested enough to develop and share code, and focuses the community on solving problems at that level. Bugs are identified and addressed faster and working groups tackle specific challenges and work to resolution faster. 

That said, operators aren't deploying open source on their own, they are looking to their vendors to provide distributions and ongoing support. And the question is, does that extra step mean open source moves at the same pace as standards development?

I personally don't think so because you still have the advantages mentioned above of getting to consensus faster and working out issues faster, but that's my impression as a journalist. These are questions I'm going to be asking a lot in the coming year. 

What I thought was interesting about Tom's insight was that he sees the value of the open source process but also its significant weakness - the software architecture piece. We are also seeing issues around how vendors derive value from this whole exercise, and that's a well-acknowledged challenge within the open source community but one it clearly hasn't solved

Sterling Perrin 1/4/2018 | 10:16:44 AM
Speed of Open Source Originally people thought open source's value was in being cheaper but that value was diminished by the facts that software and hardware needs to be supported and vendors need to make money in order to exist. 

In its place, the speed of open source standard development emerged as the most compelling value proposition - particularly when compared to traditional standards development (such as ITU, IETF, etc.). I have presented on this open source value proposition at several conferences over the past year. And speed seems to be at the heart of Tom Nolle's argument as well.

But very recently I have been hearing network operators say that open source is not faster than traditional standards development. If open source is not faster, then this is a big problem for open source in telecom - it starts to lack a real value proposition!

While defending open source, Tom's assessment that ONAP is good because the code spent years under development by AT&T also erodes the open source speed argument.

If it can't be faster, I think open source will have little value.

vances 1/3/2018 | 10:47:21 PM
New Models for Partner Engagement Open source has proven to be the only acceptable solution for foundational technologies.

Vendors introduce proprietary solutions at the top of the stack but as we build more and more layers above those dependencies wil be met by open source replacements. Some vendors will hang on to the lion's share of that business but open source replacements are necessary for the health, safety and stability of the ecosystem. The state of the operating sytems market tells the story quite well. There are very few companies selling proprietary OSes left today.

As communications service providers becomesoftware driven network businesses they need to adapt their procurement rules of engagement and partnering strategies. Open source provides many advantages and opportunities which CSPs may thrive on but you can't make a direct comparison to a vendor selling run time use licenses. Open source solutions remove the CAPEX and focus on OPEX. Unlike hardware software is dynamic and changes over time, or at least it should in a cloud native environment. The CTO office and procurement should focus on the cost of operations. Open source frees up CAPEX to fund OPEX and allows greater choice in future partnering choices and ability to inhouse development and support.

You can't ignore open source as it's successes are abundantly obvious, most of the software in use today is open source. That trend is only going to increase.

rocket101 1/3/2018 | 9:22:11 PM
Re: Some Analyst trying to justify pros of OpenSource. You get what you pay for. Period. Nothing is free. Analyst Tom, please look at the state of OpenStack. 
rocket101 1/3/2018 | 7:45:40 PM
Some Analyst trying to justify pros of OpenSource. >>Tom Nolle, president of CIMI Corp. , and someone I frequently quote because I think he's >>smart, says open source processes represent a better approach to consensus on >>standards than the traditional processes, but quickly adds there are problems there, too.


Quickly adds there are problems too. Got it? Does he even know what he is talking?

Open source is better, but buyer beware!!!! LOL
Sign In