NFV Specs/Open Source

Virtualization Confidence Takes Hit in Latest HR Survey

The number of telcos saying they are extremely confident about meeting targets for the rollout of "high-priority" virtualized functions has fallen over the last year, according to Heavy Reading 's latest Future of Virtualization index survey results.

The results of the index, which canvasses opinion in the telco sector once every six months as part of the Virtuapedia research project, could reflect some despondency about the deployment of virtualization technology, as companies wrestle with interoperability and standardization challenges. (See Virtuapedia Community Hits 10,000.)

Just 5% of respondents said they were "extremely confident" their company would meet its timetable for the testing and deployment of virtualized functions in "high-priority areas" -- down from 22% a year earlier.

In November last year, about 42% of survey participants expected that by now they would have identified all of the functions they intend to virtualize by 2020. Yet only 10.7% of respondents have actually done so, the latest research shows.

"By any standards, that is a very big miss," said Patrick Donegan, chief analyst with Heavy Reading, during a presentation at Light Reading's Executive Summit in Rome this week.

It is not all doom and gloom, however. When it comes to progress on virtualization planning, there has been a notable uptick since the last survey six months ago.

At that time, just 55% of respondents said they had completed most of their planning for high-priority areas -- the same as in November 2015. But the figure had risen sharply to 70% in the recent survey.

"When you contrast the results in terms of identifying functions with the strong planning figures, I don't know that I would call any of it disappointment," said Donegan. "It seems more like focus."

More than 70% of telcos also expect that capital expenditure on virtualization will increase next year, compared with 2016, although nearly 40% reckon spending will increase by less than 10%.

Want to know more about the companies, people and organizations driving developments in the virtualization sector? Check out Virtuapedia, the most comprehensive online resource covering the virtualization industry.

While a small number of operators are pioneering the rollout of software and virtualization technologies, many are waiting for technologies to mature -- and for evidence of the actual benefits -- before making any commitments.

"We must fix the problems with virtualization, making it easier to deploy and guaranteeing that products from different vendors will work together," said Steve Saunders, the founder and CEO of Light Reading, during a morning presentation at the Rome Executive Summit.

The New IP Agency , a not-for-profit group established by Light Reading, was launched earlier this year to address the interoperability challenges surrounding virtualization technology, and has already secured the backing of some of the world's biggest operators and equipment vendors. (See NIA Tests Reveal OpenStack Version Challenges.)

Having already carried out a number of interoperability tests -- working in partnership with the European Advanced Networking Test Center AG (EANTC) -- the NIA this week announced plans to begin certifying virtualization labs as well as technologies.

The move is aimed at speeding up the development of technologies and their subsequent rollout in live production networks.

Just 11% of survey respondents now claim to have 20% of their high-priority virtualized functions in live networks, but 45% expect to reach that milestone by the end of 2017.

— Iain Morris, Circle me on Google+ Follow me on TwitterVisit my LinkedIn profile, News Editor, Light Reading

Make sure your company and services are listed free of charge at Virtuapedia, the comprehensive set of searchable databases covering the companies, products, industry organizations and people that are directly involved in defining and shaping the virtualization industry.

zekken 12/28/2016 | 12:35:55 PM
Bonne année 2017 and imgsquotes.com
Thanks for this amazing article. 
brooks7 12/19/2016 | 12:11:10 PM
Re: Where is the real hurdle?  



I agree with your commentary.  One can't expect vendors to spend millions developing full on products and then make them available for free.

The other part of this is that the web giants understand that there are lots of challenges with any sort of architecture and set of choices that they make.  The only way to make progress is to take things as they are and stitch them together.  The web giants understand that they have to develop their own products and that they source components from vendors and Open Source.  As you point out this is anathema to the major service providers.

What this would look like from an SP standpoint is that they would pick a number of Open Source Products, assign a group of engineers to patch it together, and deploy.  They would not have a standards body worry about things.  The more successful that a program was the more likely that its lower end components would be chosen to make the next project.

If you are going to try to do this with 3rd party products, then the right thing to do is to try to encapsulate them so that they are not integrated with the service provider infrastructure. That way any interfaces are plug standard (TCP ports for example) and can be provisioned over Rest APIs using something like JSON.  The NOCs would have separate monitoring for those individual elements and they would have to be self-monitoring and self-reporting for errors, potentially variable monitoring in the wrappers.  Thus the software team @ the service provider becomes responsible for the NOC integration.


To your other point.  I actually disagree.  I think the right thing for them is to not become more complex, but more simple.  Admit that they are in the CAPEX deployment and explotation business and figure out how to make networks cost less over time.  I think the whole service providers should become a web giant thing is going to fail and fail miserably.  


patricknmoore 12/19/2016 | 11:00:16 AM
Re: Where is the real hurdle? I certainly agree with the number of software engineers comment. Service providers typically take the approach of just buy it, or choose an open source alternative that they can outsource the support of it...and they NEVER improve it, they just use it. This isn't 100% true in all cases, but it not being true is the exception instead of the rule.

However, ask yourself about the differences in the companies you mentioned that have successfully transitioned to where everyone should be headed and the typical service provider. Indeed, the above point on software engineers is part of it, but it is part of a three pronged issue:
  1. Staffing software engineers, as mentioned by you and agreed to by me, is a major part. Service providers lean on vendors (vendors that oversell, and sometimes outright lie, about their capabilities) too much to have been able to achieve the velocity that the Google/Facebook/Amazons of the world have.
  2. Legacy networks is another piece. What is the difference between a Verizon, AT&T, etc and a Google, Facebook, etc? 80-90%+ of the networks of the new breed of company exist in the data center. It is pretty safe to say the opposite of that is true for the service provider world. With the service providers I have worked with the confusion on operationalization has not been with the segments in the data center. Google and others have shown the service providers how to do that piece. It is with how to extend those concepts beyond the data center and out to the customer premises, which they have limited control over (but which contain networks of their own that the service provider MUST be compatible with), by traversing their legacy networks. Oh, and add to that the cost of continuing to try and maintain legacy services through increased automation in those spaces. Legacy is not going away anytime soon.
  3. Organizational vision is, potentially, the largest hurdle. It plays into the staffing area. Service providers are going to have to go through a paradigm shift where they evolve themselves into the new world as well...not just their network and tools. I see them struggling with this more than anything. There is a battle going on in at least some ( I can't speak for all because I haven't personally been involved with all, of course) of the service providers between network engineering, IT Ops, and their App Dev groups. In many cases upper level leadership is leaving it to the organizaitons to just figure it out, and it is allowing turf wars to build. Until service providers decide that there is NOT a clear line anymore between the network, IT infrastructure, and software groups in the future world they are going to continue to spin their wheels just like they are now.

So, it isn't just as simple as having more people...even more of the right people. They MUST evolve. They MUST find the balance between the old and the new networks. They MUST change their mindset.

All of these are easier in companies with a decade's, maybe 2, worth of history in building its DNA; versus ones that have almost a hundred years worth...in some cases. I think it needs those companies to form a new organization that has THIS as its DNA, and to let it grow over time. I am not sure any of them are willing to do that, however.
brooks7 12/19/2016 | 12:04:09 AM
Re: Where is the real hurdle? So patricknmoore, ask yourself this question:

Facebook, Google and Amazon have completely migrated to virtualized networks.  Why are they able to complete their work before traditional service providers even start?


PS - the answer can be found in the number of software engineers that they employ.
patricknmoore 12/18/2016 | 11:27:11 AM
Where is the real hurdle? It is certainly true that there is a ton of work to be done in the following areas before NFV becomes real:
  • VNFs - there need to be more real VNFs, versus just making a software based copy of the physical devices people talk about. Most vendors haven't figured this out yet, it seems.
  • VNF licensing - this model has to change as well. These are new things, not just another network device, and they should be licensed differently.
  • Interoperability - there are a plethora of tools out there, and it is too hard to try and make them work together...if they even can be made to work together.

Those are all mentioned in the article and survey. In my experience there is a bigger hurdle than those, however, ad that is the management layer above all of this.

The MANO space, to use the ETSI terms, is fragmented and full of immature solutions right now. I see:
  • Vendors that are creating proprietary solutions that force you to go with THEIR VNFs, THEIR version of OpenStack (or other VIM), THEIR Orchestrator, THEIR VNF Manager...
  • The open source alternatives have multiplied like rabbits. There are many of them, especially adding in the open sourcing of ECOMP by AT&T (which is a tool and a framework melded into one thing) and it is confusing people...and none of these are mature enough yet either

The result of those two things is an NFV space where even after a service provider finds the right use case, tests that the virtualized components work, and are ready to go to trial with real customers they are absolutely confused on how to operationalize all of this.

Every service provider I have spoken with talks about open source, but is being led down a different road by their vendors...and don't seem to see it. They are being led back to the same world they are in now.

Vendors preach open source, then supply tools that are not. They have individual employees that KNOW how things should be, and are brilliant people. However, the business decisions made above those people lead to a "protect your base" mentality. I understand this, because they are trying to make money. They aren't doing this under public funding that is meant to advance the industry.

So, what does all of this mean, in my opinion? 

It is going to take time. Service providers have to push their vendors on interoperability. Vendors have to listen. As much focus has to be put on operationalization as anything...maybe MORE.

Even IF the network side of it is solid, if the management of it is not as solid, or better, then NFV will never realize its potential and will be seen as a failure of a science experiment.
lstark 12/8/2016 | 3:44:09 PM
NFV Interoperability I would assume that the concerns about interoperability expressed at Vision 2020 in Rome were for the data layer, but after all, Light Reading is about optical networks, so it makes me wonder if the attendees were thinking of the effects of dissaggregation in the DWDM network. Can anybody help me with this?
mendyk 12/8/2016 | 10:06:49 AM
Re: Spending glass half-empty or half-full? One of the more interesting things about this project is that it confirms what we've known for more than a decade now -- the hype/disillusion/reality progression is a fact of life in this industry. Marketers and "experts" start the progression, business types watch it fall into the trough of despair, and the people who actually do the work pull things up into reality.
Carol Wilson 12/8/2016 | 5:07:59 AM
Spending glass half-empty or half-full? When this report was presented at our Rome 2020 Vision Summit, the audience response was interesting. Some folks were buoyed by the news that spending will go up among more than half of the operators, even though the survey also showed two-thirds would spend 10% more or less in the coming year. 

Others saw that as discouraging - I guess it's a matter of perspective. 

Without a doubt, NFV is getting real and facing real roadblocks. 
Sign In