& cplSiteName &

Virtualization Won't Kill Hardware Test Tools

Danny Dicks
Heavy Lifting Analyst Notes
Danny Dicks
8/18/2016
50%
50%

Think of network functions virtualization (NFV) and you're likely to think of monolithic hardware boxes being replaced with software running on virtual machines (VMs) in multiple locations, bringing down costs, increasing flexibility and agility and speeding time to market for new services. Among the functions ripe for virtualization is test and measurement (T&M).

It's certainly true that NFV is shaking up this market and significantly blurring the lines between service assurance and T&M, replacing old testing processes with a DevOps approach and seeing some new players entering the market, for instance with open source traffic simulators.

You might think that the days would be numbered for the dedicated hardware test device, operated by an engineer in a van driving to a specific location to run a specific test according to a set routine -- but this is far from the case. It's true that virtual active and passive probes and agents are replacing hardware-based probes at aggregation points because they can be deployed much more widely and flexibly, turned up or down as required for network monitoring and can run specific tests on demand. However, these software tools cannot do everything that hardware test devices can do.

One of the consequences of NFV is that more capability is being moved to the edge of networks -- for instance, to support virtualized customer premises equipment (vCPE) services for enterprises and cloud radio access network (RAN) in mobile networks. This is resulting in higher-capacity links between edge locations, which must be tested using tools that can emulate very high throughputs and can test synchronization down to microseconds. It's not possible to do this with virtualized software test tools. This means there is a continuing -- even growing -- need for "engineer plus hardware tool" testing as NFV gathers pace.

But that doesn’t mean that it’s business as usual for the vendors of this specialist equipment. The test tools are changing and fitting into more automated processes, deskilling the engineer and leveraging connectivity between the tool and a central lab from where tests can be automatically configured, and to where results can be instantly uploaded for analysis. Of course, test configuration and data analysis could be performed using virtualized equipment -- offline, but in near real-time. So while the hardware test tool is far from dead, it's certainly not immune from the impact of virtualization.

The latest Heavy Reading report report, "Test & Measurement for NFV," examines the approaches being taken in testing, measuring and monitoring a virtualized network and the way the vendors' offerings are changing. The focus is on traditional network T&M equipment vendors, including those that offer network-oriented service assurance solutions. The report examines blurring lines between test/measurement and service assurance and what the impact of this is on vendors and their service provider customers. The report profiles providers of T&M probes and other tools, a testing services consultancy with an NFV testing framework and a high-level service assurance solution provider that is competing with assurance platforms built by probe vendors.

— Danny Dicks, Contributing Analyst, Heavy Reading

(4)  | 
Comment  | 
Print  | 
Newest First  |  Oldest First  |  Threaded View        ADD A COMMENT
brooks7
50%
50%
brooks7,
User Rank: Light Sabre
8/19/2016 | 2:07:00 PM
Re: Well as usual I strongly disagree
So, since Virtualization dominates the IT space and has exsted for 10+ years....how is the Enterprise market for HW test pieces? Go to the Barrauda website and go see the whole wide range of vCPE that already exsits.  And that is my point...here is just another example of the Virtualization thing being new.  Its not.

Virtualization is an established market and HW based test is dead.  Will take some time to die and there will be a couple of places that won't change.  But people need to go look at Virtualization in action.

seven

 
aazizk
50%
50%
aazizk,
User Rank: Light Beer
8/19/2016 | 10:37:59 AM
Your assumption is generalized
I agree with your assumption partly, yes on the aggregated backhaul of the vCPEs the requirement for T&M will be more and more stringent as well because the operator will be providing multitudes of services with different service attributes. But how about the user itself, all the vCPEs should be able to auto-configure and auto-test themself, which will reduce the hardware based testing at customer premises to almost zero. Please correct me if I am wrong.
DannyDicks
50%
50%
DannyDicks,
User Rank: Blogger
8/19/2016 | 3:36:33 AM
Re: Well as usual I strongly disagree
Some of what you say makes sense. But even the most bullish virtual test platform vendors acknowledge the continuing need for high-performance hardware test equipment, and the limitations of virtualized tools.
brooks7
100%
0%
brooks7,
User Rank: Light Sabre
8/18/2016 | 1:53:23 PM
Well as usual I strongly disagree
 

The basic problem with a virtual network is that the testing probes have to be instantiated as part of the element being created.  You can't have a separate test environment.  In fact, it should be considered normal for test functions to be rolled up as part of the development with the applications registering with the NOC and reporting back all the stuff that it needs to.

This is basically kill HW test boxes.  Where do you plug them in if you don't have a plug?

seven

 
More Blogs from Heavy Lifting Analyst Notes
US cable providers are gearing up to play a crucial role by supporting transport of 5G communications, but their aspirations are clouded by the complexity of 5G itself.
Many operators are expected to initially deploy 5G RANs in non-standalone mode on an LTE network, migrating later to standalone mode. But that migration is less than straightforward, and there are multiple options to consider.
Next-generation networks need innovation in the radio access network and Open RAN developments are set to disrupt the market – in a good way.
The optical line system is undergoing a revolution. Web-scale providers are bringing the concept of disaggregation to optical networking in the form of the open line system (OLS). But are operators buying it?
The Linux Foundation's Acumos AI Project has issued its initial release on schedule but its 'app store' is still rather bare.
Featured Video
Flash Poll
Upcoming Live Events
March 12-14, 2019, Denver, Colorado
April 2, 2019, New York, New York
April 8, 2019, Las Vegas, Nevada
May 6, 2019, Denver, Colorado
May 6-8, 2019, Denver, Colorado
May 21, 2019, Nice, France
September 17-19, 2019, Dallas, Texas
October 1, 2019, New Orleans, Louisiana
December 5-3, 2019, Viena, Austria
All Upcoming Live Events