x
Optical/IP Networks

Content Switch Test

Content switches are one of those new-fangled product categories that promise big things but have yet to take off in a big way.

The big promises include enabling network operators to keep up with the dizzying pace of developments in Internet services, while cutting costs and improving security. They also include enabling carriers to make money by offering different grades of service at different prices.

So, if content switches are all that, you may ask, why hasn't their use taken off?

One reason is that content switches are still relatively new and unproven, which is precisely why Light Reading and its storage networking sister-site, Byte and Switch, undertook to put them through the world’s first independent test of this type of product.

We wanted to help everybody move to the next stage – get a reading on whether content switches are ready for prime time – and in order to do this, we've put some of them through one heck of a hammering.

Light Reading commissioned European Advanced Networking Test Center AG (EANTC) to devise and conduct a series of performance tests on content switches, using testing equipment and software from Ixia (Nasdaq: XXIA).

Sixteen vendors were invited to participate; three of them took up the challenge: These boxes were pushed to the limit in seven tests that evaluated: performance and scaleability when handling Web and TCP traffic; resilience to distributed denial of service (DDOS) attacks; and ability to handle secure sockets layer (SSL) connections. In addition, prices and feature sets were compared.

After much number crunching and careful interpretation of the results – bearing in mind that content switches can be used for a variety of applications – we had a clear winner: NetScaler

NetScaler’s RS9800 achieved an overall weighted score of 4.3 out of 5. It got the best average mark in each group of performance tests – and, on top of that, it’s half the price of Extreme and WinCom’s boxes.

Extreme and WinCom also did well, tying for second place with scores of 3.5 out of 5 – demonstrating that all three switches in this test are ready for serious use. All three vendors also deserve praise for stepping up for this test, demonstrating their faith in their products.

This report is chockablock full of detailed explanations of how we went about the tests, the results we obtained, and our interpretation of them. Here’s a hyperlinked summary of the contents to get you started:

Results in a Nutshell
  • Overall weighted results
  • Explanation of scoring system
  • How different applications might influence weightings and results


Methodology
  • What content switches are, where they're deployed, and what they do
  • What test equipment was used
  • How the test equipment was used
  • Standard testbed configuration


Participants and Products
  • Details of products tested
  • Vendors that were invited but didn't show up


Layer 7 HTTP Test Group
  • Scene setter for the following three pages of tests
  • Summary of scores for this group of tests


HTTP Session Rate and Capacity
  • WinCom best on session rate
  • NetScaler best on capacity


HTTP Latency
  • NetScaler gets top score
  • Extreme not far behind


Layer 7 Content Switching
  • Also called load balancing
  • WinCom does well
  • NetScaler is next


TCP Connection Rate and Capacity
  • WinCom wins on rates...
  • But gets penalized on capacity
  • NetScaler pips Extreme for first place


Distributed Denial of Service Resilience
  • Different types of attack identified
  • All vendors do fairly well


Secure Sockets Layer Performance
  • NetScaler scores
  • WinCom is next
  • Extreme isn't a runner


Price and Features
  • NetScaler wins on price
  • NetScaler and WinCom get top marks for rich feature sets
  • Monster table comparing features


GlossaryAbout the Authors: Bernd Klusmann is project manager, and Carsten Rossenhövel is managing director, research and manufacturer testing, at the European Advanced Networking Test Center AG (EANTC). Klusmann may be contacted at [email protected] and Rossenhövel at [email protected].

1 of 13
Next Page
slickmitzy 12/5/2012 | 12:32:55 AM
re: Content Switch Test

What i don't get is why some very strong players
in the content switching market are not present in this test.

How come nortel alteon is not included,
or cisco content switching module for the 6500 switches or radware csd ?
optical Mike 12/5/2012 | 12:32:54 AM
re: Content Switch Test I can think of only 2 reasons, either they didnGÇÖt see any sales/marketing benefit from the test exposure or they may have a bug or some operating flaw and not wish it exposed.
slickmitzy 12/5/2012 | 12:32:08 AM
re: Content Switch Test
Sorry, but i strongly disagree with the second comment.
I have extensive experience with alteon switches
of various models both in SLB applications, FWLB applications, and transparent cache redirection.
I also have little experience with the cisco csm module for the cat 6500 switch.
they both operate well in a complex production network.
lightreading presents it like those players on the tests are the best switches or something when this is far from the truth.
if the vendors didn't want to be part of the test then lightreading should either buy their product and test it anyway or at least state in the test summary that there are some major players not included in the tests.

Amos

emont 12/5/2012 | 12:23:29 AM
re: Content Switch Test I'm agree with your opinion about this kind of benchmarking...(is marketing).
Literary 12/4/2012 | 11:29:13 PM
re: Content Switch Test The pricing compares only the top end. Is there any data available on the low end from different vendors?

I got a pricing quotation from one of these vendors. Their low end price is same as the high end price listed in your evaluation. I wonder if this is possible or prices have changed since this test was done.