& cplSiteName &

Report: P2P Filtering Needs Work

Light Reading
News Analysis
Light Reading
3/27/2008

Today's deep packet inspection (DPI) products may not be adequate for large-scale detection and filtering of peer-to-peer (P2P) traffic, according to a new report by Internet Evolution.

The report, entitled "Peer-to-Peer Filters: Ready for Internet Prime Time?" sought to test P2P filtering products by emulating today's carrier networks and measuring their performance.

In a lab test commissioned by Internet Evolution and SNEP (the "Syndicat National de l'Édition Phonographique," an industry organization that represents the interest of the French music industry), the European Advanced Networking Test Center AG (EANTC) was called on to determine how well carrier-grade systems would perform under network conditions.

More than two dozen vendors were invited to participate in the test, including established players -- such as Allot Ltd. (Nasdaq: ALLT), Cisco Systems Inc. (Nasdaq: CSCO), Ellacoya Networks Inc. , F5 Networks Inc. (Nasdaq: FFIV), Huawei Technologies Co. Ltd. , Narus Inc. , Packeteer Inc. (Nasdaq: PKTR), and Sandvine Inc. -- and a number of lesser-known startups.

However, only five vendors agreed to take part. And even then, they only joined if they could back out any time they wanted. By the time the test was completed, three of those participants decided to keep the results to themselves. That meant only two vendors -- Ellacoya and ipoque GmbH -- were willing to make their results public, out of an initial list of 28.

Carsten Rossenhovel, managing director of EANTC, says it was unusual for so many vendors to decline such an invitation. However, he believes many ultimately opted out because their products were not up to snuff. "Over time we found out this was because many of the products were still in the early stages of development," he says.

There were two major types of failures that the tested products fell victim to. The first came in accurately identifying and filtering the many different peer-to-peer protocols that people use.

The devices were about 90 percent accurate in identifying sessions for BitTorrent, which is the most popular P2P file sharing protocol. However, the products were less accurate in other sessions, allowing up to 30 percent of traffic in other protocols to pass. In total, 13 different P2P protocols were tested, but only the most widely used were reliably detected.

The second failure was much more important to content owners, as the devices tested were only able to identify traffic by protocol, and not by content -- which isn't helpful for content owners looking to protect their copyrights.

"Service providers want to reduce P2P for network traffic. But media companies are not interested in reducing traffic; they just want to reduce copyrighted content" shared by P2P users, Rossenhovel says.

All of which points to improvement needed from vendors. In the meantime, service providers probably won't stop buying DPI filtering products, even if they're not perfect.

As Rossenhovel says, "If you are a service provider and have a desperate need for a device to control peer-to-peer traffic, you're likely to use a certain solution as long as nothing better is on the market."

The full report, along with detailed results, is available here.

— Ryan Lawler, Reporter, Light Reading

(14)  | 
Comment  | 
Print  | 
Newest First  |  Oldest First  |  Threaded View        ADD A COMMENT
Page 1 / 2   >   >>
jepovic
jepovic
12/5/2012 | 3:44:47 PM
re: Report: P2P Filtering Needs Work
Modern Bittorent clients have built-in encryption between endpoints. This hides the Bittorent protocol and effectively prevents any filtering on protocol level, thus rendering the whole idea of P2P filtering useless. Of course, the encrytption does not prevent identification by the content owners' spies, but that's beside the point in this case.
douaibei
douaibei
12/5/2012 | 3:44:47 PM
re: Report: P2P Filtering Needs Work
P2P is a entertainment platform, I do agree it provide the convinence for the privacy of IPR. but at same time P2P enable the fast internet file and content sharing. without the P2P, the internet will be less attractive as you need to pay for everything. software, content.


The carrier and the IPR owner should think about how to reduce the price or even offer free service. at the same time they can make money from the online ads. or real time inteacitive service, differentiate access service.


For ex: you provide the the same movie with different resolution like : 256kbps, 512kbps, 1024kbps and you dont charge for 256kbps which has very bad quality, and you charge 2p for 512kbps and 10p for 1024bps

and at same time Carrier can offer differentiated access service, say 2p for 1024bps traffic while it's free for 256kbps service etc.


there is a lot of mechinism that provider and content owner can do to make money. think about how much google make money? if disney can offer free movie and provide the same ads service, I believe they can make more money than the present business model, as the more subscriber you have the more money you make.

BTW: P2P is very difficult to detect as the pattern can change very quickly, the hardware chipset will be easily defeated as they can not keep upgrade they p2p pattern base without compromise the performance this the reason why no so many vendor want to disgrace themself on the test, as DPI is a good mirage but the real performance is still remain to be seen.


personally the BLADE service system and traffic shapping system will be the priority the carrier should think about. not to differentiate too much service.
Ryan Lawler
Ryan Lawler
12/5/2012 | 3:44:47 PM
re: Report: P2P Filtering Needs Work
Modern Bittorent clients have built-in encryption between endpoints. This hides the Bittorent protocol and effectively prevents any filtering on protocol level, thus rendering the whole idea of P2P filtering useless

The test took into account both encrypted and unencrypted P2P traffic. The results of the encryption test are on page 8 of the report.

But to summarize -- encrypted P2P streams are not as undetectable as you might think. Clearly the products aren't perfect, but from the report and from demos I've seen of DPI products, even encrypted streams are recognized.
Ryan Lawler
Ryan Lawler
12/5/2012 | 3:44:47 PM
re: Report: P2P Filtering Needs Work
Modern Bittorent clients have built-in encryption between endpoints. This hides the Bittorent protocol and effectively prevents any filtering on protocol level, thus rendering the whole idea of P2P filtering useless

The test took into account both encrypted and unencrypted P2P traffic. The results of the encryption test are on page 8 of the report.

But to summarize -- encrypted P2P streams are not as undetectable as you might think. Clearly the products aren't perfect, but from the report and from demos I've seen of DPI products, even encrypted streams are recognized.
acohn
acohn
12/5/2012 | 3:44:45 PM
re: Report: P2P Filtering Needs Work
Haven't looked at the report, but I'd be curious to know how you detect the content of something that is encrypted by its nature.

However, with that said, its an arms race that P2P blockers will not win.
jepovic
jepovic
12/5/2012 | 3:44:40 PM
re: Report: P2P Filtering Needs Work
Dito that, how is it done? Port number seems pretty dangerous, considering how many ports that can be used.

An interesting test would be to see how much non-P2P traffic gets hurt by these techniques. After all, if filtering non-P2P traffic is not an issue, all the traffic can be rate-limited...
wisenet
wisenet
12/5/2012 | 3:44:34 PM
re: Report: P2P Filtering Needs Work
Not sure I follow the logic of the commenters. Do you understand how all virus detection signatures work? If not, does that mean virus detection products don't work?

I would also question the conclusion of this article. As far as I know, there're no standard benchmark for such a test, which is a good enough reason in my opinion for established companies to refuse participation.

Sorry for being so negative here, but I would expect Light Reading to deliver more professional reports.

WN
cross
cross
12/5/2012 | 3:43:36 PM
re: Report: P2P Filtering Needs Work
Jepovic,

You are right - P2P filtering on the basis of TCP ports is not an option.

To quote the report: "We chose unknown (undisclosed and varying) TCP ports. Network operators consider Layer 4 filtering by well-known TCP port numbers inferior; indeed, these port numbers can be modified just too easily, to circumvent detection."

Clearly thorough P2P detection implies the use of signature based and/or behavioural related mechanism. These are the technologies that differentiate P2P filtering devices from normal firewall activity.

The report also describes that backround traffic was not affected during the P2P regulation tests - which answers the question about false positives.

Best regards, Carsten Rossenhoevel (EANTC)
mccoyc
mccoyc
12/5/2012 | 3:43:35 PM
re: Report: P2P Filtering Needs Work
In the grand scheme of things, I think DPI and filtering will lead to an escalating arms race that nobody can win. Since the solutions I've seen so far interfere with TCP sessions by sending resets, I see P2P using customized stacks to send data. If TCP is interfered with, P2P will use UDP. If ports are used for filtering, simply randomize ports. Or moreover, I see P2P spreading the load across multiple TCP sessions. We could very well see a customized P2P transport protocol that is neither TCP nor UDP! This theoretical P2P transport protocol may use multiple pairs of ports for communication, which leaves NetFlow and the like absolutely worthless! Good luck guys...
fiber_r_us
fiber_r_us
12/5/2012 | 3:43:34 PM
re: Report: P2P Filtering Needs Work
For a provider to consider everything he doesn't recognize as "degraded" would be a *very bad idea*. There are *countless* apps the provider will never know anything about (and shouldn't), and there are new ones every day. The tail is very long indeed.

And, as you say, many of the enterprises already encrypt (eiter with SSL, or IPSEC, or some proprietary mechanism).

The best a provider-based filtering scheme could do is try to identify traffic patterns (i.e. lots of sessions to certain sites). Even with that, assuming you could identify it was the dreaded P2P app, how do you know it wasn't legitimate content being distributed by legitimate players? What if Microsoft decides to use a P2P app to distribute those increasingly large updates?

Its a losing battle. The provider should learn how to leverage the technology, not fight it.
Page 1 / 2   >   >>
Featured Video
Upcoming Live Events
October 22, 2019, Los Angeles, CA
November 5, 2019, London, England
November 7, 2019, London, UK
November 14, 2019, Maritim Hotel, Berlin
December 3-5, 2019, Vienna, Austria
December 3, 2019, New York, New York
March 16-18, 2020, Embassy Suites, Denver, Colorado
May 18-20, 2020, Irving Convention Center, Dallas, TX
All Upcoming Live Events