x
Policy + charging

FCC Mulling New Internet Rules

The Federal Communications Commission (FCC) , already conducting a probe into how Comcast Corp. (Nasdaq: CMCSA, CMCSK) manages peer-to-peer (P2P) traffic, appears poised to take action to ensure that cable operators and telcos can’t discriminate against applications that use their high-speed networks. (See FCC Eyes Comcast's P2P Policies.)

That much was clear Monday at a public en banc hearing held at the Ames Courtroom at Harvard Law School in Cambridge, Mass., which was also broadcast over the Internet.

The FCC held the hearing as it considers rules to define what constitutes “reasonable” network management, and to address claims that operators are violating elements of the agency’s Internet policy statement. That statement holds that consumers are entitled to access lawful Internet content and to connect their choice of legal devices that don’t harm the network.

“These are very significant issues. We do not take these allegations lightly,” FCC chairman Kevin Martin said during his statements to a packed house.

He stressed the importance that policies and practices of broadband service providers must be done in an “open and transparent way.”

U.S. Rep Ed. Markey (D-Mass.), who introduced a bill on Network Neutrality earlier this month, argued that sufficient competition does not exist to create affordable high-speed access to the Internet.

“Competition should be our preferred policy,” he said. Markey further suggested that an even more open platform would jumpstart that competition, creating lower prices, higher speeds, and enhanced service quality.

But he leveled the most quotable line of the morning in respect to the management of P2P applications. The “intercession into a user’s access to the Internet should not result… in the transformation of BitTorrent into ‘BitTrickle.' ” He called that a “problematic result [whether] purposeful or purely circumstantial.”

Whether operator choices on the matter are right or wrong, how they make those choices must be made transparent and not “shackle the promise of the Internet,” suggested FCC Commissioner Michael Copps.

Today, those decisions, while lawful, are being made in a “black box that the American people had precious little opportunity to peek into,” he said.

“We need to establish an effective Internet Bill of Rights that can secure Internet freedom for generations to come,” added FCC Commissioner Jonathan Adelstein.

Comcast has held that its practices are fair and are covered by the FCC’s policy statement. (See Comcast Defends P2P Management .) Comcast EVP David Cohen went into the lion’s den this morning to replay the MSO’s defense, noting that “reasonable network management practices are essential to broadband deployment.” He also reiterated that the MSO does not block Websites, applications, or Web protocols, including P2P services.

“Don’t let the rhetoric of some of the critics scare you. There’s nothing wrong with network management,” he said, noting that Comcast believes it has selected the “least intrusive method” that allows the greatest number of customers to avoid service degradation.

He also called on the FCC to maintain its light regulatory approach to the Internet.

Marvin Ammori, general counsel for Free Press, one of the organizations that have lodged complaints against Comcast, likened the MSO’s bandwidth management position to a “smoke screen,” suggesting that it “masks anti-competitive notions.”

P2P player Vuze has also filed a complaint against the MSO. On Monday, Vuze CEO Gilles BianRosa said his company has been playing a “cat and mouse” game with Comcast to keep its application in the fast lane. While he agreed that operators should be able to use “reasonable measures” to manage networks, he added, “We are against network management with no boundaries. It threatens the openness and freedom of the Internet."

Timothy Wu, a professor of Law at Columbia Law School, suggested that any network management definition should ensure that operators, as a “simple rule,” be barred from blocking lawful applications.

Likewise, he is not a fan of deep packet inspection technologies. With DPI, “you have the technology of censorship being built into the system,” he warned.

He said it’s important that the U.S. remain the role model “for what an open Internet looks like. What happens here will be followed everywhere.”

— Jeff Baumgartner, Site Editor, Cable Digital News

OldPOTS 12/5/2012 | 3:46:51 PM
re: FCC Mulling New Internet Rules GǣTimothy WuGǪ suggested that any network management definition should ensure that operators, as a Gǣsimple rule,Gǥ be barred from blocking lawful applications.
Likewise, he is not a fan of deep packet inspection technologies. With DPI, Gǣyou have the technology of censorship being built into the system,Gǥ he warned.Gǥ

After reading many posts on the most efficient method to transport information, I think professor WU makes important points.

But arenGt we really talking about fair Traffic Management and not how to manage the network to the cheapest network that a service provider can devise. In a best effort network a well economically managed network means reducing the bandwidth to a point just above where customer churn is too great to sustain the business model.

ATM created an elaborate QoS to do traffic management that was followed by best effort IP that created CoS, renamed by marketing as IP QoS. Then Ethernet needing to distinguish the traffic levels came up with the GaqG bits. Each provided less granular traffic distinctions. Well in Best Effort networks as BW becomes less available through heavy usage congestion, mainly by video and large files, there becomes a tension between users of the network.

Are the four levels of differentiation in Ethernet enough to distinguish customer traffic? But one could add more by adding DPI with the risk of censorship by the network operator. Or an operator could use one of the two methods in IP that provide about ten to sixteen levels of service. But this is done as a best effort by the network operator, meaning the differentiated traffic doesnGt avoid delays/blocks because of congestion in the cheapest network, they just all get treated according to the rules. Again the rules are hidden in the network, where the customer may not know which rule his traffic is being handled under. Isn't this the real problem?

Instead of this the complex, but granular, ATM approach statistically reserved resource (CAC) for each transmission (SPVCs) to ensure that the customer got the service he requested and if it was not available the customer was notified and could negotiate for another acceptable service level.

This way the customer knows when the network is not designed to meet his requirements and he must choose lower service. While I am not advocating that the solution be ATM, with some of its complexity, only using it's approach to traffic management that provides ways to define the customerGs requirements. This includes the extent he is willing to share some or all of the BW. If there is not enough resources to deliver the resources the customer will also learn quickly. The service rejection statistics can be passed on to the network operator.

This approach would allow one to choose best effort if they desired, but since most customers have some critical transmissions (even email has multiple priorities choices) one can subscribe to the level of service desired for those transmissions. Because resources are reserved in advance the customer would get what they paid for or be notified that the service could not be delivered. The choice of service could include both dedicated BW and shared BW (Best effort) to be negotiated/chosen by the customer.

Most service providers will soon come to letting customers select a service level (now an access bit rate). In the beginning this customer service level choice could be done on a per customer level. But they will soon come to the next step as people desire different levels per transmission.

Food for thought.

OP

PS - VBN networking is in your future!
NormalPerson 12/5/2012 | 3:46:49 PM
re: FCC Mulling New Internet Rules Have a look at some the Policy Management solutions for subscriber and network resource-based admission control (TISPAN RACS, ITU-T RACF). There are a number of solutions available that provide some of the capabilities you describe. Debatable if carriers would apply this in the best efforts domain, but does allow granular per user, per session QoS control for premium applications like IPTV.
materialgirl 12/5/2012 | 3:46:40 PM
re: FCC Mulling New Internet Rules It seems that this level of control would add costs to the network. Is there a break-even point, at which the cost of the added management is less than the value added?
rjmcmahon 12/5/2012 | 3:46:39 PM
re: FCC Mulling New Internet Rules Is there a break-even point, at which the cost of the added management is less than the value added?

Most I've talked to suggest the expense of adding complex traffic management like DPI isn't worth the cost assuming the goal is to transport bits as efficiently as possible. More importantly, there is a fundamental problem when the incentive is to invest in managing scarcity rather than investing in more supply or capacity. It's fairly obvious this behavior represents market and regulation failure. Markey's rhetoric, while it sounds nice, misses the mark on coming up with real solutions. Real solutions won't be based on ideological babble.
HOME
Sign In
SEARCH
CLOSE
MORE
CLOSE