x
Optical/IP

AT&T: Don't Burden Us With New Regs

SAN DIEGO -- CTIA Wireless IT & Entertainment -- In his keynote Wednesday morning at the CTIA Wireless IT & Entertainment 2009 show, AT&T Inc. (NYSE: T) Mobility CEO Ralph de la Vega followed the Federal Communications Commission (FCC) chairman with a stand against more regulation in the wireless industry.

The wireless boss started his speech with a 10-point presentation about how and why the U.S. wireless industry is the most competitive in the world. He saved the red meat for his closing statement, however:

"Before we begin fixing what isn't broken, we need to be thoughtful about the consequences. We believe the marketplace today is vibrant, and there is no need to burden the mobile Internet with onerous new regulations," de la Vega told the crowd.

De La Vega's comments followed an opening keynote by new FCC chairman Julius Genachowski in which the new boss laid out what he called his four-step plan to promote the takeup of 4G networks in the U.S. (See FCC at CTIA: 'Spectrum Is Oxygen'.)

The plan involves:

  • Unleashing spectrum for 4G mobile broadband and beyond
  • Removing obstacles to deploying robust and ubiquitous 4G networks, which would include removing tower siting obstacles [this drew a cheer from the crowd] and promptly issuing spectrum licenses
  • Fair rules of the road for an open Internet
  • Empowering consumers by supporting a vibrant, transparent, and open marketplace
"We don’t want heavy-handed and invasive regulation," Genachowski had reassured the industry-packed crowd earlier. The FCC is currently mulling over the thorny issues of competition and innovation in the American cellular industry. The government has also said that it will meet and rule on net neutrality for wired and wireless networks soon.

AT&T's de la Vegas was sure to point out that his company also "supports the open Internet." AT&T yesterday lifted its lock on allowing iPhone users to run VoIP applications over AT&T's 3G network.

— Dan Jones, Site Editor, Unstrung

lrmobile_kumaramitabh 12/5/2012 | 3:53:39 PM
re: AT&T: Don't Burden Us With New Regs It is interesting to read that we are facing a spectrum shortage and that one of the reasons for it is just the 8 million iPhones and some more smartphones in the market, which have started to soak up the bandwidth and the spectrum. This makes it simplistic to understand and it is easy to believe that we are running out of spectrum. That may be easy to believe, but unfortunately, complicated to understand that we may also be running out of innovative technologies which may do better with spectrum.
The battle cry of spectrum running out was raised in the 1990G«÷s when the analog cellular systems running at 30 KHz each had soaked up all the spectrum with not only a million users. Once the GSM and CDMA technologies came in, nothing much was heard on spectrum till recently, when the data services, once so sought after by the carriers actually took hold. So are we once again at the footsteps of another technology to deliver us from the spectrum woes?
Yes, we are, and even though we may not be able to rot our brains enough to come up with what it is, it is not far. The real problem is that the large operators always grow organically and have the least inclination to delve into innovative technologies, and with the exponential growth that we have seen so far, the spectrum shortage is natural.
A similar situation was foreseen in the early days of internet when internet streaming was being done without the benefit of Content delivery Networks (or Content mirrors at network edge). It was sincerely believed that even a million users will lead to the collapse of the internet if they started streaming from the same source. Of course nothing of the sort happened as the CDN technologies came in naturally.
The P2P networks and technologies like Bit Torrent create similarly efficient network infrastructures, though it is another matter that we have yet not developed sufficient expertise to allow them to operate securely. But it is now known that content need not be carried around recklessly if an appropriate architecture for the same is available. This requires intelligent network infrastructures, beyond what we have today. But with largely monopolistic control of the markets, few are willing to gamble on such infrastructures.
Wi-Fi is another technology, which rose very quickly into prominence. This is an eco-friendly technology in the sense that the size of the Wi-Fi G«£spotsG«• are relatively small and the same footprint of frequency of about 100 MHz is used across the nation. With a proper backhaul, such as with Fibres or with WiMAX, it is possible to enable data services on a much larger scale than is possible today. The FCC has been well aware of it and has in place a roaming arrangement for the wireless ISPs.
But the issue is that the mobile operators see the Wi-Fi networks as G«£externalG«• and go to great lengths so that the customers continue to use the mobile networks for data. Even the UMA ( or generic network access) which allows interoperability between the Wi-Fi and the 3G networks routes the data through 3G switches. The 3G switches themselves remain with legacy circuit switched architectures, with embedded gateways and signalling converters. An IP core is the objective but is not here yet. The 3G provides for multicast structures i.e. MBMS where a number of video streams can be multicast to thousands of users instead of thousands of streams being sent out from massive servers, each using up spectral resource. But there are virtually no implementations of MBMS so far.
The next generation network (NGN) initiative of the ITU has is based on an IP core, but with the ITU parentage being of fixed line operators, it is still oriented towards these networks. There are few which have thought of the mobile wireless networks with an integrated IP core using IPv6 which can use the spectrum optimally with techniques such as Continuous packet Connectivity and P2P type of network architectures.
Today we are lacking on both the fronts: the radio networks themselves, which are relatively inefficient and the network architectures right up to application level which have outlived their lives.
We are not trying to say that G«£this is itG«• i.e. the mobile P2P is the answer or it is the femtocells which will enable extensive reuse. But we certainly have exhausted the technology which is driving the present networks. It is creating a FrankensteinG«÷s Monster of the spectrum requirements and if the same technologies continue to be used, any amount of spectrum will bring up against a wall.
In fact, this state of affairs has arisen when the real use of video streaming is still very limited due to the restrictions placed by the carriers, and the use of data while roaming overseas or even within the country is miniscule. The primary reason for this is that we are just at the beginning of the uptrend of the use of video, gaming, navigation and multimedia services. The growth trend should take us on a growth profile of well over a hundred times of the media that we use today, if it is not restricted by inefficient architectures. To be fair new architectures have been proposed by the 3GPP in the form of LTE, with speeds of 100 Mbps. But perhaps speed is not all that will be required. The granularity of the data use will be of primary importance, where a device, wireless enabled, will perhaps send a few bytes a day without making a call or a connection. WiMAX architectures are good for low data granularity but we will need to see far these networks are able to go against a foreboding competitor.
There are numerous other examples of such technologies which took the capacities much beyond the projected growth within the same physical infrastructures such as the multimode optical fibers ( Transmission capacity is today never a limitation even on the heaviest routes). The spot beam Ka satellites is another example, but we will stop here as such examples are far too many. All that we can say is that new technology is the solution to the types of impasses which we now see in G«£spectrum shortageG«•.
While we cannot predict the technology, we can begin to predict the trends which will perhaps constitute the elements of a future wireless network which will deliver much more in the resources we have today. For example, such a network will need to extensively depend on broadcast based deliveries where large base of users can be served for common content instead of millions of individual data streams. These may be in the form of FLO, ATSC M/H, MBMS or other technologies. Which one exactly is not important, but the elements will lie in mobile broadcasting. Secondly it will need to be able to deal with data streams of different types more intelligently. For VoIP the data packets are small but the periodicity required is high and latency small. For dormant wireless devices ( such as washing machines) the latency is of no value but the granularity required may be just a byte. The new architectures will need to more intelligently manage content as content now forms the bulk of what is transmitted on the networks. It may not be precisely p2p but close. We will also need network architectures which enable wireless devices to talk peer-to G«Űpeer because the use of such frequencies is ecologically more efficient. They do not impact a whole building or a city.
The FCC will need to play a key role in such initiatives. The industry bodies such as CTIA are important but they represent the collective wishes of the larger players. It is the FCC, which mooted net neutrality, (which has not been to the liking of the established players, and yet to take off), but is the one which can give some innovative players to come up with something disruptive.
So are we going to see a Malthusian disaster of ever growing demand and exhausting supplies? In a holistic model yes. But the heydays of such models never come as history is witness.
http://wimax-home.com
HOME
Sign In
SEARCH
CLOSE
MORE
CLOSE