Big Bad Backhaul Blast

4:30 PM -- One of the key challenges that essentially every wireless carrier will face over the next few years is enhancing the backhaul capacity of its infrastructure. By backhaul, I mean in this case the interconnection of cells to the rest of the network’s infrastructure. To this point, most cellular networks, for example, didn’t need much backhaul -- a T or two could do it. Voice consumes almost nothing, and data services have seen relatively little use. Most data services (most notably, non-multimedia Web access and especially email) have good tolerance for latency, so, while more capacity is always better, there hasn’t been much incentive for carriers to add it.

That’s going to be changing dramatically over the next few years. As the voice market saturates (any minute now), data will be consuming a greater percentage of backhaul capacity. And data rates are rising -- megabit and even multi-megabit subscriber access is going to become common. And while carriers will always be reluctant to commit to any given level of service in terms of throughput and/or latency (or anything else, I’ll bet), competitive pressures are going to replace “Can you hear me now?” “with “Did you get that huge file I sent? Why, yes I did!”

I recently did a briefing with BridgeWave Communications , a company that specializes in gigabit point-to-point 60GHz (unlicensed) and 80GHz (licensed) links. What’s really amazing is that these are getting cheap enough that carriers (and enterprises looking for P2P connectivity as an alternative to installing fiber) should have no problem in working this new capacity into their business plans. Oh, sure, they’ll think of some reason to put it off -- but I would suggest sooner rather than later is going to be essential to keeping a competitive edge.

— Craig Mathias is Principal Analyst at the Farpoint Group , an advisory firm specializing in wireless communications and mobile computing. Special to Unstrung

Be the first to post a comment regarding this story.
Sign In