x
Huawei Ultra Broadband Forum 2018

The Big Bandwidth Question

How much bandwidth will we need in future? And how much will we get?

Those are questions that access network strategists and analysts have been asking themselves for years, and the answer to date has been pretty consistent: a lot more than we have today.

This truth was captured as long ago as 1998 in Nielsen's Law of Internet Bandwidth, which states that "a high-end user's connection speed grows by 50% per annum." Tracked graphically (from 1983-2014), this shows high-end connection speeds rising from 300 bit/s to 100 Mbit/s. If we agree that this really constitutes a law, we can predict that a typical high-end connection speed will reach 1 Gbit/s in 2016-2017, and 10 Gbit/s in 2019-2020.

This, of course, begs the questions of what “high-end” really means, how representative it is, whether the connection is running at the speed claimed by the network provider, and whether higher connection speeds are driven by users or operators. Nielsen measures high-end connection speed rather than need, complicating the question of what the average user actually wants.

To help answer at least some of those questions, we can turn to the ever-fascinating Speedtest.net, which provides a good guide to the trend in actual connection speeds: its Net Index shows that the average global downstream connection speed at September 2014 was 29.86 Mbit/s. At September 2009, the Speedtest global average was 6.4 Mbit/s, suggesting a 5-year CAGR of just over 35% -- a fair bit lower than Nielsen's high-end user prediction. So if we extrapolate from the global Net Index growth, we find we get to 1 Gbit/s in 2027, not Nielsen's 2020.

It’s impossible to know how accurately Speedtest's results reflect reality; most likely it also overstates average speed, since those taking the test are probably more interested in line speeds (and hence tend to have higher speed connections) than the average user. When UK regulator Ofcom measured average broadband line speed in the UK in November 2013, it found an average speed of 17.8 Mbit/s, against Speedtest's finding at the time of 23.7 Mbit/s, putting Speedtest about a year ahead of the average. It's possible, too, that the growth rate is distorted by a change in the self-selecting user base; perhaps Speedtest users were more “high end” in 2009 than in 2013, dragging down the average growth rate.

Still, it's an intriguing finding, and suggests that Nielsen's Law doesn't necessarily help us plan real-world networks; and we can't be sure that even the 35% annual rise of the past five years recorded by Speedtest's users will continue. In conversation with the CTO of a major access equipment vendor, I was intrigued to hear him argue that there was no iron law that says access bandwidth will continue to rise inexorably as far our as we can see. At some point, for most people, enough really would be enough, and that day was not far off, he implied.

Some support for that still-contrarian view is already evident in the PC market, where people no longer seem to be changing out their PCs to get more memory or a bigger hard disk: computers, it appears, are already powerful enough for almost anything we can throw at them, and the market focus has shifted to smaller-format devices. In the bandwidth guessing-game, meanwhile, 3DTV (now quietly forgotten), 4K TV, 8K TV, holographic video, telemedicine, VR games, surveillance and who knows what else are all cited in defence of the view that we'll always need more. But as 3DTV showed (and consumer telepresence before it), there are no sure-fire winners here.

So where does all this leave those trying to plot out a future for wireline networks? With the troubling thought, perhaps, that it's not possible to predict with any confidence how much bandwidth people will "need" -- or actually have -- in the longer term. Not the least of our problems in prediction is the chicken and egg nature of the argument: are suppliers pulling users to higher speeds, or are users forcing suppliers to upgrade? And is that changing?

For some telcos, these uncertainities simply mean that accurate prediction is strictly a short-term game. The director of technology at one big telco told me that it models traffic in detail over the next 12 months; beyond that, he said, it was no more than guesswork. Others, meanwhile, invest heavily in FTTH and offer gigabit connections in the confident belief that consumers will never feel they have enough.

Looking ahead to that gigabit world we ought at the least to take note of the PC market's stagnation, and recognize that there is some upper boundary to the amount of information we can handle -- defined, if nothing else, by our own limited sensory apparatus.

But we should also be wary of joining Cassandras Thomas Watson and Ken Olsen in the worst-prediction league. If we have learned anything in the Internet era, it is that new apps will continue to emerge, and that some of them -- we can very confidently predict -- will take us entirely by surprise.

— Graham Finnie, Chief Analyst, Heavy Reading

DHagar 11/18/2014 | 10:09:39 PM
Re: A chicken and egg question Susan, good assumptions.  More is a good target, and as pointed out in this well-written blog, we probably don't know what we don't know yet.

I fully agree that the demand now is on the user side; not for capacity's sake, but to support the every-evolving array of devices, connections, etc., that we are developing. 

It will reach capacity at some point, but who knows what that point is?
Susan Fourtané 11/7/2014 | 8:13:32 AM
A chicken and egg question All this prediction about the future of networks going on will require serious crystal balls to predict how much bandwidth people will need in the future. The simple answer could be: More than what we have now.

As for the chicen and egg question of who is pushing the higher speeds, suppliers or users, I would say that suppliers were first the pushers and now the pushers are the users who always want more of whatever they get. 

-Susan
Duh! 10/10/2014 | 2:50:23 PM
It's the S-curve, stupid. All growth is logistical.  The reductio ad absurdem of exponential growth in bandwidth demand is an end state where every nano-joule of energy in the universe carries a symbol of information.  At some point, there has to be an asymptote.

The question is not if, but rather when.  I think that time is approaching, and know I'm not alone.  

Media is approaching the limits of human perception.  How subjectively better is 4k than HD?  As much as HD was better than SD?  As much as SD was better than analog?  How much subjectively better than 4k will 8k be? Even if average viewers can percieve an improvement, is it enough for them tol pay for? 


Also, there is a tug-of-war between computing and communications.  The video coding folks haven't retired yet.  HD video was initially encoded in MPEG2 at 20 Mbit/s.  Netflix now reportedly transports HD video (HEVC?)  at 5-7 Mbits/s.  And they anticipate 4K will take between 10 and 15 Mbits/s.  I'm not close enough to the video folks to know whether they are pushing fundamental limits.  But until that point, we have to assume that evoloution of lower rate algorithms for higher resolution will reduce bandwidth demand.

Perhaps there will be another mass-market application which will consume more bandwidth than video.  We don't have a line-of-sight to one.  And how much are we willing to bet that it will emerge?

 
HOME
SIGN IN
SEARCH
CLOSE
MORE
CLOSE