Service providers will only re-deploy fiber infrastructure as a last resort. And why should they when coherent technology has already been proven to enable 40G and 100G speeds on "bad fiber" that was barely capable (if at all) of transmitting traditional 10G traffic.
Which commercially available 40G or 100G system has better PMD tolerance than 10G systems?
Even with the current advances on modulation technologies, as far as I know, PMD is still the main problem in the current fibers to deploy the 40G or 100G systems.
And beyond 100G, I suppose we will reach the limit on those fibers that were deployed in the 90's.
i understand the costs to re-deploy, but I wonder if we aren't reaching a point that the whole infrastucture will have to be renewed.
I agree with Bo on this one: the ability to work on existing infrastructure has been the #1 requirement for transport so far. To assume that for the next rate, that requirement will suddenly drop way down on the list is a very big assumption. From a supplier R&D standpoint, it means betting that operators will deploy new fiber while competing suppliers are working hard to make existing infrastructure work.
On Stevery's comment: I don't see why the "true 100G" distinction is so important outiside of some academic discussion. Are you saying that 100G that uses sophisticated modulation is less worthy in some way versus 100G baud rate with on/off keying? If this is your argument, then why is advanced modulation acceptable in wireless networking but not in optics?
Hi Sterling, I do agree that investing in new transmission mediums, in this case fibers, will be very expensive. I do understand that in the in the ideal world, we would be able to use the existing fiber infrastructure to transport the traffic.
I am just wondering if there will be feasible conditions to develop some technology to enable more than 192 Wavelengths, each one with 100G, in the existing fibers. It could happen (maybe using another band?).
But if you take into the perspective that 19200 Gbps systems will take a time to be widely deployed, and also that more bandwidth than this will take sometime to be the bottleneck. You can put a good 5 years, or more.
By this time, maybe a refresh in the current infrastructure is needed, at least in most important part of the backbone. Or maybe not. I don't know. :)
I know that theoretically it is possible to transmit up to 100 Tbps in one fiber, and 19.2 Tbps is far from it, but how to achieve that?
Now that the major work on 100Gbit/s technology is over, the time has come to ask: What's next?
Real 100G.
It's more workable now: There's been time to replace the expert folk who left the field after serial 40G yielded few profits.