x

Physicists Find Fiber's Limit

Scientists from Bell Labs have calculated the theoretical limits on the carrying capacity of glass optical fiber and concluded that there's still a long way to go before optical systems reach those limits (see Bell Labs Calculates Limits). The results were published yesterday in the journal Nature.

Partha Mitra, lead author of the paper, says his work will point the way for future research by showing which approaches are likely to come up against fundamental physical limits and which aren't. It could also aid engineers whose task it is to model the enormously complicated properties of DWDM (dense wavelength-division multiplexing) systems.

Unlike system vendors, physicists measure the information-carrying capacity of a fiber in bit/s per Hertz of spectral bandwidth (bit/s/Hz). To find the capacity in Gbit/s or Tbit/s, this number has to be multiplied by the available bandwidth of a system (that's bandwidth in the physics sense, in Hz, rather than its common telecom usage, which essentially means capacity).

Mitra and his co-author Jason Stark calculated that the theoretical limit imposed by the physical properties of optical fiber on a communication system is about 3 bit/s/Hz. This corresponds to a maximum payload of 150 Tbit/s on a single fiber, assuming that the fiber can carry signals across the wavelength range 1260 to 1620 nanometers.

Mitra points out that all existing optical systems have a lower limit of 1 bit/s/Hz. That's because they encode data using a simple on-off keying technique, which represents bits by the presence or absence of light. "We've shown that the theoretical limits are substantially greater than this," he says. "What this means is that by changing the modulation scheme, it's possible to get more data into a fiber than was thought possible."

The downside? While the work at Bell Labs suggests that fiber has plenty of room to grow, new technologies -- more complicated modulation schemes and coherent detectors, which measure both power and phase of the incoming signal -- will be needed to make the most of it.

Few would argue with Bell Labs' rather basic conclusion -- that fiber has more capacity than is currently being used. It's a no-brainer. What's new is that the researchers have been able to quantify how much surplus capacity there is, something that can't be deduced from existing communications theory.

The classical formula for calculating capacity, known as Shannon theory, predicts that capacity will increase indefinitely as the power of the optical signal goes up. That's because the signal keeps getter stronger relative to the noise, which is fixed.

In real life, however, strange "non-linear" phenomena come into play, and start creating more noise at high optical power. Mitra calls it the cocktail-party effect. "If everyone's talking at once, then you have to raise your voice in order to be heard, and if everyone raises their voice, then you can't hear anything." Much the same thing can occur among channels in the same fiber generated by DWDM systems.

The origin of non-linear effects is the fact that, rather unexpectedly, the speed of light inside a silica fiber does depend on its intensity, or instantaneous power. (Remember, the speed of light is only constant in a vacuum.) This is most likely to be observed in DWDM systems where lots channels of data are packed into the same fiber, creating very high total optical powers.

"People knew that non-linearities were doing something, but they couldn't quantify it precisely," says Mitra.

Mitra and Stark were able to include non-linearities in the calculations for the first time. Why hadn't this been done before? Simply because it required some creative mathematical thinking to reduce the equations to ones that could be solved analytically.

— Pauline Rigby, Senior Editor, Light Reading
http://www.lightreading.com
<<   <   Page 2 / 8   >   >>
ownstock 12/4/2012 | 8:09:26 PM
re: Physicists Find Fiber's Limit Reiterate: If you set up the wrong constraints, you will get a wrong answer...because of that, the claim of deriving fundamental limit is just plain WRONG!

The real issue is there are efficient brute force (aka easy and well known) ways to minimize the nonlinearities in fiber that they did not even consider...and to achieve well over 3 bps/Hz...so their paper is not worth the pulp it was printed on!

Sorry to say this is a typical ivory tower, tunnel vision paper...hyped to the max...designed to derive a clever (but very wrong) answer...

Enough time wasted on this....

-Own
ownstock 12/4/2012 | 8:09:25 PM
re: Physicists Find Fiber's Limit Sanddune:

First, there is a BIG difference between writing for a peer reviewed journal, and a trade publication...they are two very different things.

Something like an IEEE or OSA technical journal is the first, and LR or Time is the latter (for example).

JMHO, but Nature is NOT the first place anyone of caliber in communications would think to publish. OTOH it would be an ideal place to publish a supposedly "fundamental limit" in fiber optics.

Precisely because they knew by going there it would not go through a tough and thorough peer review, and/or the editors would tolerate their combination of hype-title and "let's pretend it is this way" problem constraints. Sophistry.

My effort is not intended to confuse, but rather to expose the effort on the part of others to confuse / spin / hype / etc.

I have given (more than) enough information for those with technical expertise in the field of communication theory and/or fiber optics to understand...

-Own
sanddune 12/4/2012 | 8:09:25 PM
re: Physicists Find Fiber's Limit ownstock,

From my limited knowledge of comm sys
what you say is true for a "memoryless,
gaussian channel" for a copper local loop.
This efficiency of modulation is
fundamentally correlated with the channel
noise characteristics, which in case of
fiber are very different from the copper.




ppm 12/4/2012 | 8:09:24 PM
re: Physicists Find Fiber's Limit There are plenty of IEEE papers with 30 pages
of unnecessary theorems and lemmas followed by
a non-result. While Nature is not a traditional
communication theory journal, it is as valid
a forum as any to report a new way of
computing information capacities in the
presence of nonlinearities. Information theory
was to some extent born whole, there are few fundamentally new results in any case after Shannon's original publications.

I have failed to see a single valid technical point that you have raised. I sympathise with
your suspicion of the ivory tower, but I am
afraid that the IEEE and other journals are
as full of unnecessarily narrow publications,
whether from academics or not. Apart from
Turbo and LDPC codes (the latter being already
a rediscovery), there has been little over
the last couple decades that constitutes
basic advances in understanding.

Let me point out something simple. You brought
up a 10bit/s/Hz as a number. At 20dB SNR, which
you also mention, the Shannon formula for a
linear channel only gives you 7bits or so
(log2(100)=7 approx). There are inconsistencies
in your thinking even for a linear channel.

The limits in question are quite real. As for
specifying parameter values, that is clearly
inescapeable; this is not the speed of light
in vacuum or the gravitational constant that
is being determined. However, an estimate
for realistic parameter values is adequately
valuable. The same issues arise in predicting
limits for semiconductor memory chips, for
example. The world is imperfect: one has to
start somewhere.

Best,
ppm.
ubwdm 12/4/2012 | 8:09:23 PM
re: Physicists Find Fiber's Limit Well, ownstock is completely wrong. His
reaction is typical of many DWDM beginers
in some startups I met last years though.

To achieve 3 bps/Hz for a sigle 50Ghz channel
is not difficult. Even 10 bps/Hz is possible
in the lab for a few spool of fibers. But try it for 100+ channels for a few thousand KM transmissions, then you will start to appeciate the laws of physics.

This reminds me that when I did my disertation
many years ago, I analysed the capacity of holographical memory with similiar mathematical method. The established theory at time predicted that a 1cmX1cmX1cm LiNbO3 could hold the library of Congress, but reality was millions times under.
The cause: nonlinearity.






ppm 12/4/2012 | 8:09:20 PM
re: Physicists Find Fiber's Limit That's interesting - I've thought a little
bit about that problem ... I presume you are
referring to the nonlinearity having to do
with the refractive index change saturating
after writing a certain number of holograms.
I have a vague recollection that the maximum
delta(n)'s are of the order of a fraction of
a percent.

Best
ppm.
Petabit 12/4/2012 | 8:09:17 PM
re: Physicists Find Fiber's Limit Ownstock,

talking of peer-reviewed papers, you might want to look one up. About a year ago, Desurvire published a nice paper in an IEEE journal. It talked about the ultimate limit of an optically amplified system - it compared and contrasted lumped and distributed amplified systems.

So just looking at the noise from amplifiers in long haul systems, the limits he calculated were 3 bit/s/Hz for distributed amplifier systems and 6 bit/s/Hz for lumped systems. Which comes very close to the Bell Labs results, without including non-linearities.

All of which is a very long way from the 0.1 bit/s/Hz we are using today.

P.
vaporware 12/4/2012 | 8:09:11 PM
re: Physicists Find Fiber's Limit 20 years ago John Pierce published some papers on the photon counting channel and derived the quantum limit. I think, if my memory serves me, it's about 0.8 photons per bit. The 3 bits/Hz seems odd, unless it is based on a certain modulation scheme. We'll know the real answer if someone builds a good optical phase locked loop...
eewhiz 12/4/2012 | 8:09:09 PM
re: Physicists Find Fiber's Limit After reading all the posts, (but have not read the Nature article), it seems that todays advanced DWDM based systems are operating at best 0.4b/s/Hz...(calculated from a 10Gb/s OOK modulated signal operating in a 25GHz channel spacing using optimal matched filtering for baseband filtering). This is still a far cry from the 3b/s/Hz mentioned in the Nature article. But to go to higher throughputs will require somehting extraordinary...

Lets say we start with a 16-ary multi level PAM instead OOK (this would be difficult with current DWDM lasers). This would assign a 4 bits/symbol, this would increase the throughput to 1.6b/s/Hz, but this type of signalling is poor because all the quantization levels are in the 'real' amplitude plane; you could not transmit this signal too far before AWGN and any optical ASE would start intefering with decision thresholds. We need to use the phase plane to spread out the quantizing radii so that better Eb/No at the receiver, this is how the V.22 on up to V.90 modems were able to transmit greater BW's over 4 kHz of copper BW. The same holds true for HFC cable systems sending 256-QAM in each 6 MHz channel space.

The problem is that no-one (that I know of) manufactures or has an equivalent light based IQ modulator (that would operate at 193.1THz for example).

Even if you could build something like this, then the non-linear effects (PMD, chromatic and time disersion) will start to interfere with demodulation, so then you would need to adaptively equalize these non-linearities with technology that so far does not exist...not to mention we'll probably need FEC over all of this!!
This is very hard to do at 100Mb/sec let alone over 10Gb/s...only time and technology will tell.

At this point I'll settle for 0.4b/s/Hz...
Cheers
realguy 12/4/2012 | 8:08:57 PM
re: Physicists Find Fiber's Limit After carefully reading all the postings, I have to say ownstock has a valid point. Some years ago a very respected physicist got a Nobel Prize for the theory of superconductivity. The theory predicted that it is impossible to have superconductivity at room temperature. Then again a few years ago, it was discovered otherwise. The same Nobel laureate now claims it is indeed very natural to have superconduction at room temperature. In fact he insists his original theory predicted it if one would have just digged little deeper. Optical gyroscope was initially were believed to be impossible because it violated the theory of general relativity. After it worked, they claim it works actually in support of general relativity.

The assumption of the paper published was nonlinearity is a fact of life in optical fiber. Wrong! What if through some clever scheme (digital or analog) nonlinratities could be suppressed. Then again if that happens, the physicists would say, we knew it all along.
<<   <   Page 2 / 8   >   >>
HOME
SIGN IN
SEARCH
CLOSE
MORE
CLOSE