Carriers Stress Test Their Fiber

The installed fiber plant in Europe's major carrier networks is capable of transmitting data at more than 1 terabit per second (1,000 Gbit/s), more than enough to cope with the expected capacity demands created by the widespread introduction of video services.

That's one of the conclusions of a recent test carried out by France Telecom SA (NYSE: FTE), Deutsche Telekom AG (NYSE: DT), and Alcatel SA (NYSE: ALA; Paris: CGEP:PA), which showed that an existing 430-kilometer single-mode fiber link in the French carrier's network is capable of carrying eight 160-Gbit/s channels (see Alcatel Tests Fiber With FT, DT ).

That compares with the 10 Gbit/s per wavelength achieved by today's installed commercial DWDM systems, says Alcatel's CTO Niel Ransom.

"Increasing the bit rate per wavelength channel leads to potentially better equipment integration, higher capacity and lower network costs," said France Telecom's R&D director Pascal Viginier in a prepared statement. The carrier "needs to increase its network capacity at the lowest costs" as it introduces "high-bit-rate services combining voice, data and video," added Viginier (see French Say Oui to DSL TV).

And transmitting fewer wavelengths saves money, as it requires less regeneration, says Alcatel's Ransom. So while the same capacity can be achieved with multiple 10 Gbit/s or 40 Gbit/s wavelengths, that's a more expensive option.

The test results are good news for any operator with an extensive fiber network, as it's going to be expensive enough migrating to next-generation network systems without having to replace existing installed fiber. For example, BT Group plc (NYSE: BT; London: BTA) is set to spend £10 billion ($19.2 billion) on its 21st Century Network, or 21CN, project (see BT Moves Ahead With Mega Project).

It's worth pointing out that BT and Siemens Communications Group have already conducted a similar trial of 160-Gbit/s transmission technology, over BT's existing fiber infrastructure. In the Light Reading article covering the BT/Siemens demo, earlier trials of 160-Gbit/s transmission systems by Mitsubishi Electric Corp. (Tokyo: 6503), Bell Labs, and Germany's Heinrich Hertz Institute (HHI) were also cited (see Siemens Claims 160-Gbit/s Milestone and Mitsubishi Looks to 160-Gbit/s Future). Alcatel learned a lot from the test it conducted with FT and DT, according to Ransom. He says his team had to "put together all sorts of tricks to get this to work over a standard G.562 fiber," and they found that the key feature that needs to be addressed when trying to send so much data over legacy fiber is the polarization mode dispersion (PMD).

PMD is one of a bunch of phenomena that cause light pulses to spread out as they travel along fiber, so that they eventually overlap and become indistinguishable from each other. This means errors get introduced when translating the pulses back to electronic signals. A brief tutorial on the topic is given in a Light Reading Beginner's Guide: Chromatic Dispersion and Polarization Mode Dispersion (PMD)).

"We found that PMD is not constant, that it's affected by temperature and by physical stresses on the fiber, for example if a train runs overground where the fiber is installed. So we built adaptive PMD devices that constantly monitored and adjusted the optical components to cope with the variations," says Ransom.

So when can we expect to see 160-Gbit/s systems available commercially? Not any time soon, says the Alcatel man, as the electrical components for such equipment don't exist yet. For this test, Alcatel interleaved four short-pulse optical TDM 42.7-Gbit/s signals to create a single 170-Gbit/s signal (160 Gbit/s plus an overhead for bit error detection and correction).

But while that's fine for experiments, it's not something that's economically feasible in commercial networks, says Ransom, and at the moment there's no demand for 160-Gbit/s systems anyway. "But the test has shown us what's needed to transmit single 160-Gbit/s wavelengths, and the carriers now know they won't need new fiber when they need to deploy 160-Gbit/s systems."

Ransom says that, at present, there is a small number of sophisticated end users, such as universities and R&D networks, that want to generate 40-Gbit/s Ethernet signals, "but in six or seven years' time they'll be looking to do 160 Gbit/s."

— Ray Le Maistre, International News Editor, Light Reading

Page 1 / 3   >   >>
ssfiberoptics 12/5/2012 | 3:31:03 AM
re: Carriers Stress Test Their Fiber I was the marketting guy at Lucent in the fiber business from 90-99 and thought id give some data points to the discussion.The fix that lowered PMD dramatically was a process change in 1991--it was patented (and still is) so Alcatel and other suppliers had to come up woth workarounds.Corning and the Japanese(sounds like a rock group)have a different process and started out with much better PMD.After the process improvement Lucents was the best.
As to share, NZDF fibers make up about half the backbone networks in the US, with Lucent being 60 % and GLW the rest(of the NZDF).Of the standard fibers,since Lucent was sole sourced to ATT,who has the biggest network, Id say 90% would be lucent.There is one network provider who made the mistake of using dispersion shifted fiber (DSF), MCI and they also have the bad PMD fiber made by NortelSo they have a double wammy--high PMD and only able to use coarse DWM.Hope this helps
sigint 12/5/2012 | 12:58:09 AM
re: Carriers Stress Test Their Fiber "We found that PMD is not constant, that it's affected by temperature and by physical stresses on the fiber, for example if a train runs overground where the fiber is installed. So we built adaptive PMD devices that constantly monitored and adjusted the optical components to cope with the variations," says Ransom."

I expect that temperature changes would be gradual and it could be feasible to change adaptive parameters to cope with such changes.

But a high speed train that zooms past? How quickly can these parameters be changed so that no data-corruption happens?

If someone from ALA is listening in, please comment!

dwdm2 12/5/2012 | 12:58:08 AM
re: Carriers Stress Test Their Fiber "But a high speed train that zooms past? How quickly can these parameters be changed so that no data-corruption happens?"

Though I'm not from ALA, let me try with my limited understanding. Most of the chip-based dispersion compensation techniques that I know of don't care about the origin of distortion, it only tracks the distortion itself (or at least that should be the case). So when a gbps signal is being compensated, contribution from vibration or other external causes are actually a much slowly varying factor compared to the speed of compensation. I'm interested to learn more on it as well...
stephenpcooke 12/5/2012 | 12:58:05 AM
re: Carriers Stress Test Their Fiber PMD is one of those 'nasties' that grab system bandwidth as you get smaller and smaller bit periods (ie: higher & higher bitrate). The effect is that different launch polarizations travel at slightly different velocities (similar to chromatic dispersion). Now you say that the launch polarization doesn't usually change, this is generally true. However, it is the polarization relative to the optimal transmission polarization of each segment of fiber, and by 'segment' I mean each individual centimeter of it.

Older SMF fiber was bad for PMD because of poor quality control on the circular diameter of the cladding, it was actually very slightly elliptical as opposed to entirely circular in nature. This is why older fiber can be a real problem for newer, high bitrate systems.

PMD compensation has to be something that runs continuously, and adapts continuously, to the changing conditions of the transmission fiber in real time. By 'changing conditions' I mean rail traffic, temperature, fiber twisting, pressure changes of any sort, etc.

I came up with a possible 'solution' many years ago. What I suggested was a parallel wavelength (ie: a close DWDM-grid wavelength travelling down the same path as the signal to be compensated) with a slightly smaller bit period as the compensated signal. This signal would be monitored at the receive end of the link and the bit period measured (As PMD results in Intersymbol Interference - ISI - what the receiver sees is an effective shortening of the bit period). The bit period length would be passed back to the head end of the link via the SONET/SDH overhead in the link-to-be-compensated. Both the parallel wavelength and the real signal would be transmitted through a polarization rotation device at the head end. The 'launch' polarization (ie: the polarization that hits the actual link fibre on its way out of the CO) of the parallel stream would be continuously, but slowly, rotated and a state machine would read the returned bit period values. The state machine would then control the 'launch' polarization of the real signal in almost real time on a granularity of 125 microseconds (SONET frame interval).

Please understand that I have no idea how PMD compensation is currently done in the real world, as I have been out of it for a while. Also understand that 'PMD' is a broad term for non-circular cross-section fiber, temperature changes, etc. and is the cumulative result of all of these possible effects. As far as I know it is very difficult to isolate the individual causes of the cumulative impairment. It is also possible that there really is no optimum launch polarization if the cumulative impairments obliterate the bit period completely and this is definitely a possibility as bit periods shorten. The key to remember is that PMD is a link phenomenon and varies all over the map on some links and doesn't change at all on others.

I hope this helps.

[email protected]
dwdm2 12/5/2012 | 12:58:04 AM
re: Carriers Stress Test Their Fiber "As far as I know it is very difficult to isolate the individual causes of the cumulative impairment."

My understanding is PMD ompensators don't care about the origin of distortion, it only tracks the distortion itself and compensates (or at least that should be the case).

Potential approaches to mitigating PMD fall into three major categories:

G㡠PMD compensators
G㡠Forward error correction (FEC)
G㡠Novel signal modulation formats

Some claim taht the best approach to PMD would be a compensator that precisely and specifically corrects for the PMD effects present in a fiber link. Because PMD effects are different and uncorrelated for each independent channel, each channel in a WDM or TDM system would require its own PMD compensator. To visualize what a PMD compensator needs to do, imagine a "race" along a long course between two runners of fairly equivalent capabilities, with the runners representing the orthogonal polarization states of a single wavelength. While the goal in a running race is to determine which competitor can cross the finish line first, the goal of PMD
compensation is exactly the opposite: to manipulate the course during the journey so that both "runners" (i.e., polarization
states) finish at precisely the same time. From this scenario controling a fast moving guy by tracking a slow moving guy may not be feasible.
dwdm2 12/5/2012 | 12:57:53 AM
re: Carriers Stress Test Their Fiber "The real question is, in the long run,which fiber supports the most economical transmission scheme."

The concern is about the dark fiber that is already under the ground. Fancy type of fiber will be more important in the areas where new installation is being considered. Meanwhile one needs to work with the bird at hand.
ssfiberoptics 12/5/2012 | 12:57:53 AM
re: Carriers Stress Test Their Fiber very interesting posts on the test.Its of interest to me that Alcatel fiber made with MCVD was the worst "old" fiber, made before PMD was determined to be a problem,back in the early 90s.FT gave them a very hard time So it shouldnt surprise anyone that Alcatel now confirms that there wont be a problem with 652 fiber.The real question is, in the long run,which fiber supports the most economical transmission scheme.From the evidence Ive seen, the 655/656 specs are the best medium for handling both PMD and chromatic dispersion---the latter being the more serious above 10Gbs (and not mentioned in the article)
ssfiberoptics 12/5/2012 | 12:57:51 AM
re: Carriers Stress Test Their Fiber IN france though, the issue would be, could a new competitor.like TEF come in a be competitve on a cost basis with FT.They(TEF) have plenty of money.Its kind of like T, nor MCI cant be competitive on a cost basis with Level 3 at 10Gbs DWDM, because they have too much 652 or worse DSF fiber(MCI) in their network.FT is telling ALA that what they sold them wont keep them competitve at 10Gbs and up--and ALA is saying, no, your OK.Operationally,your correct, but thats why trains lost out to trucks,to use your analogy.Its really a strategic issue in the long run,and not operational
stephenpcooke 12/5/2012 | 12:57:47 AM
re: Carriers Stress Test Their Fiber Hi,

For those who care I'll try to provide a relatively short but complete tutorial on the subject of Polarization Mode Dispersion (PMD). I will make some basic assumptions as to the knowledge level of the readers on this topic and go from there.

Basic Laser Understanding
Lasers in telecom emit light in a single polarization state. In fact, to be standards compliant any transmitting laser must have an optical isolator on its output prior to anything else so that the impact of reflections can be minimized. The way an isolator works is that the emitted light passes through a polarized filter (ie: only letting a single polarization orientation through. This is set up at laser fabrication so that correct alignment with the emitted light is obtained.). The next, and last component of the isolator is a 1/8 polarization rotation device (usually implemented via a magnetically controlled polarization rotation medium surrounded by a permanent magnet). This results in a polarization rotation of 45 degrees. Any reflections come back, hit this medium and are rotated another 45 degrees for a cumulative 90 degree rotation before hitting the initial polarization filter. As you know there will be no transmission through the polarization filter at a 90 degree orientation. This achieves the standards requirement of being able to withstand reflections of -8.5dB with an optical power path penalty of less than 1dB.

Polarization Mode Dispersion (PMD)
PMD is a general term that relates to any cumulative impairment caused by uneven propagation of various polarization states on a fiber. As mentioned above, there is only one polarization state that is launched into the fiber from the transmitter. However, due to any uneven application of pressure on the fiber or any manufacturing inconsistancy that affects the relative propagation velocity of one orientation relative to another, PMD is created.

It is theoretically possible for a single polarization state to propagate, unaltered, through any length of fiber. This however, is not observable in any reasonable fiber communications system.

As the launch polarization travels through the fiber, which is under various kinds of stress (eg: trains, trucks, temperature, etc.) the propagation characteristics of the fiber itself may change. The result is that the launch polarization is resolved, relative to any impairment directionality (orthogonal or otherwise). If the impairment affects velocity of one resolved component differently than another, PMD is created.

Please note: resolved components along the course of the fiber may appear and disappear (ie: they may be resolved back into the main stream in the same way that they were split out in the beginning) depending on the relative conditions present on the link at the time of passage.

I personally believe that 'compensation' is a bad term for PMD. I think that the better approach would be 'optimization'. There will almost always be one launch polarization that will work better than another. The trick is to find it for the specific link at the specific time. This is why a dynamic launch polarization seems to make the most sense. In my earlier post on this thread I proposed such a scheme.

I have no idea how PMD 'Compensators' work so someone else will have to add that tidbit. It is also important to understand that there seems to be some dependence on wavelength but I haven't seen anything that provides any ranges of compensation applicability; this in no way implies that they do not exist.

Good Luck,

[email protected]
dwdm2 12/5/2012 | 12:57:47 AM
re: Carriers Stress Test Their Fiber "IN france though, the issue would be, could a new competitor.like TEF come in a be competitve on a cost basis with FT."

Well, cost competitiveness is a universal aspect of any business except probably defense. Sounds like one can substitue the RBOCs with the FT in the United States.
Page 1 / 3   >   >>
Sign In