I’m beginning to wonder whether the integrated optical circuit is ever going to happen, and if it does, whether it will matter. Of the many component companies I’ve seen this year that were exciting enough to recommend, none were of the integrated optics variety. Most, however, were pushing what I would call adaptive components, meaning they had as their premise a way to interact in real time within an optical network and perform a critical function – amplification, dispersion mitigation – “intelligently” and dynamically.

A few years ago a number of optical component companies attacked head-on the obvious challenge that optical components have always faced: integration. In electronics, the integrated circuit revolutionized the industry, if not the world, by allowing chip designers to constantly add more functionality and processing power to a silicon chip. The optical component industry, however, could not evolve at the same pace and had long been fighting a battle against itself, that is, against its basic material: glass.

Glass, or silica, is far from silicon. It doesn’t conduct electricity, it doesn’t couple easily to optical fibers or other silica structures, and it does not accommodate miniaturization on the scale of silicon-based ICs. Vendors such as Bookham Technology PLC (Nasdaq: BKHM; London: BHM), Kymata Ltd., and Lightwave Microsystems Corp. have all been working to create the optical IC, but their path to success has been challenged on many fronts. The result is that most of these companies are now selling arrayed waveguide gratings (AWGs) that serve a single function, putting true integration off while they work out the kinks.

David Polifko, formerly of Ciena Corp. (Nasdaq: CIEN) and now Principal at Jafco Ventures, sees it this way: “Integrated optics have failed to deliver, since it is substantially easier to optimize the individual components and then integrate them into a module, subsystem, or system. Most of these optimized components are made from different materials and processes, rendering them unable to be integrated on a single substrate, let alone be spliced or glued together.”

The point here is rather straightforward but important. There simply isn’t a material equivalent to silicon that can be used to create optical circuits. Combinations and hybrids are often proposed, such as silica-on-silicon or polymer-on-silica, but these come with tradeoffs, which typically offset the value of integration. You may get a smaller footprint, but at the cost of higher loss or decreased performance.

“That's what makes the optics industry so interesting,” says Polifko. “It is not just a simple semiconductor process, but in addition you have bulk optics, waveguides, fiber based devices, crystals, etc. – lots of room for innovation and exciting technologies.” As he sees the problem, each of these materials is progressing at a different rate, and what may look one day like the perfect foundation for an optical IC may be inappropriate the next when different functions are required.

Many component vendors claimed they would be able to manufacture optical ICs as easily as semiconductors, yielding huge quantities of optical ICs and driving down costs; but that has also been elusive, though certainly not impossible. Even so, the new components coming out of AWG companies are most often wavelength mux/demux chips, in some cases including integrated variable optical attenuation.

The reality in the optical components market, therefore, is quite different than the one envisioned a few years ago. Instead of the components industry making an evolutionary leap into integrated circuits (often dubbed “planar lightwave circuits”), many companies have instead been focusing on moving up the value chain.

The typical component supplier, such as Corning Inc. (NYSE: GLW) or JDS Uniphase Inc. (Nasdaq: JDSU; Toronto: JDU), has been moving from individual discrete components to modules and subsystems. Optical amps that were once made from scratch are now packaged as a module with an integrated controller and automatic gain function. This has led to cheaper EDFAs (erbium-doped fiber amplifiers), but most of these are simple "set and let be" devices that only respond to command changes or, in some cases, to power-level changes. Optical switching subsystems or wavelength mux/demux subsystems can also be included in this category, though none can be considered particularly "intelligent" or adaptive.

So, just as the optical systems market is moving from static to dynamic, dumb to intelligent, so will the components industry. These are the components to watch and a trend worth following. This column isn’t meant to entirely discount the value of optical ICs, but to point out that these components are in many ways technologies in search of an application, and, as those applications change, particular ICs may find themselves stuck down blind alleys or dead ends. Intelligent or adaptive components, however, address the near-term demand for intelligent subsystems to address amplification, switching, and dispersion compensation, while participating in the transformation of networks by laying the groundwork for a completely adaptive optical network layer.

Until now, such smart subsystems haven't been necessary, since the distances and topologies of optical networks have been fixed, and the signals are 3R'd (reshaped, retimed, retransmitted) at each junction. Only with the addition of optical crossconnect systems and optical add/drop multiplexers (where the total optical channel path length can change if routing paths between two points are changed) does the need to "adapt" arise.

With the higher data rates such as 10- and 40-Gig the signal impairments due to fiber non-linearities and irregularities become even more significant. The mitigation of these impairments is now not only on a per-wavelength basis, it is also required on a real-time basis, as impairments such as PMD (polarization mode dispersion) or insertion loss can change as the fiber heats or is bent slightly.

Already, at least one systems designer, Sycamore Networks Inc. (Nasdaq: SCMR), is building in technology that allows in-field adaptive changes to the environment on a sub-system level, as evidenced by their recent announcement of enhancements to their SN10000 transport product line. Most other systems vendors are following suit, adding tunable components, often as a fundamental part of their system architectures. Atoga Systems, for one, is using tunable lasers not as a simple sparing solution but as a fundamental part of how they allocate bandwidth in metro DWDM-based networks. In a way, Atoga’s solution is based on the concept of “adaptive networking,” in which changing traffic demands or service changes require rapid network reconfiguration at the wavelength layer.

Cinta Corp. also uses tunable lasers as part of an optical transport switching architecture. The tunable laser acts as one dimension of switching, and a low-cost 1xK optical switch provides the second dimension. Other systems vendors, such as Movaz Networks Inc. and Polaris Networks, are leveraging developments in optical layer signaling to make their systems more “adaptive.” Here, an out-of-band optical control plane dynamically controls the interaction between edge gear and a metro core switching system to manage connectivity and bandwidth within a metro.

So down at the component level, which are the companies enabling this transformation? Onetta has introduced the concept of the intelligent optical amplifier; Yafo Networks, the concept of the dynamic PMD mitigation subsystem. Others still in stealth mode are talking about subsystems that provide intelligent dispersion compensation, amplification and gain flattening, optical layer monitoring, protection switching, and optical add/drop multiplexing. There is much more to come, and it appears these are the components systems vendors are clamoring for.

Having adaptive optical subsystems can ensure that as the network path or fiber changes, the transmitters and receivers can adaptively compensate for the distortions incurred. As we evolve slowly but surely to a more dynamic optical layer, the key enablers will be at the component layer. The next wave of optical innovation is happening here. So forget about integrating: Adapt!

— Scott Clavenna, Director of Research, Light Reading
COMMENTS Add Comment
Page 1 / 2   >   >>
wimchatta 12/4/2012 | 7:59:41 PM
re: Adapt! Scott,

Didn't you say you are off on a vacation since nothing interesting is going to be happening this summer, just the beginning of a phase of optimization?

Here's the link...

So what in your view has changed that calls for a new goal for the industry?

-wim :)
[email protected] 12/4/2012 | 7:59:40 PM
re: Adapt! Just curious but seems like this is your new fad just like the last six or seven that you have covered. Question I would have would be that since this is not something that you seem to have much background in technologically or coverage wise, where are you getting the information to write stuff like this?
smoking_craters 12/4/2012 | 7:59:37 PM
re: Adapt! Great article!

I would take issue with your belief, however, that vendors like Bookham and Lightwave Microsystems are not yet making optical integrated circuits.

While they are primitive, the combination of an AWG with VOA's on a single substrate has to be called an SSI optical IC. If you combine that with switches on the same substrate you'd have to call that an MSI optical IC. After all, it took decades to go from the transistor to the SSI IC to the MSI IC to the LSI IC and eventually to VLSI and now ULSI IC's.

You're right about the size, though. In the past, silicon IC evolution focused on smaller and smaller and higher and higher gate counts. Optical waveguide size constrains the smaller and smaller part of optical IC evolution. Optical components chew up a lot of space on a wafer.

I agree with you that we won't see anything like Silicon evolution in optical components. Still, when you pack 40 channels of AWG, VOA's and switches onto a single substrate, you've got a small, inexpensive OADM chip that can easily handle a half a terabit of data in one integrated part. Of course, the lasers and receivers are located off-chip, but still, a half a terabit ain't small potatoes, Scott!
Scott Clavenna 12/4/2012 | 7:59:35 PM
re: Adapt! Thanks for the question, but the column really is supposed to take on the character of flavor of the month in a way since it's an opportunity to share what I'm seeing in the industry as I meet new companies and talk with carriers, VCs, endusers - anyone that contacts me. I meet one or two startups a day, even in this market, so I find myself in an interesting position to see trends as they emerge, and then talk with credible sources about what those trends might mean, or learn if they are in fact just "fads."

This one, adaptive optics and subsystems, has been coming on for a while and I've met with lots of very talented engineers and entrepreneurs in this market that are validating this idea. Talking with VCs, like Dave Polifko, who has been studying this trend closely, helped me put together the idea for the column. Not much more to it than that, really. When you talk to as many people as I do, trends just yield themselves up out of the noise, and writing a column becomes a task of clarifying and embellishing the ideas that have come in without any real form. Essentially, that's the process of writing, in a nutshell, and it's what I like best about my work. I'd be more apt to call the result my interests and observations, rather than "my new fad."


Scott Clavenna
LightBeating 12/4/2012 | 7:59:33 PM
re: Adapt! I have been in the optics field for more than 20 years now, and I guess that gives me a different perspective than that of the engineer who learned of the existence of optical components only 2-3 years ago when it became fashionable.

Those who have been there for such a long time will remember that optical IC's were already a hot topic in the early 1980's! Back then we were working on Lithium Niobate, or ion-exchange waveguides, and then all sorts of semiconductors. We all believed in the all-optical computer, and a simple function like optical bistability was to have a bright future.

Of course none of that happened. The technology worked in the lab, but was always plagued by different drawbacks: thermal effects, high loss, etc. etc.

What I've learned in all those years is that a technology that really works will make its way rapidly into the commercial world. For example, Lithium Niobate modulators were a simple, elegant device, a proof that integrated optics could work, and in a matter of a few years they were available commercially. Nobody even mentions them any more, even though they are still an integral part of many high-bit-rate systems.

Among other technologies that really worked well is the optical fiber itself, a marvel of modern materials science. A couple of breakthroughs in the fabrication process have been enough to bring the loss below the dB/km range. The process was easily scalable. I never cease to wonder at the purity of that glass, which is probably the more transparent material to exist in the universe. Yet it went from the lab to field-installed systems in less than ten years or so.

Semiconductor lasers, which benefited from all the research on transistors, also rapidly became reliable, powerful devices. They in turn allowed another powerful technology, the Erbium-doped fiber amplifier, to become a commercial reality in less than five years!

Fiber Bragg gratings are another powerful technology that rapidly made it to the market place. In a matter of 4-5 years, all the issues of fabrication, reliability, quality were mostly settled.

But many promising technologies failed miserably. Integrated optical circuits is among them, as even AWG's are still difficult to fabricate in volume with a reasonable yield. But remember also: fluoride-doped fibers, that had in principle a loss ten times less than silica fibers; coherent detection, that was to increase sensitivity to the ultimate limit; ion-exchange waveguides (Corning lost a lot of money on those...); nonlinear optical switching; dispersion compensation using four-wave mixing; polymer waveguides and fiber (whatever happened to POF's!?).

Still, some technologies that one thought had been superseded with better ones have been ressucitated: a beautiful example is Raman amplification. While the kids may think this is a new invention, it dates from the early 80's, and the pioneering work of Roger Stolen and others (Hail Bell labs!). In those days argon lasers were used as pumps, which made it inconceivable to use them in the field. Yet the advent of powerful semiconductor lasers (again!) suddenly made them an elegant alternative to EDFA's (thanks Steve Grubb for bringing this one back to life!). Now Raman amplification may very well be the only way to cover the whole transmission bandwidth of fibers (together with semiconductor amplifiers, who knows!).

So IOC's (integated optical circuits) still seem too difficult to make, even after more than 20 years of research. Should we give up? Probably not, but put them in the back burner for a while, and keep an eye on technological breakthroughs that may suddenly change the picture. That is why "fundamental" research is still important and should be well funded.

A final note: for so many years we have seen the potential bandwidth of optical fiber as nearly infinite. Yet the recent developments of DWDM technology have found ways to exploit it almost entirely in a matter of only a few years: the bandwidth carried by a single fiber in commercially available systems has increased nearly a thousand fold in only about 5 years, from 2.5 GB/s to more than 1 Tb/s. Noboby should expect that this trend will continue indefinitely. Actually, we only have about another order of magnitude to gain, to around 15 Tb/s (give or take a few), and that will be it, there simply won't be any more room. While these four orders of magnitudes that we will have gained will have been a relatively easy task, further factors of only 2 may be daunting challenges. So space divison multiplexing may well become the new trend!

Light2001 12/4/2012 | 7:59:33 PM
re: Adapt! Scott,

Great article, and I share your view about the state-of-affair of the "integrated optical ICs". It's been over-hyped, and under-delivered. For those believe otherwise, try harder, and show us!

I'd like to add one comment about the trend towards intelligent optical networking: one is to use adative technology to optimize system performance responding to dynamic network topology change, such as dynamic DGFF, DCM, OLA, PMDC etc. And you've discussed this category in length.

The other trent, I believe is more important, is to use more intelligent dynamic optical technology to enable system features, enhance optical networking functionality, and increase capacity. Tunable lasers, tunable filters, re-configurable OADM, etc. belong to this category.

You mentioned Atoga, and Cinta as the 2 system vendor examples, but did not elaborate on this distinction clearly.

redface 12/4/2012 | 7:59:31 PM
re: Adapt! Hi Lightbeating:

Thanks for the posting. It is excellent.

The list of "failed technology" should include "optical interconnect" and "SEED device"! These two things have received so much attention and made some people famous, yet they never took off.

I would like to discuss two points raised in your message.

"But many promising technologies failed miserably. Integrated optical circuits is among them, as even AWG's are still difficult to fabricate in volume with a reasonable yield."

I probably won't put AWG in the category of "failed technology" just yet. To me, AWG is an elegant solution to the MUX/DEMUX problem and a cornerstone of DWDM technology. Just think, what other DEMUX technology can scale as well as AWG? I don't know what kind of yield you are talking about since you probably have a better knowledge of it, but the AWGs are being sold in high volume at about $200 per channel now and I think it will drop even more in the future. People are working on better manufacturing processes such as sputtering deposition process to improve the AWG yield which may bear fruit in the not so distant future. So AWG is here to stay.

"A final note: for so many years we have seen the potential bandwidth of optical fiber as nearly infinite. Yet the recent developments of DWDM technology have found ways to exploit it almost entirely in a matter of only a few years:... Noboby should expect that this trend will continue indefinitely. Actually, we only have about another order of magnitude to gain, to around 15 Tb/s (give or take a few), and that will be it, there simply won't be any more room. While these four orders of magnitudes that we will have gained will have been a relatively easy task, further factors of only 2 may be daunting challenges. So space division multiplexing may well become the new trend!"

I agree with your assessment. However, we might not need that much fiber capacity (15 Tb/s) anyway. I think fiberoptics's ultimate mission is to solve the communication need of human race forever. To an individual consumer household, the maximum he needs is on the order of about 20 Mb/s which is needed for HDTV transmission with a lot of compression. The aggregate bandwidth requirement of the world thus comes to about 25,000 Tb/s. Assuming realistic fiber capacity of 2.5 Tb/s, we need about 10,000 fibers to carry all the HDTV real time traffic of the world. That does not seem to be too difficult to do since these 10,000 fibers are distributed all over the world and people are already laying fiber cables with hundreds of strands of fibers inside. I guess the other problem is the famous "last mile" bottleneck problem, which can be solved as well. So the task of wiring the world with fiberoptics is progressing nicely and we have another ten years to go to finish the job!

- R
[email protected] 12/4/2012 | 7:59:31 PM
re: Adapt! Another country heard from:

BW2088 AUG 07,2001 4:57 PACIFIC 07:57 EASTERN

( BW)(VA-CIR) New CIR White Paper Provides Insights Into the Evolution of the Integrated Optics Market

Business Editors/High Tech Writers

CHARLOTTESVILLE, Va.--(BUSINESS WIRE)--Aug. 7, 2001--Communications Industry Researchers, Inc.(CIR), a leading optical industry analyst firm based in Charlottesville, VA, has released a new White Paper that examines the trends and issues surrounding the integration of optical components. The paper is available for immediate download from its Web site, www.cir-inc.com .

Why the Fuss?

For years the microelectronics industry has enjoyed tremendous success through the ability to integrate many functions onto a single miniaturized chip. Companies such as Intel, IBM, Microsoft, Dell, Texas Instruments, Oracle and a host of other semiconductor, computing and software companies were founded and grew into industry giants and, are in many ways, the backbone of major segments of the world's economies. Massive wealth has been generated through making things "smaller, faster and cheaper."
The optical integration story presents some interesting parallels, according to CIR's new White Paper. Manufacturers of optical components have begun to bring manufacturing processes and technologies together that are enabling the creation of "hybrid" devices that enable the functionality of several "discrete" components to be joined together in one miniaturized module. CIR sees the future of optical integration moving towards a monolithic process where optical "chips" will take the place of larger components in the way that electronic chips replaced printed circuit boards crammed with transistors.

Why Integrate?

CIR's new paper states that since service providers are looking for ways to substantially reduce infrastructure costs to improve profitability, manufacturers are being pressured to slash equipment costs while meeting the same performance requirements. The components makers are therefore being squeezed to provide ever-cheaper parts for their customers. Several new companies have received well over $100 million to bring to market the next generation of products that will enable the cost reductions necessary to fuel the next wave of growth within the optics industry.

Not so Fast

However, according to this new paper, there is a great deal of work to be done in order for optical integration to become more than a science lab experiment:

--Pricing levels are not attractive enough for systems manufacturers to buy integrated components just yet. CIR blames inadequate production capabilities for this as well as the model of passing on initial R&D costs in the first batches of components, similar to the practices within the pharmaceutical industry.

--Technological issues related to materials used in components construction are not resolved and the best approaches to manufacturing are still unclear.

-- The first generation of integrated optical products simply does not match up well with existing systems level requirements of today from major equipment companies such as Nortel, Ciena and Lucent. It will be at least another 18 months for the next generation of hardware to be deployed with significant amounts of integrated optical components designed in.

Companies and Technologies to Watch?

CIR believes that established firms such as Alcatel, Agere and JDS Uniphase have the potential to lead the market forward with interesting start-ups such as Lightwave Microsystems, Genoa, Zenastra and Gemfire showing great promise. CIR does question Corning's apparent decision to lag the market, however. Silica on Silicon is seen as a powerful enabler of optical integration and is happening now, but Indium Phosphide is another strong bet for the future. Products such as low cost transceivers and transmitters, monitors and amplifiers aimed at the metro space are viewed as legitimate opportunities. However, passive WDM products, optical switches and backplane products are seen as further behind.
The paper is available via CIR's web site: www.cir-inc.com and drawn from CIR's soon to be released report, "The Market for Integrated Optical Products: 2001-2005."
lightmaster 12/4/2012 | 7:59:29 PM
re: Adapt! Wow,

A response to a criticism that doesn't strike back, get defensive, or avoid the question, but tries to answer in a sincere manner. Scott, could you please spread a little of this civility to the rest of the site?
The Carmack 12/4/2012 | 7:59:29 PM
re: Adapt! To an individual consumer household, the maximum he needs is on the order of about 20 Mb/s which is needed for HDTV transmission with a lot of compression.

You're assuming that HDTV is the be-all end-all communication needs of a human. Have you heard of Jaron Lanier's telepresence experiments?


This applications will require massive amounts of bandwidth, much more than HDTV. Who knows what other applications will come up beside those, and beyond.

Whatever bandwitdh you think we will need in the future, you can be most sure about one thing: you're underestimating it.

"640k is enough for everyone"
- Bill Gates
Page 1 / 2   >   >>
Sign In