Describe The Evolution Of Light Wave Systems

Lightwave systems represent a natural extension of microwave communication systems inasmuch as information is transmitted over an electromagnetic carrier in both types of systems. The major difference from a conceptual standpoint is that, whereas carrier frequency is typically ~1 GHz for microwave systems, it increases by five orders of magnitude and is typically ~100 THz in the case of lightwave systems. This increase in carrier frequency translates into a corresponding increase in the system capacity. Indeed, whereas microwave systems rarely operate above 0.2 Gb/s, commercial lightwave systems can operate at bit rates exceeding 1 Tb/s. Although the optical carrier is transmitted in free space for some applications related to satellites and space research, terrestrial lightwave systems often employ optical fibers for information transmission. Such fiber-optic communication systems have been deployed worldwide since 1980 and constitute the backbone behind the Internet. One can even claim that the lightwave technology together with advances in microelectronics was responsible for the advent of the "information age" by the end of the twentieth century. The objective of this book is to describe the physics and engineering behind various kinds of lightwave systems. The purpose of this introductory chapter is to present the basic concepts together with the background material. Section 1.1 provides a historical perspective on the development of lightwave communication systems. Section 1.2 focuses on the building blocks of such a system and describes briefly the three components known as optical transmitters, fibers, and receivers. Section 1.3 covers the concepts such as analog and digital signals and the technique used to convert between the two. Channel multiplexing in the time and frequency domains is described in Section 1.4 where we also discuss the technique of code-division multiplexing.

1.1 Evolution of Lightwave Systems

Microwave communication systems were commercialized during the decade of 1940s, and carrier frequencies of up to 4 GHz were used by 1947 for a commercial system operating between New York and Boston [1], During the next 25 years or so, microwave as well as coaxial systems evolved considerably. Although such systems were able to operate at bit rates of up to 200 Mb/s or so, they were approaching the fundamental limits of the technology behind them. It was realized in the 1950s that an increase of several orders of magnitude in the system capacity should be possible if optical waves were used in place of microwaves as the carrier of information. However, neither a coherent optical source, nor a suitable transmission medium, was available during the 1950s. The invention of the laser solved the first problem [2]. Attention was then focused on finding ways of transmitting laser light over long distances. In contrast with the microwaves, optical beams suffer from many problems when they are transmitted through the atmosphere. Many ideas were advanced during the 1960s to solve these problems [3], the most noteworthy being the idea of light confinement using a sequence of gas lenses [4].

In a parallel but unrelated development, optical glass fibers were developed during the 1950s, mainly from the standpoint of medical applications [5]-[9 |. It was suggested in 1966 that optical fibers might be the best choice for transporting optical signals in lightwave systems [10] as they are capable of guiding the light in a manner similar to the guiding of electrons in copper wires. The main problem was their high losses since fibers available during the 1960s had losses in excess of 1,000 dB/km.

A breakthrough occurred in 1970 when fiber losses were reduced to below 20 dB/km in the wavelength region near 1 /im using a novel fabrication technique [11], At about the same time, GaAs semiconductor lasers, operating continuously at room temperature, were demonstrated [12]. The simultaneous availability of compact optical sources and low-loss optical fibers led to a worldwide effort for developing fiber-optic communication systems during the 1970s [13]. After a successful Chicago field trial in 1977, terrestrial lightwave systems became available commercially beginning in 1980 [14]-[16], Figure 1.1 shows the increase in the capacity of lightwave systems realized after 1980 through several generations of development. As seen there, the commer-


Figure 1.1: Increase in the capacity of lightwave systems realized after 1980. Commercial systems (circles) follow research demonstrations (squares) with a few-year lag. The change in the slope after 1992 is due to the advent of WDM technology.


Figure 1.1: Increase in the capacity of lightwave systems realized after 1980. Commercial systems (circles) follow research demonstrations (squares) with a few-year lag. The change in the slope after 1992 is due to the advent of WDM technology.


Figure 1.2: Increase in the BL product from 1975 to 2000 through four generations of lightwave systems. Different symbols are used for successive generations. (After Ref. [17]; ©2000 IEEE.)


Figure 1.2: Increase in the BL product from 1975 to 2000 through four generations of lightwave systems. Different symbols are used for successive generations. (After Ref. [17]; ©2000 IEEE.)

cial deployment of lightwave systems followed the research and development phase closely. The progress has indeed been rapid as evident from an increase in the system capacity by a factor of 100,000 over a period of less than 25 years. The saturation of the capacity after 2000 is partly due to the economic slowdown experienced by the lightwave industry (known popularly as the bursting of the telecom bubble).

The distance over which a lightwave system can transmit data without introducing errors is also important while judging the system performance. Since signal is degraded during transmission, most lightwave systems require periodic regeneration of the optical signal through devices known as "repeaters." A commonly used figure of merit for any communication system is the bit rate-distance product, BL, where B is the bit rate and L is the repeater spacing. The research phase of lightwave systems started around 1975. The first-generation systems operated in the near infrared at a wavelength close to 800 nm and used GaAs semiconductor lasers as an optical source. They were able to work at a bit rate of 45 Mb/s and allowed repeater spacings of up to 10 km. The 10-km value may seem too small from a modern perspective, but it was 10 times larger than the 1-km spacing prevalent at that time in coaxial systems.

The enormous progress realized over the 25-year period extending from 1975 to 2000 can be grouped into four distinct generations. Figure 1.2 shows the increase in the BL product over this time period as quantified through various laboratory experiments [17]. The straight line corresponds to a doubling of the BL product every year. In every generation, BL increases initially but then saturates as the technology matures. Each new generation brings a fundamental change that helps to improve the system performance further.

It was clear during the 1970s that the repeater spacing could be increased considerably by operating the lightwave system in the wavelength region near 1.3 [im, where fiber losses were below 0.5 dB/km. Furthermore, optical fibers exhibit minimum dispersion in this wavelength region. This realization led to a worldwide effort for the development of semiconductor lasers and detectors operating near 1.3 jum. The second generation of fiber-optic communication systems became available in the early 1980s, but the bit rate of early systems was limited to below 100 Mb/s because of dispersion in multimode fibers [18]. This limitation was overcome by the use of single-mode fibers. A laboratory experiment in 1981 demonstrated transmission at 2 Gb/s over 44 km of single-mode fiber [19]. The introduction of commercial systems soon followed. By 1987, second-generation lightwave systems, operating at bit rates of up to 1.7 Gb/s with a repeater spacing of about 50 km, were commercially available.

The repeater spacing of the second-generation lightwave systems was limited by fiber losses at the operating wavelength of 1.3 fim (typically 0.5 dB/km). Losses of silica fibers become minimum near 1.55 jum. Indeed, a 0.2-dB/km loss was realized in 1979 in this spectral region [20], However, the introduction of third-generation lightwave systems operating at 1.55 ¿urn was considerably delayed by a relatively large dispersion of standard optical fibers in the wavelength region near 1.55 jum. The dispersion problem can be overcome either by using dispersion-shifted fibers designed to have minimum dispersion near 1.55 fim or by limiting the laser spectrum to a single longitudinal mode. Both approaches were followed during the 1980s. By 1985, laboratory experiments indicated the possibility of transmitting information at bit rates of up to 4 Gb/s over distances in excess of 100 km [21], Third-generation lightwave systems operating at 2.5 Gb/s became available commercially in 1990. Such systems are capable of operating at a bit rate of up to 10 Gb/s [22]. The best performance is achieved using dispersion-shifted fibers in combination with distributed-feedback (DFB) semiconductor lasers.

A drawback of third-generation 1,55-^im systems was that the optical signal had to be regenerated periodically using electronic repeaters after 60 to 70 km of transmission because of fiber losses. Repeater spacing could be increased by 10 to 20 km using homodyne or heterodyne detection schemes because their use requires less power at the receiver. Such coherent lightwave systems were studied during the 1980s and their potential benefits were demonstrated in many system experiments [23]. However, commercial introduction of such systems was postponed with the advent of fiber amplifiers in 1989.

The fourth generation of lightwave systems makes use of optical amplification for increasing the repeater spacing and of wavelength-division multiplexing (WDM) for increasing the bit rate. As evident from different slopes in Figure 1.1 before and after 1992, the advent of the WDM technique started a revolution that resulted in doubling of the system capacity every 6 months or so and led to lightwave systems operating at a bit rate of 10 Tb/s by 2001. In most WDM systems, fiber losses are compensated periodically using erbium-doped fiber amplifiers spaced 60 to 80 km apart. Such amplifiers were developed after 1985 and became available commercially by 1990. A 1991 experiment showed the possibility of data transmission over 21,000 km at 2.5 Gb/s, and over 14,300 km at 5 Gb/s, using a recirculating-loop configuration [24], This performance indicated that an amplifier-based, all-optical, submarine transmission system was feasible for intercontinental communication. By 1996, not only transmission over 11,300 km at a bit rate of 5 Gb/s had been demonstrated by using actual submarine

Figure 1.3: International network of submarine fiber-optic cables in 2004. (Source: TeleGeog-raphy Research Group, PriMetrica, Inc. ©2004.)

cables [25], but commercial transatlantic and transpacific cable systems also became available. Since then, a large number of submarine lightwave systems have been deployed worldwide.

Figure 1.3 shows the international network of submarine fiber-optic cables as it existed in 2004. The 27,000-km fiber-optic link around the globe (known as FLAG) became operational in 1998, linking many Asian and European countries [26]. Another major lightwave system, known as Africa One, was operational by 2000; it circles the African continent and covers a total transmission distance of about 35,000 km [27], Several WDM systems were deployed across the Atlantic and Pacific oceans from 1998 to 2001 in response to the Internet-induced increase in the data traffic; they have increased the total capacity by orders of magnitudes [28], One can indeed say that the fourth generation of lightwave systems led to an information revolution that was fuelled by the advent of the Internet.

At the dawn of the twenty-first century, the emphasis of lightwave systems was on increasing the system capacity by transmitting more and more channels through the WDM technique. With increasing WDM signal bandwidth, it was often not possible to amplify all channels using a single amplifier. As a result, new kinds of amplification schemes were explored for covering the spectral region extending from 1.45 to 1.62 pm. This approach led in 2000 to a 3.28-Tb/s experiment in which 82 channels, each operating at 40 Gb/s, were transmitted over 3,000 km, resulting in a BL product of almost 10,000 (Tb/s)-km. Within a year, the system capacity could be increased to nearly 11 Tb/s (273 WDM channels, each operating at 40 Gb/s) but the transmission distance was limited to 117 km [29]. By 2003, in a record experiment 373 channels, each operating at 10 Gb/s, were transmitted over 11,000 km, resulting in a BL product of more than 41,000 (Tb/s)-km [30]. On the commercial side, terrestrial systems with the capacity of 1.6 Tb/s were available by the end of 2000. Given that the first-

generation systems had a bit rate of only 45 Mb/s in 1980, it is remarkable that the capacity of lightwave systems jumped by a factor of more than 30,000 over a period of only 20 years.

The pace slowed down considerably during the economic turndown in the lightwave industry that began in 2000 and was not completely over in 2004. Although commercial deployment of new lightwave systems virtually halted during this period, the research phase has continued worldwide and is moving toward the fifth generation of lightwave systems. This new generation is concerned with extending the wavelength range over which a WDM system can operate simultaneously. The conventional wavelength window, known as the C band, covers the wavelength range of 1.53 to 1.57 ¡im. It is being extended on both the long- and short-wavelength sides, resulting in the L and S bands, respectively. The traditional erbium-based fiber amplifiers are unable to work over such a wide spectral region. For this reason, the Raman amplification technique, well known from the earlier research performed in the 1980s [31], has been readopted for lightwave systems as it can work in all three wavelength bands using suitable pump lasers [32]—[35]. A new kind of fiber, known as the dry fiber, has been developed with the property that fiber losses are small over the entire wavelength region extending from 1.30 to 1.65 jum [36], Research is also continuing in several other directions to realize optical fibers with suitable loss and dispersion characteristics. Most noteworthy are photonic-crystal fibers whose dispersion can be changed drastically using an array of holes within the cladding layer [37]—[41 ]. Moreover, if the central core itself is in the form of a hole, light can be transmitted through air while guided by the photonic-crystal structure of the cladding [42]-[46]. Such fibers have the potential of transmitting optical signal with virtually no losses and little nonlinear distortion!

The fifth-generation systems also attempt to enhance the spectral efficiency by adopting new modulation formats, while increasing the bit rate of each WDM channel. Starting in 1996, many experiments used channels operating at 40 Gb/s [47]-[54], and by 2003 such 40-Gb/s lightwave systems had reached the commercial stage. At the same time, the research phase has moved toward WDM systems with 160 Gb/s per channel [55]—[58]. Such systems require an extremely careful management of fiber dispersion. Novel techniques capable of compensating chromatic and polarization-mode dispersions in a dynamic fashion are being developed to meet such challenges. An interesting approach is based on the concept of optical solitons—pulses that preserve their shape during propagation in a lossless fiber by counteracting the effect of dispersion through the fiber nonlinearity. Although the basic idea was proposed [59] as early as 1973, it was only in 1988 that a laboratory experiment demonstrated the feasibility of data transmission over 4,000 km by compensating fiber losses through Raman amplification [31]. Since then, many system experiments have demonstrated the eventual potential of soliton communication systems [60]. Starting in 1996, the WDM technique was also used for solitons in combination with dispersion-management and Raman amplification schemes [61]-[64], Many new modulation formats are being proposed for advancing the state of the art. Even though the lightwave communication technology is barely 25 years old, it has progressed rapidly and has reached a certain stage of maturity. Many books were published during the 1990s on topics related to optical communications and WDM networks, and this trend is continuing in the twenty-first century [65]-[80].

Figure 1.4: A generic optical communication system.

Figure 1.4: A generic optical communication system.

1.2 Components of a Lightwave System

As mentioned earlier, lightwave systems differ from microwave systems only in the frequency range of the carrier wave used to carry the information. Both types of systems can be divided into three major parts. Figure 1.4 shows a generic optical communication system consisting of an optical transmitter, a communication channel, and an optical receiver. Lightwave systems can be classified into two broad categories depending on the nature of the communication channel. The optical signal propagates unguided in air or vacuum for some applications [81]. However, in the case of guided lightwave systems, the optical beam emitted by the transmitter remains spatially confined inside an optical fiber. This text focuses exclusively on such fiber-optic communication systems.

1.2.1 Optical Transmitters

The role of optical transmitters is to convert an electrical signal into an optical form and to launch the resulting optical signal into the optical fiber acting as a communication channel. Figure 1.5 shows the block diagram of an optical transmitter. It consists of an optical source, a modulator, and electronic circuits used to power and operate the two devices. Semiconductor lasers or light-emitting diodes are used as optical sources because of their compact nature and compatibility with optical fibers. The source emits light in the form of a continuous wave at a fixed wavelength, say, Ao. The carrier frequency Vo is related to this wavelength as Vo = r/Ao, where c is the speed of light in vacuum.

In modern lightwave systems, Vo is chosen from a set of frequencies standardized by the International Telecommunication Union (ITU). It is common to divide the spectral region near 1.55 fim into two bands known as the conventional or C band

Figure 1.5: Block diagram of an optical transmitter.

and the long-wavelength or L band. The C band covers carrier frequencies from 191 to 196 THz (in steps of 50 GHz) and spans roughly the wavelength range of 1.53 to 1.57 fim. In contrast, L band occupies the range 1.57 to 1.61 jum and covers carrier frequencies from 186 to 191 THz, again in steps of 50 GHz. The short-wavelength or S band covering the wavelength region from 1.48 to 1.53 fim may be used for future lightwave systems as the demand for capacity grows. It is important to realize that the source wavelength needs to be set precisely for a given choice of the carrier frequency. For example, a channel operating at 193 THz requires an optical source emitting light at a wavelength of 1.5533288 jum if we use the precise value c = 299,792,458 km/s for the speed of light in vacuum.

Before the source light can be launched into the communication channel, the information that needs to be transmitted should be imposed on it. This step is accomplished by an optical modulator in Figure 1.5. The modulator uses the data in the form of an electrical signal to modulate the optical carrier. Although an external modulator is often needed at high bit rates, it can be dispensed with at low bit rates using a technique known as direct modulation. In this technique, the electrical signal representing information is applied directly to the driving circuit of the semiconductor optical source, resulting in the modulated source output. Such a scheme simplifies the transmitter design and is generally more cost-effective. In both cases, the modulated light is coupled into a short piece of fiber (called a pigtail) with a connector attached to its other end. Chapter 2 provides more details on how the optical signal is generated within an optical transmitter.

An important design parameter is the average optical power launched into the communication channel. Clearly, it should be as large as possible to enhance the signal-to-noise ratio (SNR) at the receiver end. However, the onset of various nonlinear effects limits how much power can be launched at the transmitter end. The launched power is often expressed in "dBm" units with 1 mW acting as the reference level. The general definition is (see Appendix A)

Thus, 1 mW is 0 dBm, but 1 ¿uW corresponds to —30 dBm. The launched power is rather low (less than —10 dBm) for light-emitting diodes but semiconductor lasers can launch power levels exceeding 5 dBm.

Although light-emitting diodes are useful for some low-end applications related to local-area networking and computer-data transfer, most lightwave systems employ semiconductor lasers as optical sources. The bit rate of optical transmitters is often limited by electronics rather than by the semiconductor laser itself. With proper design, optical transmitters can be made to operate at a bit rate of up to 40 Gb/s.

1.2.2 Communication Channel

The role of a communication channel is to transport the optical signal from transmitter to receiver with as little loss in quality as possible. Most terrestrial lightwave systems employ optical fibers as the communication channel because they can transmit light with losses as small as 0.2 dB/km when the carrier frequency lies in the spectral region

Transmitter Regenerator Regenerator Receiver


+2 0


    What is a lightwave system?
    2 years ago
  • girma
    What is meaning of B (mb/s) and l (km) in terrestrial lightwave system?
    2 years ago
  • jody samuel
    What are the generation of light wave systems?
    2 years ago

Post a comment