Impact of DCF Properties on System Design
Capítulo de Libro
Fecha:
2007Editorial y Lugar de Edición:
SpringerLibro:
Fiber-Based Dispersion Compensation (pp. 425-496)Springer
ISBN:
0387403477Resumen *
Dispersion compensating fiber (DCF) has become an important building block making up today’s high-capacity optical transport networks. To appreciate the importance of DCF for high-capacity lightwave systems, we briefly look at the historic evolution of optical fiber transmission. The first single-mode optical fibers to be fabricated with low loss were step-index silica fibers, now referred to as standard single-mode fibers (SSMFs) and specified in the International Telecommunication Union (ITU) standard G.652. In the late 1970-ies and early 1980-ies, transmission over such fibers was performed in the spectral window around 1300 nm, where this fiber’s chromatic dispersion (hereafter just referred to as “dispersion”) is lowest, and permitted loss-limited transmission over about 100 km at about 1 Gb/s using Fabry-Perot lasers. However, the minimum intrinsic loss of SSMFs at 1300 nm is still 0.4 dB/km, twice the minimum value of 0.2 dB/km found in the wavelength band around 1550 nm. But even at such low loss, and with the use of both distributed feedback (DFB) lasers and highly sensitive coherent detection, transmission distances were attenuation-limited to about 200 km at 1-Gb/s data rates. With the development of Erbium-doped optical amplifiers (EDFAs) operating in the 1550 nm window, the limitation from fiber loss was circumvented, and transmission over distances of thousands of kilometers became possible. However, since dispersion of SSMFs in the 1550-nm window amounts to around 17 ps/(nm/km), it was now the accumulated dispersion that limited transmission distances to a few hundred kilometers at 2.5 Gb/s. To overcome this dispersion limit, new optical fibers with a dispersion zero shifted from 1300 to the 1550-nm region were designed and fabricated (ITU standard G.653). Using such dispersion-shifted fibers (DSFs), the dispersion limit was pushed out to a few thousand kilometers without the need for dispersion-compensating fibers (DCFs). However, with the advent of wavelength-division multiplexing (WDM), it was discovered that the effects of Kerr fiber nonlinearities over fibers having low-dispersion values [0. 2 ps/(nm km)], such as DSFs, leads to signal distortions from four-wave mixing (FWM) and nonlinear mixing of signal and noise, which strongly limits transmission distance. As a result, non-zero dispersion-shifted fibers (NZDFs) were developed to provide a sufficient value of dispersion [2 ps/(nm km)] in the 1550-nm window to prevent FWM (ITU standard G.655). The presence of NZDFs and SSMFs in optical networks along with the increase in signal speed and growth in the bandwidth of optical amplifiers all contributed in making dispersion compensation needed in current and future optical networks. The need for dispersion compensation in the low-loss amplification window has been identified as early as 1980 to extend the dispersion-limited transmission distance. Among the early technologies used to demonstrate dispersion compensation were chirped fiber Bragg gratings, all-pass filters, and micro strip delay equalizers in combination with coherent detection. The first use of negative-dispersion optical fibers as dispersion compensators in system experiments has been demonstrated starting in 1993 where eight wavelength-division multiplexed (WDM) channels operating at 10 Gb/s were transmitted over 280 km of non-zero dispersion-shifted fibers (NZDFs). Many experimental demonstrations followed in terrestrial applications as well as in submarine systems. The capacity of WDM transport then increased dramatically as dispersion compensated transmission lines became efficient.The beginning of the deployment of 10-Gb/s-based WDM communication systems near the end of the 1990’s fostered the incorporation of dispersion compensation in various optical transmission systems. By the beginning of the 3rd millennium, deployment of 10-Gb/stechnologies had become widespread in backbone terrestrial networks, making dispersion compensation omnipresent in the fabric of the worldwide fiber-optic communication infrastructure. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The increased demand in transport capacity, first in backbone and later in metropolitan networks, and the cost reduction of 10-Gb/s transponders is what allowed the large-scale deployment of dispersion compensation to take place. Nowadays, fiber-optic communication systems are, de facto, designed with dispersion compensation built in, so as to accommodate transport at any bit rate from 2.5 Gb/s to 40 Gb/s. The need for dispersion compensation in the low-loss amplification window has been identified as early as 1980 to extend the dispersion-limited transmission distance. Among the early technologies used to demonstrate dispersion compensation were chirped fiber Bragg gratings, all-pass filters, and micro strip delay equalizers in combination with coherent detection. The first use of negative-dispersion optical fibers as dispersion compensators in system experiments has been demonstrated starting in 1993 where eight wavelength-division multiplexed (WDM) channels operating at 10 Gb/s were transmitted over 280 km of non-zero dispersion-shifted fibers (NZDFs). Many experimental demonstrations followed in terrestrial applications as well as in submarine systems. The capacity of WDM transport then increased dramatically as dispersion compensated transmission lines became efficient.The beginning of the deployment of 10-Gb/s-based WDM communication systems near the end of the 1990’s fostered the incorporation of dispersion compensation in various optical transmission systems. By the beginning of the 3rd millennium, deployment of 10-Gb/stechnologies had become widespread in backbone terrestrial networks, making dispersion compensation omnipresent in the fabric of the worldwide fiber-optic communication infrastructure. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The increased demand in transport capacity, first in backbone and later in metropolitan networks, and the cost reduction of 10-Gb/s transponders is what allowed the large-scale deployment of dispersion compensation to take place. Nowadays, fiber-optic communication systems are, de facto, designed with dispersion compensation built in, so as to accommodate transport at any bit rate from 2.5 Gb/s to 40 Gb/s. 0.4 dB/km, twice the minimum value of 0.2 dB/km found in the wavelength band around 1550 nm. But even at such low loss, and with the use of both distributed feedback (DFB) lasers and highly sensitive coherent detection, transmission distances were attenuation-limited to about 200 km at 1-Gb/s data rates. With the development of Erbium-doped optical amplifiers (EDFAs) operating in the 1550 nm window, the limitation from fiber loss was circumvented, and transmission over distances of thousands of kilometers became possible. However, since dispersion of SSMFs in the 1550-nm window amounts to around 17 ps/(nm/km), it was now the accumulated dispersion that limited transmission distances to a few hundred kilometers at 2.5 Gb/s. To overcome this dispersion limit, new optical fibers with a dispersion zero shifted from 1300 to the 1550-nm region were designed and fabricated (ITU standard G.653). Using such dispersion-shifted fibers (DSFs), the dispersion limit was pushed out to a few thousand kilometers without the need for dispersion-compensating fibers (DCFs). However, with the advent of wavelength-division multiplexing (WDM), it was discovered that the effects of Kerr fiber nonlinearities over fibers having low-dispersion values [0. 2 ps/(nm km)], such as DSFs, leads to signal distortions from four-wave mixing (FWM) and nonlinear mixing of signal and noise, which strongly limits transmission distance. As a result, non-zero dispersion-shifted fibers (NZDFs) were developed to provide a sufficient value of dispersion [2 ps/(nm km)] in the 1550-nm window to prevent FWM (ITU standard G.655). The presence of NZDFs and SSMFs in optical networks along with the increase in signal speed and growth in the bandwidth of optical amplifiers all contributed in making dispersion compensation needed in current and future optical networks. The need for dispersion compensation in the low-loss amplification window has been identified as early as 1980 to extend the dispersion-limited transmission distance. Among the early technologies used to demonstrate dispersion compensation were chirped fiber Bragg gratings, all-pass filters, and micro strip delay equalizers in combination with coherent detection. The first use of negative-dispersion optical fibers as dispersion compensators in system experiments has been demonstrated starting in 1993 where eight wavelength-division multiplexed (WDM) channels operating at 10 Gb/s were transmitted over 280 km of non-zero dispersion-shifted fibers (NZDFs). Many experimental demonstrations followed in terrestrial applications as well as in submarine systems. The capacity of WDM transport then increased dramatically as dispersion compensated transmission lines became efficient.The beginning of the deployment of 10-Gb/s-based WDM communication systems near the end of the 1990’s fostered the incorporation of dispersion compensation in various optical transmission systems. By the beginning of the 3rd millennium, deployment of 10-Gb/stechnologies had become widespread in backbone terrestrial networks, making dispersion compensation omnipresent in the fabric of the worldwide fiber-optic communication infrastructure. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The types of backbone terrestrial networks requiring dispersion compensation includes ultra long-haul (ULH, > 3000 km), long-haul (LH, 1000-3000 km) and regional (300-1000 km). A few years after its apparition in backbone networks, dispersion compensation also started to appear in regional and metropolitan optical networks (< 300 km) as they started to adopt 10-Gb/s technologies. The increased demand in transport capacity, first in backbone and later in metropolitan networks, and the cost reduction of 10-Gb/s transponders is what allowed the large-scale deployment of dispersion compensation to take place. Nowadays, fiber-optic communication systems are, de facto, designed with dispersion compensation built in, so as to accommodate transport at any bit rate from 2.5 Gb/s to 40 Gb/s. Información suministrada por el agente en SIGEVAPalabras Clave
dispersioncompensationopticalcommunication