Due to increasing traffic demands telecommunication operators have to upgrade the transmission capacity of their networks. Since the success story of WDM in optical fiber based networks, component and system manufacturers as well as operators are dealing with the question if it is better to increase the number of WDM channels remaining at low channel bitrate or to enhance the channel line rate itself. The momentary situation is that already 10 Gbit/s based systems are installed for client traffic and are running properly.
By a comparison of the technical advantages and business cases for a lot of transmission scenarios, the 10 Gbit/s solution turns out to be the preferable solution if compared to 2.5-Gbit/s-based systems.
In the meantime 40GHz electronics has made severe progress so that now it seems to be possible to take the step towards the next hierarchy, the 40 Gbit/s channel rate. Nevertheless, many system manufacturers still wait with the market introduction of 40 Gbit/s, on the one hand because we observe a stagnant capacity demand this year, and on the other hand because the business case seems yet not to be competitive.
One of the reasons for this is the impact of different physical limitations of fast optical fiber transmission. While the step from 2.5 to 10 Gbit/s still did not raise severe technological problems for medium distances, this is completely different for ultra long haul systems and especially for all 40-Gbit/s-based systems. While nonlinear effects still can be sufficiently managed, phenomena like, e.g., polarization-mode dispersion (PMD), chromatic dispersion mismatch, and gain tilt of optical amplifiers play an important role. Chromatic dispersion and polarization effects may vary with time so that either passive or adaptive compensation schemes may be needed in order to realize sufficiently long transmission distances.
This paper will deal with different current solutions to overcome limitations from fibers and components, namely the use of special modulation formats, the use of pa