With the advent of digital signal processing (DSP) in optical transmitters and receivers, the ability to finely tune the ratio
of pre and post dispersion compensation can be exploited to best mitigate the nonlinear penalties caused by the Kerr
effect. A portion of the nonlinear penalty in optical communication channels has been explained by an increase in peak to average power ratio (PAPR) inherent in highly dispersed signals. The standard approach for minimizing these impairments applies 50% pre dispersion compensation and 50% post dispersion compensation, thereby decreasing average PAPR along the length of the cable, as compared with either 100% pre or post dispersion compensation. In this paper we demonstrate that simply considering the net accumulated dispersion, and applying 50/50 pre/post
dispersion is not necessarily the best way to minimize PAPR and subsequent Kerr nonlinearities. Instead, we consider
the cumulative dispersion along the entire length of the cable, and, taking into account this additional information, derive an analytic formula for the minimization of PAPR. Alignment with simulation and experimental measurements is
presented using a commercially available 100Gb/s dual-polarization binary phase-shift-keying (DP-BPSK) coherent modem, with transmitter and receiver DSP. Measurements are provided from two different 5000km dispersion managed
Submarine test-beds, as well as a 3800km terrestrial test-bed with a mixture of SMF-28 and TWRS optical fiber. This method is shown to deviate significantly from the conventional 50/50 method described above, in dispersion managed communications systems, and more closely aligns with results obtained from simulation and data collected from laboratory test-beds.