To estimate the depth distribution of the absorption coefficient in a turbid medium, a new nonlinear inversion technique was developed. It can resolve important shortcomings of the nonlinear error of conventional linear inversion techniques. First, a turbid medium is divided into imaginary layers with arbitrary thickness. The spatial pathlength distribution (SPD) is obtained for each layer in the Monte Carlo simulation as a function of source-detector distance. In the integral operation using SPDs, we can obtain the absorption coefficient of each layer, or the depth distribution of absorption. This inversion process is based on the assumption that the light attenuation is linear with respect to the small pathlength of a photon. However, if we consider the variance of pathlength of many photons in each layer, then this assumption produces nonlinear error. We developed a technique to solve this problem in the following three steps. First, the initial values of absorption coefficient of each layer are obtained using a conventional linear inverse matrix assuming that variance of pathlength of many photons does not exist. Then, improved absorption coefficients are obtained with these initial values and the pathlength variance of many photons using the same matrix. In this way, the nonlinear error is corrected. Repeating this process, the improved absorption coefficient approaches a true value. The effectiveness of the proposed technique was confirmed in the Monte Carlo simulation. The effect of measurement noise was analyzed in the simulation.