An accuracy analysis is made of currently used near infrared spectroscopy (NIRS) algorithms, based on the Beer-Lambert law, for quantification of chromophore concentration changes in neonatal brain. In particular, the regression method and chromophores considered, as well as the influence of the wavelength-dependent extinction coefficients, on the accuracy of the computed concetration changes of Hb, HbO2, and Cytaa3 is investigated. The total least squares method, which considers errors in the specific extinction coefficients as well, is able to double the accuracy (compared to least squares), when appropriate weights, derived from the error variances, are added and enough wavelength measurements (> 100) are available. Furthermore, the influence of the number of wavelenghts used, as well as the importance of the choice of the wavelength sebset, is studied. It is shown that the accuracy significantly increases, up to 10 wavelengths. Additionally, optimizing the condition number of the extinction coefficient matrix allows to select small wavelength subsets that clearly outperform presently used sets in accuracy for all computed concentration changes. Finally, the influence of the chromphores, such as H2O is shown. H2O may only be included in the model when enough wavelengths (> 100) are used for NIRS measurement or when the chosen subset minimizes the condition number of the corresponding extinction coefficient matrix.