It is well known that Principal Components Analysis (PCA) is optimal in the sense of Mean Square Error (MSE). However, the estimation based on MSE is sensitive to noise or outliers, therefore, it is not a robust estimator. In order to get a robust estimation, absolute error criterion (L1 norm) could be used, but it is not differentiable at the origin point; and minimax criterion (L(infinity ) norm) could be also applied, but only for batch learning. In this paper, a cost function is proposed for robust estimation of the principal components of random variables. This cost function is rooted in the M-estimators in robust statistics. It has the form (phi) (t) equals 1-exp[-(Beta) t2]/1+exp[-(Beta) t2]. It is easy to verify that this function is an even, nonnegative, and differentiable at any value t. Hence, it overcomes the drawback of the discontinuity of the M-estimators. We derived an on- line adaptation rule for both the weights and the slope (Beta) of the cost function in a linear network. With the adjustable (Beta) , the relative position between near-linearity and the saturation can be adapted based on the given random data. Simulation results showed that the representation error of PCA is much more smoothed using the new cost function than that of the original PCA with MSE.