Nonnegative matrix factorization (NMF) is a recently developed method for dimensionality reduction, feature extraction, and data mining, etc. Currently, no NMF algorithm holds both satisfactory efficiency for applications and enough ease of use. To improve the applicability of NMF, we propose a new monotonic, fixed-point algorithm called FastNMF by implementing least-squares error-based nonnegative factorization essentially according to the basic properties of parabola functions. The minimization problem corresponding to an operation in FastNMF can be analytically solved just by this operation, which is far beyond all existing algorithms' power, and therefore FastNMF holds much higher efficiency, which is validated by a set of experimental results. For the simplicity of design philosophy, FastNMF is still one of the NMF algorithms that are the easiest to use and the most comprehensible. In addition, theoretical analysis and experimental results also show that FastNMF tends to converge to better solutions than the popular multiplicative update-based algorithms as far as approximation accuracy is concerned.