Image registration is often used in the clinic, for example during radiotherapy and image-guide surgery, but also for general image analysis. Currently, this process is often very slow, yet for intra-operative procedures the speed is crucial. For intensity-based image registration, a nonlinear optimization problem should be solved, usually by (stochastic) gradient descent. This procedure relies on a proper setting of a parameter which controls the optimization step size. This parameter is difficult to choose manually however, since it depends on the input data, optimization metric and transformation model. Previously, the Adaptive Stochastic Gradient Descent (ASGD) method has been proposed that automatically chooses the step size, but it comes at high computational cost. In this paper, we propose a new computationally efficient method to automatically determine the step size, by considering the observed distribution of the voxel displacements between iterations. A relation between the step size and the expectation and variance of the observed distribution is then derived. Experiments have been performed on 3D lung CT data (19 patients) using a nonrigid B-spline transformation model. For all tested dissimilarity metrics (mean squared distance, normalized correlation, mutual information, normalized mutual information), we obtained similar accuracy as ASGD. Compared to ASGD whose estimation time is progressively increasing with the number of parameters, the estimation time of the proposed method is substantially reduced to an almost constant time, from 40 seconds to no more than 1 second when the number of parameters is 10<sup>5</sup>.