A grid-based Bayesian array (GBA) for robust visual tracking has recently been developed, which proposes a novel method of deterministic sample generation and sample weighting for position estimation. In particular, a target motion model is constructed, predicting target position in the next frame based on estimations in previous frames. Samples are generated by gridding within an ellipsoid centered at the prediction. For localization, radial edge detection is applied for each sample to determine if it is inside the target boundary. Sample weights are then assigned according to the number of the edge points detected around the sample and its distance from the predicted position. The position estimation is computed as the weighted sum of the sample set. In this paper, we enhance the capacity of the GBA tracker in accommodating the tracking of targets in video with erratic motion, by introducing adaptation in the motion model and iterative position estimation. The improved tracking performance over the original GBA tracker are demonstrated in tracking a single leukocyte in vivo and ground vehicle target observed from UAV videos, both undergoing abrupt changes in motion. The experimental results show that the enhanced GBA tracker outperforms the original by tracking more than 10% of the total number of frames, and increases the number of video sequences with all frames tracked by greater than 20%.