Spatially varying motion blur in video results from the relative motion of a camera and the scene. How to estimate accurate optical flow in the presence of spatially varying motion blur has received little attention so far. We extend the classical warping-based variational optical flow method to deal with this issue. First, we modify the data term by matching the identified nonuniform motion blur between the input images according to a fast blur detection and deblurring technique. Importantly, a downsample-interpolation technique is proposed to improve the blur detection efficiency, which saves 75% or more running time. Second, we improve the edge-preserving regularization term at blurry motion boundaries to reduce boundary errors that are caused by blur. The proposed method is evaluated on both synthetic and real sequences, and yields improved overall performance compared to the state-of-the-art in handling motion blur.