An algorithm is presented to view interpolation of dynamic events in real-time across time and space. Two temporal and two spatial flow fields are estimated from four images captured by two cameras at two different times. Hybrid gradient- and correlation-based motion estimation is used to compute optical flow fields with high density and accuracy. Based on the flow fields, texture coordinates of small textured squares are computed and a new image is composed at an arbitrary viewpoint and time. Real-time processing is possible through
vectorized implementation of computational demanding functions and visualization using OpenGL and standard graphics hardware. The spatio-temporal view interpolation algorithm is applicable to non-rigid events, does not use explicit 3D models, and requires no user input.