30 May 2002 Using animation quality metric to improve efficiency of global illumination computation for dynamic environments
Author Affiliations +
Abstract
In this paper, we consider applications of perception-based video quality metrics to improve the performance of global lighting computations for dynamic environments. For this purpose we extend the Visible Difference Predictor (VDP) developed by Daly to handle computer animations. We incorporate into the VDP the spatio-velocity CSF model developed by Kelly. The CSF model requires data on the velocity of moving patterns across the image plane. We use the 3D image warping technique to compensate for the camera motion, and we conservatively assume that the motion of animated objects (usually strong attractors of the visual attention) is fully compensated by the smooth pursuit eye motion. Our global illumination solution is based on stochastic photon tracing and takes advantage of temporal coherence of lighting distribution, by processing photons both in the spatial and temporal domains. The VDP is used to keep noise inherent in stochastic methods below the sensitivity level of the human observer. As a result a perceptually-consistent quality across all animation frames is obtained.
© (2002) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Karol Myszkowski, Karol Myszkowski, Takehiro Tawara, Takehiro Tawara, Hans-Peter Seidel, Hans-Peter Seidel, } "Using animation quality metric to improve efficiency of global illumination computation for dynamic environments", Proc. SPIE 4662, Human Vision and Electronic Imaging VII, (30 May 2002); doi: 10.1117/12.469514; https://doi.org/10.1117/12.469514
PROCEEDINGS
10 PAGES


SHARE
Back to Top