A modern digital camera is not just a single sensor capturing light. It is an ensemble of different sensors which
capture independent contextual information about the photo shooting event. This is stored as metadata in the
image. In this paper, we demonstrate how the optical metadata (data related to the optics of the camera) can
be retrieved, interpreted and used along with content information for organizing and indexing digital photos.
Our model is based on the physics of vision and operation of a camera. We use our algorithm on images from
personal photo albums. Our results show that the optical metadata improves annotation performance and
decreases the search space for retrieval.