26 September 2013 Quantized embeddings: an efficient and universal nearest neighbor method for cloud-based image retrieval
Author Affiliations +
Abstract
We propose a rate-efficient, feature-agnostic approach for encoding image features for cloud-based nearest neighbor search. We extract quantized random projections of the image features under consideration, transmit these to the cloud server, and perform matching in the space of the quantized projections. The advantage of this approach is that, once the underlying feature extraction algorithm is chosen for maximum discriminability and retrieval performance (e.g., SIFT, or eigen-features), the random projections guarantee a rate-efficient representation and fast server-based matching with negligible loss in accuracy. Using the Johnson-Lindenstrauss Lemma, we show that pair-wise distances between the underlying feature vectors are preserved in the corresponding quantized embeddings. We report experimental results of image retrieval on two image databases with different feature spaces; one using SIFT features and one using face features extracted using a variant of the Viola-Jones face recognition algorithm. For both feature spaces, quantized embeddings enable accurate image retrieval combined with improved bit-rate efficiency and speed of matching, when compared with the underlying feature spaces.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Shantanu Rane, Shantanu Rane, Petros Boufounos, Petros Boufounos, Anthony Vetro, Anthony Vetro, "Quantized embeddings: an efficient and universal nearest neighbor method for cloud-based image retrieval", Proc. SPIE 8856, Applications of Digital Image Processing XXXVI, 885609 (26 September 2013); doi: 10.1117/12.2022286; https://doi.org/10.1117/12.2022286
PROCEEDINGS
11 PAGES


SHARE
RELATED CONTENT


Back to Top