It has been proved that Sensor Pattern Noise (SPN) can serve as an imaging device fingerprint for source camera identification. Reference SPN estimation is a very important procedure within the framework of this application. Most previous works built reference SPN by averaging the SPNs extracted from 50 images of blue sky. However, this method can be problematic. Firstly, in practice we may face the problem of source camera identification in the absence of the imaging cameras and reference SPNs, which means only natural images with scene details are available for reference SPN estimation rather than blue sky images. It is challenging because the reference SPN can be severely contaminated by image content. Secondly, the number of available reference images sometimes is too few for existing methods to estimate a reliable reference SPN. In fact, existing methods lack consideration of the number of available reference images as they were designed for the datasets with abundant images to estimate the reference SPN. In order to deal with the aforementioned problem, in this work, a novel reference estimator is proposed. Experimental results show that our proposed method achieves better performance than the methods based on the averaged reference SPN, especially when few reference images used.
When an individual carries an object, such as a briefcase, conventional gait recognition algorithms based on average
silhouette/Gait Energy Image (GEI) do not always perform well as the object carried may have the potential of being
mistakenly regarded as a part of the human body. To solve such a problem, in this paper, instead of directly applying
GEI to represent the gait information, we propose a novel dynamic feature template for classification. Based on this
extracted dynamic information and some static feature templates (i.e., head part and trunk part), we cast gait recognition
on the large USF (University of South Florida) database by adopting a static/dynamic fusion strategy. For the
experiments involving carrying condition covariate, significant improvements are achieved when compared with other