The advancement in face recognition algorithm has a strong relationship with the availability of face databases that exhibit varying factors reflecting real life scenarios. The GUCLF face database is the first of its kind that can strongly influence the advancement in face recognition technology. In this paper, we introduce and describe our new face samples database collected using Lytro light field camera. The database consists of 200 reference samples and 303 probe samples collected from 25 subjects. The reference samples are collected in the controlled conditions using Canon EOS 550D DSLR camera. While probe samples are captured using both conventional digital camera (Sony DSC-S750) and Lytro light field camera. The probe samples are captured in three different scenarios: indoor, corridor and outdoor to include all possible real life conditions. In addition to the database description, this paper also elaborates on possible uses of the collected database and proposes a testing protocol. Further, we also present the quantitative results from the baseline experiments using the Kernel Discriminant Analysis (KDA).
We present in this paper a sample quality control approach for the case using a mobile phone's camera as a fingerprint
sensor for fingerprint recognition. Our approach directly estimates the maximum ridge frequency orientation by the
amplitude-frequency features of the Fast Fourier Transform and takes the frequency features' difference in two
perpendicular orientations as a distinguishing feature for ridge-like patterns. Then a decision criterion which combines
the frequency components' energy and ridge orientation features is used to determine if an image block should be
classified as high-quality fingerprint area or not. The number of such high-quality blocks can thus be used to indicate the
whole fingerprint sample's quality. Experiments show this approach's effectiveness in distinguishing the high-quality
blocks from other low-quality ones or background area. Mapping the quality metric to the sample utility as derived from
the the NIST minutiae extractor "mindtct" function is also given to verify the approach's quality prediction effectiveness.
Keywords: Fingerprint, quality assessment, mobile phone camera
Image encryption process is jointed with reversible data hiding in this paper, where the data to be hided are modulated
by different secret keys selected for encryption. To extract the hided data from the cipher-text, the different tentative
decrypted results are tested against typical random distribution in both spatial and frequency domain and the goodnessof-
fit degrees are compared to extract one hided bit. The encryption based data hiding process is inherently reversible.
Experiments demonstrate the proposed scheme's effectiveness on natural and textural images, both in gray-level and
A robust fingerprint minutiae hash generation algorithm is proposed in this paper to extract a binary secure hash bit
string from each fingerprint minutia and its vicinity. First, ordering of minutiae points and rotation and translation
geometric alignment of each minutiae vicinity are achieved; second, the ordered and aligned points are diversified by
offsetting their coordinates and angles in a random way; and finally, an ordered binary minutia hash bit string is
extracted by quantizing the coordinates and angle values of the points in the diversified minutiae vicinity. The generated
hashes from all minutiae vicinities in the original template form a protected template, which can be used to represent the
original minutia template for identity verification. Experiments show desirable comparison performance (average Equal
Error Rate 0.0233 using the first two samples of each finger in FVC2002DB2_A) by the proposed algorithm. The
proposed biometric reference requires less template storage capacity compared to their unprotected counterparts. A
security analysis is also given for the proposed algorithm.
We investigate in this paper several possible methods to improve the performance of the bit-shifting operation based reversible image watermarking algorithm in the integer DCT domain. In view of the large distortion caused by the modification of high-amplitude coefficients in the integer DCT domain, several coefficient selection methods are proposed to provide the coefficient modification process with some adaptability to match the coefficient amplitudes’ status of different 8-by-8 DCT coefficient blocks. The proposed adaptive modification methods include global coefficient-group distortion sorting, zero-tree DCT prediction, and a low frequency based coefficient prediction method for block classification. All these methods are supposed to optimize the bit-shifting based coefficient modification process so as to improve the watermarking performance in terms of capacity/distortion ratio. Comparisons are presented for these methods in aspects of performance in terms of capacity/distortion ratio, performance stability, performance scalability, algorithm complexity and security. Compared to our old integer DCT based scheme and other recently proposed reversible image watermarking algorithms, some of the proposed methods exhibit much improved performances, among which the low frequency based coefficient prediction methods bear highest efficiency to predict the coefficient amplitudes’ status, leading to distinct improved watermarking performance in most aspects. Detailed experimental results and performance analysis are also given for all the proposed algorithms and several other reversible watermarking algorithms.
We propose a reversible embedding watermarking algorithm for georeferenced 2D-vectordata which provides a promising solution to GIS (geographic information system) data hiding and authentication applications with a high requirement of fidelity or bit-by-bit exactness with the original point coordinates.
The proposed scheme uses an 8-point integer DCT-transform to exploit the high correlation among neighboring coordinates in the same polygon. There exist two kinds of distributions of DCT-coefficients where the typical one shows energy concentration in the low frequency range while in the other case the highest frequency coefficient holds the maximum DCT-coefficient. In the first step of our scheme a distinction between these two cases has to be done. For this discrimination the DCT-coefficient of the highest frequency is compared with the coefficients of the lower frequency range. Only for the typical cases the information is embedded. This is accomplished with a bit-shift procedure where the DCT-coefficients of certain frequencies are shifted by one or two bits. The watermarking information is embedded in the resulting gaps. The frequencies are lying in the range not being used for the discrimination task of step one. Depending on a key there is an alternating sequence of shifts by one ore two bits.
We present a high capacity reversible watermarking scheme using companding technique over integer DCT
coefficients of image blocks. This scheme takes advantage of integer DCT coefficients' Laplacian-shape-like
distribution, which permits low distortion between the watermarked image and the original one caused by the bit-shift
operations of the companding technique in the embedding process.
In our scheme, we choose AC coefficients in the integer DCT domain for the bit-shift operation, and therefore the
capacity and the quality of the watermarked image can be adjusted by selecting different numbers of coefficients of
different frequencies. To prevent overflows and underflows in the spatial domain caused by modification of the DCT
coefficients, we design a block discrimination structure to find suitable blocks that can be used for embedding without
overflow or underflow problems. We can also use this block discrimination structure to embed an overhead of location
information of all blocks suitable for embedding. With this scheme, watermark bits can be embedded in the saved LSBs
of coefficient blocks, and retrieved correctly during extraction, while the original image can be restored perfectly.