You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
8 May 2012Exploring point-cloud features from partial body views for gender classification
In this paper we extend a previous exploration of histogram features extracted from 3D point cloud images of human
subjects for gender discrimination. Feature extraction used a collection of concentric cylinders to define volumes for
counting 3D points. The histogram features are characterized by a rotational axis and a selected set of volumes derived
from the concentric cylinders. The point cloud images are drawn from the CAESAR anthropometric database provided
by the Air Force Research Laboratory (AFRL) Human Effectiveness Directorate and SAE International. This database
contains approximately 4400 high resolution LIDAR whole body scans of carefully posed human subjects. Success from
our previous investigation was based on extracting features from full body coverage which required integration of
multiple camera images. With the full body coverage, the central vertical body axis and orientation are readily
obtainable; however, this is not the case with a one camera view providing less than one half body coverage. Assuming
that the subjects are upright, we need to determine or estimate the position of the vertical axis and the orientation of the
body about this axis relative to the camera. In past experiments the vertical axis was located through the center of mass
of torso points projected on the ground plane and the body orientation derived using principle component analysis.
In a natural extension of our previous work to partial body views, the absence of rotational invariance about the
cylindrical axis greatly increases the difficulty for gender classification. Even the problem of estimating the axis is no
longer simple. We describe some simple feasibility experiments that use partial image histograms. Here, the cylindrical
axis is assumed to be known. We also discuss experiments with full body images that explore the sensitivity of
classification accuracy relative to displacements of the cylindrical axis. Our initial results provide the basis for further
investigation of more complex partial body viewing problems and new methods for estimating the two position coordinates for the axis location and the unknown body orientation angle.
The alert did not successfully save. Please try again later.
Aaron Fouts, Ryan McCoppin, Mateen Rizki, Louis Tamburino, Olga Mendoza-Schrock, "Exploring point-cloud features from partial body views for gender classification," Proc. SPIE 8402, Evolutionary and Bio-Inspired Computation: Theory and Applications VI, 84020L (8 May 2012); https://doi.org/10.1117/12.921880