Facial expressions and hand gestures are recognized as a part of human emotions especially in a feeling of fatigue signs. In computer vision research, a positioning of hand over face is one of the challenging problem caused by difficulty of the difference of skin color for hands and face. In this paper, we present a method for classifying six positions of the hand over face which is able to identify the signs of feeling fatigue for the visual display terminal (VDT) workers. We apply a deep learning method in order to compare with the methods used the face and skin colors detection, processing the edge detection and feature extraction algorithms. In addition, GoogleNet is used for training a data set made in the simulated VDT workers environment. The data set includes 1,440 images, the participants from several countries, Egypt, Japan, Bangladesh, Mongolia and Rwanda to cover a wide range of skin tones. The data set is categorized by six groups of the positions of hand over face. These groups consist of hands on a forehead, eyes, nose, mouth, right and left face. The experiments were performed using MATLAB to implement our proposed method. The system achieved average recognition ratio 99.3 % in all hand over face gestures.
In this paper, we propose a method of surveillance of the plant growth using the camera image. This method is able to observe the condition of raising the plant in the greenhouse. The plate which is known as HORIBA is prepared for extracting harmful insect. The image of HORIBA is obtained by the camera and used for processing. The resolution of the image is 1280×960. In first process, region of the harmful insect (fly) is extracted from HORIBA by using color information. In next process the template matching is performed to examine the correlation of shape in four different angles. 16 kinds of results are obtained by four different templates. The sum logical of the results is calculated for estimation. In addition, the experimental results are shown in this paper.
By carrying out marketing research, the managers of large-sized department stores or small convenience stores obtain the information such as ratio of men and women of visitors and an age group, and improve their management plan. However, these works are carried out in the manual operations, and it becomes a big burden to small stores. In this paper, the authors propose a method of men and women discrimination by extracting difference of the facial expression change from color facial images. Now, there are a lot of methods of the automatic recognition of the individual using a motion facial image or a still facial image in the field of image processing. However, it is very difficult to discriminate gender under the influence of the hairstyle and clothes, etc. Therefore, we propose the method which is not affected by personality such as size and position of facial parts by paying attention to a change of an expression. In this method, it is necessary to obtain two facial images with an expression and an expressionless. First, a region of facial surface and the regions of facial parts such as eyes, nose, and mouth are extracted in the facial image with color information of hue and saturation in HSV color system and emphasized edge information. Next, the features are extracted by calculating the rate
of the change of each facial part generated by an expression change. In the last step, the values of those features are compared between the input data and the database, and the gender is discriminated. In this paper, it experimented for the laughing expression and smile expression, and good results were provided for discriminating gender.