An approach to gaze estimation for wearable devices is proposed and its effectiveness is demonstrated. The proposed approach includes composing stereo vision system which is comprised of half transparent and half reflecting mirror and multi-camera, and its calibration procedure is easy and practical. This whole system can also run online and with little intervention from user by using half transparent and half reflecting mirror, and cameras above the eyes. Because of the application of stereo vision system, some feature points in the region around eyes can be solved. Thus, eye gaze is estimated directly by calculating the spatial coordinates of feature points which are extracted from pupil and eye corner. The calibration procedure is general, as no complex model of the human eye is utilized in this work, and it can be demonstrated as how to calibrate the angle between optical axis and directly measured gaze. The proposed approach is effectively to study visual attention with the help of gaze estimation experiments, and it is useful and practical for people operating in unstructured scenarios.