28 August 2013 Visual fatigue measurement model in stereoscopy based on Bayesian network
Author Affiliations +
Optical Engineering, 52(8), 083110 (2013). doi:10.1117/1.OE.52.8.083110
Abstract
A stereoscopic visual fatigue measurement model based on Bayesian networks (BNs) is presented. Our approach focuses on the interdependencies between factors, such as contextual and environmental, and the phenomena of visual fatigue in stereoscopy. Specifically, the implementation of BN with the use of multiple features provides a systematic way to project and evaluate visual fatigue. Compared with another measurement model, our present BN-based scheme is more comprehensive. The test validation also indicates that our proposed model can be used as a reliable method for the visual fatigue inferring in stereoscopy.
Yuan, Kim, and Cho: Visual fatigue measurement model in stereoscopy based on Bayesian network

1.

Introduction

Recently, with various stereoscopy technologies commercialized, more three-dimensional (3-D) applications have been accepted as an element of modern life. Three-dimensional televisions (3-DTVs) and 3-D movie theaters are also becoming popular. However, the development of 3-D technology is facing some critical barriers, specifically stereoscopic visual fatigue. Visual fatigue caused by conflict between accommodation and convergence is unavoidable in most stereoscopic applications. As described in Refs. 1 and 2, although viewers are able to perceive a smooth 3-D watching experience after resolving the visual conflicts, a series of fatigue can be incurred (such as eyestrain and headaches), which is usually experienced after about 20 min of observation on 3-D displays. In order to ensure the safety of 3-D applications, it is essential to measure visual fatigue for stereoscopic images. Thus, many studies have investigated the visual fatigue of stereoscopy.34.5.6

Figure 1(a) describes the main measurement schemes existing in 3-D visual fatigue research: the mean opinion score (MOS)-based scheme, the contact and contactless physiological feature (CLPF)-based scheme [such as electroencephalogram (EEG), electrocardiograph (ECG), and eye movement (EM) detection]. As noted by Kim and Cho,7 the MOS is to measure subjective 3-D visual fatigue using questionnaires that have high correlation with the subjective 3-D visual fatigue. Such as question “How much do you feel visual fatigue?” and answers “comfortable, a little uncomfortable, uncomfortable.” The CLPF, as shown in Kim et al.,8 and Chae et al.9 designs a visual fatigue measurement model using eyes’ response curve and blink frequency. According to the result of eye tracking, they determine the level of visual fatigue in stereoscopy. The contact physiological feature (CPF) as described by Gomarus et al.10 and Fang et al.11 is a measurement model based on records of electrical activities to visual fatigue. The level of stereoscopic visual fatigue is determined by the reflection of bio-signals on human body.

Fig. 1

(a) Describes the main existing measurement schemes in recent research of three-dimensional (3-D) visual fatigue and (b) describes our proposed measurement model based on Bayesian networks (BNs).

OE_52_8_083110_f001.png

However, both subjective and objective measurements have their own advantages and defects. Unfortunately, in most studies, they ignore the influence of extraneous state variable (e.g., the human body and testing environment). For this reason, with the same test method on different subjects, the results of measurement may have a significant deviation. Therefore, we develop a measurement model based on a strong correlation structure (the BN structure) as depicted in Fig. 1(b) that can reliably recognize stereoscopic visual fatigue.

Figure 1(b) shows our proposed measurement model on a BN structure. The feature vector (node) is comprised on the BN tree. The results of each node are fused with BN inference algorithm, and then the final fusion result could be inferred according to the probability values of different variable states. To the best of our knowledge, this is the first adaptation of a probabilistic framework on the BN structure for inferring the 3-D viewer’s state of visual fatigue. As opposed to the previous works described in Refs. 4, 5, 8, 10, and 12, our proposed model does not employ a single physiological feature as a decision factor, but deals with probability values of different variables’ states from interdependencies between aspects of both observation and contextual features.

The organization of this article is as follows. After a brief introduction in Sec. 1, Sec. 2 introduces the background and related work for this study. Section 3 describes the BN-based 3-D visual fatigue measurement framework. Section 4 presents the experimental results. Finally, Sec. 5 summarizes the article.

2.

Background and Related Work

2.1.

Visual Fatigue Description in Stereoscopic

A binocular vision is produced when we use two separate images corresponding to the left and right eyes, although slightly different, merged in viewer’s brain to build a common impression.13 Hodges and McAllister14 describe the method of right and left perspective view in the 3-D display. Based on binocular parallax, the 3-D screen that can be implemented, relies on the format of the image presented and the viewing format. Figure 2 illustrates the watcher experiencing a stereoscopic sensation on images depending on presenting the appropriate view to each eye on a 3-D screen. Also, by improving depth perception, we can feel an added realism for stereoscopy. Although stereoscopic imagery can be presented on 3-D displays, it violates the relationship of natural viewing in the real world. In Fig. 2, the viewer observes a real object or an image on a two-dimensional (2-D) device, the eyes accommodate (focus on) and converge to a specific point. Accommodate distance matches with the convergence distance. Conversely, a viewer obtains a stereoscopic image on 3-D display, the remaining focus point is also on the plane of screen, while the eyes convergances of the image are located at a different distance. Because of the breakdown of the relationship between the accommodation and convergence, a visual discomfort is caused.

Fig. 2

Comparison between stereogram viewing and natural viewing.

OE_52_8_083110_f002.png

For 3-D comfort evaluation, Choi et al.15 identify some factors to capture the spatiotemporal characteristics of disparity. The prediction of visual comfort is determined by factors fusing. Figure 3 illustrates types of disparity during stereoscopic viewing. Two disparities are indicated on the coordinate plane, positive (uncrossed) and negative (crossed) disparities by blue and red zone.13 In Fig. 3, the horizontal gray line position of display represents zero disparity planes. A zero disparity plane is a converged domain of stereoscopic imaging, and also the zero disparity area is commonly referred to as a comfortable zone of stereoscopic imaging.16,17 Depending on the stereoscopic disparity, different 3-D imaging positions can be implemented, such as in front of or behind the screen. Stereoscopic disparity refers to the difference in image location of one object viewed by the left and right eyes. When a 3-D camera captures a stereoscopic image, each lens separately converges on the main object, and generates stereoscopic disparity. The main object can be seen as a single image, but the background would be seen as double images with disparity.

Fig. 3

Relationship between (a) positive disparity and (b) negative disparity; (c) is natural scene.

OE_52_8_083110_f003.png

In Fig. 3(a), the positive disparity in the stereoscopic image corresponds to the uncrossed line. In Fig. 3(b), the negative disparity corresponds to the crossed line. The negative disparity exhibits crosstalk that occurs between accommodations of each eye. In addition, the negative disparity shows a larger disparity and object size than positive disparity, since the imaging in negative disparity is closer than in positive disparity. This phenomenon is related to the geometry of a binocular viewing. Therefore, negative disparity can incur more visual fatigue than positive disparity.17,7 Yilmaz and Gudukbay18 point that the crosstalk (or ghosting effect) is the faded image viewed by the untargeted eye. This effect is undesirable because it may cause visual fatigue and other problems. Gudukbay and Yilmaz19 indicate that a more comfort stereo view can be achieved in terms of reduced crosstalk (or ghosting effect).

2.2.

Visual Fatigue Measurement Model Description

Body fatigue can be easily tracked from observable physiological features.20,21 This scheme is considered the relatively objective method for visual measurement. Physiological features may be classified into: The contactless and the contact features. Contactless features contain the EMs, head movement, etc., and these movements can be easily detected from a real-time monitor. Contact features contain the brain activity, heart rate variability, etc., and these movements can be detected by EEG, ECG, and other bio-sensor systems.

The CLPF-based scheme focuses on inferring the fatigue from the contactless features. Ji et al.22 demonstrate that the human in fatigue should exhibit some visual cues in long-time visual experiments. Horng et al.23 present a fatigue measurement algorithm depend on the eye tracking and dynamic matching. Kim et al.24 construct a neural network-based scheme for fatigue recognition by detecting the movement of the mouth and eyes, respectively.

The CPF-based scheme focuses on inferring the fatigue from the contact features. For example, the EEG can represent abundant information on the human cognitive states, according to the detection in the major EEG bands (δ, θ, α, and β). Lal et al.25 present a fatigue recognition algorithm on different levels of EEG bands. Also, Jung et al.,26 and Wilson and Bracewell27 propose a method to estimate and predict the fatigue level based on the EEG power spectrum estimation and fuzzy neural network model. According to the main electroencephalography (EEG) activities (δ, θ, α, and β) for 52 subjects (36 males and 16 females) during fatigue measurement, Budi et al.21 found that δ and θ activities is stable over time, but there is a slight decrease for activity of α, and a significant decrease for activity of β. For the other important CPF ECG signal, in Refs. 28 and 29 fatigue recognition refer to heart exhibition on low frequency (LF), very low frequency (VFH), high frequency (HF), and the LF/HF ratio.

Previous physiological feature-based schemes focus only on a single specific aspect. That may lead to inaccurate results because the fatigue is not directly observable, which can only be inferred from the information available. There are a number of reasons for the inaccuracies using the scheme mentioned above: (1) Contextual factor. Fatigue recognition contains much subjectivity that cannot always reflect the real objectivity. (2) Environment factor. For example, when human is present in a not well acquainted environment,30 an inaccurate interpretation of the facial expression (such as eye and mouth movement) would be caused, especially for the introverted persons. Therefore, to fuse as many as possible features from uncertain events is a better way to make an accurate inference.31 Further, Picard et al.32 figured out that it was necessary to fuse the contextual and physiological features and the human performance in order to make the fatigue measurement more reliable.

By considering the evidence and beliefs of the contextual information and physiological features from measurement, Ji et al.22 construct a BN-based algorithm to infer and predict the fatigue of human beings, enhancing the reliability of fatigue detection. Yang et al.20 develop a BN-based fatigue recognition model to be used in systems that evolve over time. However, such visual fatigue network in Refs. 20, 22, 3334.35.36 mostly apply to driving, visual display terminals monitoring, and marine industry. To the best of our knowledge, there is no relating issue on stereoscopic visual fatigue based on probabilistic framework or BN. Eventually, considering the states and beliefs of contextual information and physiological features, a novel probabilistic framework-based (the BN-based) measurement model for stereoscopic visual fatigue is proposed in this article.

2.3.

Bayesian Networks Method Description

Hubbard37 describes uncertainty as the lack of certainty, a state of having limited knowledge where it is difficult to infer precisely the existing state or future outcome. Decision making is generally recognized by engineers as an indispensable part of the whole engineering design process. Just as most fatigue recognition, the stereoscopic visual fatigue measurement is also comprised of a number of uncertainty factors. Because of the fact that uncertainty has a significant impact on judgment, the engineer tries to manage uncertainty via compound methods and intelligent systems. The most reliable tool for modeling uncertainty is the use of probabilities theory.35

One of the most prevalent and effective graphical models to manage uncertainty is the BNs.38 A BN, belief network or directed acyclic graphical model, is a probabilistic graphical model that correlates the conditional dependencies of a number of random variables with the use of a Directed Acyclic Graph (DAG). A DAG is a directed graph with no directed cycles. The formation of a DAG includes vertices and directed edges, each edge connecting one vertex to another so that a cyclic route is impossible to appear. Figure 4 shows an implementation of DAG in our application.

Fig. 4

The detailed BN structure used to measure visual fatigue in stereoscopy.

OE_52_8_083110_f004.png

The basic concept in the Bayesian treatment of certainties in causal networks is conditional probability. Whenever a statement of the probability P(A) of an event A is given, then it is given conditioned by other known factors. Therefore, according to the feature vector mentioned above and conditional probability, the probability of estimated fatigue is obtained through Bayesian theorem in Refs. 20 and 39:

(1)

P(Z=z|E)=P(Z=z|ec)P(eo|Z=z)j=12P(Z=zj|ec)P(eo|Z=zj)

  • Z represents the fatigue node, and z represents the fatigue state value.

  • E represents the evidences {ec,eo}, ec represents the contextual evidences and eo represents the observations.

  • P(Z=z|E) represents the posterior probability of Z given E, and hence it is the new estimation for the probability that the hypothesis Z is true, taking evidence E into account.

  • P(eo|Z=z) represents the conditional probability of observable evidence eo, if the hypothesis Z turns out to be true.

  • P(Z=z|ec) represents the prior probability of hypothesis before providing contextual evidences.

  • i=12P(Z=zj|ec)P(eo|Z=zj) represents the marginal probability, which is the prior probability under all possible fatigue hypotheses.

3.

BN-Based Visual Fatigue Measurement Implementation

To set up a fatigue recognition model based on the discrete BN, the first step is to specify the nodes of the discrete BN. In other words, we need to specify the contextual, contactless and contact physiological variables that are used to construct the discrete BN. The second step is to determine the values that are used to represent the discrete variables. The third step is to configure the states of the variables, to calculate the conditional probability, and to evaluate the visual fatigue in stereoscopy. In the following, these steps are described.

3.1.

Specifying the Nodes of the Discrete Bayesian Networks

As remarked in Fig. 4, there are many contextual and physiological features related to fatigue. Among these features, some of them lead to more contributions to fatigue while others have lesser contributions to the fatigue. For the sake of simplicity but without any loss of generality, we only select those contextual and physiological features that have immediate relations with the fatigue measurement. In particular, the following features are described in step 1. For the contextual, hidden and observable selected in Fig. 4, the fuzzy method is used to determine the discrete values for each variable based on a set of heuristic knowledge rules.40

3.1.1.

Stereoscopic contextual features node

Binocular disparity (BD) node. Lambooij et al.41 noted that the human eye experiences conflict between the accommodation and vergence that mostly affect visual fatigue in stereoscopy. Ohzawa et al.13 classified the disparity as positive disparity and negative disparity. Kim and Cho7 suggested a simplified relative visual fatigue metric that considers the “accommodation and vergence” factors that can be calculated by the disparities in stereoscopy. We are motivated by Ohzawa et al.13 and Kim and Cho.7 As exhibited in Fig. 5, several sets of different stereoscopic instances were provided to evaluate visual fatigue. The different sample image in the negative disparity zone and in the positive disparity zone has been shown in experiment for 3-D fatigue measuring.

Fig. 5

(a) Test images to evaluate visual fatigue and (b) the graph of mean opinion score result with various Avg. converged objects disparity and comfort zone in stereoscopy in Ref. 7. Note: The valuation is based on the five grades (1 to 5); 1: very comfortable, 2: comfortable, 3: a little uncomfortable, 4: uncomfortable, 5: very uncomfortable.

OE_52_8_083110_f005.png

Display quality (DQ) node. As Michel et al.12 described, with 3-DTV and 3-D cinema at the extremes of the screen size spectrum, comfort zone issues for stereoscopy are different when trying to use them to present the same content. Apparently, resolution and luminance are also key elements of display. For example, an unsuitable resolution and luminance also causes a visual discomfort. However, among these features, the screen size has immediate relations with the DQ on issues that are our concern as mentioned in Refs. 12 and 42. Therefore, the display size is taken as a main contextual features corresponding to the DQ nodes.

3.1.2.

Nonstereoscopic contextual features (NSCF) node

Sleeping quality (SQ) node. SQ is immediately associated with the fatigue.29 Therefore, we take the SQ as a nonstereoscopic node on the BN DAG (Fig. 4). Gomarus et al.10 noted that the SQ is related to such quantities as the duration of sleep, difficulty in falling asleep at night, the sleeping environment, and so on. Among them, the sleeping time and the sleeping satisfaction were taken as the key contributors to the SQ, since a certain minimum sleep time is necessary for everyone, and also whether the SQ is satisfied depends on the human’s subjective judgments.

Circadian rhythm (CR) node. CR is also a cardinal factor in the fatigue measurement. Lal and Craig43 identified that the CR plays an important role in the study of the fatigue recognition. There are two sleep peaks each day, one of which appears after midnight, and other appears approximately after lunch time. Humans are easily fatigued during these peak periods.

Experiment environment (EE) node. EE is the last selected factor by the proposed method. Apparently, light, noise, temperatures, and other EE factors have a strong relation with fatigue measurement, especially the light influence to the viewer on the screen. Therefore, we take the EE as a nonstereoscopic node on BN graph.

3.1.3.

Observation state node

EEG node. In the frequency domain, the EEG mainly includes the δ band (0.5 to 4 Hz) corresponding to the sleep activity, the θ band (4 to 7 Hz) that is related to drowsiness, the α band (8 to 13 Hz) corresponding to relaxation and creativity, and the β band (13 to 25 Hz) that corresponds to activity and alertness. Budi et al.21 note that the β band has strong relations with visual fatigue. Through the variations in the EEG tracing, the power of β frequencies increase as watching duration increases, and it is much stronger in 3-D rather than in 2-D conditions, as shown in Fig. 6(a). Li et al.44 identified that the 3-D content affected the power of brain wave in the β frequency. The β power was stronger at viewing the 3-D contents. Also, subjective results also showed more strong visual fatigue in the 3-D condition than in the 2-D condition. Therefore, we take the waveband magnitude of the EEG spectrum in the β band as an observable variables node in BN diagram.

Fig. 6

Physiological response: 3-D and 2-D compared, 3-D viewed first. (x-axis is time, y-axis is magnitude).

OE_52_8_083110_f006.png

EM node. The EM-based visual fatigue measurement is related to such quantities such as eye gaze, eye blink, and eyelid closure. These manifestations are described in Ref. 45 for the fatigue detection. Zhu and Lan22 pointed out that EM is a reliable and valid determination of fatigue. In Ref. 46 the percentage of eyelid closure over the pupil in a given time (PERCLOS) is indicated. It illustrates that the viewer is possibly in a state of fatigue if the eyes are at least 80% closed during a period of 1 min. Thus, the proportion of the eye-closed time was taken in this article as one of the observable variables corresponding to the nodes of the BN diagram.

3.2.

Determining Discrete Variables in Each Node

The construction of BN has two tasks: one is the determination of nodes; and the other is the determination of its parent discrete variables and their states for each note. In the previous step, the related nodes are determined. While in the following section in step 2, we describe the discrete variables and their states that indicate the likelihood of a particular feature that contributes to the fatigue.

Visual fatigue node: Z=[Z1,Z2] in which Z1 and Z2 represent the fatigue and no-fatigue states, respectively.

Contextual features node: X=[X1,X2,X3] represents the nonstereoscopic factor node state, in which X1, X2, and X3 represent the sleep quality, CR and EE, respectively. Here, X1=[X11,X12] in which X11 and X12 represent the sleep parameters, including the sleep time and sleep satisfaction. Y=[Y1,Y2] represents the stereoscopic factor node, in which Y1 and Y2 represent the binocular disparity and DQ, respectively.

Observation features node: O=[O1,O2] represents the observation features node, in which O1 represents the CLPF (e.g., EM), and O2 represents the CPF (e.g., EEG).

As remarked in Fig. 4, zk, xij, yij, and oij denote the specific values taken by Z=[Z1,Z2], X=[X1,X2,X3], Y=[Y1,Y2], and O=[O1,O2], respectively. In Fig. 4 the variables, together with the directed edges, form the DAG. P(xij) represents the probability of the sleep quality node states {x11=good,x12=bad}, CR node states {x21=active,x22=drowsy} and EE node states {x31=comfortable,x32=uncomfortable}; P(yij) represents the probability of the binocular disparity node states {y1j=disparity zone} and DQ node states {y21=small,y22=large,y23=ex-large}; P(oij) represents the probability of the contact physiological node states (EEG node) {o21=decrease,o22=no-change,o23=increase} and contactless physiological node states (EM node) {o11=large,o12=medium,o13=small}.

3.3.

Calculating Bayesian Networks

Assume that the evidences from the contextual nodes are represented as eX,Y={eXYij}, and the evidences from the observable nodes are represented as eO={eOij}, where eXYij represents the evidence of the i’th contextual node with the j’th state value (xij and yij), and eOij represents the evidence of the i’th observable node with the j’th state value (oij). e={eXY,eO} as evidences from the contextual factor and observable feature nodes, respectively. In Eqs. (2) and (3), P(Z=zk|eXY) is the prior probability of visual fatigue Z that was inferred before the parents’ contextual evidence was available. P(eO|Z=zk) is the conditional probability of observable evidence eO, if the parent visual fatigue Z turns out to be true.

Then the conditional probability of Z given the occurrence of the eXY node can be written as in Ref. 39

(2)

P(Z=zk|eX,Y)P(Z=zk|eXi,j)P(Z=zk|eYi,j)=[i=12j=12l=12P(Z=zk|x1i,x2j,x3l)P(x1i)P(x2j)P(x3l)]×[i=117j=13P(Z=zk|y1i,y2j)P(y1i)P(y2j)]k=1,2

The conditional probability of eO given the occurrence of node Z can be written as in Ref. 39

(3)

P(eo|Z=zk)P(eo1,j|Z=zk)P(eo2,j|Z=zk)=[m=13P(eo1,j|o1m)P(o1m|Z=zk)]×[n=13P(eo2,j|o1n)P(o1n|Z=zk)]k=1,2andj=1,2,3.

According to the BN theorem,4 the conditional probability of node Z given the occurrence evidence can be obtained by combining Eqs. (2) and (3); and it can be written as in Ref. 39.

(4)

P(Z=zk|e)=P(Z=zk|eX,Y)P(eo|Z=zk)i=12P(Z=zi|eX,Y)P(eo|Z=zi)k=1,2,
where i=12P(Z=zi|eXY)P(eO|Z=zi) is the marginal probability, which is the prior probability under all possible hypotheses of visual fatigue Z.

4.

Simulation Results and Discussion

In this work, in order to acquire the conditional probabilities information for each node, we employ some previous research methods from several literatures. For example, the conditional probabilities information for the BD and DQ nodes are obtained from Refs. 12 and 7. The conditional probabilities information for the CR, SQ, and EE nodes is obtained from Refs. 20, 22, 29, 32, 4748.49.50. The conditional probabilities information for the EEG and EM nodes is obtained from Refs. 5, 20, and 8. However, some probabilities cannot be directly obtained from these studies; we adopted similar acquisition methods based on our experiments. For instance, binocular disparity comfort judgment is mainly based on personal satisfaction, due to the difference of visual sensing for each person. Here, subjective feeling (like MOS) is considered to be relatively high. In order to obtain this data set, we adopt a statistical analysis scheme to acquire them based on Ref. 7. Finally, depending on these efforts, all probabilities in BN model have been acquired which are shown as following. Table 1 describes the conditional probability that BD node states is the main factor of visual fatigue in stereoscopy. Table 2 describes the conditional probability for visual fatigue as the states of CR, SQ and EE. Table 3 describes the conditional probability for EEG and EM, respectively, as the event of visual fatigue takes place simultaneously.

Table 1

Conditional probability for fatigue node with BD.

BD negativeFatigue nodeBD positiveFatigue node
NormalFatigueNormalFatigue
−800.050.9500.980.02
−700.110.89100.950.05
−600.380.62200.940.06
−500.570.43300.910.09
−400.690.31400.910.09
−300.810.19500.890.11
−200.870.13600.860.14
−100.930.07700.820.18
00.980.02800.750.25

Table 2

Conditional probability for fatigue node with CR, SQ, and EE.

CR nodeSQ nodeEE nodeFatigue node
NormalFatigue
ActiveGoodComfortable0.950.05
Uncomfortable0.850.15
BadComfortable0.730.27
Uncomfortable0.490.51
DrowsyGoodComfortable0.230.77
Uncomfortable0.120.88
BadComfortable0.110.89
Uncomfortable0.020.98

Table 3

Conditional probabilities for EM and EEG given fatigue.

FatigueEEG nodeEM node
DecreaseNo-changeIncreaseLargeMediumSmall
Fatigue0.900.080.020.940.050.01
Normal0.020.080.900.010.050.94

With the help of the System Neuroscience Laboratory at Sungkyunkwan University, we obtained the EEG and EM data sets. Here, we used EM tracking system called Eyelink II to measure at the 500 Hz temporal resolution. Twenty students from Sungkyunkwan University volunteered to participate in the experiments. Each participant was asked to watch the test 3-D image at different disparities on 3-DTV, and no break or rest was permitted during the 25 min experiment. Due to display limitations (our research only focus on the 3-D-HDTV application), we cannot include a variety of DQ requirements. The EEG and EM signals of each participant were collected at a rate of 1 sample/min. Then results were processed based on the statistical properties to form the evidence data sets that are needed to infer the viewer fatigue estimation. For example, according to the statistical properties of the contactless physiological data from the participants, if the PERCLOS value of EM is equal to 85, P(eO1,1)=0.89, P(eO1,2)=0.42, P(eO1,3)=0.18; and for the contact physiological data, if the EEG signal indicates that the decreases of β rhythms are large, P(eO2,1)=0.90, P(eO2,2)=0.20, P(eO2,3)=0.10.

In order to obtain the probability for CR, SQ, and EE, we adopted a statistical analysis-based questionnaire that mainly concerned the information about the CR, SQ, and EE state. The questionnaires were distributed among the twenty students before the simulation experiment. There are two groups of probability for CR and SQ. For the first group simulation, we required 20 students who did not have any kind of sleep disorder to maintain a relatively good SQ state before the test day, so the probability for SQ were P(x11)=0.87 and P(x12)=0.13. We asked the volunteers to participate in the simulation test from 8:30 to 11:30 AM, so the probabilities for CR were P(x21)=0.85 and P(x22)=0.15. For the second group simulation, some of the volunteers were deprived of a good sleep during the previous night (e.g., sleep time was less than 6 h), and we asked them to participate in the simulation test from 1:00 to 2:30 PM the next day. Then the probabilities for SQ and CR were P(x11)=0.37, P(x12)=0.63, P(x21)=0.25, and P(x22)=0.75. In our experiment, EE was relatively good, and the probabilities for EE were P(x31)=0.80 and P(x32)=0.20.

A partial test image was shown in Fig. 5(a). We adopted a different parallax pairwise comparison in a stereoscopy for a fair evaluation. Figure 5(b) drew the MOS result from the total results with various averages of the converged objective disparity. We obtained a relatively accurate visual fatigue from the validated MOS evaluation in Ref. 7. MOS is a common evaluation method for stereoscopy visual fatigue. Therefore, we decided to fit a curve from these results as a contrast database in our simulation. From Fig. 5(b) we can observe that the disparity of the comfortable zone is between 30 and disparity 70.

In Fig. 7(a), the measurement results are calculated with various converged objective disparities, based on the SQ, CR, and EE probabilities P(x11)=0.87, P(x12)=0.13, P(x21)=0.85, P(x22)=0.15, P(x31)=0.80, and P(x32)=0.20. In Fig. 7(b), the results are based on the different SQ and CR probabilities P(x11)=0.37, P(x12)=0.63, P(x21)=0.25, and P(x22)=0.75. From Fig. 7(b) we can observe that when we include an SQ and CR factor under a worse state to infer the viewer’s fatigue, the estimation will bring a large deviation in measuring the stereoscopic fatigue. In order to intuitively understand the results, we can also obtain a validation from the mean absolute error (MAE). Here, MEA7(a)=0.0848 and MEA7(b)=0.2782. Thus, the measurement of the visual fatigue in stereoscopy is influenced by other factors (nonstereoscopic factors). If we ignore the nonstereoscopic contextual features factor, the measurement performance for visual fatigue is unreliable in stereoscopy, which can be explained by the fact that the MAE in Fig. 7(b) is 0.2782, while the MAE in Fig. 7(a) is 0.0848.

Fig. 7

(a) Visual fatigue measurement results in stereoscopy based on BN model with good sleeping quality (SQ) and Circadian rhythm (CR) states; and (b) with relative bad SQ and CR states.

OE_52_8_083110_f007.png

5.

Conclusion

We proposed a BN-based measurement model for stereoscopic visual fatigue estimation. Two important conclusions can be drawn from this study: (1) multiple features, including the stereoscopic contextual, nonstereoscopic contextual, contact physiological, and CLPFs were used to infer the viewer’s fatigue, providing a wide coverage of the categories of features. Covering more nodes in the BN that imply fatigue recognition helps to infer the fatigue more reliably and accurately. Especially, most previous studies have ignored the influence from condition variables such as CR, SQ, and EE. (2) Furthermore, the contactless physiological and CPFs are two important observation features for fatigue recognition. The test validation indicates that based on EM and EEG model the visual fatigue in stereoscopic can be accurately measured. It would be of significant interest to extend the current measurement model to handle more practical situations from various 3-D devices. We also have an interest in how to improve the subjective factors in determining the probability.

Acknowledgments

This work is supported by Ministry of Trade, Industry and Energy (MOTIE) Foundation of the World-Class300 Project: Development of automated manufacturing robot system technology integrating with the 6 Degree of Freedom (DOF) robot mechanism and the S/W platform for assembling mobile Information Technology (IT) products. (10043213).

References

1. 

D. M. Hoffmanet al., “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vision 8(3), 1–30 (2008).1534-7362http://dx.doi.org/10.1167/8.3.33Google Scholar

2. 

K. UkaiaP. Howarth, “Visual fatigue caused by viewing stereoscopic motion images: background, theories, and observations,” Displays 29(2), 106–116 (2008).DISPDP0141-9382http://dx.doi.org/10.1016/j.displa.2007.09.004Google Scholar

3. 

S. Yanoet al., A Study of Visual Fatigue and Visual Comfort for 3D HDTV/HDTV Images, Displays, pp. 191–201, Elsevier, Amsterdam (2002).Google Scholar

4. 

H. C. O. Liet al., “Method of measuring subjective 3D visual fatigue: a five-factor model,” in Digital Holography 2008, Optical Society of America, Washington, DC (2008).Google Scholar

5. 

J. H. YuB. H. LeeD. H. Kim, “EOG based eye movement measure of visual fatigue cause by 2D and 3D displays,” in IEEE-EMBS Int. Conf. Biomedical and Health Informatics (BHI2012), pp. 305–308, IEEE (2012).Google Scholar

6. 

J. S. Choiet al., “Visual fatigue modeling and analysis for stereoscopic video,” Opt. Eng. 51(1), 017206 (2012).OPEGAR0091-3286http://dx.doi.org/10.1117/1.OE.51.1.017206Google Scholar

7. 

J. G. KimJ. D. Cho, “Simplified relative model to measure visual fatigue in a stereoscopy,” in IEICE Trans. Fundamentals of Electronics, Communications and Computer Sciences, Vol. E94A, pp. 2830–2831, IEICE (2011).Google Scholar

8. 

D.Y. Kimet al., “Stereoscopic visual fatigue measurement based on fusional response curve and eye-blinks,” in 17th Int. Conf. Digital Signal Processing (DSP), pp. 1–6, IEEE (2011).Google Scholar

9. 

H. B. Chaeet al., “Three-dimensional display system using a variable parallax barrier and eye tracking,” Opt. Eng. 50(8), 087401 (2011).OPEGAR0091-3286http://dx.doi.org/10.1117/1.3607962Google Scholar

10. 

K. Gomaruset al., “The effects of memory load and stimulus relevance on the EEG during a visual selective memory search task: an ERP and ERD/ERS study,” Clin. Neurophysiol. 117(4), 871–884 (2006).CNEUFU1388-2457http://dx.doi.org/10.1016/j.clinph.2005.12.008Google Scholar

11. 

G. Fanget al., “NeuroGlasses: a neural sensing healthCare system for 3D vision technology,” IEEE Trans. Inf. Technol. Biomed. 6(2), 1–7 (2011).ITIBFX1089-7771Google Scholar

12. 

B. Michel, “Production issues with 3D content targeting cinema, TV, and mobile devices,” European digital cinema forum, http://www. edcf. net/3d. html (2009).Google Scholar

13. 

OhzawaG. DeangelisR. Freeman, “Stereoscopic depth discrimination in the visual cortex: neurons ideally suited as disparity detectors,” Science 249(4972), 1037–1041 (1990).SCIEAS0036-8075http://dx.doi.org/10.1126/science.2396096Google Scholar

14. 

L. F. HodgesD. F. McAllister, “Stereo and alternating-pair techniques for display of computer-generated images,” IEEE Comput. Graphics Appl. 5(9), 38–45 (1985).ICGADZ0272-1716http://dx.doi.org/10.1109/MCG.1985.276523Google Scholar

15. 

J. H. Choiet al., “Visual comfort measurement for 2D/3D converted stereo video sequence,” in 3DTV-Conference: The True Vision Capture, Transmission and Display of 3D Video (3DTV-CON), pp. 1–4, IEEE (2012).Google Scholar

16. 

J. G. KimJ. D. Cho, “Optimizing a virtual re-convergence system to reduce visual fatigue in stereoscopic camera,” IEICE Trans. Inf. Syst. E95D(5), 1238–1247 (2012).ITISEF0916-8532http://dx.doi.org/10.1587/transinf.E95.D.1238Google Scholar

17. 

J. G. Kimet al., “A real- time virtual re-convergence hardware platform,” J. Semicond. Technol. Sci. 12(2), 127–138 (2012).Google Scholar

18. 

T. YilmazU. Gudukbay, “Stereoscopic urban visualization based on graphics processor unit,” Opt. Eng. 47(9) 097005 (2008).OPEGAR0091-3286http://dx.doi.org/10.1117/1.2978948Google Scholar

19. 

U. GudukbayT. Yilmaz, “Stereoscopic view-dependent visualization of terrain height fields,” IEEE Trans. Visualization Comput. Graphics 8(4), 330–345 (2002).TVCG1077-2626http://dx.doi.org/10.1109/TVCG.2002.1044519Google Scholar

20. 

G. S. YangY. Z. LinP. Bhattaharya, “A driver fatigue recognition model based on information fusion and dynamic Bayesian network,” Inf. Sci. 180(10), 1942–1954 (2010).ISIJBC0020-0255http://dx.doi.org/10.1016/j.ins.2010.01.011Google Scholar

21. 

T. J. Budiet al., “Using EEG spectral components to assess algorithms for detecting fatigue,” Expert Syst. Appl. 36(2), 2352–2359 (2009).ESAPEH0957-4174http://dx.doi.org/10.1016/j.eswa.2007.12.043Google Scholar

22. 

Q. JiZ. ZhuP. Lan, “Real-time nonintrusive monitoring and prediction of driver fatigue,” IEEE Trans. Veh. Technol. 53(4), 1052–1068 (2004).ITUTAB0018-9545http://dx.doi.org/10.1109/TVT.2004.830974Google Scholar

23. 

W. Hornget al., “Driver fatigue detection based on the eye tracking and dynamic template matching,” in Proc. IEEE Int. Conf. Networking, Sensing and Control Taiwan, Vol. 1, pp. 7–12, IEEE (2004).Google Scholar

24. 

D. KimZ. BieK. Park, “Fuzzy neural network-based approach for personal facial expression recognition with novel feature selection method,” in Proc. 12th IEEE Int. Conf. Fuzzy System, Vol. 2, pp. 908–913, IEEE, St. Louis, Missouri (2003).Google Scholar

25. 

K. L. S. Lalet al., “Development of an algorithm for an EEG-based driver fatigue countermeasure,” J. Safety Res. 34(3), 321–328 (2003).JSFRAV0022-4375http://dx.doi.org/10.1016/S0022-4375(03)00027-6Google Scholar

26. 

T. P. Junget al., “Estimating alertness from the EEG power spectrum,” IEEE Trans. Biomed. Eng. 44(1), 60–69 (1997).IEBEAX0018-9294http://dx.doi.org/10.1109/10.553713Google Scholar

27. 

B. J. WilsonT. D. Bracewell, “Alertness monitor using neural networks from EEG analysis,” in Proceedings of the 2000 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing X, Vol. 2, pp. 814–820, IEEE, Sydney, Australia (2000).Google Scholar

28. 

T. H. LinhM. StodolskiS. Osowski, “On-line heart beat recognition using Hermite polynomials and neuro-fuzzy network,” IEEE Trans. Instrum. Meas. 52(4), 1224–1231 (2003).IEIMAO0018-9456http://dx.doi.org/10.1109/TIM.2003.816841Google Scholar

29. 

O. G. TalS. David, “Driver fatigue among military truck drivers,” Transp. Res. Part F 3(4), 195–209 (2000).1369-8478http://dx.doi.org/10.1016/S1369-8478(01)00004-3Google Scholar

30. 

C. Conati, “Probabilistic assessment of user’s emotions in educational games,” Appl. Artif. Intell. 16(7–8), 555–575 (2002).AAINEH1087-6545http://dx.doi.org/10.1080/08839510290030390Google Scholar

31. 

H. ChenP. Meer, “Robust fusion of uncertain information,” IEEE Trans. Syst. Man Cybernet. Part B 35(3), 578–586 (2005).ITSHFX1083-4427http://dx.doi.org/10.1109/TSMCB.2005.846659Google Scholar

32. 

R.W. PicardE. VyzasJ. A. Healey, “Toward machine emotional intelligence: analysis of affective physiological state,” IEEE Trans. Pattern Anal. Mach. Intell. 23(10), 1175–1191 (2001).ITPIDJ0162-8828http://dx.doi.org/10.1109/34.954607Google Scholar

33. 

P. Vysoky, “Changes in car driver dynamics caused by fatigue,” Neural Network World 14(1), 109–117 (2004).1210-0552Google Scholar

34. 

Q. JiP. LanC. Looney, “A probabilistic framework for modeling and real-time monitoring human fatigue,” IEEE Trans. Syst. Man Cybernet. Part A 36(5), 862–875 (2006).ITSHFX1083-4427http://dx.doi.org/10.1109/TSMCA.2005.855922Google Scholar

35. 

N. Vagias, “A Bayesian Network Application for the Prediction of Human Fatigue in the Maritime Industry,” National Technical University of Athens, Athens, Greece, (2010).Google Scholar

36. 

Nikolaoset al., “Human fatigue: evaluation with the usage of Bayesian networks,” in Computational Intelligence Systems in Industrial Engineering, pp. 651–676, Springer (2012).Google Scholar

37. 

D. W. Hubbard, How to Measure Anything, 2nd ed., Tantor Audio, New Jersey (2010).Google Scholar

38. 

J. Pearl, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, 2nd ed., Morgan Kaufmann, San Francisco (1988).Google Scholar

39. 

R. O. DudaP. E. HartD. G. Stork, Pattern Classification, 2nd ed., Wiley, New York (2001).Google Scholar

40. 

J. C. PrincipeS. K. GalaT. G. Chang, “Sleep staging automaton based on the theory of evidence,” IEEE Trans. Biomed. Eng. 36(5), 503–509 (1989).IEBEAX0018-9294http://dx.doi.org/10.1109/10.24251Google Scholar

41. 

M. LambooijW. IJsselsteijnI. Heynderickx, “Visual discomfort and visual fatigue of stereoscopic displays: a review,” J. Imaging Sci. Technol. 53(3), 030201 (2009).JIMTE61062-3701http://dx.doi.org/10.2352/J.ImagingSci.Technol.2009.53.3.030201Google Scholar

42. 

C. W. Tyler, “Spatial limitations of human stereoscopic vision,” in 21st Annual Technical Symposium, pp. 36–42, International Society for Optics and Photonics (1977).PSISDG0277-786Xhttp://dx.doi.org/10.1117/12.955731Google Scholar

43. 

K. L. S. LalA. Craig, “A critical review of the psychophysiology of driver fatigue,” Biol. Psychol. 55(3), 173–194 (2001).BLPYAX0301-0511http://dx.doi.org/10.1016/S0301-0511(00)00085-5Google Scholar

44. 

H. C. O. Liet al., “Measurement of 3D Visual Fatigue Using Event-Related Potential (ERP): 3D Oddball Paradigm,” in 3DTV Conference: The True Vision Capture, Transmission and Display of 3D Video, pp. 213–216, IEEE (2008).Google Scholar

45. 

Y. LinW. J. ZhangL.G. Watson, “Using eye movement parameters for evaluating human–machine interface frameworks under normal control operation and fault detection situations,” Int. J. Human Computer Stud. 59(6), 837–873 (2003).1071-5819http://dx.doi.org/10.1016/S1071-5819(03)00122-8Google Scholar

46. 

W. W. Wierwilleet al., “Research on vehicle-based driver status/performance monitoring: development, validation, and refinement of algorithms for detection of driver drowsiness,” National Highway Traffic Safety Administration Final Report: DOT HS 808 247 (1994).Google Scholar

47. 

X. LiQ. Ji, “Active affective State detection and user assistance with dynamic Bayesian networks,” IEEE Trans. Syst. Man Cybernet. Part A 35(1), 93–105 (2005).ITSHFX1083-4427http://dx.doi.org/10.1109/TSMCA.2004.838454Google Scholar

48. 

J. A. Healey, “Wearable and Automotive Systems for Affective Recognition from Physiology,” Doctoral Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA (2000).Google Scholar

49. 

T. PierreB. Jacques, “Monotony of road environment and driver fatigue: a simulator study,” Accid. Anal. Prev. 35, 381–391 (2003).AAPVB50001-4575http://dx.doi.org/10.1016/S0001-4575(02)00014-3Google Scholar

50. 

Y. ZhangQ. Ji, “Active and dynamic information fusion for facial expression understanding from image sequences,” IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 699–714 (2005).ITPIDJ0162-8828http://dx.doi.org/10.1109/TPAMI.2005.93Google Scholar

Biography

OE_52_8_083110_d001.png

Zhongyun Yuan received his BS degree and MS degree in the Department of Electronic and Electrical Engineering from North University of China, in 2005 and 2008. He is currently pursuing a PhD degree in the Department of Electrical and Computer Engineering, Sungkyunkwan University (SKKU), Suwon, Korea. His interests include measurement, 3-D vision, data compression, data acquisition, and compressive sampling.

OE_52_8_083110_d002.png

Jong Hak Kim received a BS degree in radio communication engineering from the Kyunghee University, Suwon, Korea, in 2009, the MS degree from the Department of Electrical and Computer Engineering, Sungkyunkwan University, in 2012, and he is studying for a PhD degree at Sungkyunkwan University. He is interested in efficient low-power and real-time processing systems for mobile equipment and currently studies image processing algorithms and hardware implementation for stereo-systems.

OE_52_8_083110_d003.png

Jun Dong Cho received a BS degree in electronic engineering, Sungkyunkwan University in Seoul, Korea, 1980, an MS degree from Polytechnic University, Brooklyn, NY, 1989, and a PhD degree from Northwestern University, Evanston, IL, 1993, both in computer science. He was a senior CAD engineer at Samsung Electronics, Co., Ltd. He is now professor of Department of Electronic Engineering, Sungkyunkwan University, Korea. He was a visiting scientist of IBM T.J. Watson Research Center, from 2000 to 2001. He has been an IEEE senior member since April 1996. His research interests are in the area of VLSI/SoC CAD and lower power design of multimedia and communication.

Zhongyun Yuan, Jong Hak Kim, Jun Dong Cho, "Visual fatigue measurement model in stereoscopy based on Bayesian network," Optical Engineering 52(8), 083110 (28 August 2013). http://dx.doi.org/10.1117/1.OE.52.8.083110
Submission: Received ; Accepted
JOURNAL ARTICLE
11 PAGES


SHARE
KEYWORDS
Visualization

Stereoscopy

Visual process modeling

Electroencephalography

Chromium

3D displays

3D modeling

RELATED CONTENT

Performance-evaluation-of-3D-TV-systems
Proceedings of SPIE (January 28 2008)
3D-animation-in-three-dimensions--the-rocky-road-to...
Proceedings of SPIE (February 20 2006)

Back to Top