## 1.

## Introduction

In recent years, with the rapid development of solid-state lighting technology, white light-emitting diodes (LEDs) have been widely used in the field of lighting, displaying, and transmitting and/or receiving of data.^{1}^{,}^{2} Compared with incandescent and fluorescent lights, white LEDs have the characteristic of long life expectancy, high energy efficiency, and low cost, and they can be modulated at a relatively high speed that is undetectable to the human eye. To date, a considerable amount of white LED–based research has been developed and may fall into two categories: visible light communication (VLC) and visible light positioning (VLP).

For VLC, the intrinsic features of white LEDs makes them suitable for high speed communication. First, VLC is based on the lighting function of white LEDs. To ensure sufficient light intensity, 400 to 1000 lux^{3} is often required for illumination levels. Therefore, the signal-to-noise ratio is high enough for VLC. Second, the radiation spectrum of white LEDs spans from 400 to 800 THz; thus high channel capacity could be achievable in accordance with the Shannon formula. At present, most researches on high speed VLC are confined to the indoor environment, mainly to improve the modulation bandwidth of LEDs,^{4} develop improved modulation technology,^{5} and design multiplexing scheme.^{6} The highest data rate reported so far is the wave division multiplexing (WDM) VLC system,^{7} where carrierless amplitude and phase modulation technology and adaptive equalization technology are jointly used to achieve a data rate of 4.5 Gbps in the laboratory.

For VLP,^{8}9.^{–}^{10} according to the optical reception devices used at the receiver, it can be divided into the photodiode (PD)-based VLP and the image sensor (IS)-based VLP. Since the PD is susceptible to the direction of the light beam, if it is flipped over or moved out of the range covered by the LED, the PD-based VLP system could fail; thus it has limited mobility and is only suitable for slow speed motion or quasistatic condition. In addition, PDs cannot be utilized in outdoor direct solar radiation environment. This is because PDs can only detect the optical power of incoming light; because direct solar radiation is usually strong, the PD is saturated by the intense optical power since it has a limited response. Therefore, most of the PD-based VLPs proposed belong to indoor positioning.^{11}12.13.14.15.16.^{–}^{17}

For the IS-based VLP,^{9}^{,}^{18}19.^{–}^{20} IS is used as an optical reception device. IS can detect not only the intensity but also the angle of arrival (AOA) of incoming light. IS consists of many pixels, and different light sources can be spatially separated using their imaging points on IS. Here, the light sources include various LED sources (such as indoor LED dome light, outdoor LED traffic light, LED brake light, or headlight of a vehicle) and noise sources (such as the Sun and other ambient lights). Via differentiating the imaging positions of light sources, LED sources can be recognized by a simple feature matching algorithm from multiple noise sources. Consequently, the IS-based VLP is available not only for indoor but also outdoor environments. Furthermore, combined with image processing and digital signal processing technology, the IS-based VLP can be utilized for safety driving such as collision warning and avoidance, lane change assistance, pedestrian detection, and adaptive cruise control.

To date, the published papers on the IS-based VLP have mostly focused on the field of applied research, such as indoor navigation systems,^{21} outdoor intelligent traffic systems,^{22}23.24.^{–}^{25} and various location-based services.^{26} These researches have shown that accurate localization can be achievable; however, little has been published about the analytical performance bounds of the positioning accuracy from the view of statistical optimization. The determination of the positioning accuracy will allow the optimization of the parameters governing the IS-based VLP systems.

The contributions of this paper are as follows. First, we analyze and derive the maximum likelihood estimation (MLE) and corresponding Cramér–Rao lower bounds (CRLB) for a typical outdoor IS-based VLP system, assuming white Gaussian model for system noise. Second, we analyze the effect of system parameters on CRLB. When a camera IS is used as receiver, there exist several types of noise generated from IS. As shot noise takes the dominant role, the system noise variance is influenced by many factors, such as the total received optical power, the pixel size, the focal length, and the frame rate of the camera receiver. Because the derived CRLB is proportional to system noise variance, we will emphatically analyze the system noise and the parameters affecting system noise variance in this paper.

The rest of this paper is organized as follows. In Sec. 2, an outdoor IS-based VLP system model is introduced, where the transmitter is the LED array of traffic light, and the camera receiver is assumed to be mounted on the dashboard of a vehicle. In Sec. 3, the MLE of the vehicle position is derived under the condition that the observation values of the LEDs’ imaging points are affected by white Gaussian noise. The performance analysis is completed in Sec. 4, where the CRLB is deduced and the parameters affecting CRLB are analyzed in detail. In Sec. 5, simulation results are given for a typical outdoor scenario. Conclusions are made in Sec. 6.

Notations: The operators ${\{\xb7\}}^{T}$, $E[\xb7]$, and $\mathrm{var}(\xb7)$ denote the transpose of a matrix, the expectation, and the variance of a random variable or matrix, respectively.

## 2.

## System Model

For the outdoor IS-based VLP system, as shown in Fig. 1, the transmitter may be the LED array of the traffic light in a city crossing, and the receiver may be a camera IS mounted on the dashboard of a vehicle. The signal from the LED and the image of the LED on the camera receiver are jointly used to determine the location of the receiver, which is assumed to be the vehicle position.

## Table 1

System parameters.

Parameter | Value |
---|---|

Height of traffic light | ${Z}_{i}=6.21$ (m) |

Height of camera receiver | ${Z}_{s}=1.0$ (m) |

Length of traffic arm | 1.0 (m) |

Width of lane | 3.5 (m) |

Width of vehicle | 1.8 (m) |

In Fig. 1, there are three coordinate systems, which are the three-dimensional (3-D) world coordinate system, the 3-D camera coordinate system, and the two-dimensional (2-D) image plane coordinate system. Any LED ${P}_{i}$ ($i=1,2,\dots ,N$) from the LED array transmitter is imaged into an imaging point ${p}_{i}$ ($i=1,2,\dots ,N$) in the image plane through the center of the lens. It is assumed that LED ${P}_{i}$ ($i=1,2,\dots ,N$) is located at ${P}_{i}={({X}_{i},{Y}_{i},{Z}_{i})}^{T}$ in the 3-D world coordinate system and is known *a priori*. The imaging point of LED ${P}_{i}$ is ${p}_{i}={({x}_{i},{y}_{i})}^{T}$ ($i=1,2,\dots ,N$) in the 2-D image plane coordinate system, which can be measured via image processing and signal processing technologies. However, the measurement value of the imaging point is often influenced by noise. When shot noise is the dominant noise source, system noise can be viewed as white Gaussian noise.^{27}^{,}^{28} Hence, our goal is to estimate the location of the camera receiver for white Gaussian noise, to obtain the MLE, and finally derive the CRLB.

For any LED ${P}_{i}$ from the LED array transmitter, it satisfies

where ${P}_{ci}={({X}_{ci},{Y}_{ci},{Z}_{ci})}^{T}$, $i=1,2,\dots ,N$ is the coordinate of LED ${P}_{i}$ in the camera coordinate system. The camera coordinate system is such a system where its origin is located at the center of the camera receiver, the direction of ${Z}_{c}$ is perpendicular to the 2-D image plane, and the ${Z}_{c}$ axis is usually called the optical axis. $\mathit{R}$ is the rotation matrix of the camera receiver from the camera coordinate system to the world coordinate system, which is a $3\times 3$ orthogonal matrix. ${O}_{s}={({X}_{s},{Y}_{s},{Z}_{s})}^{T}$ is the world coordinate of the center of the camera receiver, which is assumed to be the vehicle position since the camera receiver is fixed on a vehicle, supposably on the dashboard of a vehicle.The rotating process of the camera receiver from the camera coordinate system to the world coordinate system is shown in Fig. 2. The rotation angle $\varphi $ and $\omega $ can be directly read out from the inclination sensor attached in the camera receiver; however, the azimuth angle $\kappa $ has to be calculated:

## (2)

$$\mathit{R}=\left(\begin{array}{ccc}{r}_{11}& {r}_{12}& {r}_{13}\\ {r}_{21}& {r}_{22}& {r}_{23}\\ {r}_{31}& {r}_{32}& {r}_{33}\end{array}\right)\phantom{\rule{0ex}{0ex}}=\left(\begin{array}{ccc}\mathrm{cos}\text{\hspace{0.17em}}\varphi & 0& -\mathrm{sin}\text{\hspace{0.17em}}\varphi \\ 0& 1& 0\\ \mathrm{sin}\text{\hspace{0.17em}}\varphi & 0& \mathrm{cos}\text{\hspace{0.17em}}\varphi \end{array}\right)\left(\begin{array}{ccc}1& 0& 0\\ 0& \mathrm{cos}\text{\hspace{0.17em}}\omega & -\mathrm{sin}\text{\hspace{0.17em}}\omega \\ 0& \mathrm{sin}\text{\hspace{0.17em}}\omega & \mathrm{cos}\text{\hspace{0.17em}}\omega \end{array}\right)\left(\begin{array}{ccc}\mathrm{cos}\text{\hspace{0.17em}}\kappa & -\mathrm{sin}\text{\hspace{0.17em}}\kappa & 0\\ \mathrm{sin}\text{\hspace{0.17em}}\kappa & \mathrm{cos}\text{\hspace{0.17em}}\kappa & 0\\ 0& 0& 1\end{array}\right)\mathrm{.}$$*a priori*, then the distance (between the traffic light and camera receiver) along the direction of the optical axis is $h={Z}_{i}-{Z}_{s}$.

In the 3-D camera coordinate system, the relationship between ${P}_{ci}={({X}_{ci},{Y}_{ci},{Z}_{ci})}^{T}$, $i=1,2,\dots ,N$ and ${p}_{i}={({x}_{i},{y}_{i})}^{T}$, $i=1,2,\dots ,N$ can be described, with the focal length of the lens being $f$, as

Rearranging Eqs. (1) and (3), we get the mathematical relationship between the LEDs ${({X}_{i},{Y}_{i})}_{i=1}^{N}$ and the measurement values ${({\tilde{x}}_{i},{\tilde{y}}_{i})}_{i=1}^{N}$ of their imaging points, which can be written as## (4)

$$\left(\begin{array}{c}{\tilde{x}}_{i}\\ {\tilde{y}}_{i}\end{array}\right)=(-\frac{f}{h})\left(\begin{array}{c}{X}_{i}-{X}_{s}\\ {Y}_{i}-{Y}_{s}\end{array}\right)+\left(\begin{array}{c}{n}_{xi}\\ {n}_{yi}\end{array}\right),$$Our goal is to estimate the parameter vector $\mathit{r}={({X}_{s},{Y}_{s})}^{T}$ of the vehicle position, derive its MLE, and finally get the CRLB for white Gaussian noise.

## 3.

## Maximum Likelihood Estimation

Based on the measurement values ${\{{\tilde{x}}_{i}\}}_{i=1}^{N}$ and ${\{{\tilde{y}}_{i}\}}_{i=1}^{N}$ and the LED coordinates ${\{{X}_{i}\}}_{i=1}^{N}$ and ${\{{X}_{i}\}}_{i=1}^{N}$, the log-likelihood function of the parameter vector $\mathit{r}={({X}_{s},{Y}_{s})}^{T}$ of the vehicle position is given as

## (5)

$$\mathrm{ln}({X}_{s},{Y}_{s},{\sigma}^{2})=(-\frac{N}{2})\mathrm{ln}(2\pi {\sigma}^{2})\phantom{\rule{0ex}{0ex}}-\frac{1}{2{\sigma}^{2}}\sum _{i=1}^{N}[{\left(\right(\frac{f}{h})[{X}_{i}-{X}_{s}]+{\tilde{x}}_{i})}^{2}+{\left(\right(\frac{f}{h})[{Y}_{i}-{Y}_{s}]+{\tilde{y}}_{i})}^{2}].$$## (6)

$$\frac{\partial \mathrm{ln}({X}_{s},{Y}_{s})}{\partial {X}_{s}}=\frac{f}{{\sigma}^{2}h}\sum _{i=1}^{N}\left[\right(\frac{f}{h})[{X}_{i}-{X}_{s}]+{\tilde{x}}_{i}]\mathrm{.}$$## (7)

$${\widehat{X}}_{s}=\frac{1}{N}\sum _{i=1}^{N}{X}_{i}+\frac{h}{Nf}\sum _{i=1}^{N}{\tilde{x}}_{i}.$$Similarly, differentiating the log-likelihood function with respect to ${Y}_{s}$ gives

## (8)

$$\frac{\partial \mathrm{ln}({X}_{s},{Y}_{s})}{\partial {Y}_{s}}=\frac{f}{{\sigma}^{2}h}\sum _{i=1}^{N}\left[\right(\frac{f}{h})[{Y}_{i}-{Y}_{s}]+{\tilde{y}}_{i}].$$## (9)

$${\widehat{Y}}_{s}=\frac{1}{N}\sum _{i=1}^{N}{Y}_{i}+\frac{h}{Nf}\sum _{i=1}^{N}{\tilde{y}}_{i}.$$Define $\frac{1}{N}\sum _{i=1}^{N}{X}_{i}={\overline{X}}_{i},\frac{1}{N}\sum _{i=1}^{N}{Y}_{i}={\overline{Y}}_{i},\frac{1}{N}\sum _{i=1}^{N}{\tilde{x}}_{i}=\overline{x},\frac{1}{N}\sum _{i=1}^{N}{\tilde{y}}_{i}=\overline{y}$.

We can express the MLE of the vehicle position as

## (10)

$$\{\begin{array}{l}{\widehat{X}}_{s}=\overline{X}+\frac{h}{f}\overline{x}\\ {\widehat{Y}}_{s}=\overline{Y}+\frac{h}{f}\overline{y}\end{array}.$$Consequently, the MLE of the vehicle position can be obtained by finding the means of measurement values ${\{{\tilde{x}}_{i}\}}_{i=1}^{N}$ and ${\{{\tilde{y}}_{i}\}}_{i=1}^{N}$, and the means of LEDs coordinates ${\{{X}_{i}\}}_{i=1}^{N}$ and ${\{{X}_{i}\}}_{i=1}^{N}$.

Figure 3 shows the estimation values of ${X}_{s}$ and ${Y}_{s}$ when $\sigma $ is ${10}^{-3}$. It can be seen that the estimation values vibrate around the original value (${X}_{s}=2.02\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{m}$ and ${Y}_{s}=30.8\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{m}$), and this is because the program is run independently each time.

## 4.

## Performance Analysis

The CRLB gives a lower bound on variance attainable by any unbiased estimation. In order to better illustrate the performance of an estimation method, it can be compared with the CRLB. The regularity condition of the CRLB^{29} holds for the given estimation since Eqs. (6) and (8) are finite, and the expected value of Eqs. (6) and (8) is 0.

The CRLB of the vector parameter $\mathit{r}={({X}_{s},{Y}_{s})}^{T}$ can be obtained through three steps. First, from Eqs. (6) and (8) we get the second-order derivatives of the log-likelihood function with respect to ${X}_{s}$ and ${Y}_{s}$, respectively. Second, taking the negative expectations of the second-order derivatives yields

## (11)

$$\{\begin{array}{l}-E\left[\frac{{\partial}^{2}\mathrm{ln}({X}_{s},{Y}_{s})}{\partial {X}_{s}^{2}}\right]=\frac{N{f}^{2}}{{h}^{2}{\sigma}^{2}}\\ -E\left[\frac{{\partial}^{2}\mathrm{ln}({X}_{s},{Y}_{s})}{\partial {Y}_{s}^{2}}\right]=\frac{N{f}^{2}}{{h}^{2}{\sigma}^{2}}\\ -E\left[\frac{{\partial}^{2}\mathrm{ln}({X}_{s},{Y}_{s})}{\partial {X}_{s}\partial {Y}_{s}}\right]=0\end{array}.$$The $2\times 2$ Fisher information matrix $\mathbf{I}(\mathit{r})$ is written as

## (12)

$$\mathbf{I}(\mathit{r})=\left(\begin{array}{cc}-E\left[\frac{{\partial}^{2}\mathrm{ln}({X}_{s},{Y}_{s})}{\partial {X}_{s}^{2}}\right]& -E\left[\frac{{\partial}^{2}\mathrm{ln}({X}_{s},{Y}_{s})}{\partial {X}_{s}\partial {Y}_{s}}\right]\\ -E\left[\frac{{\partial}^{2}\mathrm{ln}({X}_{s},{Y}_{s})}{\partial {Y}_{s}\partial {X}_{s}}\right]& -E\left[\frac{{\partial}^{2}\mathrm{ln}({X}_{s},{Y}_{s})}{\partial {Y}_{s}^{2}}\right]\end{array}\right).$$## (13)

$${\mathbf{I}}^{-1}(\mathit{r})=\left(\begin{array}{cc}\frac{{h}^{2}{\sigma}^{2}}{N{f}^{2}}& 0\\ 0& \frac{{h}^{2}{\sigma}^{2}}{N{f}^{2}}\end{array}\right).$$Consequently, the CRLB of the vehicle position for white Gaussian noise is given as

## (14)

$$\{\begin{array}{l}\mathrm{var}({\widehat{X}}_{s})\ge \frac{{h}^{2}{\sigma}^{2}}{N{f}^{2}}\\ \mathrm{var}({\widehat{Y}}_{s})\ge \frac{{h}^{2}{\sigma}^{2}}{N{f}^{2}}\end{array}.$$^{30}It can be seen that the MSPE is proportional to ${\sigma}^{2}$ and is unlimitedly close to the CRLB.

From Eq. (14), we know that the CRLB is proportional to the noise variance ${\sigma}^{2}$, with the number of LEDs used $N$, the focal length of the camera receiver $f$, and the distance $h$ being known. However, when a camera IS is used as receiver for an outdoor IS-based VLP system, there exist several types of noise generated from IS. When shot noise takes the dominant role, the system noise variance is influenced by many factors, such as the total received optical power, the pixel size, the focal length, and the frame rate of camera receiver.

In the following, we will emphatically analyze the system noise in the IS-based VLP system and the parameters affecting system noise variance.

## 4.1.

### System Noise

There are two basic types of noise generated by IS, which are pattern noise (PN) and random noise (RN). PN can be directly observed by human eyes and distributed in a spatial form, which does not vary with each frame of image. The effect of PN on image quality is far greater than RN, but it can be effectively inhibited or eliminated through the correlated double sampling or flat field correction technology. Hence, the effect of PN will not be considered in this paper.

The quantized values of RN vary with each frame of image, and RN obeys a statistical distribution. One typical RN is shot noise, and it is generated by random variation of photoinduced charge carriers with incoming light in the semiconductor of the camera receiver. When the number of photoinduced charge carriers is large enough, shot noise is in Gaussian distribution and is white noise. In the IS-based VLP system, shot noise is mainly made up of three parts: quantum noise generated from the observation point of the image corresponding to each LED, quantum noise coming from the interference of other LEDs, ambient light noise from fluorescent or incandescent lights or the sun, and so on. Since IS has the ability to spatially separate sources, the imaging points of discrete LEDs on the camera IS receiver can be resolved; that is, noise from the interference of other LEDs is so small that it can be classified into ambient light noise. Hence, while shot noise takes the dominant role, the system noise variance can be expressed as

## (15)

$${\sigma}^{2}={\sigma}_{\mathrm{shot}}^{2}=2q\rho ({P}_{r}+{P}_{n}{A}_{\mathrm{total}})\xb7{I}_{2}{R}_{\mathrm{b}},$$## 4.2.

### Parameters Affecting System Noise Variance

In this paper, such a channel scenario is utilized for the outdoor IS-based VLP system, as shown in Fig. 5(a), where the LED array transmitter is placed on horizontal ground and the camera receiver is fixed on the dashboard of a vehicle, with the center of the LED array transmitter in the optical axis of the camera receiver.

## 4.2.1.

#### Total received optical power

If $N$ LEDs are used to locate an IS receiver, the total received optical power ${P}_{\mathrm{r}}$ of IS is ${P}_{r}=\sum _{i=1}^{N}{H}_{i}(0){P}_{t}$, when each LED transmits constant optical power ${P}_{t}$ for each line of sight (LOS) channel. A lateral view of the transmitter–receiver channel is shown in Fig. 5(b). For the $i$’th channel, $i=\mathrm{1,2},\dots ,N$, ${H}_{i}(0)=\frac{(m+1){\mathrm{cos}}^{m}({\varphi}_{i})\mathrm{cos}({\phi}_{i})A}{2\pi {D}_{i}^{2}}$ is the directed circuit gain, $m$ is the order of Lambertian emission and generally $m=1$, ${\varphi}_{i}$ is angle of irradiance, ${\phi}_{i}$ is the angle of incidence with $0\le {\phi}_{i}\le {\phi}_{C}$, and ${\phi}_{C}$ is the field of view (FOV) of the IS receiver. ${D}_{i}$ is the propagation distance from each LED transmitter to the camera receiver. For the communication distance between the LED transmitter and camera receiver being $h$, if ${\varphi}_{i}={\phi}_{i}$ then $\mathrm{cos}({\varphi}_{i})=h/{D}_{i}$, and the total received optical power can be expressed as

where $C(\phi )=\sum _{i=1}^{N}[(m+1){\mathrm{cos}}^{m+3}({\phi}_{i})/2\pi ]$, which is related to incidence angles. It is assumed that all incidence angles of LOS links are within the FOV of the receiver.## 4.2.2.

#### Effective area for detecting

It is necessary to calculate the image size corresponding to one LED when a camera IS is used for receiver. The imaging process of a single LED through the lens on the camera IS is shown in Fig. 6. According to Newton’s formula, if the diameters of the LED and the corresponding image are $L$ and $l$, respectively, for a focal length of $f$ and a distance of $h$ between the LED and the lens, then the relationship between these parameters satisfies $l=fL/h$. It is referred to the distance where the LED generates an image that falls into exactly one pixel as the critical distance ${d}_{c}$. If $h\ge {d}_{c}$, the image of the LED falls into only one pixel; then the effective area for detecting is $A={w}^{2}$, where $w$ is the width of a pixel. If $h<{d}_{c}$, the image of LED will fall into several pixels; then $A={l}^{2}={(fL/h)}^{2}$.

## 5.

## Numerical Results

Simulation experiments are performed in a channel scenario, as shown in Fig. 5(a), where the LED array transmitter is placed on horizontal ground and the camera receiver is fixed on the dashboard of a vehicle, with the center of the LED array transmitter in the optical axis of the camera receiver. The communication distance between the LED array transmitter and the camera receiver is changed from 15 to 60 m, every 5 m on a static condition. The white LEDs are used for the LED array transmitter, and a Photron IDP-Express R2000 is used for the camera IS receiver. The parameters are listed in Tables 2 and 3, respectively.

## Table 2

Parameters for the LED array transmitter.

Parameter | Value |
---|---|

Number of LEDs | $32\times 32$ |

Spacing of LEDs | 15 (mm) |

Size of LED array | $465\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{mm}\times 465\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{mm}$ |

Diameter of an LED | $L=6$ (mm) |

Transmitted optical power of an LED | ${P}_{t}=100$ (mW) |

## Table 3

Parameters for the camera image sensor receiver.

Parameter | Value |
---|---|

Focal length | $f=35$ (mm) |

Pixel size | $w=10$ ($\mu \mathrm{m}$) |

Frame rate | ${R}_{s}=1000$ (fps) |

Resolution | $1024\text{\hspace{0.17em}\hspace{0.17em}}\text{pixel}\times 1024\text{\hspace{0.17em}\hspace{0.17em}}\text{pixel}$ |

In the following, we will present simulation results for the CRLB for the positioning system described in the previous section for a range of parameters, such as the communication distance, the pixel size, and the focal length and frame rate of the camera receiver.

First, we study the influence of the communication distance on CRLB. Figure 7 shows the CRLB versus the communication distance between the LED transmitter and camera receiver, from 15 to 60 m, with a step size of 15 m on a static condition. The positioning accuracy decreases with increasing communication distance. When the communication distance between the LED array transmitter and the camera receiver is 60 m, the CRLB of the vehicle position is about 0.35 m. However, when the distance is shortened to 15 m, the CRLB of the vehicle position is less than 0.05 m.

Second, we study the influence of pixel width on CRLB. Figure 8 plots the CRLB versus the number of LEDs, which shows that the positioning error decreases as the number of LEDs increases. We vary pixel width from 25 to $10\text{\hspace{0.17em}\hspace{0.17em}}\mu \mathrm{m}$. The CRLB drops with decreasing the pixel width. When four LEDs are used in the outdoor IS-based VLP system at a communication distance of 30 m between the LED array transmitter and the camera receiver, the CRLB of the vehicle position is less than 0.1 m.

Next, we study the impact of focal length on CRLB. In Fig. 9, the CRLB is plotted as a function of the used number of LEDs. The CRLB falls with increasing focal length. We vary focal length from 20 to 35 mm. This figure again shows that low values of CRLB are achievable for typical camera IS parameters. For four LEDs used in the outdoor IS-based VLP system at a communication distance of 30 m between the LED array transmitter and the camera receiver, the CRLB of the vehicle position is less than 0.1 m.

Finally, we investigate how CRLB behaves as we vary the frame rate of the camera receiver. In Fig. 10, the CRLB is plotted versus the number of LEDs for various frame rates. It shows that for a given number of LEDs, the CRLB drops with reducing frame rate. For four LEDs used in the outdoor IS-based VLP system at a communication distance of 30 m between the LED array transmitter and the camera receiver, the CRLB of the position of camera receiver for the frame rate of 1000 fps is about 0.5 m. This falls to only about 0.05 m when the frame rate is decreased to 30 fps. Therefore, the positioning accuracy increases with reducing of the frame rate, However, the lower frame rate (which is equal to the sampling rate of the camera IS) directly limits the achievable data rate. This is the reason why high speed cameras are usually utilized for VLC, while medium and low speed cameras are used for VLP.

## 6.

## Conclusion

For a typical outdoor scenario, theoretical limits of the location of an in-vehicle camera receiver are calculated by deriving the CRLB. Under the condition that the observation values of the LED imaging points are affected by white Gaussian noise, the MLE for the vehicle position is first calculated, then the CRLB is derived. For typical parameters of a white LED array and in-vehicle camera IS, simulation results show that accurate location estimation is achievable, with the positioning error usually in the order of centimeters for a communication distance of 30 m between the LED array transmitter and the camera receiver. Positioning accuracy has relation with the number of LEDs used, the focal length of the lens, and the pixel size and frame rate of the camera receiver in the presence of a constant communication distance. The determination of the CRLB will provide a theoretical basis of statistical analysis for the optimization problem for outdoor IS-based VLP systems.

## Acknowledgments

This work was supported by the Natural Science Foundation of China under Grant Nos. 61261017, 61362006, and 61371107, the Natural Science Foundation of Guangxi under Grant Nos. 2014GXNSFAA118387 and 2013GXNSFAA019334, the Key Laboratory Foundation of Guangxi Broadband Wireless Communication and Signal Processing under Grant No. GXKL061501, the Guangxi Colleges and Universities Key Laboratory Foundation of Intelligent Processing of Computer Images and Graphics under Grant No. GIIP201407, and the High-Level Innovation Team of New Technology on Wireless Communication in Guangxi Higher Education Institutions.

## References

## Biography

**Xiang Zhao** received her BS degree in information engineering and her MS degree in communication and information systems from Guilin University of Electronic Technology, Guilin, China, in 2001 and 2006, respectively. She is currently working toward her PhD in communication and information systems from Xidian University, Xian, China, and her current research interests are visible light communication and visible light positioning.

**Jiming Lin** received his MSc degree from the University of Electronic Science and Technology of China in 1995 and his PhD from Nanjing University in 2002. Then he held a two-year postdoctoral fellowship at the State Key Laboratory for Novel Software Technology at Nanjing University. Since 2004, he has been a professor of Guilin University of Electronic Technology. His research interests are in synchronization and localization in WSNs, UWB communication, and visible light communication and positioning.