검색 전체 메뉴
PDF
맨 위로
OA 학술지
Indoor Positioning System using LED Lights and a Dual Image Sensor
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
Indoor Positioning System using LED Lights and a Dual Image Sensor
KEYWORD
Optical wireless , Visible light communication , White-light LED , Indoor positioning
  • I. INTRODUCTION

    Visible light communication (VLC) is regarded as one of the most promising alternatives to radio wave communication [1, 2]. Since VLC can be a solution of the broadband home network, the IEEE 802.15 Working Group for Wireless Personal Area Networks developed a standard for VLC [3]. For efficient illumination and high-speed transmission, several modulation schemes and LED lamp arrangements were studied [4-7]. In recent years, the utilization of VLC for indoor localization has received attention from many researchers. The advantages of utilizing VLC for an indoor positioning system are that the existing infrastructure can be utilized, and the achievable accuracy is not dependent upon transmission bandwidth.

    Sertthin et al. [8] developed an indoor positioning system based on visible light communication identification (VLID) and a 6-axis sensor. The switching estimated receiver position (SwERP) scheme yields high accuracy even with wide field-of- view (FOV). The SwERP improves the position accuracy by optimizing the estimated distance error, which varies in proportion to the receiver’s tilt angle. An indoor positioning and communication platform using fluorescent light communication and a nine-channel receiver has been proposed [9]. Location estimation in this system requires at least three fluorescent lamps, to be detected by the photo sensor, which implies some limitations in indoor positioning. Also, indoor positioning schemes based on VLC and image sensors have been researched [10-13]. High-accuracy indoor positioning using visible LED lights and an image sensor has been proposed [10]. Each LED in a lighting fixture sends a differential three-dimensional space coordinate, and an image sensor receives the signals. This scheme uses a collinearity condition to relate the three-dimensional space coordinate data of the LEDs to the two-dimensional coordinate data of the image sensor. Indoor positioning of vehicles using an active optical infrastructure has been proposed [11]. The proposed positioning system consists of a network of beacons which transmit position signals via infrared channels to multiple mobile receivers. The mobile receiver detects the contents, transmission direction and angle of arrival of the position signal utilizing a receiver that consists of a photo detector and an image sensor device. Indoor positioning using lighting LEDs and two image sensors has been proposed [12]. In this scheme, four LEDs transmit their three-dimensional coordinate information, and a mobile receiver with two image sensors demodulates the information and calculates the indoor position from the geometrical relations of the LED images created on the image sensors. A highly precise indoor positioning algorithm using lighting LEDs, an image sensor, and VLC has been proposed [13]. In this scheme, three LEDs transmit their three-dimensional coordinate information, which is received and demodulated by a single image sensor at an unknown position.

    In this paper, we present a novel indoor positioning system using lighting LEDs and a dual image sensor. Lighting LEDs transmit their three-dimensional position information using VLC technology, and a mobile receiver with a dual image sensor performs indoor positioning. The proposed indoor positioning algorithm requires position information from two LEDs and provides the three-dimensional position and azimuth angle of the mobile receiver. The rest of this paper is organized as follows. Section II provides an indoor position estimation algorithm using two LEDs and two image sensors. The proposed algorithm provides the three-dimensional position and the yaw angle of the mobile receiver. Section III shows simulation results for the proposed indoor positioning algorithm and for the conventional method with vector estimation. Especially, the mean absolute positioning error and azimuth angle error are considered. Section IV shows experimental results for the developed indoor positioning system. Final conclusions are covered in Section V.

    II. INDOOR POSITION ESTIMATION

    We consider an indoor position estimation model which is shown in Fig. 1. Each LED broadcasts its position information using VLC technology, and a mobile receiver with a dual image sensor is located at the bottom level. The variables are described as follows: PA and PB are three-d imensional position vectors of the LEDs, PR is the center of the mobile receiver which is the midpoint between the two image sensors, is the horizontal axis on the image sensor, is the vertical axis on the image sensor, (ua1, va1) and (ua2, va2) are respectively the projected local positions of LED PA on the image sensor 1 and the image sensor 2, (ub1, vb1) and (ub2, vb2) are respectively the projected local positions of LED PB on the image sensor 1 and the image sensor 2, L is the distance between the centers of the two lenses, and f is the focal length of the lenses.

    For indoor position estimation, identification of the projected position vectors of the LEDs on the dual image sensor are performed using VLC technology. The proposed indoor position estimation scheme is as follows. First, rotation angle αr about the z-axis of the mobile receiver is calculated using the projected position vectors of two selected LEDs. Second, the directional distance vector PR - PA between the position vector PA of an LED and the position vector PR of the mobile receiver is estimated. Finally, the indoor position estimation of the mobile receiver is performed using the three-dimensional position vector PA and the estimated directional distance vector PR - PA.

       2.1. Rotation Angle Estimation

    We assume that identification of the projected LED points on the dual image sensor are already performed. Projected two-dimensional position vectors of two LEDs on an image sensor are shown in Fig. 2. If more than two projected LED points are shown in an image sensor, any two points can be selected for the rotation angle estimation. Let LED A and LED B be selected for rotation angle estimation. Then, the projected points of LED A and LED B are (ua, va) and (ub, vb) respectively. Vector and the rotation angle θ of the vector in the receiver coordinate system are calculated as follows:

    image
    image

    where atan2(y, x) computes tan-1(y / x) but uses the signs of both x and y to determine the quadrant in which the resulting angle lies. Because the mobile receiver knows the global position vectors of LED A and LED B using VLC technology, the rotation angle ϕ of the vector in the global coordinate system is calculated as follows:

    image

    Figure 3 shows LED mapping and the global azimuth angle on an image sensor. The rotation angle of the receiver coordinate system from the world coordinate system is obtained as

    image

    The rotation matrix R(αr) transforms the vector from the receiver coordinate into world coordinate, and the cross product of the two parallel vectors and can be expressed as

    image

    From (5), the trigonometric equation is expressed as

    image

    with

    image

    Because the z-axis of the receiver coordinate system is parallel to the z-axis of the global coordinate system, we can estimate the rotation angle of the mobile receiver as

    image

       2.2. Indoor Position Estimation

    We assume that LED A is used for position estimation of the mobile receiver. Let (x, y, z) be the unknown position vector of LED A in the receiver coordinate system, (u1, v1) is the projected position of the LED on the sensor plane of sensor 1, and (u2, v2) is the projected position of the LED on the sensor plane of sensor 2. Then, the collinearity equations are given as follows:

    image
    image

    where (-L/2, 0, f) is the center of the lens of sensor 1 and (L/2, 0, f) is the center of the lens of sensor 2. The above two collinearity equations intersect at (x, y, z) which is the position of the LED. Hence, the x-axis distance of LED A in the receiver coordinate system is given as follows:

    image

    Equations (8) and (9) give y-axis and z-axis distances for LED A in the receiver coordinate system as follows:

    image
    image

    By choosing one of Equations (11) and (12) for a non-zero denominator, the y-axis and z-axis values are selected. Next, by premultiplying the position vector (x, y, z)T by the rotation matrix R(αr), we can get the position vector PR of the mobile receiver in the world coordinate system as

    image

    III. SIMULATION

    The simulation of the proposed indoor positioning algorithm was performed using MATLAB. In our simulation, a 1×1m2 LED array is considered for the LED lighting and VLC transmitter. Simulation parameters such as the image sensor dimension, the field of view (FOV) of the image sensor, the distance between the centers of the two lenses, and the positioning area of the mobile receiver are listed in Table 1. The positioning error is calculated by applying Euclidean norm of the error vector as follows:

    [TABLE 1.] Simulation parameters

    label

    Simulation parameters

    image

    where is the estimated position vector of the mobile receiver.

    Figure 4 shows the mean absolute positioning error of the proposed positioning algorithm and the conventional positioning algorithm with vector estimation [12]. The number of pixels per line of the image sensor is increased from 600 to 3000 with a step of 200 pixels. For the conventional positioning algorithm with vector estimation, the mean absolute positioning errors for 1000, 1600, 2200, and 3000 pixels per line are 31.02 cm, 21.83 cm, 7.84 cm, and 2.13 cm, respectively. The positioning error shows decreasing zigzags as the number of pixels per line is increased. This is due to the quantization error at the image sensors [12]. With the proposed algorithm, the mean absolute positioning errors for 1000, 1600, 2200, and 3000 pixels per line are 2.49 cm, 1.41 cm, 1.43 cm, and 0.46 cm, respectively. Because of the quantization error, the proposed positioning algorithm also shows zigzagging of the positioning error with the number of pixels per line. However the positioning error of the proposed algorithm is greatly decreased in comparison with the conventional algorithm with vector estimation. That means that proposed algorithm is superior to the conventional method in indoor position estimation.

    Figure 5 shows the mean absolute positioning errors of the x axis, y axis, and z axis when the proposed positioning algorithm is applied. In this case, azimuth angle of the mobile receiver is assumed to be zero. Due to quantization error, all of the positioning errors show zigzags with regard to the dependency on the number of pixels per line. When the number of pixels per line is 3000, the mean absolute positioning error is 0.09 cm, 0.08 cm, and 0.44 cm along the x axis, y axis, and z axis, respectively. Hence, we know that the errors along the x axis and y axis are much lower than that of the z axis.

    Figure 6 shows the mean absolute positioning error of the proposed positioning algorithm along the azimuth angle, which is increased from -180 degrees to 175 degrees with steps of 5 degrees. The positioning error along the azimuth angle shows a random distribution. That means that the positioning error has no dependency on azimuth angle. We also know that the mean absolute positioning error and the variation of the range of error are reduced when the number of pixels per line is increased.

    IV. EXPERIMENT

    We developed an indoor positioning system and demonstrated the proposed indoor positioning algorithm. Figure 7 shows the experimental model room for the indoor position experiment. The height of the model room is 2.6 m. Two LED lamps are located at the ceiling and are separated by 1 m. Each LED lamp is assembled with 5 × 5 white LEDs and the diameter of each LED is 5 mm. Each LED lamp is modulated for lighting and VLC communication, and the VLC transmitter of each LED lamp performs on-off keying (OOK) modulation to broadcast its position information. A mobile receiver is located at the bottom of the model room, and 7 × 7 points are selected for the indoor position experiment. A mobile receiver is developed for VLC communication and is composed of two Logitech 310 webcams. The dimension of each image sensor is 3.2 × 2.4 mm2, and the distance between centers of the two lenses is 10 cm. In this experiment, two LED lamps broadcast that position information using VLC technology. Hence, identification of the projected LED lamps on the dual image sensor is performed. Some experimental parameters are listed in Table 2.

    [TABLE 2.] Experimental parameters

    label

    Experimental parameters

    Figure 8 shows the mean absolute positioning error of the developed indoor positioning system at five resolutions: 160 × 120, 320 × 240, 640 × 480, 800 × 600, and 1280 × 960. The simulation errors along the x axis, y axis, and z axis at 160 × 120 resolution are 1.45 cm, 1.03 cm, and 22.3 cm, respectively. The experimental errors along the x axis, y axis, and z axis at 160 × 120 resolution are 5.4 cm, 6.05 cm, and 1.33 m, respectively. When the resolution is 1280 × 960, the simulation errors along the x axis, y axis, and z axis are 0.07 cm, 0.02 cm, and 0.32 cm, respectively. The experimental errors along the x axis, y axis, and z axis at 1280 × 960 resolution are 2.26 cm, 5.51 cm, and 1.46 m, respectively. The experimental errors are much higher than those of the simulation results. This is because the broadcast of position information is performed not by each LED but by each LED lamp. The errors in finding the centers of LED lamp images, measurement errors, and quantization errors increase the indoor positioning experimental errors. Especially, the z-axis error is dominant over the x-axis and y-axis errors. This is because the indoor position estimation on the z-axis is dependent on the estimated x-axis and y-axis values.

    V. CONCLUSION

    In this paper, we proposed an indoor position estimation algorithm using LEDs and a dual image sensor. Each lighting LED broadcasts its position information using VLC technology and a mobile receiver with a dual image sensor estimates its three-dimensional position and azimuth angle from the broadcast positions of the two LEDs. Using MATLAB simulation, we showed that the proposed indoor position estimation method gives more accurate positioning results than the conventional method with vector estimation.

    Next, we developed an indoor positioning system and demonstrated its performance. From the results of the experiment, the mean absolute positioning error of the z-axis is dominant over x-axis and y-axis errors. The planned future work is to improve the z-axis positioning error of the proposed indoor positioning system.

참고문헌
  • 1. Elgala H., Mesleh R., Haas H. (2011) “Indoor optical wireless communication: potential and state-of-the-art,” [IEEE Commun. Mag.] Vol.49 P.56-62 google
  • 2. Komine T., Nakagawa M. (2004) “Fundamental analysis for visible-light communication system using LED lightings,” [IEEE Trans. Consum. Electron.] Vol.50 P.100-107 google cross ref
  • 3. (2011) IEEE Standard P802.15.7, Short-Range Wireless Optical Communication Using Visible Light google
  • 4. Grubor J., Randel S., Langer K.-D., Walewski J. W. (2008) “Broadband information broadcasting using LED-based interior lighting,” [IEEE J. Lightwave Technol.] Vol.26 P.3883-3892 google cross ref
  • 5. Moon H.-D., Jung S.-Y. (2012) “Multi-coded variable PPM for high data rate visible light communications,” [J. Opt. Soc. Korea] Vol.16 P.107-114 google cross ref
  • 6. Choi S.-I. (2012) “New type of white-light LED lighting for illumination and optical wireless communication under obstacles,” [J. Opt. Soc. Korea] Vol.16 P.203-209 google cross ref
  • 7. Hong-gang H., Dan-dan Z., Shuai T. (2014) “Analysis of the LED lamp arrangement for uniformity of illumination in indoor VLC system,” [J. Opt. Soc. Korea] Vol.18 P.663-671 google cross ref
  • 8. Sertthin C., Ohtsuki T., Nakagawa M. (2010) “6-axis sensor assisted low complexity high accuracy-visible light communication based indoor positioning system,” [IEICE Trans. Commun.] Vol.E93-B P.2879-2891 google cross ref
  • 9. Liu X., Makino H., Mase K. (2010) “Improved indoor location estimation using fluorescent light communication system with a nine-channel receiver,” [IEICE Trans. Commun.] Vol.E93-B P.2936-2944 google cross ref
  • 10. Yoshino M., Haruyama S., Nakagawa M. 2008 “High-accuracy positioning system using visible LED lights and image sensor,” [Proc. IEEE Radio and Wireless Symposium] P.439-442 google
  • 11. Heissmeyer S., Overmeyer L., Muller A. I. 2012 “Indoor positioning of vehicles using an active optical infrastructure,” [Proc. Int. Conf. on Indoor Positioning and Indoor Navigation] P.1-8 google
  • 12. Rahman M. S., Kim K.-D. (2013) “Indoor location estimation using visible light communication and image sensors,” [International Journal of smart Home] Vol.7 P.99-114 google
  • 13. Hossen M. S., Park Y., Kim K.-D. (2015) “Performance improvement of indoor positioning using light-emitting diodes and an image sensor for light-emitting diode communication,” [Opt. Eng.] Vol.54 P.1-11 google
OAK XML 통계
이미지 / 테이블
  • [ FIG. 1. ]  Indoor position estimation model.
    Indoor position estimation model.
  • [ ] 
  • [ FIG. 2. ]  Projected LEDs on an image sensor.
    Projected LEDs on an image sensor.
  • [ ] 
  • [ ] 
  • [ FIG. 3. ]  LED mapping and the global azimuth angle.
    LED mapping and the global azimuth angle.
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ TABLE 1. ]  Simulation parameters
    Simulation parameters
  • [ ] 
  • [ FIG. 4. ]  Mean absolute positioning error of the conventional and proposed algorithms.
    Mean absolute positioning error of the conventional and proposed algorithms.
  • [ FIG. 5. ]  Mean absolute positioning error along the three different axes.
    Mean absolute positioning error along the three different axes.
  • [ FIG. 6. ]  Mean absolute positioning error along the rotation angle.
    Mean absolute positioning error along the rotation angle.
  • [ FIG. 7. ]  Experimental model for indoor positioning.
    Experimental model for indoor positioning.
  • [ TABLE 2. ]  Experimental parameters
    Experimental parameters
  • [ FIG. 8. ]  Mean absolute positioning error of the developed indoor positioning system.
    Mean absolute positioning error of the developed indoor positioning system.
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.