검색 전체 메뉴
PDF
맨 위로
OA 학술지
Depth Resolution Analysis of Axially Distributed Stereo Camera Systems under Fixed Constrained Resources
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
Depth Resolution Analysis of Axially Distributed Stereo Camera Systems under Fixed Constrained Resources
KEYWORD
3D imaging , Axially distributed stereo sensing , Depth resolution , Elemental images
  • I. INTRODUCTION

    Three-dimensional (3D) imaging techniques have been considered as an important issue in the computer vision, target tracking, object recognition, and so on [1-5].Various methods for capturing and visualization of 3D objects in space have been studied [6-14] including integral imaging, synthetic aperture integral imaging (SAII) and axially distributed image sensing (ADS). Among them, the extended version of ADS was reported called axially distributed stereo sensing (ADSS) which is implemented using a stereo camera [14]. In this method, a stereo camera is translated along its optical axis to obtain multiple image pairs and the computational reconstruction is implemented by the uniform superposition of the resized elemental image pairs. This can solve the problem that the collection of 3D information is not uniform across the sensor in the conventional ADS system.

    Recently, the resolution analysis methods for various 3D imaging systems under the equally-constrained resources have been reported [15-18]. In 2012, the N-ocular imaging system was firstly analyzed using two point source resolution criteria. Here, the lateral and longitudinal resolutions of the N-ocular imaging systems were analyzed based on several factors such as the number of sensors, pixel size, imaging optics, relative sensor configuration, and so on [15]. In 2013, we proposed the resolution analysis of the ADS method based on two point source resolution criteria [18]. The analysis results for system parameters were presented.

    In this paper, we propose a new framework for performance evaluation of ADSS systems under the equally-constrained resources. The evaluation parameters are a fixed number of pixels, a fixed moving distance, a fixed pixel size, and so on. For the resolution analysis of the proposed ADSS framework, we use the two point sources resolution criterion [15]. We evaluate the depth resolution through the Monte Carlo simulations.

    II. REVIEW OF ADSS

    In general, the ADSS method is composed of a pickup part and a digital reconstruction part [14]. Figure 1 shows the total system structure of the ADSS. In the pickup part of the ADSS as shown in Fig. 1, we record an elemental image pair using stereo cameras with different distances. And, the distance between the two cameras is b. Then, we capture multiple elemental image pairs by moving the stereo camera along its optical axis.

    With the recorded multiple elemental image pairs, we can generate 3D sliced images in the digital reconstruction part of the ADSS which is shown in Fig. 2. The digital reconstruction process of 3D objects is the same with the inverse process of ADSS pickup. It can be implemented on the basis of an inverse mapping procedure through a virtual pinhole model as shown in Fig. 2.

    Let us assume that the first pinhole is located at z=0 and the k-th pinhole is located at z=(k-1)Δz in Fig. 2. We define that and are the left and right images of the k-th elemental image pair, respectively. Then, Rk(x, y, zr) is the inversely mapped image of the k-th elemental image pair through the k-th pinhole at the reconstruction image plane zr and it becomes

    where Mk=[zr - (k - 1)Δz] / g. Equation (1) means the uniform superposition of two resized elemental images. The final reconstructed 3D image at distance zr is obtained by summation of all resized elemental images. That is,

    In fact, the conventional ADS system cannot capture the correct 3D information near the optical axis. However, the ADSS system can overcome this problem by generating 3D images near the optical axis.

    III. RESOLUTION ANALYSIS FOR ADSS SYSTEM

    Figure 3 shows the general framework of ADSS system with stereo camera located at K different positions. In this framework, we want to use the equally-constrained resources regardless of the number of stereo camera. To do so, let us assume that the total number of pixels (S), the size of the pixel (c), and the moving range (D) are fixed. Thus, if we can use total S pixels in the image sensors, the pixel number of each stereo camera becomes S/K. Also, we assume that the used imaging lenses are identical with the same focal length (f) and the diameter (A).

    In the ADSS framework as shown in Fig. 3, we can vary the positions of the stereo camera by K different positions. The moving range is limited within D. When K=1, the conventional stereo system can be constructed where two cameras with high resolution of S/2 are used. When K>>1, the ADSS system can be constructed where multiple cameras structure with each camera having the low resolution of S/2K is implemented.

    In this paper, we want to analyze the resolution performance for the ADSS framework as shown in Fig. 4. To do so, the resolution analysis method based on two point sources resolution criteria is utilized [15]. We modified this analysis method into stereo camera case in this paper.

    For simplicity, we present one-dimensional notation. We assume that there are two close point sources in space as shown in Fig. 4. We first find the mapping pixel index for the first point source located at (x1, z1) as shown in Fig. 4. Then, the point spread function (PSF) of the imaging lens for the first point source is recorded into stereo camera (left and right cameras) located at the different i-th positions. We can calculate the center position of the PSF of the imaging lens for the first point source in the i-th stereo camera. This becomes

    where f is the focal length of the imaging lens and and are the positions of the left and right cameras at i-th position. For the recorded point sources into and pixels, their pixel indexes are calculated by

    where⌈.⌉is the rounding operator.

    Next, we calculate the unresolvable pixel area for the mapping pixels. The unresolvable pixel area means that two point sources cannot be separated within a single pixel. When the position of the second point source is located close to the first point source, we can resolve two point sources if the second point source is registered by at least one sensor so that it is on a pixel that is adjacent to the pixel that registered the first PSF. But, if the second PSF falls on the same pixel that recorded the PSF of the imaging lens for the first point source, we cannot resolve them. Based on this unresolved area, we can calculate the depth resolution.

    Now let us consider the second point source located at (x1, z2). Then, the possible mapping pixel ranges of the second point source for left and right cameras are given by

    where δ is 1.22λf/A.

    Then, we want to find the unresolvable pixel area for the left and right cameras. The unresolvable pixel area of the second point source for the mapping pixel is calculated by using ray back-projection into the plane of two point sources as shown in Fig. 5. They are given by the following equations, respectively.

    Finally, to calculate the depth resolution, we will find the common area for the unresolvable pixel areas calculated from all stereo cameras. The depth resolution can be considered as the common intersection of all unresolvable pixel ranges. Thus, the depth resolution for ADSS system becomes

    IV. MONTE CARLO SIMULATIONS AND RESULTS

    For the proposed framework of the ADSS system, the Monte Carlo simulations were performed by computer. We calculated the depth resolution statistically using the two point sources resolution analysis method. In our Monte Carlo simulation, two point sources were located longitudinally far from the stereo camera as shown in Fig. 4. We selected the random position of the first point source. And the second point source is moved in the longitudinal direction. Table 1 shows our experimental conditions of the Monte Carlo simulation for the proposed ADSS system.

    [TABLE 1.] Experimental conditions for Monte Carlo simulation

    label

    Experimental conditions for Monte Carlo simulation

    We selected the random locations of two point sources with 4,000 trials for the Monte Carlo simulations. The depth resolutions for this experiment were calculated as the sample mean. For the several parameters, we carried out the simulations in terms of the depth resolution for ADSS frameworks. First experiment is the calculation of depth resolution according to the number of cameras. The result is shown in Fig. 6. According to the distance (x) between the optical axis and the point source plane, the results are plotted. From these results, we can observe that the depth resolution is improved as x becomes larger. Then, we calculated the depth resolution when x=0 (optical axis case). This means that the depth resolution exists along the optical axis in the ADSS method because 3D perspective information can be recorded into either left or right camera. However, the depth resolution cannot exist in the conventional ADS system. As shown in Fig. 6, the depth resolution is improved as the number of stereo cameras increases because the use of many cameras is more effective for resolving two point sources in space. This means that the usage of more low-resolution stereo cameras may be useful to capture 3D objects in terms of the depth resolution compared with a single high-resolution stereo camera.

    Figure 7 shows the depth resolution results according to the number of cameras and focal length of imaging lens when x=0 mm and x=100 mm. The depth resolution is improved when a large number of cameras and large focal length are used. And, the calculation results of the depth resolution are shown in Figs. 8-10 when the camera pixel size, total number of camera pixels and moving range of stereo cameras are varied. The improved depth resolution is obtained when small pixel size and large moving range are used as shown in Fig. 8 and 9, in which we can see the large variations of depth resolution when N=2. This is because the depth resolution was calculated based on two-pixel information from a single stereo camera. This variation can be improved by using more position ranges of two point sources. However, it is seen that the total number of sensor pixels is not related to the depth resolution as shown in Fig. 10.

    V. CONCLUSION

    In conclusion, we have analyzed the depth resolution for various ADSS frameworks under fixed-constrained resources. To evaluate the system performance of ADSS, we have considered system parameters including the number of cameras, the number of pixels, pixel size, and focal length. From the computational simulation, it is seen that the depth resolution in ADSS system can be improved with the large number of cameras and the large distance between optical axis and point source plane. The proposed analysis method may be one of a promising tool to design a practical ADSS system.

참고문헌
  • 1. Okoshi T. 1976 Three-dimensional Imaging Techniques google
  • 2. Ku J.-S., Lee K.-M., Lee S.-U. 2001 “Multi-image matching for a general motion stereo camera model,” [Pattern Recognition] Vol.34 P.1701-1712 google cross ref
  • 3. Stern A., Javidi B. 2006 “Three-dimensional image sensing, visualization, and processing using integral imaging,” [Proc. IEEE] Vol.94 P.591-607 google cross ref
  • 4. Cho M., Javidi B. 2008 “Three-dimensional tracking of occluded objects using integral imaging,” [Opt. Lett.] Vol.33 P.2737-2739 google cross ref
  • 5. Yeom S.-W., Woo Y.-H., Baek W.-W. 2011 “Distance extraction by means of photon-counting passive sensing combined with integral imaging,” [J. Opt. Soc. Korea] Vol.15 P.357-361 google cross ref
  • 6. Stern A., Javidi B. 2003 “Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging,” [Appl. Opt.] Vol.42 P.7036-7042 google cross ref
  • 7. Arimoto H., Javidi B. 2001 “Integral three-dimensional imaging with digital reconstruction,” [Opt. Lett.] Vol.26 P.157-159 google cross ref
  • 8. Lee J.-J., Shin D., Lee B.-G., Yoo H. 2012 “3D optical microscopy method based on synthetic aperture integral imaging,” [3D Research] Vol.3 P.2 google
  • 9. Navarro H., Barreiro J.C., Saavedra G., Martinez-Corral M., Javidi B. 2012 “High-resolution far-field integral-imaging camera by double snapshot,” [Opt. Express] Vol.20 P.890-895 google cross ref
  • 10. Yoo H. 2013 “Axially moving a lenslet array for high-resolution 3D images in computational integral imaging,” [Opt. Express] Vol.21 P.8873-8878 google cross ref
  • 11. Schulein R., DaneshPanah M., Javidi B. 2009 “3D imaging with axially distributed sensing,” [Opt. Lett.] Vol.34 P.2012-2014 google cross ref
  • 12. Shin D., Javidi B. 2011 “3D visualization of partially occluded objects using axially distributed sensing,” [J. Disp. Technol.] Vol.7 P.223-225 google cross ref
  • 13. Hong S.-P., Shin D., Lee B.-G., Kim E.-S. 2012 “Depth extraction of 3D objects using axially distributed image sensing,” [Opt. Express] Vol.20 P.23044-23052 google cross ref
  • 14. Shin D., Javidi B. 2012 “Three-dimensional imaging and visualization of partially occluded objects using axially distributed stereo image sensing,” [Opt. Lett.] Vol.37 P.1394-1396 google cross ref
  • 15. Shin D., Daneshpanah M., Javidi B. 2012 “Generalization of 3D N-ocular imaging systems under fixed resource constraints,” [Opt. Lett.] Vol.37 P.19-21 google cross ref
  • 16. Shin D., Javidi B. 2012 “Resolution analysis of N-ocular imaging systems with tilted image sensors,” [J. Display Technol.] Vol.8 P.529-533 google cross ref
  • 17. Cho M., Javidi B. 2012 “Optimization of 3D integral imaging system parameters,” [J. Display Technol.] Vol.8 P.357-360 google cross ref
  • 18. Cho M., Shin D. 2013 “Resolution analysis of axially distributed image sensing systems under equally constrained resources,” [J. Opt. Soc. Korea] Vol.17 P.405-409 google cross ref
OAK XML 통계
이미지 / 테이블
  • [ FIG. 1. ]  Pickup process of ADSS system.
    Pickup process of ADSS system.
  • [ FIG. 2. ]  Digital reconstruction process of ADSS system.
    Digital reconstruction process of ADSS system.
  • [ FIG. 3. ]  Framework for ADSS with K stereo cameras.
    Framework for ADSS with K stereo cameras.
  • [ FIG. 4. ]  Calculation diagram of pixel position for each camera of the first point source.
    Calculation diagram of pixel position for each camera of the first point source.
  • [ FIG. 5. ]  Calculation diagram of unresolvable pixel area.
    Calculation diagram of unresolvable pixel area.
  • [ TABLE 1. ]  Experimental conditions for Monte Carlo simulation
    Experimental conditions for Monte Carlo simulation
  • [ FIG. 6. ]  Simulation results according to the number of cameras with various distances between optical axis and point source plane.
    Simulation results according to the number of cameras with various distances between optical axis and point source plane.
  • [ FIG. 7. ]  Simulation results according to the focal length of imaging lens. (a) When x=0 mm (b) When x=100 mm.
    Simulation results according to the focal length of imaging lens. (a) When x=0 mm (b) When x=100 mm.
  • [ FIG. 8. ]  Simulation results according to the pixel size of camera. (a) When x=0 mm (b) When x=100 mm.
    Simulation results according to the pixel size of camera. (a) When x=0 mm (b) When x=100 mm.
  • [ FIG. 9. ]  Simulation results according to the moving range of stereo camera. (a) When x=0 mm (b) When x=100 mm.
    Simulation results according to the moving range of stereo camera. (a) When x=0 mm (b) When x=100 mm.
  • [ FIG. 10. ]  Simulation results according to the total pixel number of stereo camera. (a) When x=0 mm (b) When x=100 mm.
    Simulation results according to the total pixel number of stereo camera. (a) When x=0 mm (b) When x=100 mm.
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.