검색 전체 메뉴
PDF
맨 위로
OA 학술지
Precision Evaluation of Three-dimensional Feature Points Measurement by Binocular Vision
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
Precision Evaluation of Three-dimensional Feature Points Measurement by Binocular Vision
KEYWORD
Camera calibration , 3D reconstruction , Binocular vision , Accuracy evaluation , (100.6890) Three-dimensional image processing , (110.0110) Imaging systems , (120.0120)Instrumentation , measurement , and metrology
  • I. INTRODUCTION

    Visual navigation, robotics and vision-based measurement are just a few examples of industrial applications that depend on pose estimation and computation of the three-dimensional(3D) object location[1-7]. In 3D reconstruction process with an optical technique, system accuracy is defined as a problem of the determination of 3D point positions, which are provided by 2D features gathered from binocular calibrated cameras. For great accuracy in the system performance, camera calibration and reconstruction have to be accurately performed.

    In the past many researchers have developed algorithms for camera calibration and reconstruction as an open topic in computer vision. Currently, several major calibration techniques are available. The most popular camera calibration method is the direct linear transformation (DLT) method originally reported by Abdel-Aziz and Karara[8]. The DLT method uses a set of control points whose object space coordinates are already known. The control points are normally fixed to a rigid calibration frame. The flexibility of the DLTbased calibration often depends on how easy it is to handle the calibration frame. The main DLT method problem is that the calibration parameters are not mutually independent. An alternative approach is reported by Hatze to ensure the orthogonality of the rotation matrix[9]. The direct nonlinear minimization technique directly builds camera parameters to minimize the residual error utilizing iteration computation[10, 11]. The intermediate parameters can be computed by solving linear equations. However, lens distortions do not follow this method[12, 13]. Len[14] and Tsai[15, 16]introduce an improved solution for calibration which is a two-step method to compute the distortion and skew parameters.To make the calibration more convenient and to avoid requiring 3D coordinates, Svoboda[17] made the technique more robust. According to the analysis of several images of a 2D calibration board, usually with a chessboard pattern, Z. Zhang[18, 19] describes an efficient method to improve the accuracy of the camera calibration based on Tsai’s model. A similar technique is explained by Kato[20], who focuses on retrieving camera location using fiducially markers, which are located in the 3D environment as squares of known size with high contrast patterns in the centre. K. C. Kwon[21] proposes a binocular stereoscopic camera vergence control method using disparity information by image processing and estimates the quantity of vergence control.

    For evaluating the calibration accuracy, K. Zhang[22] explores a model of the binocular vision system focused on 3D reconstruction and describes an improved genetic algorithm aimed at estimating camera system parameters.In order to enhance the calibration accuracy, many corners should be treated as feature points, which are distributed uniformly on the calibration block. W. Sun[23] presents a study investigating the effects of training data quantity, pixel coordinate noise on camera calibration accuracy. H. H.Cui[24] discusses an improved method for accurate calibration of a measurement system. The system accuracy is improved considering the nonlinear measurement error. Independent of the computed lens distortion model or the number of camera parameters, C. Ricolfe-Viala[25] outlines a metric calibration technique that calculates the camera lens distortion isolated from the camera calibration process. An accurate phase-height mapping algorithm is proposed by Z. W. Li[26] to improve the performance of the structured light system with digital fringe projection. By means of a training network, the relationship between the 2D image coordinates and the 3D object coordinates can be achieved. However, their experiments involve fixed system structure parameters and provide neither a synthetic evaluation on separate test data nor flexible binocular system parameters to verify the accuracy calibrated results.

    Considering the widely used DLT calibration method, we emphasize the structure factors that influence the binocular measurement system accuracy in this paper. Baseline distance and measurement distance are evaluated and applied to the experimental binocular system to reveal the quality of each factor. In the experiment, the 3D board calibration method for both factors is applied.

    II. MEASUREMENT ALGORITHM

       2.1. Camera Calibration

    The calibration method requires a set of 3D coordinates and two sets of 2D coordinates of grid board images from two cameras. Then the intrinsic and extrinsic parameters are calculated by solving the projection equation.

    image

    where M is the set of camera parameters, i.e. projective matrix; X3D and X2D are the corresponding 3D and 2D grid corners, respectively. The parameter matrix M can be decomposed as

    image

    where K1 is the intrinsic parameter matrix and K2 is the extrinsic parameter matrix. 

    image

    are the focal lengths along x and y axes of the image plane in pixel dimensions; u0 and v0 are the x and y coordinates of the origin in the image plane.

    image

    where R3×3 denotes the 3×3 rotation matrix and T3×1 denotes the 3×1 translation vector between camera and world coordinate. Though there are no distortion and skew parameters, the model is simple and common for calibration.

       2.2. 3D Reconstruction

    3D reconstruction is usually applied in the binocular vision measurement where the camera projection matrices of two cameras are adopted to calculate the 3D world coordinates of a point viewed by the cameras. Suppose P1 and P2 detected in two images and matched to each other are the 2D projective points of an arbitrary 3D point P in space. Then

    image
    image

    where (u1, v1, 1) and (u2, v2, 1) represent the homogenous coordinates of P1 and P2 in two images, respectively. (X, Y, Z, 1) is the homogenous coordinate of 3D point P. s1 and s2 are two non-zero scalars. M1 and M2 are the projective matrices of two cameras. They are received from camera calibration and employed to denote two projective mappings from world coordinates to pixel coordinates.

    Derived from Eq. (4) and (5), two scalars s1, s2 are eliminated and the linear equations are transformed as follows:

    image
    image

    Since one linear equation stands for a plane in space and two composite linear equations represent one crossing line of two planes, Eq. (6) and (7) refer to two lines in 3D space, which intersect at the 3D point P(X, Y, Z, 1).

    III. EXPERIMENTAL METHODS AND RESULTS

       3.1. Experimental Process

    The experiment is carried out in our laboratory space equipped with a vertical rack, raised about 2 m above the ground, as shown in Fig. 1. Two DH-HW3102UC cameras with 8 mm focal length lens are placed along the rack in a line perpendicular to the horizontal plane. The image resolution is 2048×1536 pixels with the frame rate 6 frame/sec and pixel size 3.2 ㎛×3.2 ㎛. To investigate the influences of baseline distance and measurement distance illustrated in Fig. 2., twenty-five configurations, baseline distances of 600 mm, 800 mm, 1000 mm, 1200 mm, 1400 mm, and measurement distances of 1000 mm, 1500 mm,2000 mm, 2500 mm, 3000 mm, are separately studied on the experimental setup.

    The obtained data for calibration is generated by printing a chessboard pattern of 60×60 mm grid corners onto three 500×500 mm sized sheets. Each of them is attached to the plane of a rigid cube. Binocular images captured from two cameras are demonstrated in Fig. 3. and Fig. 4. It produces two 2D coordinate data sets of 27 points for either camera.

    The 3D world coordinates of these points are measured relative to the intersection point of three chessboard pattern sheets, i.e. the origin of world reference system. The 3D coordinate data set has covered the calibration cube surface with 60 mm division. Because a normal ruler is accurate only to 1 mm, the maximum calibration board error of 0.5 mm can be reached, which is approximate 0.83% of the pattern size.

       3.2. Accuracy Evaluation Results

    The calibration result is a set of camera parameters. In most applications, the parameters are used for stereo computation to reconstruct the 3D coordinates for feature points of the measured objects. As the baseline distance, the measurement distance and the measurement direction are the three essential factors for the feature point reconstruction precision, the influences should be evaluated by the accuracy experiments of the stereo vision system.

    In this evaluation, with given camera parameters, stereo computation reconstructs the 3D coordinates of feature points on the calibration board by 2D camera images. Since the measurement and baseline distances of a binocular system influence the system accuracy, the difference between the reconstructed 3D coordinate of a feature point and the original coordinate in space is defined as the stereo error. The stereo error is related to the camera parameters and is restricted by system structures. To make a conclusion, we create a binocular system and choose some synthetic test points to show the stereo error distribution in 3D space. Furthermore, since different test points have different error values, we also analyze the error distribution and scope to show the relationship between the stereo error and the positioning parameters of the binocular system. The identical calibration and reconstruction methods mentioned above are adopted to ensure that the assessment results are independent of the computation method. Therefore, different measurement results are appraised with the same evaluation procedure.

    The stereo error varies when the camera positions are different. The 3D errors are defined as follows:

    image
    image
    image
    image

    where (xor, yor, zor) is the original coordinates of the feature points on the calibration board, (xre, yre, zre) is the reconstructed coordinates of the feature points from two camera images; Ex, Ey, and Ez demonstrate the measurement errors of the binocular system in x, y, z directions, respectively; E represents the comprehensive error.

    Fig. 5(a)-(y) attempt to show what level of accuracy is available given a constant amount of baseline distance and a variable observation distance. All the data are collected under similar conditions in order to compare experimental results.

    In the scope of the error distribution in Fig. 5(a)-(y), as the measurement distance increases from 1000 mm to 3000 mm, the maximum error value changes from -2 mm - +2 mm to -4 mm - +6 mm, and the error values distribute more dispersedly. From these results, we come to the conclusion that the error value increases with the distance between the test object and the measurement system. However, when the baseline distance varies from 800 mm to 1400 mm with a certain constant value of measurement distance 1000 mm,1500 mm, 2000 mm, 2500 mm, or 3000 mm, for example, 3000 mm, the error value transforms from -4 mm - +6 mm to less than -2 mm - +2 mm in the boundary of error scope. If the baseline distance is increased, the measurement system error made by binocular cameras is reduced. Since the baseline distance is confined with system dimensions and the error distribution concentrates on -2 mm to +2 mm when the baseline distance is up to 1000 mm, the optimal value of the system is baseline distance 1000 mm, measurement distance 1500 mm with the minimum error value and distribution scope. For a certain binocular measurement system, after an obvious peak of the accuracy benefit from baseline distance increase, the system accuracy is not able to be improved effectively. In addition, we notice that the measurement error in the z direction is normally less than in x and y directions. This means that the principal measurement dimension should be arranged along with the baseline direction of the two cameras.

       3.3. Experimental Results Discussion

    A model of the binocular measurement system is constructed in Fig. 6, where o, o’ represent the optical centres of two cameras[27]. P1(X1, Y1) and P2(X2, Y2) are the projective 2D points of a measured 3D point p(x’, y’, z’). L is the baseline distance of the system. f1 and f2 are the focus lengths of the two cameras. α, β are the measurement angles of p(x’, y’, z’) relative to o, o’ in o-xz’ plane. α0, β0 are the angles between optical axis and ox axis, γ1, γ2 are the horizontal viewing angles of P1(X1, Y1) and P2(X2,Y2). θ1, θ2 are the vertical viewing angles of P1(X1, Y1)and P2(X2, Y2). The coordinate of p(x’, y’, z’) is obtained from geometrical relationship as follows:

    image

    For evaluating the measurement error caused by baseline distance variation, the error transfer function relative to baseline distance, L, can be defined as

    image

    Here, two typical situations are considered in this paper, α = β and β = 90º. According to geometrical relationship in Fig. 6, the error transfer function can be expressed as

    image

    Where L is the baseline distance of the system, y’, z’ are the coordinates of p(x’, y’, z’).

    According to simulation results in Fig. 7, the value of error transfer function reduces as the baseline distance L increases. And the value is lower when the measurement distance z’ decreases. The simulation results fully explain the experimental data.

    IV. CONCLUSION

    An experimental study on camera calibration and 3D reconstruction of a binocular measurement system is carried out to investigate how factors such as baseline distance, measurement distance and baseline direction affect the reconstruction accuracy. The most representative method, developed by Abdel-Aziz and Karara, is chosen for experimentation on 2D data from cameras. A typical criterion is applied to evaluate measurement accuracy on test sets.

    With the same reference calibration board, we set a series of test distances from 1000 mm to 3000 mm based on the same calibration and reconstruction methods. According to the comprehensive error, a smaller test distance employing grid-board calibration shows a more stable result than a higher distance method. Nevertheless, the value of the measurement distance of a binocular system is determined on the detected object, system geometrical structure and view field. Therefore, it is an inefficient method to enhance accuracy by adjusting measurement distance between cameras and the detected object in the experimental situation that the measurement distance is constrained by the system structure and camera characteristics. If we alter the baseline distance for the source data of the binocular system, the accuracy is more sensitive in the range of 600 mm to 1000 mm. Thus, to increase the accuracy of the binocular system, it is more important to increase the baseline distance than to improve the measurement distance. However, experimental results indicate that after a peak of accuracy, i.e. 1000 mm in the experiment, it is unproductive to improve measurement effects by choosing a larger baseline distance.

    The stereo error evaluation shows that the measurement result gotten in different directions has different valid ranges. For the x, y, and z direction results, on average, z direction errors are smaller than the other two directions when the z direction agrees with the baseline direction.

    In summary, it is clear that the methods for enhancing accuracy of a binocular measurement system depend greatly on some known specific geometrical information such as the test object dimension, measurement distance, baseline distance and direction. Hence, it is crucial to select a reasonable system dimension and structure for binocular measurement so as to improve the accuracy of 3D feature measurement precision using binocular vision.

참고문헌
  • 1. Kwon G. I, Choi Y. H 2010 Image-processing based panoramic camera employing single fisheye lens [J. Opt.Soc. Korea] Vol.14 P.245-259 google cross ref
  • 2. Tay C. J, Quan C, Huang Y. H, Fu Y 2005 Digital image correlation for whole field out-of-plane displacement measurement using a single camera [Opt. Comm.] Vol.251 P.23-36 google cross ref
  • 3. Choi H. J, Park J. H, Hong J. S, Lee B 2004 An improved stereovision scheme using single camera and a composite lens array [J. Opt. Soc. Korea] Vol.8 P.72-78 google cross ref
  • 4. Kwon K. C, Choi J. K, Choi Y. S 2002 Automatic control of horizontal-moving stereoscopic camera by disparity compensation [J. Opt. Soc. Korea] Vol.8 P.150-155 google
  • 5. Beiderman Y, Teicher M, Garci J, Mico V, Zalevsky Z 2010 Optical technique for classification recognition and identification of obscured objects [Opt. Comm.] Vol.283 P.4274-4282 google cross ref
  • 6. Park Y. C, Park C. G, Kang M. H, Ahn S. J 2009 A high-speed digital laser grating projection system for the measurement of 3-dimensional shapes [J. Opt. Soc. Korea] Vol.13 P.251-255 google cross ref
  • 7. Shin D. H, Kim E. S 2008 Computational integral imaging reconstruction of 3D object using a depth conversion technique [J. Opt. Soc. Korea] Vol.12 P.131-135 google cross ref
  • 8. Abdel-Aziz Y. I, Karara H. M “Direct linear transformation into object space coordinates in close-range photogrammetry” P.1-18 google
  • 9. Hatze H 1988 High-precision three-dimensional photogrammetric calibration and object space reconstruction using a modified DLT-approach [J. Biomech.] Vol.21 P.533-538 google cross ref
  • 10. Gennery D. B “Stereo-camera calibration” P.101-108 google
  • 11. Isaguirre A, Pu P, Summers J “A new developmentin camera calibration calibrating a pair of mobile cameras” P.74-79 google
  • 12. Ganapathy S “Decomposition of transformation matricesfor robot vision” P.130-139 google
  • 13. Faugeras O. D, Toscani G “The Calibration problem for stereo” P.15-20 google
  • 14. Len R. K, Tsai R. Y “Techniques for calibration ofthe scale factor and image center for high accuracy 3D machine vision metrology” P.68-75 google
  • 15. Tsai R. Y “An efficient and accurate camera calibration technique for 3D machine vision” P.364-374 google
  • 16. Tsai R. Y 1987 An efficient and accurate camera calibration technique for 3D machine vision [IEEE Trans. Robot] Vol.3 P.364-374 google
  • 17. Tomas S, Daniel M, Tomas P 2005 A convenient multicamera self-calibration for virtual environments [Teleoperators and Virtual Environments] Vol.14 P.407-422 google cross ref
  • 18. Zhang Z “Flexible camera calibration by viewing a planefrom unknown orientations” google
  • 19. Zhang Z 2000 A flexible new technique for camera calibration [IEEE Trans. Pattern Anal. Mach. Intell.] Vol.22 P.1330-1334 google cross ref
  • 20. Kato H, Billinghurst M “Marker tracking and HMD Calibration for a video-based augmented reality conferencing system” P.85-95 google
  • 21. Kwon K. C, Lim Y. T, Kim N, Song Y. J, Choi Y. S 2009 Vergence control of binocular stereoscopic camera using disparity information [J. Opt. Soc. Korea] Vol.13 P.379-385 google cross ref
  • 22. Zhang K, Xu B, Tang L, Shi H 2006 Modeling of binocular vision system for 3D reconstruction with improved genetic algorithms [Int. J. Adv. Manuf. Technol.] Vol.29 P.722-728 google cross ref
  • 23. Sun W, Cooperstock J. R 2006 An empirical evaluation of factors influencing camera calibration accuracy using three publicly available techniques [Mach. Vision Appl.] Vol.17 P.51-67 google cross ref
  • 24. Cui H. H, Liao W. H, Cheng X. S, Dai N, Yuan T. R 2010 A three-step system calibration procedure with error compensation for 3D shape measurement [Chinese Opt. Lett.] Vol.8 P.33-37 google cross ref
  • 25. Ricolfe-Viala C, Sanchez-Salmeron A. J 2010 Robust metric calibration of non-linear camera lens distortion [Pattern Recognit.] Vol.43 P.1688-1699 google cross ref
  • 26. Li Z. W, Shi Y. S, Wang C. J, Qin D. H, Huang K 2009 Complex object 3D measurement based on phaseshifting and a neural network [Opt. Comm.] Vol.282 P.2699-2706 google cross ref
  • 27. Guo Y. B, Yao Y, Di X. G “Research on structural parameter optimization of binocular vision measuring system for parallel mechanism” P.1131-1135 google
OAK XML 통계
이미지 / 테이블
  • [ FIG. 1. ]  Experimental environment.
    Experimental environment.
  • [ FIG. 2. ]  Baseline distance and measurement distance.
    Baseline distance and measurement distance.
  • [ FIG. 3. ]  The image and feature points from the higher camerawith 600 mm baseline distance and 1000 mm measurementdistance.
    The image and feature points from the higher camerawith 600 mm baseline distance and 1000 mm measurementdistance.
  • [ FIG. 4. ]  The image and feature points from the lower camerawith 600 mm baseline distance and 1000 mm measurementdistance.
    The image and feature points from the lower camerawith 600 mm baseline distance and 1000 mm measurementdistance.
  • [ FIG. 5. ]  Feature points reconstruction errors Ex Ey Ez and E. The experiment is performed on baseline distance and measurementdistance with the following values respectively.(a). 600 mm 1000mm. (b). 600 mm 1500 mm. (c). 600 mm 2000 mm. (d). 600 mm2500 mm. (e). 600 mm 3000mm. (f). 800 mm 1000 mm. (g). 800 mm 1500 mm. (h). 800 mm 2000 mm. (i). 800 mm 2500 mm.(j). 800 mm 3000 mm. (k). 1000 mm 1000 mm. (l). 1000 mm 1500 mm. (m). 1000 mm 2000 mm. (n). 1000 mm 2500 mm. (o).1000 mm 3000 mm. (p). 1200 mm 1000mm. (q). 1200 mm 1500 mm. (r). 1200 mm 2000 mm. (s). 1200 mm 2500 mm. (t). 1200mm 3000 mm. (u). 1400 mm 1000mm. (v). 1400 mm 1500 mm. (w). 1400 mm 2000 mm. (x). 1400 mm 2500mm. (y). 1400 mm3000mm. (t). 1200 mm 3000 mm. (u). 1400 mm 1000mm. (v). 1400 mm 1500 mm. (w). 1400 mm 2000 mm. (x). 1400 mm2500mm. (y). 1400 mm 3000 mm.
    Feature points reconstruction errors Ex Ey Ez and E. The experiment is performed on baseline distance and measurementdistance with the following values respectively.(a). 600 mm 1000mm. (b). 600 mm 1500 mm. (c). 600 mm 2000 mm. (d). 600 mm2500 mm. (e). 600 mm 3000mm. (f). 800 mm 1000 mm. (g). 800 mm 1500 mm. (h). 800 mm 2000 mm. (i). 800 mm 2500 mm.(j). 800 mm 3000 mm. (k). 1000 mm 1000 mm. (l). 1000 mm 1500 mm. (m). 1000 mm 2000 mm. (n). 1000 mm 2500 mm. (o).1000 mm 3000 mm. (p). 1200 mm 1000mm. (q). 1200 mm 1500 mm. (r). 1200 mm 2000 mm. (s). 1200 mm 2500 mm. (t). 1200mm 3000 mm. (u). 1400 mm 1000mm. (v). 1400 mm 1500 mm. (w). 1400 mm 2000 mm. (x). 1400 mm 2500mm. (y). 1400 mm3000mm. (t). 1200 mm 3000 mm. (u). 1400 mm 1000mm. (v). 1400 mm 1500 mm. (w). 1400 mm 2000 mm. (x). 1400 mm2500mm. (y). 1400 mm 3000 mm.
  • [ FIG. 6. ]  Measurement model of a binocular vision system.
    Measurement model of a binocular vision system.
  • [ FIG. 7. ]  Value of the error transfer function. The baselinedistance varies from 0 mm to 1500 mm and measurementdistance varies from 0 mm to 3000 mm respectively. (a). α = β. (b). β = 90º.
    Value of the error transfer function. The baselinedistance varies from 0 mm to 1500 mm and measurementdistance varies from 0 mm to 3000 mm respectively. (a). α = β. (b). β = 90º.
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.