3D Visualization of Partially Occluded Objects Using Axially Distributed Image Sensing With a WideAngle Lens
 Author: Kim NamWoo, Hong SeokMin, Lee Hoon Jae, Lee ByungGook, Lee JoonJae
 Organization: Kim NamWoo; Hong SeokMin; Lee Hoon Jae; Lee ByungGook; Lee JoonJae
 Publish: Current Optics and Photonics Volume 18, Issue5, p517~522, 25 Oct 2014

ABSTRACT
In this paper we propose an axially distributed imagesensing method with a wideangle lens to capture the widearea scene of 3D objects. A lot of parallax information can be collected by translating the wideangle camera along the optical axis. The recorded widearea elemental images are calibrated using compensation of radial distortion. With these images we generate volumetric slice images using a computational reconstruction algorithm based on ray backprojection. To show the feasibility of the proposed method, we performed optical experiments for visualization of a partially occluded 3D object.

KEYWORD
3D imaging , Axially distributed sensing , Camera calibration , Wideangle lens

I. INTRODUCTION
The visualization of partially occluded 3D objects has been considered one of the most challenging drawbacks in the 3Dvision field [1, 2]. To solve this problem, several multiperspective imaging approaches, including integral imaging and axially distributed image sensing (ADS), have been studied [39]. Integral imaging uses a planar pickup grid or a camera array. On the other hand, an ADS method, implemented by translating a camera along its optical axis, was proposed to take digital plane images that can be refocused after elemental images (EIs) have been taken for 3D visualization of partially occluded objects [1014]. This method provides a relatively simple architecture to capture the longitudinal perspective information of a 3D object.
However, the capacity of the ADS method is related to how far the object is located from the optical axis. Due to a lower capacity for objects located close to the optical axis, widearea elemental images are needed to reconstruct better 3D slice images under a large field of view (FOV).
In this paper, in order to capture a widearea scene of 3D objects, we propose axially distributed image sensing using a wideangle lens (WAL). Using this type of lens we can collect a lot of parallax information. The widearea EIs are recorded by translating the wideangle camera along its optical axis. These EIs are calibrated for compensation of radial distortion. With the calibrated EIs, we generate volumetric slice images using a computational reconstruction algorithm based on ray backprojection. To verify our idea, we carried out optical experiments to visualize a partially occluded 3D object.
II. SYSTEM CONFIGURATION
In general, a camera calibration process is needed for the images captured with the camera mounted with a WAL. Therefore, to visualize the correct 3D slice images in our ADS method, we introduce a camera calibration process to generate calibrated elemental images. This camera calibration method has never been applied to the conventional ADS system.
Figure 1 shows the scheme of the proposed ADS system with a WAL. It is composed of three different subsystems: (1) ADS pickup, (2) a calibration process for elemental images, and (3) a digital reconstruction process.
2.1. ADS Pickup Process
The ADS pickup of 3D objects in the proposed method is shown in Fig. 2. Compared to the conventional method, the camera mounted with a WAL is translated along the optical axis. Let us define the focal length of the WAL as
f . When 3D objects are located at a distanceZ z_{1} away from the first camera position, the widearea EIs are captured along the optical axis by moving the camera along that axis. A total ofK EIs can be recorded by moving a wideangle cameraK 1 times. HereΔz is the separation between two adjacent camera positions. Thek th EI is recorded at the camera positionzk =z_{1} +(k −1)Δz . Since we capture each EI at a different camera position, each contains the object's image at a different scale level.2.2. Calibration Process
In a typical imaging system, lens distortion usually can be classified among three types: radial distortion, decentering distortion, and thin prism distortion [15]. However, for most lenses the radial component is predominant. We assume that our WAL produces predominantly radial distortion, and ignore other distortions in a recorded image. Therefore, the image distortion should be corrected by a calibration process before the digital reconstruction used in the proposed method. Our calibration process is composed of two steps. In the first step, the radial distortion model is considered. We suppose that the center of distortion is (
c_{x} ,c_{y} ) in the recorded image with radial distortion. LetI_{d} be the distorted image andI_{u} the undistorted image. To correct the distorted image, the distorted point located at (x_{d} ,y_{d} ) inI_{d} has to move to the undistorted point at (x_{u} ,y_{u} ) inI_{u} . Ifr_{d} andr_{u} are respectively defined as the distance between (c_{x} ,c_{y} ) and (x_{d} ,y_{d} ) and the distance between (c_{x} ,c_{y} ) and (x_{u} ,y_{u} ), the coordinates (x_{u} ,y_{u} ) can be calculated by [16, 17]From Eq. (1) we can see that the distortion model has a set of five parameters
Θ_{d} = [c_{x} ,c_{y} ,k_{1} ,k_{2} ,k_{3} ].In the second step, the point with coordinates (
x_{u} ,y_{u} ) is projected to a new point (x_{p} ,y_{p} ) in the desired image using a projective transformation, which is the most general transformation that maps lines into lines. The new coordinates of (x_{p} ,y_{p} ) are given by [16]Here, it is seen that the projection parameters are
Θ_{p} =[m_{0} ,m_{1} ,m_{2} ,m_{3} ,m_{4} ,m_{5} ,m_{6} ,m_{7} ]. Therefore, the parameter setsΘ_{d} andΘ_{p} must be found to obtain the corrected images.Before recording 3D objects, we want to find the two parameter sets
Θ_{d} andΘ_{p} for a given system. To do so, a chessboard pattern is used to apply the pointcorrespondences method [16]. Figure 3 shows the ADS pickup of the chessboard pattern for the calibration process. For the chessboard pattern located at a certain position, EIs are recorded by moving the wideangle camera through its total range of motion, as shown in Fig. 3. We can see that the recorded EIs have radial image distortion. In this paper, the coordinate mapping of distorted elemental images is performed by using the chessboard image to identify whether the distortion of the EIs has mapped the coordinates correctly. Based on recording the chessboard image, the flowchart of the calibration process we used is shown in Fig. 4. The first step is to extract the corner feature points from the recorded chessboard pattern. We want to recover the mapping from the distorted EIs to the EIs using the extracted feature points. Using the GaussNewton method, we can find the two parameter setsΘ_{d} andΘ_{p} for the radial distortion and projective transformation [16]. The computed parameters are then used to correct the image distortion in the widearea EIs of the desired 3D object. After repeating the computation process of the calibration parameters for each EI of the chessboard pattern, we store a set of calibration parameters in the computer. Based on the stored parameter sets, the recorded EIs of 3D objects are corrected.2.3. Digital Reconstruction Process
The final process of our wideangle ADS method is digital reconstruction using the calibrated EIs described in Section 2.2. In this process we generate a sliceplane image according to the reconstruction distance. Figure 5 shows the digital reconstruction process based on an inversemapping procedure through a pinhole model [7]. Each wideangle camera is modeled as a pinhole camera with the calibrated EI located at a distance
g from the camera. We assume that the reconstruction plane is located at a distancez =L . Each EI is inversely projected through each corresponding pinhole to the reconstruction plane atL . Then the ith inversely projected EI is magnified byM_{i} =(L z_{i} )/g . At the reconstruction plane, all inversely mapped EIs are superimposed upon each other using the different magnification factors. In Fig. 3,E_{i} is thei th EI with a size ofp ×q , wherep andq are the pixel counts corresponding to width and height in the EI.I_{L} is the superimposed image of all the inversely mapped images of the EI at the reconstruction planeL .I_{L} can be calculated by the following equation:where
U_{i} is the upsampling factor for magnification ofE_{i} at the reconstruction planeL , and the size ofI_{L} isM_{1}p ×M_{1}q .To reduce the computational load imposed by the large magnification factor, Eq. (3) is modified by using the downsampling factor
D_{r} of the image by a factor ofr . Then the superimposed image is given byIn Eq. (4)
I_{L} is the reconstructed plane image after superimposing all EIs at the reconstruction distanceL . To generate the 3D volume information, we should reconstruct the plane images for the desired depth ranges. To do so, the digital reconstruction process is repeated for the given distance range.III. EXPERIMENTS AND RESULTS
We performed preliminary experiments to demonstrate our proposed ADS system for partially occluded object visualization. Figure 6 shows the experimental structure we implemented. As shown in Fig. 6, we used two scenarios at the same time. The first scenario has a single object with a chessboard pattern with square size 100 mm × 100 mm. The second scenario has two objects: a tree as the occluder, and ‘DSU’ letter objects with letter size 100 mm × 70 mm to demonstrate partially occluded object visualization. The chessboard pattern and ‘DSU’ are located 350 mm from the first wideangle camera position, as shown in Fig. 6. The occluder is located 150 mm in front of the ‘DSU’ object.
We use a 1/4.5 inch CMOS camera with a resolution of 640 × 480. The WAL has a focal length
f =1.79 mm and maximum FOV angle 131°. The wideangle camera was translated inz =1 mm increments for a total ofK =150 EIs and a total displacement distance of 149 mm. Examples of recorded EIs are shown in Fig. 7.After recording the EIs using the wide–angle camera, we applied the calibration process to them. Each EI was corrected using the corresponding calibration parameters. The calculated parameters are shown in Tables 1 and 2 for the radial distortion and projective transformation (
Θ_{d} andΘ_{p} ) respectively. The calibrated EIs are shown in Fig. 8. From the result in the top left of Fig. 8, it is seen that the projection parameters were well computed. Based on these projection parameters, the EIs of 3D objects were calibrated. In our calibration process the calibrated images were cropped for the next digital reconstruction.With the calibrated EIs as shown in Fig. 8, we reconstructed slice plane images for 3D objects according to the different reconstruction distances. The 150 calibrated EIs were used in the digital reconstruction algorithm employing Eq. (2). The slice image at the original position of the 3D objects is shown in Fig. 9. For comparison, we include the results of using the conventional ADS method without a calibration process. From the experimental results, we can see that our method can be demonstrated successfully for visualizing a partially occluded object.
IV. CONCLUSION
In conclusion, we have presented a wideangle ADS system to capture a widearea scene of 3D objects. Using a WAL we can collect a lot of parallax information for a large scene. The calibration process was introduced to compensate for the image distortion due to the use of this type of lens. We performed a preliminary experiment of partially occluded 3D objects and demonstrated our idea successfully.

[FIG. 1.] Scheme of the proposed ADS method.

[FIG. 2.] Pickup process for 3D objects by moving a camera with a wide angle lens according to the proposed ADS method.

[]

[]

[FIG. 3.] Optical pickup process to capture the elemental images, using a chessboard pattern for camera calibration.

[FIG. 4.] Flowchart of the calibration process to find the best parameters for radial and projective transformation.

[]

[FIG. 5.] Digital reconstruction process based on ray backprojection in the proposed ADS method.

[]

[FIG. 6.] Experimental setup to capture the elemental images of 3D objects.

[FIG. 7.] Examples of the recorded elemental images with the image distortion (a) Chessboard patter for calibration process (b) 3D objects.

[TABLE 1.] Computed parameters Θd for radial distortion

[TABLE 2.] Computed parameters Θp for projective transformation

[FIG. 8.] Experimental results: (a) 150th recorded elemental images for chessboard pattern and 3D objects before calibration process, (b) 150th calibrated elemental images for chessboard pattern and 3D objects after calibration process.

[FIG. 9.] 3D slice images reconstructed at the original position of the ‘DSU’ object: (a) conventional method without calibration process, (b) proposed method with calibration process.