검색 전체 메뉴
PDF
맨 위로
OA 학술지
Image Blurring Estimation and Calibration with a Joint Transform Correlator
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
Image Blurring Estimation and Calibration with a Joint Transform Correlator
KEYWORD
Joint transform correlator , Blurring effect , OTF , Correlation peak
  • I. INTRODUCTION

    Image blur due to the focusing error of camera systems (defocus blur) is mainly a result of a certain problem of the geometric image formation and the finite depth of field of practical camera lens systems. The camera lens system with the focusing error tends to defocus objects and blur the acquired images. This kind of image blur caused by the focusing error or by an imperfect imaging lens system with defocusing aberration will cause serious image degradation. Many kinds of image processing techniques have been developed to restore the original image [1-3]. Most of these techniques estimate the point-spread function (PSF) associated with the image acquisition system which carries out space-invariant deconvolution based on the estimated blurring function for image restoration. In addition to the applications related to image restoration or sharpening, defocus blur is also considered as an important visual cue for image quality assessment [4, 5] and super resolution image reconstruction [6, 7]. For the images captured with small depth-of-field, defocus blur can be used for image segmentation or interested region detection [8, 9]. Moreover, depth recovery from a single camera can be achieved by measuring the blur extent of the captured defocused image [10, 11]. In the above applications, the identification of defocus blur parameters plays a central role in their underlying techniques. Moreover, these kinds of techniques are able to identify the blur parameters for specific application domains. However, the estimation of a blur does not necessarily give the true parameter estimate [12]. As a result, the analysis is simply carried out based on the comparison between the defocused blur image and original image.

    In this paper, we adopted the Joint transform correlator (JTC) to carry out the comparison between the defocused blur image and original image. JTC has shown remarkable achievements and is a useful alternative to the other optical systems [13-15] for pattern recognition and target tracking applications. The typical advantage of the JTC is that it uses a type of real time optical system which quantitatively compares images by measuring correlation peaks [16]. Recently, the JTC shows a unique application in the color comparison and color difference measurement [17]. This works presented the close relation between the color difference and the correlation peaks by decomposing the red, green, and blue components from the original images. In this paper, we presented a simple technique to estimate qualitatively the extents of blurring and to restore the original image by using the JTC. The correlation peaks obtained by the JTC play an essential role as the blur parameters to estimate quantitatively the extents of the blurring in detail. The correlation peaks are used to calculate the focusing error which results in the restoration of original image by calibrating the defocused camera. In the following sections, the relation between the correlation and the focusing error concerning the OTF (optical transfer function) and the JTC system in detail. Section III describes simulation results and discusses a possibility of the restoration of original image and the calibration of the camera lens with the defocused aberration, and finally some comments are contained in the conclusion section.

    II. BLURRING EXTENTS AND CORRELATION PEAK

    One of the easiest aberrations to deal with mathematically is a simple focusing error of the lens. When a focusing error is present, the center of curvature of the spherical wavefront converging towards the image of an object is formed either to the left or to the right of the image plane. Considering an on-axis point for simplicity, the phase distribution across the exit pupil is of the form

    image

    where f is a focal length of the lens. Eq. (1) is a general phase function. If there is a focusing error, the phase difference Δ⏀(x,y) can then be determined by subtracting the ideal phase distribution from the actual phase distribution. Thus, the phase error is given by

    image

    where fi is a focal length of the lens which forms ideal phase distribution, and fa is a focal length of the lens which forms actual phase distribution with the defocused error. Thus, the path-length error is given by

    image

    which is seen to depend on the quadratic space variables in the exit pupil. For the assumption of a square aperture of width 2w, the maximum path-length error at the edge of the aperture along the x or y axis, which we represent by Wm is given by

    image

    The number Wm is a convenient indication of the severity of the focusing error. If we let fa = fi ± Δf and Δf small, then the path-length error W(x, y) can be expressed as

    image

    If the path-length error W(x, y) given by Eq. (5) is used to obtain the OTF (optical transfer function), it can be shown as [18]

    image

    We want to know the relation between the OTF of Eq. (6) caused by the focusing error and the extents of the image blurring caused by this focusing error. Now we prepare a JTC system to investigate the effect of the focusing error on the image blurring. Figure 1 shows the optical structure of the JTC system for the investigation of the defocusing effect on the correlation peak and PSNR. The optical system of Fig. 1 is composed of two parts, the image capture camera part and the JTC part. The image capture camera part is composed of a camera lens system which captures the original image on the LCD come from a PC. The captured image will be blurred if the camera lens is defocused. The blurred image comes into the PC and is combined with the original image to form a joint input image on LCD 1. Let us call the reference image r(x,y) and the sample image s(x,y) and assume that those two inputs are separated by 2xo. Then, the input joint images g(x,y) can be expressed as

    image

    The joint power spectrum (JPS) on the Fourier plane, which is called the intensity of this interference fringe pattern, can be expressed as

    image

    Here, * is the phase conjugate, u and v are independent spatial frequency variables scaled by a factor of where λ is the wavelength of input collimated light and f is the focal length of the Fourier transforming lenses L1 and L2. Equation (9) indicates the JPS excluding the DC terms and including the non-linearity parameter k.

    image

    If the reference image perfectly matches the sample image and there is no phase error between the reference image and the sample image, the output can be expressed as

    image

    III. RESULTS AND DISCUSSIONS

    Figure 2 shows the computer simulated results of the OTF derived from Eq. (6) for various values of Wm/λ. Figure 2 shows that the diffraction-limited OTF is obtained when Wm = 0. Note also that, for value of Wm>λ/2, the sign of the OTF are reversed at some region of the spatial frequency. Table 1 shows the results of the focusing error calculated from Eq. (5) for various values of Wm/λ. In this paper, we used a DSLR camera for capturing the images for investigating the relation between the OTF of Eq. (6) caused by the focusing error and the image blurring caused by this focusing error. The diameter and the focal length are all the same of 50 mm. As mentioned previously in Section II, we make an assumption of a square aperture for the simplicity of derivation of the OTF. Because our purpose is to shows the evident relation between the OTF with the focusing error and the image blurring with this focusing error, the actual error caused by this assumption instead of using a circular aperture has little effect on our study. Table 1 reveals that focal error Δf increases by 1.27 μm for λ/4 increment of Wm. Figure 3 shows 256 × 256 gray images of a portrait captured by defocusing the camera according to Table 1. Figure 3(a) represents an image (reference) taken by the camera focusing at the object. Next, we defocused the camera step by step with the amount of 1.27 μm corresponding to λ/4 increment of Wm which resulted in from Fig. 3(b) to Fig. 3(e). Figure 3(f) represents a mismatch image to check the amount of discrimination between the matched blurring images and the mismatched images compared with the reference image. We used the JTC system to evaluate the proposed technique of calculating the amount of blurring caused by defocusing the camera. Table 2 shows the correlation peaks of the sample images measured by the JTC system. First, let us check the correlation peaks of the images obtained by defocusing the reference image. In the case of correlating the reference image of Fig. 3(a) corresponding to the case without a focusing error (Δf=0), the measured correlation peak value is 6.184×108. For the case of correlating the reference image of Fig. 3(a) with the defocused image from sample 1 (Fig. 3(b)) to sample 4 (Fig. 3(e)), the measured correlation peak values are 6.172×108, 6.157×108, 6.149×108, and 6.137×108, respectively. In Table 2, ΔCP means the difference of the correlation peak between the reference image and the sample image. The value of ΔCP for the four blurred sample images is 0.012, 0.027, 0.035, and 0.047, respectively. On the other hand, in the case of correlating the reference image with the mismatch image of Fig. 3(f), the measured value of the correlation peak is 3.498×108 and ΔCP is 2.686 which shows so very high compared to the matched images even though they are blurred. Figure 4 shows the two curves of ΔCP and Δf. Figure 4 reveals that two curves are linear and approximately the same. In Section II, we mentioned that we want to know the relation between the OTF of Eq. (6) caused by the focusing error and the extents of the image blurring caused by this focusing error, and we prepared a JTC system to investigate the effect of the focusing error on the image blurring. Thus, considering Table 1, Table 2 and Fig. 4, we can conclude that the extents of the image blurring caused by the focusing error can be estimated by measuring the correlation peaks using the JTC system. In addition, we can calibrate blurred images by measuring the correlation peaks of those images and calculating the focusing errors Δf by using linear approximation of Fig. 4. Now, Figure 5 shows one example of the calibration of the blurred image. Figure 5(a) is a blurred image obtained by defocusing the reference object randomly, and Fig. 5(b) is a calibrated image obtained by finding the focusing error Δf. The focusing error Δf can be calculated by measuring the correlation peak of the blurred image. The measured correlation peak of the blurred image is 6.127 ×108, and the correlation difference ΔCP becomes 0.057. Therefore, the calibrated Δf is 6.03 μm supposupposing both slope of ΔCP and Δf are approximately 2.0.

    [TABLE 1.] Results of the focusing error for various values of Wm/λ

    label

    Results of the focusing error for various values of Wm/λ

    [TABLE 2.] Correlation peaks of the sample images measured by the JTC system

    label

    Correlation peaks of the sample images measured by the JTC system

    IV. CONCLUSION

    In this paper, we presented a new technique of estimating the extents of blurring effect caused by defocusing a camera lens using the JTC. In addition, we presented a possibility of calibration of the blurred images by finding the focusing error calculated from the measured correlation peak. Thus, we can conclude that a camera lens system with the aberration of the defocusing error can be calibrated by finding Wm/λ, that is for finding the focusing error Δf finally, through the calculation process by simply obtaining the correlation peak with the JTC system.

참고문헌
  • 1. Banham M., Katsaggelos A. 1997 “Digital image restoration,” [IEEE Signal Process. Mag.] Vol.14 P.24-41 google
  • 2. M. A. Kutay, H. M. Ozaktas 1998 “Optimal image restoration with the fractional fourier transform,” [J. Opt. Soc. Am. A] Vol.15 P.825-833 google cross ref
  • 3. Lin H.-Y., Chou X.-H. 2012 “Defocus blur parameters identification by histogram matching,” [J. Opt. Soc. Am. A] Vol.29 P.1964-1706 google
  • 4. Wu S., Lin W., Xie S., Lu Z., Ong E. P., Yao S. 2009 “Blind blur assessment for vision-based applications,” [J. Vis. Commun. Image Represent.] Vol.20 P.231-241 google cross ref
  • 5. van Zyl Marais I., Steyn W. H. 2007 “Robust defocus blur identification in the context of blind image quality assessment,” [Signal Process. Image Commun.] Vol.22 P.833-844 google cross ref
  • 6. Rajan D., Chaudhuri S., Joshi M. 2003 “Multi-objective super resolution: Concepts and examples,” [IEEE Signal Process. Mag.] Vol.20 P.49-61 google cross ref
  • 7. Yang J., Schonfeld D. 2010 “Virtual focus and depth estimation from defocused video sequences,” [IEEE Trans. Image Process.] Vol.19 P.668-679 google cross ref
  • 8. Swain C., Chen T. 1995 “Defocus-based image segmentation,” [IEEE International Conference on Acoustics, Speech, and Signal Processing] Vol.4 P.2403-2406 google
  • 9. Liu Z., Li W., Shen L., Han Z., Zhang Z. 2010 “Automatic segmentation of focused objects from images with low depth of field,” [Pattern Recogn. Lett.] Vol.31 P.572-581 google cross ref
  • 10. Pradeep K., Rajagopalan A. 2007 “Improving shape from focus using defocus cue,” [IEEE Trans. Image Process.] Vol.16 P.1920-1925 google cross ref
  • 11. Chaudhuri S., Rajagopalan A. 1998 Depth from Defocus: A Real Aperture Imaging Approach google
  • 12. Lin J., Zhang C., Shi Q. 2004 “Estimating the amount of defocus through a wavelet transform approach,” [Pattern Recogn. Lett.] Vol.25 P.407-411 google cross ref
  • 13. Weaver C. J., Goodman J. W. 1966 “A technique for optically convolving two functions,” [Appl. Opt.] Vol.5 P.1248-1249 google cross ref
  • 14. Gianino P. D., Horner J. L. 1984 “Phase-only matched filtering,” [Appl. Opt.] Vol.23 P.812-816 google cross ref
  • 15. Mu G. G., Wang X. M., Wang Z. Q. 1988 “Amplitude-compensated matched filtering,” [Appl. Opt.] Vol.27 P.3461-3463 google cross ref
  • 16. Alam M. S., Khan J., Bai A. 2004 “Heteroassociative multiple-target tracking by fringe-adjusted joint transform correlation,” [Appl. Opt.] Vol.43 P.358-365 google cross ref
  • 17. Jeong M. H. 2011 “Coded single input channel for color pattern recognition in joint transform correlator,” [Journal of the Optical Society of Korea] Vol.15 P.335-339 google cross ref
  • 18. Goodman J. W. 1996 Introduction to Fourier Optics google
OAK XML 통계
이미지 / 테이블
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ FIG. 1. ]  Optical structure of the image blurring estimating and calibrating JTC system.
    Optical structure of the image blurring estimating and calibrating JTC system.
  • [ ] 
  • [ ] 
  • [ ] 
  • [ FIG. 2. ]  Computer simulated results of the OTF for various values of Wm/λ.
    Computer simulated results of the OTF for various values of Wm/λ.
  • [ TABLE 1. ]  Results of the focusing error for various values of Wm/λ
    Results of the focusing error for various values of Wm/λ
  • [ FIG. 3. ]  256 × 256 gray images of a portrait captured by defocused camera.
    256 × 256 gray images of a portrait captured by defocused camera.
  • [ TABLE 2. ]  Correlation peaks of the sample images measured by the JTC system
    Correlation peaks of the sample images measured by the JTC system
  • [ FIG. 4. ]  Linear approximation and relation between the correlation peak and the focusing error.
    Linear approximation and relation between the correlation peak and the focusing error.
  • [ FIG. 5. ]  Example of the calibration of the blurred image.
    Example of the calibration of the blurred image.
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.