검색 전체 메뉴
PDF
맨 위로
OA 학술지
Recovering the Colors of Objects from Multiple Near-IR Images
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
Recovering the Colors of Objects from Multiple Near-IR Images
KEYWORD
Recovering colors of objects , Night vision , Near-IR image , Multiple spectral near-IR illuminations , The red edge of the visible band
  • I. INTRODUCTION

    The human eye is typically capable of detecting light at wavelengths between 400 and 700 nm, a range known as the visible band. The human eye contains photoreceptors with three kinds of color-sensitive pigments for absorbing energy in this wavelength range, thereby enabling the eyes to see [1]. Without the presence of light, the colors as well as the shapes of objects in the scene being observed would disappear.

    The development of infrared imaging technology has provided us with night vision, which is the ability to see in conditions of low light or total darkness. Although night vision devices were previously only used by military forces, they have since become more widely available for civilian use. Night vision technologies can be divided into three main categories: low-light imaging, thermal imaging and nar-IR illumination, each of which has its advantages and disadvantages. A popular and sometimes inexpensive technique for achieving night vision uses a device that is sensitive to invisible near-IR radiation in conjunction with a near-IR illuminator. This technique can be performed by most cameras, because their sensors can detect near-IR wavelengths. However, the resulting scene appears as a monochrome image containing single-color or false-color tones which do not correspond to everyday experiences [2, 3]. Therefore, humans experience monochrome images as highly unnatural.

    Furthermore, in situations in which it is critical to obtain color information, such as for accurate identification or tracking purposes, a color image is much more advantageous than a monochrome image. Over the past 20 years, much research has been devoted to producing true-color images from near-IR imaging. The main focus of these studies involves the fusion of visible and infrared images, a technique for which the results are promising when the corresponding day-time images are available [4-9]. Meanwhile, attempts to produce color images by using only infrared light have led to the development of various applications in which a set of three sensors that are individually sensitive to different wavelengths in the near-IR band are used. For example, a new night vision camera with a set of three near-IR sensors was recently developed by AIST (the National Institute of Advanced Industrial Science and Technology in Japan) [10]. It has been reported that this camera can record clear, high-frame-rate color videos even in darkness, and can reproduce colors that are equivalent or similar to the colors of the same objects under visible light. However, even though the assessment data for the color accuracy has not yet been released, the examples presented by AIST indicate that an improvement in the color signal processing method is required.

    This paper proposes an algorithm to recover the colors of objects from multiple near-IR images. The key point is to use the information from the reflectance curve in the wavelength range from 700 to 900 nm, which is near the red edge of the visible band. In Section II, the proposed algorithm is described with the aid of a flowchart. The International Commission on Illumination (CIE) color coordinates L*, a*, b* of the colors of objects are recovered from a series of gray images captured under multiple spectral near-IR illuminations. In Section III, the optical properties of the experimental components, such as the spectral power distribution and spatial uniformity of a custom-designed illuminator and the spectral reflectance of the target color objects, are described in detail. In Section IV, the experimental results that were obtained by analyzing the 24 patches of the Color Rendition Chart [11] are presented. The accuracy of the proposed algorithm is evaluated using the average color difference ΔE*ab. Finally, conclusions and possibilities for further studies are given in Section V.

    II. COLOR RECOVERY ALGORITHM

    Figure 1 shows an example containing the reflectance curves for four color patches painted with red, green, yellow, and blue paint, respectively. According to the definitions that are used in night vision technology [12], the visible band is considered to include wavelengths ranging from 400 to 700 nm, the near-IR region is defined to include wavelengths between 700 and 1100 nm, whereas IR wavelengths longer than 1100 nm are defined as the shortwave IR band that is useful for the detection of objects in a scene.

    Figure 1 presents several noteworthy features. The reflectance curve in the visible band has a particular structure that depends on color. However, in the near-IR band, the reflectance curves of the four colors tend to converge to gradually become identical throughout the shortwave IR band. This behavior has been observed for many different materials [13].

    The reflectance curve in the visible band is particularly important since it uniquely describes the color of the object as perceived by human observers. In 1931, the CIE created the CIE X, Y, Z color space based on the three photoreceptors in the eye [14]. The X, Y, Z coordinates are defined as the tristimulus values in the range of wavelengths from 380 to 780 nm, as follows;

    image

    where, λ is the wavelength measured in nanometers, P(λ) is the spectral power distribution of light illumination, RV(λ) is the reflectance of the object in the visible band, and , , are the color matching functions for the standard observer. These numerical descriptions may be considered the spectral sensitivity curves of the three linear light detectors. In Eq. (1), , , are approximately zero for wavelengths below 400 nm and above 700 nm.

    In 1976 the CIE L*, a*, b* color space, which is currently used as a perceptually uniform color space, was derived from the CIE X, Y, Z color space [14]. In the new color space L* represents the lightness, a* denotes the red/green value, and b* the yellow/blue value. The non-linear transformation from X, Y, Z to L*, a*, b* is as follows;

    image

    where Xn, Yn, Zn are the X, Y, Z of the reference white. L* = 0 yields black and L* = 100 indicates diffuse white. Along the a* axis, positive values indicate the amount of red, while negative values indicate the amount of green. Similarly, along the b* axis, positive and negative values are indicative of yellow and blue, respectively. For both axes, zero is neutral gray. The advantage of the CIE L*a*b* color space is that the color differences between two colors can be expressed in as follows;

    image

    The reflectance curve in the near-IR band does not contribute to defining the color in terms of X, Y, Z or L*, a*, b* since the near-IR band is outside of the visible band. However, for an object under near-IR illumination with wavelengths between 700 and 1100 nm, NIR1, NIR2, NIR3 corresponding to X, Y, Z can be represented by the equivalent formula:

    image

    where, P(λ) is the spectral power distribution of a near-IR illumination, RNIR(λ ) is the reflectance of the object in the near-IR band, and nir1(λ), nir2(λ), nir3(λ) are the spectral sensitivity curves of the three near-IR sensors, which can be thought of as the equivalent of , , in Eq. (1). Alternatively, Eq. (4) can be reformulated to include three spectral near-IR illuminators and a sensor that responds to wavelengths in the near-IR band.

    image

    where P1(λ), P2(λ), and P3(λ) are the spectral power distributions of three spectral near-IR illuminators, with the subscript symbolizing the peak wavelength, and S(λ) is the spectral sensitivity curve of the near-IR sensor. Most chargecoupled devices (CCDs) or CMOS devices respond to wavelengths ranging from visible to near-IR. NIR1, NIR2, NIR3 in Eq. (4) represent the tristimulus values, which are sequentially captured under three spectral near-IR illumination

    Before the proposed recovery algorithm is presented, a simple combination commonly used in remote sensing applications is considered [15]. The values NIR1, NIR2, NIR3 are assigned to the R, G, and B primary signals, respectively, for the purpose of reproducing a color image. These values may occur in many possible combinations depending on the order in which they occur or on their weighting factors. However, the resulting product for any combination is known as a false-color image that does not bear any resemblance to its original color.

    The aim of the proposed algorithm is to recover the CIE color coordinates of objects from a series of gray images captured under three spectral near-IR illuminations. This can be achieved using the polynomial regression of the form [16]

    image

    where and are the vector forms of the CIE color coordinates X, Y, Z or L*, a*, b* and the near-IR tristimulus values NIR1, NIR2, NIR3, respectively. The color mapping matrix CMM represents the numerical relationship between and . For a set of target colors with known CIE color coordinates, a CMM is easily computed from the near-IR tristimulus value NIR1, NIR2, NIR3 using MATLAB with the formula

    image

    The accuracy of CMM depends on the degree of correlation between and of the target colors. However, the aspects of the reflectance curves in the visible band RV(λ) and in the near-IR band RNIR(λ) are quite different as shown in Fig. 1; therefore, the correlation between and would be expected to be low.

    To improve the recovery performance, the proposed algorithm is designed to acquire an individual CMM for each group consisting of highly correlated colors that are selected from a set of target colors. Figure 2 shows the flowchart of the color recovery process from NIR1, NIR2, NIR3 to L*, a*, b*.

    In order to classify the highly correlated colors together, a new set of parameters is derived from the near-IR tristimulus value NIR1, NIR2, NIR3 as follows;

    image

    In Eq. (8), Is means the height of the reflectance curve RNIR(λ). While G21 and G32 mean the gradient of the reflectance curve RNIR(λ) between adjacent wavebands, respectively. For both G21 and G32 , positive values indicate an increase in the reflectance with increasing wavelength, whereas negative values indicate a decreasing reflectance.

    As can be seen in Fig. 1, selected reflectance curves in the near-IR band exhibit steep gradients, while others display gentle gradients. In this study, each set of target colors is classified into two groups based on the gradient of the reflectance curve, following which the gentle gradient group is additionally classified into two further groups based on the height of the reflectance curve. The workflow shown in Fig. 2 consists of three paths. If the sum of|G21| + |G32 | of a target color is greater than or equal to a threshold value k1 then it belongs to group I. If not, if the value of IS is determined to be greater than a threshold value k2, the color is considered to belong to group II, otherwise it belongs to group III. The threshold values k1 and k2 are properly determined according to constituent target colors. The target colors assigned to each group using this method are highly correlated with each other. The individual color mapping matrices CMM1, CMM2, CMM3 can be computed for separate groups using Eq. (7). Accordingly, for colors belonging to the same group, the values of L*, a*, b* can be recovered as follows:

    image

    where and are the vector forms of L*, a*, b* and Is, G21, G32, respectively. CMM represents the individual color mapping matrix of the group. The performance of the proposed algorithm is evaluated by determining the average color difference ΔE*ab between the original and recovered values.

    III. EXPERIMENTAL SETUP

    The experiment is carried out in dark condition. The experimental room (including the ceiling and the wall) is painted matte black. The experimental configuration is presented in Fig. 3. A screen that is attached to the target color objects is placed 0.7 m from a multiple spectral near-IR illuminator. Through the hole in the center of the illuminator, a monochrome digital camera is placed facing the screen as shown in Fig. 3. The camera has an image sensor consisting of a diagonal 1/3 inch CCD array with 1.25 M effective pixels. Its spectral response ranges from 300 to 1000 nm, and the maximum sensitivity is near 580 nm. Using the camera’s software, automatic white balance function and a gamma value are set to ‘off’ and 1, respectively. The 16-bit data of the captured image is saved in TIFF format.

    Details of the illuminator and target color objects are described in the following sections.

       3.1. Multiple Spectral Near-IR Illuminator

    The multiple spectral near-IR illuminator consisting of six near-IR LED array modules is custom-designed. Figure 4 (a) shows a schematic diagram of one of the six modules. In the diagram, Ⓐ, Ⓑ, Ⓒ, and Ⓓ indicate four types of near-IR LEDs with peak wavelengths at 700, 740, 780 and 860 nm, respectively. Each module consists of a uniform arrangement of four of each of these types, i.e., a total of 16 LEDs per module. Figure 4 (b) shows the assembled multiple spectral near-IR illuminator with the six modules, which are slightly tilted toward the central hole in the form of a sunflower. The luminous intensity of the LEDs in all six modules is controlled according to type by using four separate power supplies. For the experiment, the three types of LEDs with peak wavelengths at 700, 780 and 860 nm were selected.

    Figure 5 presents the spectral power distributions of these three types of LEDs as measured by using a USB-650 spectrometer. In Fig. 5, P(λ)700, P(λ)780 and P(λ)860 denote the spectral power distributions of the three types of LEDs, of which the respective maximum values are normalized to 1.

    The spatial uniformity of the custom-designed illuminator was measured by using a uniform gray card (X-rite ColorChecker Custom White Balance Card, of which the size of the square, excluding the border, is 27.4 × 17.8 cm). In Figure 3, the white balance card is attached to the screen. Figure 6 shows the captured images of the white balance card. The uniformity of the entire plane is shown at a glance in Fig. 6 (a), (b), and (c), in which the resultant color-coded images that were obtained under the illumination of P(λ)700, P(λ)780 and P(λ)860, respectively, are shown. The strength of the input current of each of the three types of LEDs was independently adjusted to equalize the brightness of their respective images, a process referred to as adjusting the white balance.

    In addition, the pixel positions of each of the images in Fig. 6 correspond to the size of the gray card (27.4 × 17.8 cm). All three of the color-coded images display concentric rings of which the color gradually changes, and in which the maximum code value is 1 near the center. Toward the edges of the gray card, the code value is gradually reduced to about 0.7 in the center of the x-axis and 0.8 in the center of the y-axis. Comparing the three images in Fig. 6, the degree of non-uniformity of P(λ)700, P(λ)780 and P(λ)860 can be seen to vary slightly from one image to another. The effects of the non-uniformity of the illumination were minimized by multiplying each pixel data in the captured image by the value corresponding to the inverse of the code value shown in Fig. 6.

       3.2. Target Color Objects

    The ColorChecker is generally used as a color calibration target in visible imaging technology. It includes 24 color patches in a 6 × 4 grid, each slightly under 2 inches square, made of matte paint applied to smooth paper, and surrounded by a black border. A photograph of the ColorChecker is shown in Fig. 7, in which the original colors as they appear in visible light are shown. Six of the patches form a uniform gray lightness scale; white (w), neutral 8 (n8), neutral 6.5 (n6.5), neutral 5 (n5), neutral 3.5 (n3.5), and black (bk). Another six patches represent the primary colors; red (r), green (g), blue (b), cyan (c), magenta (m), and yellow (y). The remaining colors include approximations of natural colors, such as human skin tones (ds and ls), blue sky (bs), a typical leaf (f), a blue chicory flower (bf), and bluish green (bg). The remaining colors were chosen arbitrarily to represent a gamut: orange (o), purplish blue (pb), moderate red (md), purple (p), yellow green (yg), and orange yellow (oy). The letters in parentheses represent the acronym for each tone.

    The spectral reflectance of each patch was measured by using a Cary 5000 UV-Vis-NIR spectrophotometer after cutting the 24 patches from the ColorChecker. Figure 8 shows the accumulated reflectance curve for all 24 patches in the wavelength range from 400 to 1100 nm, which includes both the visible and the near-IR bands. As expected, based on the explanation in Fig. 1, the reflectance data shows the inherent structure of each color in the visible band (400~700nm). In the near-IR band, the reflectance data increases near the red edge of the visible band (700~900 nm) before assuming a uniform appearance. As the red edge of the visible band is approached (Fig. 8), the reflectance data of some colors increase rapidly and their reflectance curve displays a steep gradient, whereas other colors have a gentler gradient. Using the measured spectral reflectance data in the visible band, it is possible to calculate the X, Y, Z values for Illuminants D65 using Eq. (1) and also to derive the L*, a*, b* values using Eq. (2). The results are presented in Table 1.

    [TABLE 1.] The X, Y, Z values for Illuminants D65 and the derived L* , a* , b* values of 24 patches

    label

    The X, Y, Z values for Illuminants D65 and the derived L* , a* , b* values of 24 patches

    As shown in the experimental setup, the 24 cut patches were rearranged in the form of a 6 × 4 grid without the black borders to prepare them for use as target color objects. The target object was created to have the same size as the gray card that is shown in Fig. 6 to compensate for non-uniformity in illumination.

    IV. RESULTS

    Figure 9 shows a set of monochrome images, after compensation for non-uniformity, that was experimentally obtained from the target color objects. Figure 9 (a), (b), and (c) show the results of illumination at P(λ)700, P(λ)780 and P(λ)860 respectively. The input current of each of the three types of LEDs was individually adjusted to achieve white balance. The longer the wavelength of the illumination, the more intense the brightness observed for the image. This tendency is consistent with the trend displayed by the reflectance curves shown in Figure 8. As a result of the effect of the white balance, the brightness of a series of neutral patches in the bottom row of the target color objects is invariant to any illumination.

    Figure 9 was used to define the set of tristimulus values P(λ)700, P(λ)780 and P(λ)860 for the target color objects. These values correspond to the normalized pixel data of the images in Fig. 9 (a), (b), and (c), respectively, because the gamma function of the camera was set to 1. Figure 10 shows examples of composite images that were composed using a simple combination method. Figure 10 (a) is formed by assigning the values of P(λ)700, P(λ)780 and P(λ)860, to the R, G, and B primary signals, respectively. Figure 10 (b) and (c) are similarly formed by assigning these values to G, B, and R, and B, R, and G, respectively. Figure 10 (a), (b), and (c) show monotones of green, blue, and red, respectively, i.e., they do not bear any resemblance to the original color.

    Figure 11 shows a color image that was recovered from the three near-IR images in Fig. 9 by using the proposed algorithm. The 24 patches are classified into three groups on the basis of the criteria suggested by the proposed algorithm using MATLAB. Figure 12 (a), (b), and (c) illustrate the reflectance curves from 700 to 900 nm of the color patches belonging to groups I, II, and III, respectively. The bluish or greenish patches, all of which have a steep gradient, are included in group I, whereas the reddish or yellowish patches, which have a gentle gradient and a large reflectance sum, are included in group II. The remaining patches, which have a gentle gradient and a small reflectance sum, are included in group III. The individual color mapping matrix CMM1, CMM2, and CMM3 was subsequently computed for all three of these groups. The noise level was suppressed by applying a polynomial transformation model with 3 × 20 terms [17] to all pixels constituting each patch. By using CMM1, CMM2, and CMM3, the L*, a*, b* values of all 24 patches were recovered by using Eq. (9). Figure 11 shows a sRGB color image obtained by transforming the recovered L*, a*, b* values to sRGB using MATLAB [18].

    At a glance, the colors in Fig. 11 appear very similar to those of the original patches. However, some colors differ significantly from the original. The color difference ΔE*ab between the calculated (Table 1) and the recovered values of each patch is calculated using Eq. (3) and shown in Table 2. The average ΔE*ab across each group is also shown in Table 2. The average ΔE*ab for the patches in groups I and III are 3.5 and 3.8, respectively. On the other hand, the average ΔE*ab for the patches in group II is 25.9, and then this value is extremely high compared with the results of other groups. The color difference for each patch is analyzed in the lightness difference ΔL* and the chromaticity shift in the a*b* plane in Fig. 13 (a) and (b), respectively. The results for the seven colors in group II (mr, oy, r, y, m, w, and n8) show comparatively large values for both the lightness difference in Fig. 13 (a) and the chromaticity shift in Fig. 13 (b). It is interesting that for the lightness difference shown in Fig. 13 (a), one pair of colors consisting of mr and n8 shows a similar value of opposite sign, a result that was also found for another pair consisting of oy and r. And a set of three colors; y, w and m show the similar result. These results were observed, because the reflectance curves in the wavelength range from 700 to 900 nm of each of these pairs are almost indistinguishable as can be seen in Fig. 12 (b).

    [TABLE 2.] The color difference between original and recovered colors

    label

    The color difference between original and recovered colors

    The average color difference ΔE*ab for all 24 patches is 11.1. However, if the patches of which the reflectance curves are indistinguishable in group II are disregarded, the average color difference ΔE*ab is reduced to 4.2, and this value is within the acceptability tolerance for complex image on the display [19, 20].

    V. CONCLUSION

    An algorithm to recover the colors of objects from multiple near-IR images is proposed. The CIE color coordinates L*, a*, b* of the objects are recovered from a series of gray images captured under multiple spectral near-IR illuminations using the polynomial regression. The feasibility of the proposed algorithm is confirmed experimentally using the 24 color patches of the ColorChecker. The average color difference that was obtained for all 24 patches is 11.1 in ΔE*ab unit. If seven patches with high value are disregarded, the average color difference ΔE*ab is reduced to 4.2, and this value is within the acceptability tolerance for complex image on the display.

    In the next study, a feasibility test will be continued for an extended number of color patches using the other Color Rendition Chart, for example 169 colors made of matte paint of the ColorChecker DC. Near-IR reflectance behavior of artifacts and natural objects made of various materials also should be investigated in further research. In addition, the spatial uniformity of the illuminator could be enhanced by improving the optical design of components, such as the lens, and the beam shaping diffuser [21]. It is expected that multiple-spectral illuminations provided by an illuminator with a narrow half-width, such as a laser, used in conjunction with a high-resolution camera, would be helpful to enhance the color image quality.

참고문헌
  • 1. Schacter D. L., Gilbert D. T., Wegner D. M. 2009 Psychology google
  • 2. Vilaseca M., Pujol J., Arjona M., Verdu F. M. M. 2004 “Color visualization system for near-Infrared multispectral images,” [Proc. 2nd CGIV] P.431-436 google
  • 3. Vilaseca M., de Lasarte J. M., Pujol J., Arjona M., Verdu F. M. M. 2005 “Multispectral system for the reflectance reconstruction and color visualization of natural and manufactured objects in the near-infrared region,” [Proc. AIC Colour 10th Congress] P.499-502 google
  • 4. Tsagaris V., Anastassopoulos V. 2005 “Fusion of visible and infrared imagery for night color vision,” [Displays] Vol.26 P.191-196 google cross ref
  • 5. Han J., Bhanu B. 2007 “Fusion of color and infrared video for moving human detection,” [J. Elsevier Pattern Recognition] Vol.40 P.1771-1784 google cross ref
  • 6. Qian X., Wang Y., Wang B. 2012 “Effective contrast enhancement method for color night vision,” [J. Elsevier Infrared Phys. & Tech.] Vol.55 P.130-136 google cross ref
  • 7. Chen Z., Wang X., Liang R. 2014 “RGB-NIR multispectral camera,” [J. Opt. Soc. America] Vol.22 P.4985-4994 google
  • 8. Toet A. 2003 “Natural colour mapping for multiband ngihtvision imager,” [J. Info. Fusion] Vol.4 P.155-166 google cross ref
  • 9. Dastjerdi S. R., Ghanaatshoar M., Hattori T. 2013 “Near-infrared subwavelength imaging and focusing analysis of a square lattice photonic crystal made from partitioned cylinders,” [J. Opt. Soc. Korea] Vol.17 P.262-268 google cross ref
  • 10. Nagamune Y. 2014 Korea Patent google
  • 11. McCamy C. S., Marcus H., Davidson J. G. 1976 “A color-rendition chart,” [J. Appl. Photographic Engineering] Vol.2 P.95-99 google
  • 12. Richards A. 2011 Alien Vision google
  • 13. Haran T. 2008 “Short-wave infrared diffuse reflectance of textile materials,” google
  • 14. Berns R. S. 2000 Billmeyer and Saltzman’s Principles of Color Technology google
  • 15. Lillesand T., Kiefer R. W., Chipman J. 2007 Remote Sensing and Image Interpretation google
  • 16. King H. R. 2006 Computational Color Technology google
  • 17. Kim Y. J., Luo M. R. 2005 “Characterization of a LCD colour monitor using a digital still camera,” [Proc. AIC Colour 10th Congress] P.295-298 google
  • 18. Westland S., Ripamonti C., Cheung V. 2012 Computational Colour Science Using MATLAB google
  • 19. Kim A., Kim H. S., Park S. O. 2011 “Measuring of the perceptibility and acceptability in various color quality measures,” [J. Opt. Soc. Korea] Vol.15 P.310-317 google cross ref
  • 20. Kim D. H., Kim H. S., Park S. O., Kim Y. J. 2003 “Perceptual quality of still images,” [Proc. 25th Session of the CIE] P.8-14 google
  • 21. Joo B. Y., Ko J. H. 2013 “Analysis of color uniformity of white LED lens packages for direct-lit LED backlight applications,” [J. Opt. Soc. Korea] Vol.17 P.506-512 google cross ref
OAK XML 통계
이미지 / 테이블
  • [ FIG. 1. ]  Reflectance curves for 4 color patches painted with red, green, yellow and blue paint.
    Reflectance curves for 4 color patches painted with red, green, yellow and blue paint.
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ Fig. 2. ]  Flowchart of color recovery process from NIR1, NIR2, NIR3 to L*, a*, b*.
    Flowchart of color recovery process from NIR1, NIR2, NIR3 to L*, a*, b*.
  • [ ] 
  • [ ] 
  • [ FIG. 3. ]  Experimental configuration.
    Experimental configuration.
  • [ FIG. 4. ]  (a) Schematic diagram for one of 6 modules and (b) an assembled multiple spectral near-IR illuminator with six modules.
    (a) Schematic diagram for one of 6 modules and (b) an assembled multiple spectral near-IR illuminator with six modules.
  • [ FIG. 5. ]  The spectral power distributions of the three types of LEDs; 700, 780 and 860 nm.
    The spectral power distributions of the three types of LEDs; 700, 780 and 860 nm.
  • [ FIG. 6. ]  The captured images of the white balance card under the three types of LEDs; (a) 700 nm (b) 780 nm (c) 860 nm.
    The captured images of the white balance card under the three types of LEDs; (a) 700 nm (b) 780 nm (c) 860 nm.
  • [ FIG. 7. ]  The photograph of ColorChecker captured under visible light.
    The photograph of ColorChecker captured under visible light.
  • [ FIG. 8. ]  The reflectance curve for all 24 patches in the wavelength range from 400 to 1100 nm.
    The reflectance curve for all 24 patches in the wavelength range from 400 to 1100 nm.
  • [ TABLE 1. ]  The X, Y, Z values for Illuminants D65 and the derived L* , a* , b* values of 24 patches
    The X, Y, Z values for Illuminants D65 and the derived L* , a* , b* values of 24 patches
  • [ FIG. 9. ]  A set of monochrome images of the target color objects obtained experimentally with the illumination of (a) P(λ)700, P(λ)780 and P(λ)860.
    A set of monochrome images of the target color objects obtained experimentally with the illumination of (a) P(λ)700, P(λ)780 and P(λ)860.
  • [ FIG. 10. ]  Examples of composite image using simple combination method; NIR700, NIR780, NIR860 are assigned to (a) (R, G and B), (b) (G, B and R) or (c) (B, R, and G).
    Examples of composite image using simple combination method; NIR700, NIR780, NIR860 are assigned to (a) (R, G and B), (b) (G, B and R) or (c) (B, R, and G).
  • [ FIG. 11. ]  A color image recovered from the three near-IR images using the proposed algorithm.
    A color image recovered from the three near-IR images using the proposed algorithm.
  • [ TABLE 2. ]  The color difference between original and recovered colors
    The color difference between original and recovered colors
  • [ FIG. 12. ]  The reflectance curves of target color objects belong in each group; (a) group I, (b) group II and (c) group III.
    The reflectance curves of target color objects belong in each group; (a) group I, (b) group II and (c) group III.
  • [ FIG. 13. ]  (a) Color difference in the lightness ΔL and (b) the chromaticity shift in the a*b* plane.
    (a) Color difference in the lightness ΔL and (b) the chromaticity shift in the a*b* plane.
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.