검색 전체 메뉴
PDF
맨 위로
OA 학술지
A Tangible Floating Display System for Interaction
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT

A tangible floating display that can provide different perspective views without special glasses being introduced. The proposed system can display perspective floating images in the space in front of the system with the help of concave mirrors. In order to avoid wearing special equipment to interact and deliver the sense of touch, the proposed system adopted an ultrasound focusing technology. For providing an immersive experience to the viewers, the proposed system consists of a tangible floating display system and a multiple-view imaging system for generating three lenticular displays in front of the users.


KEYWORD
Floating display , Lenticular display , Ultrasound technology
  • I. INTRODUCTION

    Since the successful development of a stereoscopic three-dimensional (3D) display, studies on autostereoscopic 3D displays and holography have been actively conducted [1-3]. The ultimate objective of such a 3D display is to provide viewers with a tangible 3D display. However, realistically, there are some limitations because of several constraints to implement such a 3D display. First, because of the limitations of the amount of information to be processed and the processing speed in the case of an autostereoscopic 3D display along with a lack of display panels that are usable commercially for 3D display, the implementation of a 3D display is still in the research phase or in the early commercial phase. Moreover, there are several reasons why some people are still against stereoscopic 3D displays; one of these main reasons is the discomfort of wearing special gadgets.

    From a systematic point of view, there are several requirements for providing realistic and tangible displays. First, it is necessary to display different perspective images according to the viewing direction. These perspective images should protrude in front of the display panel to make an interaction with users easier. Second, it should detect a finger or hand gesture to identify the user’s movement. The detected motion should change the afore-mentioned perspective images according to the user’s intention. Finally, an appropriate sense of touch should be provided to the user’s fingers or hands in order for him/her to experience immersive images [4].

    To provide a complementary effort to this need, we propose a tangible floating display for interaction using integrated displays and ultrasound transducers. A schematic representation of our tangible floating display system is shown in Fig. 1. For providing high-resolution directional-view images that can be viewed without the use of any special glasses by multiple users, we integrated three-lenticular lens displays and five floating displays into the proposed system. The proposed system has two attached infrared (IR) cameras to detect the user’s motions and is manufactured to be interactive with display contents generated by predefined hand gestures. In addition, an ultrasound transducer is designed to feel tactile stimulation at a specific position when interacting with directional-view images projected in the floating display.

    II. PROPOSED METHOD

    The proposed method consists of a display system in which a lenticular lens display and a floating display are combined with a hand gesture recognition system for interaction, and a spatial tactile system that provides tactile stimulation. First, various types of 3D displays can be considered to implement a tangible display. A holographic display, which provides the most natural 3D images, and a volumetric display based on a rotating screen method can provide relatively immersive 3D images [5-6]. However, an existing holographic display has a massive amount of information to be processed as well as a limitation on providing dynamical images. A volumetric display can be a suitable candidate for a tangible display; however, it has a limitation that users cannot interact with images by touching them since images are displayed in a rotating screen. A multi-view display using a parallax barrier or a lenslet array also has a limitation related to the implementation of immersive images or interaction since its resolution is relatively low [7, 8]. On the other hand, a floating display is a method of projecting clear twodimensional (2D) images over a space, which has been applied as a “Pseudo Hologram” in the areas of art performance and mass media advertisement instead of a 3D display field [9]. Among the various above-mentioned display methods, a floating display that provides clear images and is suitable for interaction as a tangible display system was selected.

    Therefore, we adopted lenticular lens displays and floating displays for our proposed display system as shown in Fig. 2. Three lenticular lens displays located in an upper position for a number of viewers to observe images were used for the selection of image contents by viewers rather than for facilitating user interactions or tangible display. A floating display, which is one of the core elements of a tangible display, was designed; in this display, viewers selected contents using a lenticular lens display followed by touching the image contents directly as shown in Fig. 2.

    A floating display can be implemented using a concave mirror that is optically equivalent to a floating lens. The location and the size of the floating image can be mathematically supported using simple geometrical optics as shown in Fig. 3. In the paraxial approximation, a Jones transfer matrix of the floating display along the light propagation paths is expressed as follows: where x1, x2, x3, and x4 denote the location of the original image, the lens, the floating image, and viewer’s eye, respectively. θ1, θ2, θ3, and θ4 denote the angle of each component, and r represents the radius of the concave mirror. h1, h2, h3, and h4 denote the size of the original image, the floating lens, and the floating image, and the distance from the viewers’ eye, respectively. According to the above equation, a location where a floating image is projected, and the image size are determined [10]. A horizontal viewing angle determined by a single floating display is 45°, and five floating displays are constructed for five viewers to simultaneously observe the floating image within 225°. The five floating displays can display continuously as if there are interactions among these displays as well, however it is not enough to display different perspective views in the present state since one floating display can cover too wide of an angle. This can be partially worked out if the viewing angle of each floating display is diminished. A floating display including a concave mirror and an original image is mounted at the lower side for viewers to see only the floating images.

    We also developed a technology that can focus stimulation on a specific location using an ultrasound transducer technology to feel tactile stimulation at the location where the aforementioned floating image is projected. First, the focus where a floating image is projected over a space is set up followed by obtaining a distance difference (ddiff) from the focus to the corresponding ultrasound transducer. Because of this distance difference, a phase difference (ψ diff) is generated per ultrasound transducer. Thus, a phase difference can be calculated using the following equation to have the same phase in the focus as well as removing the aforementioned phase difference (ψdiff), thereby having a phase that is the same at the focus [11]. where λ denotes the period of the transducer; v, the sound velocity; and f, the resonance frequency of the transducer. dmin represents a location where a focus is projected, while dxy denotes the minimum distance from (x, y, 0) to the focus. 5×10 transducers per module and 8 ultrasound transducers in total were used for increasing the tactile pressure.

    A vision camera system using IR emission units was used for applying image contents or tactile stimulation by the location of fingers or the hand gestures of users. A vision camera was designed to recognize a user’s hand gesture using a general time-of-flight (TOF) mode. Four of the user’s hand gestures—left, right, enter, and backward navigations—were recognized according to the hand’s motion or angle, which is achieved via the steps in Fig. 5. Such a vision camera was located in the center of the lenticular display system and underneath the floating display.

    III. EXPERIMENTAL RESULTS & DISCUSSION

    To demonstrate the proposed system, we integrated three lenticular lens displays, vision camera system with IR emission units, floating displays, and tactile devices using ultrasound transducers, as shown in Fig. 6. Figure 6(a) shows the complete configuration of the proposed system, and Fig. 6(b) shows the buried ultrasound transducers in the system. A 24-inch lenticular lens display with 9 views was used as the display device. Each unit was controlled by a PC that was connected with a 5-channel speaker and a TOF camera. For floating displays, 15-inch high-brightness LCD monitors were used. Their images were viewed by viewers through the polarization glass located in the center of the entire system as the images were reflected by the floating mirror located in the position calculated via Equation (1). A 37-inch parabolic mirror a focal length of 9.3 inch and diameter of 35 cm in the central area was cut, and in all, five 15-inch LCD monitors were used. Each floating display had a 45° field of view, and the diameter of the lower system was 90 cm, while that of the polarization glass was 30 cm. For more natural user interactivity, five 10-inch tablet PCs were used and three types of contents— sea, cars, and musical instruments— were displayed by the system. Once the viewer selected one of contents by lenticular displays, the image content was displayed through the floating display as well. Although the images were not connected volumetrically, we can enjoy vivid floating image because the image can be magnified, demagnified, and rotated as the viewer’s gesture.

    Figure 7 shows the experimental results according to the desirable scenarios. Through the front-located TOF camera, an acquired hand gesture was recognized and the requisite contents were selected accordingly. The selected contents were displayed in the floating display located at the lower end, and tactile stimulation was delivered to users via the ultrasound transducers installed around the floating display. Five users could simultaneously enjoy different floating images by means of five floating displays.

    IV. CONCLUSION

    We proposed a tangible floating display for providing a user interaction experience. The proposed display system consists of three sets of lenticular lens displays, five sets of floating displays, a designed ultrasound transducer system, and two TOF cameras using IR units. Each TOF camera detects the motion of a finger or a hand gesture, and the display contents and the focusing point of the tactile expression can be changed. Integrated systems are presented to verify the validity of the proposed method, and the experimental results show that the proposed system provides different perspective views and tactile expressions according to the calculated positions. We expect this system to have a number of applications in the advertisement, game, e-training, and immersive digital signage industries.

참고문헌
  • 1. Lee B. 2013 “Three-dimensional displays, past and present,” [Phys. Today] Vol.66 P.36-41 google
  • 2. Saveljev Vladimir, Kim Sung-Kyu 2013 “Reference functions for synthesis and analysis of multiview and integral images,” [Journal of the Optical Society of Korea] Vol.17 P.148-161 google
  • 3. Kim Y., Hong K., Yeom J., Hong J., Jung J.-H., Lee Y. W., Park J.-H., Lee B. 2012 “A frontal projection-type three-dimensional display,” [Opt. Express] P.20130-20138 google
  • 4. Kim Y. 2013 “Tangible 3D display for interaction,” [Proc. The 13th International Meeting on Information Display (IMID 2013)] P.127 google
  • 5. Tahara T., Ito Y., Lee Y., Xia P., Inoue J., Awatsuji Y., Nishio K., Ura S., Kubota T., Matoba O. 2013 “Multiwavelength parallel phase-shifting digital holography using angular multiplexing,” [Opt. Lett.] Vol.38 P.2789-2791 google
  • 6. Jones A., McDowall I., Yamada H., Bolas M., Debevec P. 2007 “Rendering for an interactive 360 light field display,” [ACM Transactions on Graphics (TOG)] Vol.26 P.40-49 google
  • 7. Liao H., Iwahara M., Hata N., Dohi T. 2004 “High-quality integral videography using a multiprojector,” [Opt. Express] Vol.12 P.1067-1076 google
  • 8. Lanman D., Hirsch M., Kim Y., Raskar R. 2010 “Contentadaptive parallax barriers: Optimizing dual-layer 3D displays using low-rank light field factorization,” [ACM Transactions on Graphics (TOG)] Vol.29 P.163-172 google
  • 9. Son J.-Y., Lee C.-H., Chemyshov O. O., Lee B.-R., Kim S.-K. 2013 “A floating type holographic display,” [Opt. Express] Vol.21 P.20441-20451 google
  • 10. Yeh P. 1982 “Extended Jones matrix method,” [J. Opt. Soc. Am.] Vol.72 P.507-513 google
  • 11. Iwamoto T., Tatezono M., Hoshi T., Shinoda H. 2008 “Airborne ultrasound tactile display,” [ACM SIGGRAPH 2008 New Tech Demos] Vol.1 google
이미지 / 테이블
  • [ FIG. 1. ]  Schematic diagram of the tangible floating display system.
    Schematic diagram of the tangible floating display system.
  • [ FIG. 2. ]  (a) A photo of our proposed system and (b) a system block diagram of our proposed system.
    (a) A photo of our proposed system and (b) a system block diagram of our proposed system.
  • [ FIG. 3. ]  (a) Geometrical arrangement considering floating display and (b) image formation by a thin lens for Jones transfer function.
    (a) Geometrical arrangement considering floating display and (b) image formation by a thin lens for Jones transfer function.
  • [ FIG. 4. ]  Focusing of ultrasound wave by ultrasound transducers: (a) a schematic of ultrasound transducer system, (b) distance estimation between focal point (x, y, z) and each transducer, and (c) phase alignment of ultrasound wave by considering phase difference among transducers
    Focusing of ultrasound wave by ultrasound transducers: (a) a schematic of ultrasound transducer system, (b) distance estimation between focal point (x, y, z) and each transducer, and (c) phase alignment of ultrasound wave by considering phase difference among transducers
  • [ FIG. 5. ]  Hand gesture recognition process (segmentation, detection, and template matching).
    Hand gesture recognition process (segmentation, detection, and template matching).
  • [ FIG. 6. ]  (a) A photo of experimental setup and (b) ten sets of ultrasound transducer module.
    (a) A photo of experimental setup and (b) ten sets of ultrasound transducer module.
  • [ FIG. 7. ]  Experimental results: (a) contents selection by hand gesture (sea, car, and instrument), (b) a selected floating image with tablet PC, (c) a panoramic image (continuous) when sea contents was selected, and (d) a photo of touching instrument (with tactile expression).
    Experimental results: (a) contents selection by hand gesture (sea, car, and instrument), (b) a selected floating image with tablet PC, (c) a panoramic image (continuous) when sea contents was selected, and (d) a photo of touching instrument (with tactile expression).
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.