Since the successful development of a stereoscopic three-dimensional (3D) display, studies on autostereoscopic 3D displays and holography have been actively conducted [1-3]. The ultimate objective of such a 3D display is to provide viewers with a tangible 3D display. However, realistically, there are some limitations because of several constraints to implement such a 3D display. First, because of the limitations of the amount of information to be processed and the processing speed in the case of an autostereoscopic 3D display along with a lack of display panels that are usable commercially for 3D display, the implementation of a 3D display is still in the research phase or in the early commercial phase. Moreover, there are several reasons why some people are still against stereoscopic 3D displays; one of these main reasons is the discomfort of wearing special gadgets.
From a systematic point of view, there are several requirements for providing realistic and tangible displays. First, it is necessary to display different perspective images according to the viewing direction. These perspective images should protrude in front of the display panel to make an interaction with users easier. Second, it should detect a finger or hand gesture to identify the user’s movement. The detected motion should change the afore-mentioned perspective images according to the user’s intention. Finally, an appropriate sense of touch should be provided to the user’s fingers or hands in order for him/her to experience immersive images .
To provide a complementary effort to this need, we propose a tangible floating display for interaction using integrated displays and ultrasound transducers. A schematic representation of our tangible floating display system is shown in Fig. 1. For providing high-resolution directional-view images that can be viewed without the use of any special glasses by multiple users, we integrated three-lenticular lens displays and five floating displays into the proposed system. The proposed system has two attached infrared (IR) cameras to detect the user’s motions and is manufactured to be interactive with display contents generated by predefined hand gestures. In addition, an ultrasound transducer is designed to feel tactile stimulation at a specific position when interacting with directional-view images projected in the floating display.
The proposed method consists of a display system in which a lenticular lens display and a floating display are combined with a hand gesture recognition system for interaction, and a spatial tactile system that provides tactile stimulation. First, various types of 3D displays can be considered to implement a tangible display. A holographic display, which provides the most natural 3D images, and a volumetric display based on a rotating screen method can provide relatively immersive 3D images [5-6]. However, an existing holographic display has a massive amount of information to be processed as well as a limitation on providing dynamical images. A volumetric display can be a suitable candidate for a tangible display; however, it has a limitation that users cannot interact with images by touching them since images are displayed in a rotating screen. A multi-view display using a parallax barrier or a lenslet array also has a limitation related to the implementation of immersive images or interaction since its resolution is relatively low [7, 8]. On the other hand, a floating display is a method of projecting clear twodimensional (2D) images over a space, which has been applied as a “Pseudo Hologram” in the areas of art performance and mass media advertisement instead of a 3D display field . Among the various above-mentioned display methods, a floating display that provides clear images and is suitable for interaction as a tangible display system was selected.
Therefore, we adopted lenticular lens displays and floating displays for our proposed display system as shown in Fig. 2. Three lenticular lens displays located in an upper position for a number of viewers to observe images were used for the selection of image contents by viewers rather than for facilitating user interactions or tangible display. A floating display, which is one of the core elements of a tangible display, was designed; in this display, viewers selected contents using a lenticular lens display followed by touching the image contents directly as shown in Fig. 2.
A floating display can be implemented using a concave mirror that is optically equivalent to a floating lens. The location and the size of the floating image can be mathematically supported using simple geometrical optics as shown in Fig. 3. In the paraxial approximation, a Jones transfer matrix of the floating display along the light propagation paths is expressed as follows: where
[FIG. 4.] Focusing of ultrasound wave by ultrasound transducers: (a) a schematic of ultrasound transducer system, (b) distance estimation between focal point (x, y, z) and each transducer, and (c) phase alignment of ultrasound wave by considering phase difference among transducers
We also developed a technology that can focus stimulation on a specific location using an ultrasound transducer technology to feel tactile stimulation at the location where the aforementioned floating image is projected. First, the focus where a floating image is projected over a space is set up followed by obtaining a distance difference (
A vision camera system using IR emission units was used for applying image contents or tactile stimulation by the location of fingers or the hand gestures of users. A vision camera was designed to recognize a user’s hand gesture using a general time-of-flight (TOF) mode. Four of the user’s hand gestures—left, right, enter, and backward navigations—were recognized according to the hand’s motion or angle, which is achieved via the steps in Fig. 5. Such a vision camera was located in the center of the lenticular display system and underneath the floating display.
To demonstrate the proposed system, we integrated three lenticular lens displays, vision camera system with IR emission units, floating displays, and tactile devices using ultrasound transducers, as shown in Fig. 6. Figure 6(a) shows the complete configuration of the proposed system, and Fig. 6(b) shows the buried ultrasound transducers in the system. A 24-inch lenticular lens display with 9 views was used as the display device. Each unit was controlled by a PC that was connected with a 5-channel speaker and a TOF camera. For floating displays, 15-inch high-brightness LCD monitors were used. Their images were viewed by viewers through the polarization glass located in the center of the entire system as the images were reflected by the floating mirror located in the position calculated via Equation (1). A 37-inch parabolic mirror a focal length of 9.3 inch and diameter of 35 cm in the central area was cut, and in all, five 15-inch LCD monitors were used. Each floating display had a 45° field of view, and the diameter of the lower system was 90 cm, while that of the polarization glass was 30 cm. For more natural user interactivity, five 10-inch tablet PCs were used and three types of contents— sea, cars, and musical instruments— were displayed by the system. Once the viewer selected one of contents by lenticular displays, the image content was displayed through the floating display as well. Although the images were not connected volumetrically, we can enjoy vivid floating image because the image can be magnified, demagnified, and rotated as the viewer’s gesture.
Figure 7 shows the experimental results according to the desirable scenarios. Through the front-located TOF camera, an acquired hand gesture was recognized and the requisite contents were selected accordingly. The selected contents were displayed in the floating display located at the lower end, and tactile stimulation was delivered to users via the ultrasound transducers installed around the floating display. Five users could simultaneously enjoy different floating images by means of five floating displays.
We proposed a tangible floating display for providing a user interaction experience. The proposed display system consists of three sets of lenticular lens displays, five sets of floating displays, a designed ultrasound transducer system, and two TOF cameras using IR units. Each TOF camera detects the motion of a finger or a hand gesture, and the display contents and the focusing point of the tactile expression can be changed. Integrated systems are presented to verify the validity of the proposed method, and the experimental results show that the proposed system provides different perspective views and tactile expressions according to the calculated positions. We expect this system to have a number of applications in the advertisement, game, e-training, and immersive digital signage industries.