Browse Prior Art Database

Fore-Screen Display and Manipulation for Virtual World Interaction

IP.com Disclosure Number: IPCOM000104055D
Original Publication Date: 1993-Mar-01
Included in the Prior Art Database: 2005-Mar-18
Document File: 2 page(s) / 95K

Publishing Venue

IBM

Related People

Ling, DT: AUTHOR [+5]

Abstract

Disclosed is a system that produces a computer-generated virtual world imbedded in the real world so that real-world objects can be shared with the virtual world. By use of fore-screen stereoscopic display, user position and movement sensing, and computer modeling of the user's environment, the system produces virtual objects that appear to the user to lie in coincidence with real, accessible objects in front of the screen. By sensing the user's hand motion, the system facilitates manipulation of the virtual objects by the user's real hands. The system supports multiple user interaction.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 52% of the total text.

Fore-Screen Display and Manipulation for Virtual World Interaction

      Disclosed is a system  that produces a computer-generated
virtual world imbedded in the real world so that real-world objects
can be shared with the virtual world.  By use of fore-screen
stereoscopic display, user position and movement sensing, and
computer modeling of the user's environment, the system produces
virtual objects that appear to the user to lie in coincidence with
real, accessible objects in front of the screen.  By sensing the
user's hand motion, the system facilitates manipulation of the
virtual objects by the user's real hands.  The system supports
multiple user interaction.

      The computation subsystem (a) in the figure is programmed to
produce a stereoscopic image on the display screen (b) with the
left-view/right-view disparity adjusted so that the operator (c)
perceives that the computer-generated three-dimensional scene (d)
lies in front of the screen.  The operator wears stereoscopic
eyeglasses (e) to direct the left and right views to the appropriate
eyes, a head position sensor (f) to transmit head position
information to the computation subsystem, and hand position and
configuration sensors (g) to transmit to the computation subsystem
information on the positions and orientations of the hands and finger
joints.

      The hand sensor assembly can also include tactile feedback
equipment so that the operator can experience the sensations of touch
and feeling.  The information transmitted by the hand sensors and
received by the tactile feedback apparatus may be more or less
detailed, depending on the extent to which the computer program must
model the hands, which in turn depends on the application.

      The head position information is used by the computer subsystem
to alter the stereoscopic image when the head moves so that the
virtual objects in the scene remain in the same  positions in real
space, as perceived by the operator.  The hand position and
configuration information is used by the computer subsystem to
maintain an internal model of the hands for simulating the
interaction of the hands with virtual objects, for which internal
models are also maintained.  The internal models of the hands are
continuously updated as the real hands move so that they are, in
principle, coincident with the real hands.  Similarly, the computer...