DIVERSE, Phase 2
Adding Haptic Devices to Networked Virtual Environments
Lance Arsenault, John Kelso
Principal Investigators
University Visualization and Animation Group
Virginia Tech
Blacksburg, VA
August 31, 2001
1. Background
In most immersive environments users typically feel a bit like ghosts. They
can freely move through objects without feeling them. Gravity itself is
usually absent, and the user can effortlessly fly through the virtual world.
If the user reaches out to touch something their hand and body pass through
the object without hindrance.
In many applications this loss of haptic feedback limits the usefulness of
the immersive experience and hinders the user from fully interacting with
the data they are exploring. For example, getting the sense for the
viscosity of a fluid or density of a material can be greatly enhanced, in a
natural and intuitive manner, by means of haptic feedback.
There are devices currently available which provide this missing haptic
feedback. Unfortunately, applications using them do not scale. That is, if
there is an application, which uses a haptic device on a desktop, this same
haptic device and application will not run in an immersive environment.
2. Tasks and Objectives
To address these issues, we extended DIVERSE to include support for a
Phantom haptic device. We support this device in both a local and a
networked environment-- that is, the haptic device is not directly
connected to the graphics system. We amplified the advantages offered by
haptic and other hardware devices by creating a set of graphical interaction
and feedback tools. These tools provide a uniform interaction and feedback
across all platforms.
3. Software
We feel the networked and scalable availability of software incorporating
haptic devices takes better advantage of what is currently an expensive
hardware resource. The ability to develop a haptic application at a desktop
workstation, and then run the same software using the same haptic device
in an immersive environment optimizes the use of expensive haptic and
immersive resources.
Our approrach to implement this was:
- Attach a Phantom haptic device to a PC running Windows.
- Intall VRPN on the PC to allow the Phantom to be accessed through the
network
- Install VRPN and DIVERSE on a Linux box.
- Run a program on the Linux box that acts as an interface between VRPN
and DIVERSE.
- Install DIVERSE on an SGI that supports an immersive system, such as an
I-Desk.
- Run application software on the SGI using DIVERSE's remote shared memory
to communicate with the Phantom device.
The reason VRPN isn't used to send the Phantom data directly to the SGI is
because we wanted to take advantage of features of DIVERSE's remote shared
memory facility.
This software, and instructions about how to install and use it, is
available here.
Dr. David Bevan, a researcher at Virginia Tech's Department of Biochemistry,
is using this software to implement molecular docking simulatons. Details
about this program are available here.
DIVERSE, phase 2, is distributed and licensed under the same
open-source conditions as the current DIVERSE package.
4. References
- The DIVERSE home page:
-
www.diverse.vt.edu
- The Phantom haptic device is made by SenaAble Technologies:
- www.sensable.com
- The CAVE, iDesk, and other immersive systems are manufactured by FakeSpace Systems:
-
www.fakespacesystems.com
- The Silicon Graphics home page:
-
www.sgi.com
- Linux is available from many sources. A good starting point is: href="http://www.linux.org/"> www.linux.org
- The VRPN home page is at
-
www.cs.unc.edu/Research/vrpn