ernst kruijff | technology
my work at [FHG] and [TUG], I performed multiple scientific
experiments and investigations that resulted in the following
devices. Most of the devices are based upon comprehensive user
studies, investigating specific human factors and ergonomics,
and have been evaluated through a vast amount of user tests.
TUG, internal project, 2007
Handheld devices, interaction, outdoor AR
Over the last years, outdoor augmented reality (AR) has been moving from backpack-based systems towards lightweight handheld devices as deployment platform. Overcoming limitations with previously used UMPC-based casings holding all peripherals for outdoor AR using handheld devices, we performed a large ergonomics study and user testing, coming up with a device construction called Vesp'R.
Vesp'R consistsof the “BatPack”, an enclosure around the
UMPC holding the peripherals, and two joystick-like handles (the“wings”) that can be mounted at multiple spots. The devices are
made from extremely lightweight ABS plastic (stereolithography)
covered by a thin layer of velvety rubber, a hygienic and very soft
material to grab. Vesp’R is derived from the Latin word for “bat”,
a reference the form of the devices, and Ware’s “bat” interface
The ergonomics and user studies have provided us with valuable results regarding the weight, weight balance and interaction possibilities/issues of using handheld devices in indoor and outdoor applications.
This study was performed in cooperation with Eduardo Veas.
- Veas, E. and E. Kruijff. Vesp'R - design and evaluation of a handheld AR device. In Proceedings of the 7th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'08). 2008 [PDF].
- Kruijff, E., Veas, E. Vesp'R - Transforming Handheld Augmented Reality. In proceedings of IEEE ISMAR07, Japan.
Eye of Ra
TUG, Liverplanner, 2005
Hybrid interaction, medical scenarios
This study was performed as part
of the Virtual Liver Surgery Planning project, and focused
at how 2D and 3D modalities can be integrated in a single
input device. The liver surgery planning system aims
at facilitating efficient visual inspection and correction
of surface models generated by automated segmentation
algorithms based on x-ray computed tomography scans,
needed for planning surgical resections of liver tumors.
In order to interact within this application, a hybrid
VR and Tablet PC interface is used, in which users continuously
switch between 2D and 3D interaction.
The basic goal of the study was to support the control
of both the 3D actions and the GUI placed at the Tablet
PC, using a unified device, and to study the thereout
forthcoming effects on the user's interaction. For the
design of the new device, called the Eye of Ra, a detailed
analysis of the tasks and the associated hand-device
couplings and movements was made. This resulted in the
notion that the device needed to perform both high-speed
low accuracy (sweeping) and lower speed high accuracy
tasks. These actions needed to be afforded by the right
form of device, which eventually lead to the quasi mixing
of a pen-like device (to control the Tablet PC) with
a flying mouse. The resulting form allows for an unobtrusive
switching between power and precision grasps. From clay
models, we arrived at the shape shown in the picture,
which holds a small optical mouse circuit board for
connecting to the multiple buttons, and transmission
electronics. The final device was made from carbon and
fiberglass mats layered with epoxy glue, which results
in a lightweight yet sturdy surface. The device holds
retro-reflective markers for tracking using an ART tracking
An extensive user evaluation with 18 subjects (medical
background) was performed, testing effectivity of the
user interface, and the user's attitude towards the
provided hybrid interaction methods. Results were highly
encouraging: details can be found in the publication
This study was performed in cooperation with Alexander
Bornik and Thomas Pock.
- Bornik, A., Beichel,
R., Kruijff, E., Reitinger, B., Schmalstieg. A Hybrid
User Interface for Manipulation of Volumetric Medical
Data. In proceedings of Symposium on 3D user interfaces,
IEEE Virtual Reality Conference, 2006.
FHG, 2005 - 2006
(Pseudo)-haptic feedback using neuromuscular electrical
This test focused at providing users (pseudo-) haptic
feedback through electrostimulation of the muscles.
Electrostimulation transfers small electric currents
via pads attached to the skin through the cutaneous
tissues of the human body. Hereby, sensory nerves (the
alpha motor nerves) are stimulated – the externally
generated electrical charges mimic the neural impulse
messages that travel to and from the brain. Through
stimulation of the right muscle endings, muscles can
be contracted and released. The stimulation of motor
nerves (neuromuscular electrical stimulation NMES) or
receptors (transcutaneous neurolelectrical stimulation
TENS) is used in the medical and sports area to train
muscles, and to block pain.
The aim of BioHaptics
study is to investigate to which extend biomechanical
configurations (arm poses) can be changed, moving the
arm in a specific direction as an effect of involuntary
muscle activity. Hereby, these actions are expected
to resemble externally applied forces on the arm such
as provided by an exoskeleton.
A first experiment evaluated the user's attitude towards
using electrostimulation of muscles as feedback. During
gameplay (Quake) users were stimulated when hit with
short or longer electric pulses (around 5-10 Hz, up
to 25 mA, provided to biceps or brachioradialis), resulting
in shocklike up to slightly stiff contractions of lower
and under arm. The "beating" feedback matched
the intended event (being hit) well. Due to the shocklike
characteristics, the test showed the potential of using
electrical "shocks" for warning mechanisms.
A further test will follow which triggers the exact
configuration of muscle endings to come to actual "controlled"
changes of the pose of a user's arm, since the current
test was not intended for that.
User feedback was largely positive, even though some
users would not advice using this kind of feedback outside
the gaming area, under current conditions. Exact results
are being published.
- Kruijff, E., Schmalstieg,
D., Beckhaus, S. Using Neuromuscular Electrical Stimulation
for Pseudo-Haptic Feedback. ACM Symposium on Virtual
Reality Software and Technology, 2006.
Internal project, topic under
disclosure due to patenting.
FHG 2004 - 2005 (part of DHX
Hybrid interaction using conventional and unconventional
This study focused at the combination
of conventional control techniques (a touch screen and
joystick) and unconventional controls in a hybrid setup.
The study was intended to provide more exciting ways
of interaction in traditional application areas (museums),
to promote people to interact with the application,
possibly increasing learning effects. In order to facilitate
this coupling of conventional and unconventional techniques,
a console was built, called Capsa
Arcana (“mysterious box”). Within this console,
different kinds of interaction methods can be supported,
by choosing from a range of MIDI-based sensors than
can be let in a component based infrastructure. The
study focused at providing different kinds of pushing
actions (force sensitive tapping, haptic pushing using
flexible surfaces) and using gestures to change numerical
values. In addition, the ergonomics of combining different
kinds of controls, as well as focal attention issues
were studied, which were needed to allocate problems
through combined usage of the console and the stereoscopic
wall (for immersive visualisation of the museum content)
placed behind the console.
E., Conrad, S., Palamidese, P., Mazzoleni, P., Hasenbrink,
F., Suttrop, M., Kwon, Y-M., Remote Virtual Guidance
in Immersive Museum Applications. VSMM2004, Gifu,
FHG, 2002 - 2006
Multisensory binding, sensory integration of vibrotactility
and audiofeedback, sensory substitution, hybrid interaction
This study focuses at exploring multisensory perceptual
processing (binding) of visual, vibrotactile and audio-based
feedback. Several experiments have shown that through
coupling of sensory modalities, the different modalities
directly affect each other and should not be seen as
seperate entities anymore, as is done in the "traditional"
multimodality view. The right combination (integration)
might lead to increased performance of feedback mechanisms.
Within our experiment, we test to which extend we can
provide methods that replace (substitute) haptic feedback
by coupling visual, vibrotactile, and auditory feedback.
The feedback should create effective collision and texture
perception, and studies to which extend the different
modalities affect each other.
In order to perform the experiment, a new input device
was developed, called the Tactylus. The Tactylus
is an ergonomic pen-like vibrotactile input device supporting
hybrid interaction in 2D and 3D user interfaces. For
the design a range of ergonomics studies have been performed
to analyse the coupling between hand and device during
6DOF movement. The particularity of the device lies
within the control of the vibration element via audio
signals, allowing a close coupling of sound and vibrotactility
of tactile and force-reflective events. A final evaluation
is currently being performed and will be published shortly.
This study is being performed together with Gerold Wesche,
Gernot Goebbels, Martijn Kunstman (starttosee / xiox),
and Kai Riege.
- Kruijff, E. Wesche, G., Riege,
K., Goebbels, G., Kunstman, M., Schmalstieg, D. Tactylus,
a Pen-Input Device exploring Audiotactile Sensory Binding.
ACM Symposium on Virtual Reality Software and Technology,
Using audio and air-based shockwaves for pseudo-haptic
This study focused at
the effects of sound and air-based shockwaves to create
haptic-like events. Sound waves can be sensed via transcutaneous
sensing, bone structures (bone conduction), and via
the cavities of the human body that pick up the sound
frequencies. Sound waves can generate vibrations up
to slight shocks in the human body. On the other hand,
air-based shockwaves ("balls" of air propelled
at a user) can be sensed cutanteously or transcoutaneously
(skin). To produce the sound shockwaves, we produced
4 large-size subwoofers (holding 43cm woofers) which
were placed in the corners of a conical display system,
the iCone at Fraunhofer IMK. Two of the subwoofers would
produce low frequency sounds by using the subfloor space
under the projection system, in which they were partly
let in. These subwoofers were slighly delayed so that
the low-frequencies could be sensed well in the center
of the display device. In the center, additionally 5
Paraseats (tactile transducers producing only low frequency
sounds) were placed under several floor tiles.
As a result of the experiments, we proved that users
can experience pseudo-haptic sensations in the lower
body and in the torso by using focused low-frequency
Additionally, we made several experiments using air-propulsion
devices. Unfortunately, the tried designs did not work
out well. A design using pressurized air should be used,
but was aborted due to technical difficulties and possible
problems with health & stafety issues.
This study was performed together with Aeldrik Pander,
with help of Joachim Gossmann.
E., Pander, A. Experiences of using shockwaves for
haptic sensations. In proceedings of 3D user interface
workshop, IEEE Virtual Reality Conference, Bonn, Germany,
Hybrid interfaces for a cooperative
a so-called "future office" installation,
combining traditional desktop systems with an immersive
projection display. The system was designed to study
cooperative processes of users combining 2D and 3D interaction
methods. Through experimenting with the focal attention
of different users, a form was found in which users
could have both the privacy of a private workplace,
and a shared workspace to cooperate. As a result, the
users are sitting at ergonomically designed workspaces,
on which they can put their laptop, or make use of permanently
installed desktop system. In the middle of the display
system, a horizontal projection screen is placed. The
display, which serves as public workspace, can show
both mono and stereo content. Users can display multimodal
content at the public workspace using a distributed
system architecture - users login via a secured connection
and can put data at the public workspace, or share data
with other users via direct transmission. In order to
interact with the public workspace, studies were performed
in which different kinds of 2D and 3D devices were used
to control the application, in this case an application
which supported brainstorming. The users can make use
of a variety of input devices, including a SpaceMouse,
and a touchpad which can recognize a limited set of
Above the desk, a large Plasma screen was placed at
the wall. This screen could be used as additional public
workspace, mostly to allow for AV-streaming with remote
This study was performed together with Gernot Goebbels,
Goran Galunic, Thobias Orthey, Lino Sanfilippo, Ana
Ivanovic, and Kai Cheung.
G., Kruijff, E., Galunic, G., Orthey, T., Ivanovic,
A., Sanfilippo, L. OfuturO - A Mixed Reality Desk
Environment for Supporting Creativity Methods. In
proceedings of 6th Symposium on Virtual Reality SVR
2003, Ribeirao Preto, Brazil, 2003.
Performance study comparing a task specific device
(CubicMouse) with several general purpose 3D input devices
(Stylus, glove), development of new interaction technique
and new analysis method
studies, a continuation of the initial CubicMouse (Froehlich,
Plate) development was made, based on a range of empirical
evaluations, and interaction technique developments.
The CubicMouse (commercially sold by Fakespace systems)
is a 3D input device that mimics a coordinate system,
therefore also called a "coordinate-system prop".
It contains 3 rods that can be moved and rotated to
allow for a constrained way of input. For example, the
device allows six degree of freedom manipulations of
the object coordinate system itself and additionally
six degree of freedom manipulations of objects defined
relative to the object coordinate system: the input
device for example enables positioning and orienting
a car model, while controlling an arbitrary oriented
slicing plane relative to the car model. We have performed
a comprehensive user test to evaluate the overall reaction
to the device and to compare the performance of the
device to traditional stylus and glove based interaction.
Hence, the test showed the difference between devices
that have been designed for general purpose and task-specific
interaction. The most interesting
result is that 83% of our subjects preferred Cubic-Mouse
based interaction over stylus and glove-based interaction
methods for fine grain manipulation of virtual objects,
even so task completion times were partially significantly
longer. For analysis, a newly developed graphical 3D
trajectory analysis tool was used.
Next to the empirical tests, a variety of experiments
were made on a different interaction techniques, including
manipulation and system control techniques.
These studies were performed in cooperation with Bernd
Froehlich, John Plate, Jakob Beetz and Hartmut Seichter.
||Dr. techn. Ernst
Kruijff, Institute for Computer Graphics and Vision, Graz
University of Technology
email kruijff at icg.tugraz.at
| url www.icg.tugraz.at/Members/kruijff