COMPUTER-MEDIATED SHAPE I/O
While shape capture and output literally adds a physical dimension
to telecommunication, we find it interesting to go
beyond representations that are symmetric in time and space.
How can we create interfaces that enable remote participants
to go beyond physically being there? The relaxation
of 1:1 capture/output mapping opens up significant potential
for new interaction techniques. Consider, e.g., an audio system
with real-time translation of a speaker’s voice to the listener’s
language. Here, the mediation emphasizes the relevant
and simplifies the interaction. The introduction of shape
output allows us to apply similar transformation concepts to
the physical domain. We explore shape mediation, such as
transformation of physical form, altered representation, data
filtering and replication, changing motion dynamics and timedomain
manipulations.
Transformation of Physical Form: Bending Body Limits
In our system, users can apply transformations to their activity
in the remote environment, for example with scaling,
translation, rotation, shearing, stretching and other distortions.
Translation offsets geometry and can extend reach,
with potential ergonomic benefits. Scaling can make a hand
larger or smaller for manipulation of objects of varying sizes
(see Figure 6). A small hand could avoid undesirable collisions
in dense topologies, while an enlarged hand could carry
multiple items. The transformations allow continuous realtime
changes during the interaction, e.g., enabling smooth
changes in size or position while holding an object. Examples
of other transformations include replication or mirroring,
e.g., to approach objects from multiple angles.
Altered Representation: Becoming Something Else
With user and object tracking, there are benefits to switching
representation for new capabilities. A system that captures
geometry does not need to propagate all of it. It may be useful
to just send a user’s hand and not the arm. As we are only limited
by what the shape display can render, we can also morph
into other tools that are optimal for the task, while controlled
by the user. Examples include grippers, bowls, ramps, and
claws — tools with specific properties that facilitate or constrain
the interactions (see Figure 8 and 9). The tools could
Figure 8. Replacing hands with a hook to reach or ramps to slide objects.
Figure 9. The claw tool open and closed to enclose and move an object.
also be animated or semi-autonomously use the sensed geometry
on the remote side to influence their behavior. Switching
to a purely graphical representation to avoid collisions, is another
example (see Figure 7).
Filtering: Adding and Removing Motion
Signal processing can be applied to refine propagated motion,
e.g., using smoothing or low-pass/high-pass filters. Such
approaches are in use, e.g., in surgical robotics where hand
tremors can be suppressed. Our system could also prevent
fast motion or access to protected areas to avoid involuntary
movements. In addition to reducing human noise, it may
also alleviate system limitations, such as sampling resolution,
speed, range and vibrations.
Motion Dynamics: Warping Time
Non-linear mapping of the propagated motion is interesting
for many interactions. The properties of certain remote artifacts
might, e.g., require slower or faster mapped motion, or
require brief freezing or slow-down to emphasize an effect or
make it legible. Such manipulations of time need, however,
to be designed with great care, as they break the temporal link
between the remote locations.
PROTOTYPE APPLICATIONS
Telepresence Workspace
When discussing a physical design over distance, it is important
for both parties to have an understanding of a shared
model. We propose to render physical models on shape displays
during remote collaboration meetings. The shape output
is combined with video for viewing the upper body and
face of remote collaborators. By aligning the tabletop shape
display with the vertical screen, the two collaborators perceive
a shared physical workspace, where the remote person
can reach out of the vertical screen to physically point at a
model. We support collaboration through shape rendering in
several ways:
Shared Digital Model. The model is mirrored on the remote
shape displays and provides a shared frame of reference.
Transmitted Physical Model. When a user places a physical
model onto the surface, its shape and texture is transmitted to
the remote site.