Researchers in distributed computer-supported collaborative
work (CSCW) propose improving video-mediated collaboration
by situating it in the physical world through the use of
shared media spaces [30] and projected Augmented Reality
(AR) [33]. However, these interfaces still lack many physical
aspects of collaboration, as remote participants are only visually
present on the screen, which limits their ability to collaborate
through physical objects. To overcome these challenges,
telepresence robots have been proposed to embody
remote participants for social presence and to manipulate the
physical world from a distance [13].
A different approach to physical remote collaboration is presented
by remote Tangible User Interfaces (TUIs), which focus
on synchronized distributed physical objects [2]. These
physical objects, or tokens, are synchronized with a remote
counterpart to represent shared content, rather than embodying
collaborators.
By introducing Physical Telepresence, one of our goals is
to extend the physical embodiment of remote participants,
which is common in telepresence robotics, and combine it
with the physical embodiment of shared content, common in
remote TUIs. An example of such a system is shown in Figure
1, where the hands of a remote collaborator along with a
shared digital model are materialized on a shape display
Researchers in distributed computer-supported collaborative
work (CSCW) propose improving video-mediated collaboration
by situating it in the physical world through the use of
shared media spaces [30] and projected Augmented Reality
(AR) [33]. However, these interfaces still lack many physical
aspects of collaboration, as remote participants are only visually
present on the screen, which limits their ability to collaborate
through physical objects. To overcome these challenges,
telepresence robots have been proposed to embody
remote participants for social presence and to manipulate the
physical world from a distance [13].
A different approach to physical remote collaboration is presented
by remote Tangible User Interfaces (TUIs), which focus
on synchronized distributed physical objects [2]. These
physical objects, or tokens, are synchronized with a remote
counterpart to represent shared content, rather than embodying
collaborators.
By introducing Physical Telepresence, one of our goals is
to extend the physical embodiment of remote participants,
which is common in telepresence robotics, and combine it
with the physical embodiment of shared content, common in
remote TUIs. An example of such a system is shown in Figure
1, where the hands of a remote collaborator along with a
shared digital model are materialized on a shape display
การแปล กรุณารอสักครู่..