In attempting to answer this challenge, there has been significant
effort put forth to develop technical solutions that
can support user identification on touchscreens. One approach
is to not uniquely identify each user per se, but rather
distinguish that there are multiple users operating at the
same time. For example, in Medusa [1], the presence of
multiple users can be inferred by proximity sensing around
the periphery of an augmented table. Touches to the surface
can be attributed to a particular user by using arm orientation
sensed by an array of proximity sensors on the table
bezel. If users exit the sensing zone, knowledge of the user
is lost; upon returning, the user is treated as new. Similarly,
Dang et al. [8] used finger orientation cues to back-project
to a user, achieving a similar outcome. In both systems,
occlusion and users in close proximity (i.e., side by side)
are problematic.