Perceptual intelligence for a computer can not be achieved
by merely collecting and displaying sensory information, like
what most webcams do. Perceptual intelligence results from an
understanding of what the sensory data reveal about the state
of the environment and people. The key research problem to
be addressed with the sensing chair, therefore, is the automatic
processing and interpretation of touch sensor information, and
the modeling of user behavior leading to such sensory data. We
envision tomorrow’s computing environment where all objects
are outfitted with a layer of artificial skin (for example, a sensing
chair, a sensing floor, a sensing file folder).We expect the algorithms
and behavior models that we develop with the sensing
chair to be extensible to large-scale distributed haptic sensing
and interpretation.