In this paper, a paradigm for shared control is described
in which a machine’s manual control interface is motorized
to allow a human and an automatic controller to simultaneously
exert control. The manual interface becomes a haptic
display, relaying information to the human about the intentions
of the automatic controller while retaining its role as a
manual control interface. The human may express his control
intentions in a way that either overrides the automation
or conforms to it. The automatic controller, by design,
aims to create images in the mind of the human of fixtures
in the shared workspace that can be incorporated into efficient
task completion strategies. The fixtures are animated
under the guidance of an algorithm designed to automate
part of the human/machine task. Results are presented from
2 experiments in which 11 subjects completed a path following
task using a motorized steering wheel on a fixed-base
driving simulator. These results indicate that the haptic assist
through the steering wheel improves lane keeping by at
least 30% reduces visual demand by 29% (p