One of the most potentially useful applications for increasing
the mobility of disabled and/or elderly persons is wheelchair
implementation. A standard motorized wheelchair aids the mobility
of disabled people who cannot walk, always providing that
their disability allows them to control the joystick safely. Persons
with a serious disability or handicap, however, may find it
difficult or impossible to use them because it requires fine control;
cases in point could be tetraplegics who are capable only
of handling an on-off sensor or make certain very limited movements,
such as eye movements. This would make control of the
wheelchair particularly difficult, especially on delicate maneuvers.
For such cases it is necessary to develop more complex
human-wheelchair interfaces adapted to the disability of the
user, thus allowing them to input movement commands in a safe
and simple way. Among all these types of interfaces, the least
developed ones at the moment are those based on visual information,
due mainly to the vast amount of information that needs
to be processed. One form of communication that is of particular
interest here is the detection and following of the eyegaze or eye
control systems. Many people with severe disabilities usually
retain intact their control capacity over the oculomotor system,
so eye movements could be used to develop new human-machine
communication systems. Furthermore, this type of interface
would not be limited to severely disabled persons but could
be extended to the whole group of persons with the capacity for
controlling their eye movements.