As traditional assistive robotic systems and rehabilitation
devices have a traditional user interface, such as joysticks
and keyboards, many disabled people have difficulty in
accessing them and more advanced hands-free human–
machine interfaces become necessary. EMG (Electromyography
signal: electrical activity generated during the contraction
of a skeletal muscle) and EEG (Electroencephalography signal:
electrical activity of the brain recorded from the scalp) are
two kinds of bio-signals that are physical quantities which vary
with time [1]. They contain rich information in which a user’s
intention in the form of a muscular contraction and a
brainwave can be detected through surface electrodes. These
detected bio-signals can be used in a control system to
operate rehabilitation devices and robots.
In general, the development of EMG and EEG control
systems can be divided into four stages [2–4], namely (1) data
acquisition and data segmentation, (2) feature extraction,
(3) classification and (4) controller. As shown in Fig. 1, the biosignals
are acquired from the human body and then filtered to
reduce the noise produced by other electrical activities of the