Mistakes can be traditionally classified in terms of
‘‘errors of action’’ (skill-based slips and lapses) and
‘‘errors of intention’’ (rule- and knowledge-based mistakes)
(23). The chance of making an error may arise
at any stage in a process (development, configuration,
management, action) and increases exponentially
with the number of steps through which it
develops. Moreover, the occurrence of an adverse
event following the error itself is the logical consequence
of a failure in the relative defense system.
According to this interpretation, primary risks are usually
dependent on the characteristics of the process
that is being controlled, while secondary risks result
from the use of some other dependent systems. In
fact, accidents may not only occur due to system failures,
but also because of defective controls (either
within the organization or within a particular job). In
the ideal condition, each layer of a complex defense
system would be intact, so that, assuming correct
operation, an accident trajectory would be blocked
(23). In reality, although most hazardous technologies
possess several degrees of preventive or defensive
layers, there is always a chance of system failure. As
depicted by the classical Swiss cheese model, each
defensive layer (slice of cheese) has a number of vulnerabilities
(holes) that are continually opening, shutting
and shifting their location, leaving the
opportunity for an accident trajectory that may irreversibly
penetrate the barrier (24).