Trust
Many of the complex issues involving mutual human-machine modeling,
awareness, and coordination are captured by the anthropomorphic term trust. If we examine the considerations that enter into our decision to delegate
a task to a subordinate, instruct the subordinate in how to perform the task, monitor the performance, or authorize some class of tasks without follow-up, our trust in the subordinate will almost certainly play an explanatory role. Closer consideration will
show our use of the term to be multidimensional.
The trust we have that our secretary will remember to pick up the mail is distinct from our trust that he/she will compose a postable business letter, which, in turn, is distinct
from our trust in the lawyer who assures us that the letter is not actionable.
Bonnie Muir (1996, 1994, 1987) adopted a taxonomy of trust for human-machine relations from sociologist Barber (1983), giving a nod to social psychologists (Rempel, Holmes, and Zanna 1985) for a complementary taxonomy and a source of conjectures about the dynamic character of trust. Barber (1983) defines trust in terms of three specific expectations: (1) persistence of natural, biological, and social
“laws,” for example, gravity, pain following injury, and parents protecting their offspring; (2) competence of others to perform their technical roles, for example, our trust that a bus driver will take us safely to our stop; and (3) fiduciary responsibility of others to fulfill their obligations, for example, our trust that a lawyer will administer an assigned estate without theft