2 Information
Information has historically been understood and measured in a wide variety of
ways. Usually these models are discipline specific or limited in scope. For example,
physicists speak of information in thermodynamic terms, while epistemologists
describe information as something that occurs within the context of higher
level human cognitive processes. We briefly present here a number of different
ideas about information, concluding with a discipline independent definition of
information that can be used to provide a basis for a general definition of communication.
The most widely understood notion of information for English speaking children
may be seen in Cookie Monster’s definition of information as “news or facts
about something.” Other definitions tend to be more explicitly human-centered,
such as “information is knowledge.” Similarly, Dretske [Dre81] views information as something that brings us to certainty, a definition that annoys those who
use probability theory and assume that one can never be absolutely certain about
anything. More formal than this is Bar Hillel’s model of information as what is
excluded by a statement [BHC53, BH55]. Bar Hillel’s definition provides a bridge
between formal and rigorous definitions of information and the idea of meaning.
Based upon earlier thermodynamic models of entropy, Brillouin [Bri56] suggests
that “information is a function of the ratio of the number of possible answers before
and after....” and that anything that reduces the size of the set of answers
provides information. Shannon’s model of information and communication, with
which we assume the reader is familiar, measures information as inversely proportional
to the probability of a signal [Sha93b, Rit86, Rit91, Los90]. Most of the
measures of information produce numbers similar to those produced by the familiar
Shannon model, with the amount of information being inversely proportional
to the probability of an event.