If we must decide whether a man is less than six feet tall or more than six feet tall and if we know that the
chances are 50—50, then we need one bit of information. Notice that this unit of information does not
refer in any way to the unit of length that we use–feet, inches, centimeters, etc. However you measure the
man's height, we still need just one bit of information.
http://spider.apa.org/ftdocs/rev/1994/april/rev1012343.html (2 of 17) [10/10/2001 10:23:08 AM]
Two bits of information enable us to decide among four equally likely alternatives. Three bits of
information enable us to decide among eight equally likely alternatives. Four bits of information decide
among 16 alternatives, five among 32, and so on. That is to say, if there are 32 equally likely alternatives,
we must make five successive binary decisions, worth one bit each, before we know which alternative is
correct. So the general rule is simple: every time the number of alternatives is increased by a factor of
two, one bit of information is added.
There are two ways we might increase the amount of input information. We could increase the rate at
which we give information to the observer, so that the amount of information per unit time would
increase.