Information Theory
of Claude Shannon & Warren Weaver
(From the Third Edition of A First Look at Communication Theory by Em Griffin, (1991) 8th.ed. 2014, McGraw-Hill, Inc. This text-only version of the article appears on the World Wide Web site www.afirstlook.com. The text version does not contain any figures. A facsimile of the original article, which includes all figures, is also available in PDF format.)
In the late 1940s, a Bell Telephone Company research scientist by the name of Claude Shannon developed a mathematical theory of signal transmission. As you might expect from a telephone engineer, his goal was to get maximum line capacity with minimum distortion.
Shannon showed little interest in the semantic meaning of a message or its pragmatic effect on the listener. Like today’s manufacturers of state-of-the-art compact disc players, he wasn’t concerned whether the channel carried Beethoven, the Beatles, or The Boss. He didn’t care whether the listener preferred the beat of rock or the counterpoint of Bach. His theory merely aimed at solving the technical problems of high-fidelity transfer of sound.
Technical Solutions To Social Problems
In the wake of scientific discoveries spawned by World War II, Americans were optimistic that all social problems could be recast into mechanical terms susceptible to engineering solutions. Shannon was somewhat wary about the wholesale application of his mathematical equations to the semantic and pragmatic issues of interpersonal communication. But his hesitation was not shared by Warren Weaver, an executive with the Rockefeller Foundation and the Sloan-Kettering Institute on Cancer Research, and a consultant to a number of private scientific foundations. Shannon’s published theory was paired with an interpretive essay by Weaver that presented information theory as ‘‘exceedingly general in its scope, fundamental in the problems it treats, and of classic simplicity and power in the results it reaches."1 The essay suggested that whatever the communication problem, reducing information loss was the solution.
Most people working in the field of human communication had trouble following the mathematics of Shannon’s theory, but Weaver’s translation and commentary were easy to understand. Since the discipline was ripe for a model of communication and information theory was there to fill the need, its source-channel-receiver diagram quickly became the standard description of what happens when one person talks to another. Many of the terms we use today originated with Shannon and Weaver—message fidelity, multiple channels, information loss, source credibility, and feedback.
Because Shannon’s theory explores the electronic transmission of messages, it might seem appropriate to discuss it in the context of mass media theories. But his twenty-three theorems focus on syntax, the relationship between words. Research since the theory’s introduction contributes mainly to the field of applied linguistics. For these reasons, I include information theory in the section on messages.
A Linear Model Of Communication
Since Bell Laboratories paid the bill for Shannon’s research, it seems only fair to use a telephone example to explain his model, which is shown in Figure 4.1. Imagine you have a summer job at a camp located far from civilization. A few weeks’ absence from a romantic partner has given you a strong desire to ‘‘reach out and touch someone." Finances, work schedule, and a line of others wanting to use the only pay phone available limit you to a three-minute long-distance call.
Shannon would see you as the information source. You speak your message into the telephone mouthpiece, which transmits a signal through the telephone-wire channel. The received signal picks up static noise along the way, and this altered signal is reconverted to sound by the receiver in the earpiece at the destination end of the line. Information loss occurs every step of the way so that the message received differs from the one you sent.
During his lifetime, Weaver applied the model to the interpersonal features of conversation. Your brain is the information source, your voice the transmitter. Noise could include a hoarse throat from yelling at the campers, background chatter of those waiting to use the phone, or the distraction of mosquitoes drawing blood. The received signal may be diminished by an ear that’s been overexposed to hard rock, and your friend is quite capable of altering the message as it moves from ear to brain.
Shannon concentrates on the technical center of his model. (Will the phone system work sufficiently well so that you can get your message across?) Weaver focused on the source-destination relationship. (What’s going on between the two of you?) But all information theorists share a common goal of maximizing the amount of information the system can carry.
Information: The Reduction of Uncertainty
Most of us are comfortable with Wilbur Schramm’s notion that information is simply stuff that matters or anything that makes a difference.2 Shannon, however, has a technical definition for the word that doesn’t equate information with the idea of meaning. He emphasizes that ‘‘the semantic aspects of communication are irrelevant to the engineering aspects."3 For Shannon, information refers to the opportunity to reduce uncertainty. It gives us a chance to reduce entropy.
Shannon borrowed the idea of entropy from the second law of thermodynamics, which states that the universe is winding down from an organized state to chaos, moving from predictability to uncertainty. Entropy is randomness. How much information a message contains is measured by the extent it combats entropy. The less predictable the message, the more information it carries.
Picture yourself making that long-distance call, but this time in response to a blistering letter from your friend, who has heard that you’re having a summer fling with a co-worker. The letter is clear: ‘‘Call me and just say yes it’s true, or no it’s not—nothing more!" That either/or demand means you won’t require the three-minute channel capacity of the telephone line. But since your wavering friend has only an even chance of predicting your answer, that one bit of information will reduce his or her uncertainty by 50 percent. As a matter of fact, that’s how the theory defines a bit (taken from binary digit) of information. It’s communication that can cut entropy in half. Let’s play out the scene a few bits further.
Reducing Entropy Bit by Bit
At the beginning of the telephone conversation, you truthfully acknowledge romantic feelings for someone else. Your former friend breaks the rule stated in the letter and demands to know which one of the potential sixteen staff workers is the object of your affection. The conversation could literally narrow down the alternatives bit by bit.
Friend: Is this special someone on the sports staff or the kitchen crew?
You: Sports staff. [Cut in half to eight.]
Friend: Which cabin does this new friend live in, Sequoia or Cherokee?
You: Sequoia. [Cut in half to four.]
Friend: First-year staff or an old-timer?
You: First year. [Cut in half to two.]
Friend: The redhead or the blond?
You: The blond. [All uncertainty gone.]
Removing all uncertainty took four bits of information. Of course, it would have been less cumbersome simply to say the name in the first place, but either way four bits of entropy were eliminated.
If this conversation really happened, and if you truly cared about the person on the other end of the line, you would try to squeeze every bit of innovative explanation for your conduct into the three-minute period. That’s what Shannon and Weaver mean by information. As they use the term, it ‘‘relates not so much to what you do say, as to what you could say."4 Their focus on message possibilities inspired a touch of doggerel from University of Colorado professor Don Darnell:
What one does is only one
Of several things he might have done.
One must know the things rejected
To appreciate the one selected.5
A good connection for three minutes provides lots of opportunity to draw on a wide repertoire of messages. If lack of imagination or situational constraints limit you to a few predictable clichés such as ‘‘only good friends" or ‘‘doesn’t mean a thing to me," Shannon, Weaver, and probably your ex-friend will regard your efforts as uninformative and redundant.
There are many fine things that can be said over a communication channel that don’t qualify as information. Perhaps your phone call wasn’t crisis motivated, but was merely a way to announce ‘‘I just called to say I love you." If the person on the other end had no doubt that you cared in the first place, the call is a warm ritual rather than information. If the destination party already knows what’s coming, or the source isn’t free to choose the message sent, information is zero.
Noise vs. Information
Noise is the enemy of information. For Shannon and Weaver, noise is more than an irritating sound or static on the line. It is anything added to the signal that’s not intended by the source. Usually that kind of interference is an unintended by-product of the situation. In nonelectrical channels, noise can be smudged newsprint, ah-um-er vocal filler, or visual movement that distracts the listener. There is a ground-floor seminar room at my college that overlooks a grassy knoll. The first warm day in May brings out a flock of sunbathers to soak up the rays. No teacher can begin to compete with the view; the room is too noisy.
Noise may be intentional. For many years, the government of the former Soviet Union jammed the Voice of America broadcasts so that its citizens wouldn’t hear news from the west. Hecklers try to drown out the words of a speaker in order to prevent the audience from considering an opposing viewpoint. We can even generate white noise to mask more disruptive sounds. That’s the purpose of Muzak.
Information Theory
of Claude Shannon & Warren Weaver
(From the Third Edition of A First Look at Communication Theory by Em Griffin, (1991) 8th.ed. 2014, McGraw-Hill, Inc. This text-only version of the article appears on the World Wide Web site www.afirstlook.com. The text version does not contain any figures. A facsimile of the original article, which includes all figures, is also available in PDF format.)
In the late 1940s, a Bell Telephone Company research scientist by the name of Claude Shannon developed a mathematical theory of signal transmission. As you might expect from a telephone engineer, his goal was to get maximum line capacity with minimum distortion.
Shannon showed little interest in the semantic meaning of a message or its pragmatic effect on the listener. Like today’s manufacturers of state-of-the-art compact disc players, he wasn’t concerned whether the channel carried Beethoven, the Beatles, or The Boss. He didn’t care whether the listener preferred the beat of rock or the counterpoint of Bach. His theory merely aimed at solving the technical problems of high-fidelity transfer of sound.
Technical Solutions To Social Problems
In the wake of scientific discoveries spawned by World War II, Americans were optimistic that all social problems could be recast into mechanical terms susceptible to engineering solutions. Shannon was somewhat wary about the wholesale application of his mathematical equations to the semantic and pragmatic issues of interpersonal communication. But his hesitation was not shared by Warren Weaver, an executive with the Rockefeller Foundation and the Sloan-Kettering Institute on Cancer Research, and a consultant to a number of private scientific foundations. Shannon’s published theory was paired with an interpretive essay by Weaver that presented information theory as ‘‘exceedingly general in its scope, fundamental in the problems it treats, and of classic simplicity and power in the results it reaches."1 The essay suggested that whatever the communication problem, reducing information loss was the solution.
Most people working in the field of human communication had trouble following the mathematics of Shannon’s theory, but Weaver’s translation and commentary were easy to understand. Since the discipline was ripe for a model of communication and information theory was there to fill the need, its source-channel-receiver diagram quickly became the standard description of what happens when one person talks to another. Many of the terms we use today originated with Shannon and Weaver—message fidelity, multiple channels, information loss, source credibility, and feedback.
Because Shannon’s theory explores the electronic transmission of messages, it might seem appropriate to discuss it in the context of mass media theories. But his twenty-three theorems focus on syntax, the relationship between words. Research since the theory’s introduction contributes mainly to the field of applied linguistics. For these reasons, I include information theory in the section on messages.
A Linear Model Of Communication
Since Bell Laboratories paid the bill for Shannon’s research, it seems only fair to use a telephone example to explain his model, which is shown in Figure 4.1. Imagine you have a summer job at a camp located far from civilization. A few weeks’ absence from a romantic partner has given you a strong desire to ‘‘reach out and touch someone." Finances, work schedule, and a line of others wanting to use the only pay phone available limit you to a three-minute long-distance call.
Shannon would see you as the information source. You speak your message into the telephone mouthpiece, which transmits a signal through the telephone-wire channel. The received signal picks up static noise along the way, and this altered signal is reconverted to sound by the receiver in the earpiece at the destination end of the line. Information loss occurs every step of the way so that the message received differs from the one you sent.
During his lifetime, Weaver applied the model to the interpersonal features of conversation. Your brain is the information source, your voice the transmitter. Noise could include a hoarse throat from yelling at the campers, background chatter of those waiting to use the phone, or the distraction of mosquitoes drawing blood. The received signal may be diminished by an ear that’s been overexposed to hard rock, and your friend is quite capable of altering the message as it moves from ear to brain.
Shannon concentrates on the technical center of his model. (Will the phone system work sufficiently well so that you can get your message across?) Weaver focused on the source-destination relationship. (What’s going on between the two of you?) But all information theorists share a common goal of maximizing the amount of information the system can carry.
Information: The Reduction of Uncertainty
Most of us are comfortable with Wilbur Schramm’s notion that information is simply stuff that matters or anything that makes a difference.2 Shannon, however, has a technical definition for the word that doesn’t equate information with the idea of meaning. He emphasizes that ‘‘the semantic aspects of communication are irrelevant to the engineering aspects."3 For Shannon, information refers to the opportunity to reduce uncertainty. It gives us a chance to reduce entropy.
Shannon borrowed the idea of entropy from the second law of thermodynamics, which states that the universe is winding down from an organized state to chaos, moving from predictability to uncertainty. Entropy is randomness. How much information a message contains is measured by the extent it combats entropy. The less predictable the message, the more information it carries.
Picture yourself making that long-distance call, but this time in response to a blistering letter from your friend, who has heard that you’re having a summer fling with a co-worker. The letter is clear: ‘‘Call me and just say yes it’s true, or no it’s not—nothing more!" That either/or demand means you won’t require the three-minute channel capacity of the telephone line. But since your wavering friend has only an even chance of predicting your answer, that one bit of information will reduce his or her uncertainty by 50 percent. As a matter of fact, that’s how the theory defines a bit (taken from binary digit) of information. It’s communication that can cut entropy in half. Let’s play out the scene a few bits further.
Reducing Entropy Bit by Bit
At the beginning of the telephone conversation, you truthfully acknowledge romantic feelings for someone else. Your former friend breaks the rule stated in the letter and demands to know which one of the potential sixteen staff workers is the object of your affection. The conversation could literally narrow down the alternatives bit by bit.
Friend: Is this special someone on the sports staff or the kitchen crew?
You: Sports staff. [Cut in half to eight.]
Friend: Which cabin does this new friend live in, Sequoia or Cherokee?
You: Sequoia. [Cut in half to four.]
Friend: First-year staff or an old-timer?
You: First year. [Cut in half to two.]
Friend: The redhead or the blond?
You: The blond. [All uncertainty gone.]
Removing all uncertainty took four bits of information. Of course, it would have been less cumbersome simply to say the name in the first place, but either way four bits of entropy were eliminated.
If this conversation really happened, and if you truly cared about the person on the other end of the line, you would try to squeeze every bit of innovative explanation for your conduct into the three-minute period. That’s what Shannon and Weaver mean by information. As they use the term, it ‘‘relates not so much to what you do say, as to what you could say."4 Their focus on message possibilities inspired a touch of doggerel from University of Colorado professor Don Darnell:
What one does is only one
Of several things he might have done.
One must know the things rejected
To appreciate the one selected.5
A good connection for three minutes provides lots of opportunity to draw on a wide repertoire of messages. If lack of imagination or situational constraints limit you to a few predictable clichés such as ‘‘only good friends" or ‘‘doesn’t mean a thing to me," Shannon, Weaver, and probably your ex-friend will regard your efforts as uninformative and redundant.
There are many fine things that can be said over a communication channel that don’t qualify as information. Perhaps your phone call wasn’t crisis motivated, but was merely a way to announce ‘‘I just called to say I love you." If the person on the other end had no doubt that you cared in the first place, the call is a warm ritual rather than information. If the destination party already knows what’s coming, or the source isn’t free to choose the message sent, information is zero.
Noise vs. Information
Noise is the enemy of information. For Shannon and Weaver, noise is more than an irritating sound or static on the line. It is anything added to the signal that’s not intended by the source. Usually that kind of interference is an unintended by-product of the situation. In nonelectrical channels, noise can be smudged newsprint, ah-um-er vocal filler, or visual movement that distracts the listener. There is a ground-floor seminar room at my college that overlooks a grassy knoll. The first warm day in May brings out a flock of sunbathers to soak up the rays. No teacher can begin to compete with the view; the room is too noisy.
Noise may be intentional. For many years, the government of the former Soviet Union jammed the Voice of America broadcasts so that its citizens wouldn’t hear news from the west. Hecklers try to drown out the words of a speaker in order to prevent the audience from considering an opposing viewpoint. We can even generate white noise to mask more disruptive sounds. That’s the purpose of Muzak.
การแปล กรุณารอสักครู่..
