The efficiency of a distributed algorithm is assessed with at least one out of three
classic distributed complexity measures: time complexity (number of rounds for
synchronous algorithms), communication or bit complexity (total number of bits
transmitted), and message complexity (total number of messages transmitted).
Depending on the application, one or another measure might be more relevant.
Generally speaking, time complexity has received most attention; but communication
complexity (bandwidth constraints) or message complexity (accounting
for message overhead) play a vital role as well. One cannot just ignore one of the
measures, as there are tradeoffs: One may for instance sometimes cut down on
time by exchanging larger messages. Alternatively, one may save messages and
bits by communicating “silently”. Two parties may for instance communicate for
free by telephone by simply never picking up the phone, and instead letting the
phone ring for a long time when transmitting a binary 1, and just a short time
for a binary 0. A more sophisticated example for silent communication employs
time-coding to communicate information through time. As illustration consider
pulse-position modulation, as used in wireless and optical communication. A kbit
message can be dispersed over time by encoding the message with a single
pulse in one of 2k possible slots. Employing a single pulse within time t allows to
communicate at most log t bits.1 Reducing message complexity is harder