One of the noteworthy things about this field, as in many other areas of technology, is how
little the fundamental principles change over time. Systems are incredibly smaller, current
speeds of operation are truly remarkable, and new gadgets surface every day, leaving us to
wonder where technology is taking us. However, if we take a moment to consider that the
majority of all the devices in use were invented decades ago and that design techniques
appearing in texts as far back as the 1930s are still in use, we realize that most of what we
see is primarily a steady improvement in construction techniques, general characteristics,
and application techniques rather than the development of new elements and fundamentally
new designs. The result is that most of the devices discussed in this text have been
around for some time, and that texts on the subject written a decade ago are still good references
with content that has not changed very much. The major changes have been in the
understanding of how these devices work and their full range of capabilities, and in
improved methods of teaching the fundamentals associated with them. The benefit of all
this to the new student of the subject is that the material in this text will, we hope, have
reached a level where it is relatively easy to grasp and the information will have application
for years to come.
The miniaturization that has occurred in recent years leaves us to wonder about its limits.
Complete systems now appear on wafers thousands of times smaller than the single element
of earlier networks. The first integrated circuit (IC) was developed by Jack Kilby while
working at Texas Instruments in 1958 ( Fig. 1.1 ). Today, the Intel ® Core TM i7 Extreme