Although a variety of instruments measure power, the most accurate instrument is a power meter and a power sensor. The sensor is an RF power-to-voltage transducer. The power meter displays the detected voltage as a value of power in log (dBm) or linear (watts) units. Typical power meter instrumentation accuracy will be in the order of hundredths of a dB, while other instruments (i.e., spectrum analyzers, network analyzers) will have power measurement accuracies in the tenths of dBs or more. One of the main differences between the instruments is that of frequency selective measurements. Frequency selective measurements attempt to determine the power within a specified bandwidth. The traditional Power Meter is not frequency selective in the sense that it measures the average power over the full frequency range of the sensor
and will include the power of the carrier as well as any harmonics which may be generated. A Spectrum Analyzer provides a frequency selective measurement since it measures in a particular Resolution Bandwidth. The lack of frequency selectivity is the main reason why Power Meters measure down to around -70 dBm and instruments such as a Spectrum Analyzer can measure much lower than this if narrow resolution bandwidths are used.