digital voltmeter (DVM) measures an unknown input voltage by converting the voltage to a digital value and then displays the voltage in numeric form. DVMs are usually designed around a special type of analog-to-digital converter called an integrating converter.
DVM measurement accuracy is affected by many factors, including temperature, input impedance, and DVM power supply voltage variations. Less expensive DVMs often have input resistance on the order of 10 MΩ, and the resistance may be different for each input voltage range. Precision DVMs can have input resistances of 1 GΩ or higher. To ensure that a DVM's accuracy is within the manufacturer's specified tolerances, it must be periodically calibrated against a voltage standard such as the Weston cell.
The first digital voltmeter was invented and produced by Andrew Kay of Non-Linear Systems (and later founder of Kaypro) in 1954.[1]