The basic sensitivity of a meter might be, for instance, 100 microamperes full scale (with a voltage drop of, say, 50 millivolts at full current). Such meters are often calibrated to read some other quantity that can be converted to a current of that magnitude. The use of current dividers, often called shunts, allows a meter to be calibrated to measure larger currents. A meter can be calibrated as a DC voltmeter if the resistance of the coil is known by calculating the voltage required to generate a full scale current. A meter can be configured to read other voltages by putting it in a voltage divider circuit. This is generally done by placing a resistor in series with the meter coil. A meter can be used to read resistance by placing it in series with a known voltage (a battery) and an adjustable resistor. In a preparatory step, the circuit is completed and the resistor adjusted to produce full scale deflection. When an unknown resistor is placed in series in the circuit the current will be less than full scale and an appropriately calibrated scale can display the value of the previously unknown resistor.