Why is analog "voltmeter sensitivity" is defined as ohm/Volt?
simulate this circuit – Schematic created using CircuitLab
Figure 1. DC voltmeter with range selector switch.
A simple DC voltmeter with range selector switches is shown in Figure 1.
- The meter movement will read full scale at 0.25 V and has a resistance of 5 kΩ. Adding 15 kΩ in series with the coil will convert the meter to 1 V full scale and the resistance will be 20 kΩ.
- For each successive range we need to increase the resistance proportionally at 20 kΩ per volt (difference per range select).
The Ω/V figure is taken into consideration to figure out how much the meter is loading / affecting the circuit being measured.
Digital meters tend to have fixed input impedance. Typically this is 10 MΩ and can be ignored in most applications where high accuracy is not required.
Figure 2. AVO 8 multimeter. Photo by Zureks on Wikipedia.
There are circuit schematics and some history for the AVO (amps, volts, ohms) meters on Richard's Radios and these are quite instructive and show some very clever circuits to maximise the utility of these instruments.
Because the meter movement has a full scale deflection at some specified current. A series resistor is used to turn it into a voltage measuring device. The value of this resistor will be V(for full scale) / I(full scale). So the impedance of the meter will be 1/(I full scale) ohms per volt (FSD).