What causes a fuse to blow, the current or the power?
It's the watts dissipated in the fuse itself not the watts in the system. Therefore since the fuse has resistance (R) it's the current, which provides that power I^2*R.
The voltage has nothing to do with it : at 6V, 12V or 240V, the fuse still blows at 20A. However you cannot use a low voltage fuse in high voltage applications : it will still blow at (strictly, slightly above) its rated current, but may sustain an arc that a HV fuse would extinguish.
So if a fuse is rated for 12V DC and 20 Amps, this would be equal to 240 Watts. If a different voltage is supplied, will this change the Amps at which the fuse will break? Does the fuse technically 'blow' at 240 Watts?
All the fuse knows (before it blows) is the current passing through it. This might be: -
- 20 amps from a 1 volt supply feeding a 0.05 ohm load or,
- 20 amps from a 100 volts supply feeding a 5 ohm load.
The fuse knows nothing about load power. It is \$I^2 R_{FUSE}\$ dissipation in the fuse that causes it to heat and eventually blow (due to a combination of internal power dissipation and time).
Make sure the voltage rating is also sufficient or the fuse may not disconnect correctly. Also make sure that the fuse is capable of handling the large rupture current that could flow in some circuits; example: you can get fuses that are only 100 mA rated but, they have a rupture current rating of hundreds of amps.
The rating in current is the characteristic that defines when the fuse will blow up. The rating in voltage is the characteristic that defines how much the voltage can be without producing an arc after or while blowing up the fuse. Multiplying both values has no meaning.