How current works

Instead of saying the consumer (circuit) draws as much as it needs, it is better to say current will flow as much as is allowed by the circuit.

How much current a circuit will allow is based on its resistance (or impedance). An LED has a voltage drop, but very little resistance. This small resistance means that more current will flow through it than is healthy.

A resistor is an easy and common way to limit this current. There are many many examples on this site alone on how to do that. Search for "LED current limiting".

To get you started, use Ohm's Law to calculate the resistor you want:

Subtract the LED's voltage drop from your source voltage. This is how much voltage you need to drop across your resistor. Now use the amount of current you want to pass through your LED to calculate (using Ohm's Law) the size of resistor that will do this.


I am using an infrared LED in a circuit and it seems like it is drawing around 15 mA at 5 V.

That is unlikely unless it has a built-in resistor.

enter image description here

Figure 1. The I versus V curves for a range of typical LEDs shows that the forward voltage of an IR LED is about 1.25 V at 20 mA. Source: LED IV curves.*

Note that if you were to apply even 2 V to the IR LED that the current would exceed 100 mA and it might not last long. At 5 V a bare IR LED would almost certain die instantaneously.

I was told today that I am running a chance of burning it since infra red LEDs require 1 to 2 volts as well as I should consider adding a resistor to protect it from higher currents.

A better explanation would be that you need to limit the current through the LED to a safe value. A resistor is the simplest way of doing this.

I got confused because my understanding of current is that the consumer draws as much as it needs ...

Yes, but the graph is showing us that at 5 V it "needs" a very high current.

... and the only requirement for the power source regarding current is that it must supply at least that.

That's correct (subject to my previous comment).

For example if my power source is a 5 V 1Ah battery the current flowing would still be 15 mA regardless of the higher capability of the battery.

Only if there is a current limiter in the circuit. Again, a resistor would be the normal solution.

I got further more confused by the second part. I thought resistors affect the EMF, not current, ...

The resistor will both limit the current and create a voltage drop.

How can I calculate what resistance I need in order to limit current to 15 mA?

There are many, many sites on the Internet that explain this. The load-line tool below may be of interest.

enter image description here enter image description here

Figure 2. A loadline graphic calculator for various LEDs on a 5 V supply. Source: Loadline resistance graphic tool.

To use the tool:

  • Select your LED colour: IR in your case.
  • Select the current you want: 15 mA in your case.
  • On the If axis move up to the 15 mA line and move across to the IR curve.
  • Select the nearest loadline curve: 270 Ω.

A 270 Ω resistor will limit the current through your IR LED to about 15 mA on a 5 V supply. There will be 1.25 V across your LED and about 3.75 V across your 270 Ω resistor.


I got confused because my understanding of current is that the consumer draws as much as it needs and the only requirement for the power source regarding current is that it must supply at least that.

This is an extremely common misunderstanding. There is nothing fundamental about electronics that makes this case. It just happens to be a property of some devices.

For example you can buy two 120V light bulbs and be assured that ever if they are different brightnesses, they will both run correctly on 120V. But this is because you bought 120V light bulbs.

You could imagine an alternate universe where you bought two 1A light bulbs. You could be assured that they would both run correctly on 1A even if they were different brightnesses, but they would operate at different voltages. In this world, we might have 1A power supplies that provide whatever voltage the device needs, say up to 500V. In this world, so long your supply was the right current and its maximum voltage was sufficient, the device would work safely.

So "the consumer draws as much as it needs and the only requirement for the power source regarding current is that it must supply at least that" is an attribute of devices that are specified to work with a specific input voltage. An infrared LED is not such a device. In fact, it has a specified forward current that you are expected to supply and the voltage will be whatever is required to drive that current.