How are LEDs considered efficient?
You seem to be getting confused between the efficiency of the LED and the efficiency of the circuit to drive the LED.
In terms of light output per unit of energy used by the LED they are an efficient way to generate light. In absolute terms they aren't great, they are around 10%[1] efficient in that respect however that is still far better than the ~1-2% of a conventional incandescent bulb.
But what of that power wasted in the resistor. A series resistor is the simplest way to drive an LED, it is far from the only way to do so.
Even sticking to a resistor what if we put 20 of your 2V LEDs in series and supply it with 45V? Now you are using 45*0.02 = 900mW of which 800mW is going into the LEDs and only 100mW (11%) is being used by the series resistor.
But we can make it even more efficient, the reason for the resistor is that the LEDs needs a constant current and most electronics are designed to supply a constant voltage. The easiest way to convert from one to the other (assuming a constant load) is to throw in a series resistor.
You can get constant current power supplies. If you use one of those to drive your LED then the resistor can be eliminated and you can get an efficiency of well over 90% of your total system power going into the LEDs.
For a home project or a simple indicator on a signal a resistor is a lot cheaper and simpler but if you are driving a lot of LEDs then the logical choice is to pay a bit more, have a slightly more complex circuit and use a dedicated constant current LED driver IC.
- As noted in comments, 10% is a good ballpark for current household lighting and probably also about correct for cheap commodity LEDs using older processes. Newer single colour parts can achieve significantly higher levels of efficiency.
The efficiency of a LED refers to how efficient the LED is. This has nothing to do with how efficient or not the driving circuit is.
In many cases, the overall circuit efficiency of LEDs is not much of a issue. If the LED is just being used as a indicator, it is low power in the first place. A typical green LED drops 2.1 V and is plenty bright enough for indicator use at 20 mA. That's 42 mW of power going into the LED. Even if a additional 50 mW is lost in the circuit driving the LED, the total power consumption is still inconsequential in many cases.
In some low power applications, 100 mW can be a major amount of power. In such cases, more care will be taken in the circuit other than the cheap and simple series resistor to some handy supply. Various tricks include using a higher efficiency LED and running it at lower current, using a supply that is only a little above the LED voltage, adjusting the user interface so that blinking or otherwise keeping the LED off part of the time is acceptable, and a high-efficiency constant current power supply to drive the LED.
Efficiency also matters in high power applications, such as lighting. In such cases, more effort and production cost is put into the electronics to minimize additional power dissipated outside the LED. Often the main reason for maximizing efficiency isn't so much not wasting the power as not having to deal with the heat caused by the wasted power.
The question re how efficient actual LEDs are is a good one, but the answer is more complex than may be expected. Illumination capability is usually expressed in "lumens".
LED efficiency is usually expressed in terms of either
light energy output or
illumination capability
per unit of energy input.
For a given lumen output, efficiency is usually expressed in lumens per Watt (l/W) or in light energy output per Watt W/W). The first figure is more useful in practical illumination applicatios, but the second is more meaningful in terms or energy conversion efficiency.
If lumens and light energy had a fixed relationship then efficiency determination would be simple. However, What a given lumen figure represents in terms of "light energy" varies with the spectral composition of the light.
Lumens are expressed in terms of the theoretical response curve of the human eye. The same amount of light energy will produce a different number of lumens as light wavelength or mix of wavelengths varies. As a consequence, the wavelength or wavelengths of the source plays an important part in the lumens produced per energy input.
At the short wavelength end of the visible spectrum (not quite UV) eye sensitivity is extremely low, so lumens/Watt are low - so much so that it is usual to quote the output of deep blue and "Royal Blue" sources in terms of mW/W (light energy per electrical energy). This is highly useful as an LED family which includes phosphor-less and phosphor based LEDs allows some comparisons to be made.
For example, the "top flux bin" of the Cree Royal Blue XT-E LED when operated at Vf= 2.85V and If = 350 mA produces 613 mW typical (600, 613, 625 mW min/typ/max) at a wavelength of 465 Nm.
That equals an electrical to light conversion efficiency of 60.2% / 61.5% / 62.7% min/typ/max.
See page 19 of Cree XT-E datasheet top right of table - XTEARY-00-0000-
000000Q01
The top white phosphor version of the same LED produce 180 lumen at 25C at 2.77V, 350 mA = 970 mW DC in or 186 lumens/Watt.
IF the light energy from the Royal Blue & White LEDS was the same then the white LED would have a 100% l/W figure of 186/61.5% = 302 l/W at 100% efficiency. However, light outputs are not identical (quite) as in the white LED a portion of the LED die's blue light is used directly and the remainder excites the phosphor(s) with some loss in light/light conversion efficiency.
As has been noted, Wikipedia (correctly) states that the maximum theoretical l/W figure is 683 l/W.
How can this be reconciled to the claim that 100% white LED efficincy is ~= 300 l/W - and the fact that various manufacturer's are now making white LEDs with efficiencies > 300 l/W?
The answer lies in the useful but arcane (or arcane but useful) fact that the lumen rating is related to eye response. Maximum eye sensitivity occurs at a wavelength of 555 nm. The maximum possible efficiency in l/W is achievable with a monochromatic source at 555 nm. ANY other source, monochromatic or multi wavelength, will have a lower 100% theoretical possible l/W figure.
The "ideal" white light source is a black body radiator at 5800k with its spectrum truncated to the 400-700 nm range and has a max efficiency of 251 l/W !!!!
By making various adjustments to maintain "white" light while altering the % of various wavelengths increasing white efficiencies can be achieved. A 2800k black body truncated asymmetrically to achieve a CRI of 95 has a 370 l/W max theoretical efficiency.
But wait - there's more, but, later maybe.
I'll come back and add sources and more detail, but the above shows that the answer is harder than the question, and demonstrates that in true energy out per energy in terms the top modern LEDs achieve energy conversion efficiencies of > 50%.
More anon - light fades - rootop job beckons ...
References WIP
https://en.wikipedia.org/wiki/Luminous_efficacy
Analysis on the Luminous Efficiency of Phosphor-Conversion White Light-Emitting Diode
https://en.wikipedia.org/wiki/Light-emitting_diode#Efficiency_and_operational_parameters
http://www.hi-led.eu/wp-content/themes/hiled/pdf/led_energy_efficiency.pdf
http://www.philips.com/consumerfiles/newscenter/main/design/resources/pdf/Inside-Innovation-Backgrounder-Lumens-per-Watt.pdf
2014 http://www.forbes.com/sites/peterdetwiler/2014/03/27/leds-will-get-even-more-efficient-cree-passes-300-lumens-per-watt/#258823b870b4
http://www.cree.com/News-and-Events/Cree-News/Press-Releases/2014/March/300LPW-LED-barrier
Useful:
http://boards.straightdope.com/sdmb/showthread.php?t=719499