Is there any reason why 5 volts is so ubiquitous for powering small consumer electronics?

While the original call for 5V was doubtlessly for TTL (as mentioned in the comments, specifically the reverse-biased BE junctions of bipolar transistors which are almost ubiquitously rated for 6V), there are several other things that play nice with the 5V standard:

  • reverse voltage rating of LEDs is usually 5-6 volts as well, so 5V is suitable for charlieplexing
  • charging single Li-Ion cells powering devices like the said phones, tablets, etc. You take the USB standard voltage, which is 5V ±5%, so 4.75 min at the USB socket. Then your cable will have some resistance, so what arrives at the device can be diminished by a few 100s of mV. And then you can use a simple linear battery-charging IC like the MCP73831, which will charge the battery to e.g. 4.2V. Since the source and target voltages are near, the losses would be manageable. Linear converters are much simpler than the alternative (DC-DC converters), and are especially suitable for a low-power device, meaning simpler and more compact power electronics and lower cost. If the standard were 4.5V instead of 5V, probably the battery wouldn't charge completely in the worst case (and DC-DC would be a must). If it were 6V instead of 5V, then the linear regulator losses would start to become considerable. 5V is a good middle-ground.

On the last point though, I'm not really sure what's the causality direction. It could be that Li-Ion fitted perfectly with power supply rails for typical small electronics, which led to its ubiquitous use.


One of the reasons was that the European Commission facilitated an agreement among major handset manufacturers to adopt a common charger on the basis of the micro-USB connector for data-enabled mobile phones sold in the EU.
And USB specifies 5V.

One mobile phone charger for all campaign

Background:
In the past, mobile telephones were only compatible with specific mobile telephone chargers. An estimated 500 million mobile phones were in use on 2009 in all EU countries. The chargers used often varied according to the manufacturer and model; and more than 30 different types of charger were on the market. Apart from causing inconvenience to the consumer, this created unnecessary electronic waste.

Almost every household is believed to have gathered a number of old chargers – estimated to generate more than 51 000 tons of electronic waste per year in the EU.

What the Commission is doing
In response, the European Commission facilitated an agreement among major handset manufacturers to adopt a common charger for data-enabled mobile phones sold in the EU.

In June 2009, a Memorandum of Understanding (MoU) was signed in which mobile phone manufacturers agreed to harmonise chargers for new models of data-enabled handsets, coming onto the market as of 2011.

As a result, Europe's major mobile phone manufacturers agreed to adopt a universal charger for data-enabled mobile phones sold in the EU. The MoU committed the industry to provide charger compatibility on the basis of the micro-USB connector.


I have a strong suspicion that it's related to the typical lead acid battery voltage of 2.1V/cell.

Back in the day, lead acid cells (rechargeable) were a convenient way to power vacuum tube heaters (which took a fairly high current) while dry cells (primary) were adequate for the anode supply.

So most vacuum tubes used increments ot 2V as a heater supply - 2V for portable wireless sets, 4V was the UK standard pre-war, while "6V" more precisely 6.3V was a transatlantic standard and pretty much took over during/after WW2, around the time large vacuum tube equipment (radars, early computers, etc) started being made.

Consequently, entering the 1960s, most high current AC transformers produced ... 6.3V AC.

Multiply by sqrt(2) for the peak voltage, subtract a couple of diode drops for a bridge rectifier, another volt for capacitor ripple, a volt or so for headroom in a linear regulator and you're left with ...

yup, 5V would be about the nearest round number.

This is a plausible rationale for picking 5V in he early days of (not TTL but something earlier, maybe RTL) IC logic. I have no documentary proof that this is the case; evidence either way would be welcome.

However sometime in the 1960s it DID become established as the "normal" logic voltage and everything else (unless it had a compelling reason, like ECL with -5.2V) was made compatible with that.

(Until process shrinks made 5V/gate width an unreasonably large number in volts/metre, when voltage rails started reducing)

(I can't trace it back to the width of a horse's ass in Roman times, but perhaps somebody else can...)