How can embedded systems estimate their battery status so precisely?
There is one thing which is obvious once stated, but not until then.
Your phone tells you it has "37% Charge Remaining". How do you know that's accurate? It's probably not.
The software may be doing some estimating based on average current draw since it was fully charged, average time between charges, and of course the discharge characteristics for the specific battery. Then it presents you with its best guess.
Over time, it can build up a reasonably accurate profile for the battery and use that to improve the estimates. But it is usually an estimate.
In my experience of developing battery based systems (with smart batteries, dumb NiCad, and everything in between) the only times you are confident of the charge level are 100% and 0%.
Usually, a smart battery will let you know when it's fully charged, and with a dumb one you are probably doing some calculations with current and temperature. That takes care of the 100% case.
The 0% case is where the sneakiness comes in. Whatever the battery chemistry, there is often a distinctive pattern in the discharge curve as you approach voltage collapse. But allowing a battery to go into deep discharge is generally a "Bad Thing" (TM).
So firmware looks for that pattern and decides when the battery is at a virtual "0%". Then it shuts the system down so that there's enough residual charge in the battery to avoid deep discharge and, more importantly, a sudden loss of power. This allows a graceful shutdown.
If this seems a little unlikely, let your phone "run down" and shut itself off. Then turn it back on again. If the battery were truly at 0%, it could not boot and power up the screen to tell you it needed charging.
The 5% (or perhaps 10% depending on the precision of the measurements and the batteries tolerances) warning is also often somewhat artificial, again representing a point on the discharge curve when the firmware starts thinking "Going to shutdown soon".
Ironically, this is the level at which someone in marketing insists that you turn on that bright LED to tell the user they are about to run out of battery power.
As you mention, the voltage changes a little bit on charge/discharge. Millivolt-level measurements are reasonably straightforward, and every battery chemistry I'm familiar with has a voltage change of at least a few hundred millivolts between "full" and "effectively empty".
Most battery discharge curves are linear, at least over the range that most devices use them. Because of this, you can get a rough estimate of the remaining charge by remembering the last voltage peak (corresponding to full charge), knowing the voltage level at shutoff, and interpolating between them. For more precision, you can either program the device with the typical discharge curve for the battery chemistry you're using, or have the device measure it during a "conditioning" charge-discharge cycle.
The "charge icon" represents the State of Charge (SOC) of the battery - which is normally a percentage figure.
Different battery technologies are managed in differing ways...
Some have a sloped discharge curve - you know that a given voltage at a given temperature represents a given SOC.
Others are less helpful (eg lead/acid) and have a very flat discharge curve, in that they provide X volts right up until the point of expiry, then pretty much 0 volts thereafter! These require a level of input/output counting - and recalibration at the 0%/100% levels.
Most consumer devices offer a fairly crude SOC - but it is also dependant on the State of Health - which represents the state of the battery over its lifetime.