Apple - What is the difference between millions and billions in printer/scanner colour settings?
Old photos are unlikely to have colour depth in the millions nevertheless billions. In such a case, there is no point scanning images at anything higher than 24-bit. (See comments below.)
In a nutshell, in certain circumstances some users may opt for billions of colours if they're doing high end photography, graphic design, large format printing, etc and if their hardware supports it.
So, using scanners as an example, colour (or bit depth) is the amount of information a scanner gets from what you’re scanning. Basically, the higher the bit depth, the more colours that can be recognised and the higher your quality of scan will be.
At the risk of oversimplifying things, in summary:
- Grayscale scans are 8-bit images recognising 256 levels of gray
- a 16-bit colour scan can recognise up to 65,536 colours
- a 24-bit colour scan can recognise up to 16,777,215 colours
- a 32-bit colour scan can recognise up to 4,294,967,296 colours
- and so on
Note: The above is only a basic explanation and doesn't get into alpha channels etc.
Hardware support
As I mentioned above, opting for billions of colours will also depend on whether your hardware supports it. Obviously, displaying more colors requires more memory. Most computers today will have a GPU with enough memory to support 32-bit colour. Older computers, on the other hand, may only support up to 16-bit colour. Regardless of this, your display also needs to support this too.
Likewise with scanners. Not all scanners can scan at the same bit depth. Using my summary above, a scanner only capable of 24 bit can recognise up to 16,777,215 colours, well short of the billions that some can scan.
I can't tell the difference
You stated in your question, "I can't tell the difference".
This isn't surprising as I doubt many users could tell the difference between 16-bit and 32-bit colour scans, prints or displays. However, users with particular software capable of showing/differentiating between gradients, shadows, transparency, etc that require a wide range of colour combinations may notice a difference, and this is where it gets back to the specific circumstances you alluded to in your question.
[EDIT]
IconDaemon's comment prompted me to add an example of when a user may want to scan at a higher resolution and colour depth than what their own computer/display supports.
Last December I had to produce some large-format posters for my sister-in-law and, to do this, I had to use her much older Mac to design it. While her older Mac wasn't capable of displaying billions of colours, her scanner was capable of doing up to 48-bit scans. So I scanned in the required images using 36-bit for colour depth and 600dpi for the resolution.
Then, on her outdated Mac I used Photoshop CS5 to put it altogether and export the files as high-quality press-ready PDFs. These files were then taken to a service bureau which printed the large format posters. However, if the original images were not scanned at a high enough quality, then the large format prints would have been pixelated and the lack of colour information would have resulted in a print where some of the effects (e.g. transparency, etc) would not come across very well (if at all).
This is color depth.
The Billions setting takes more disk space, so unless you need really high resolution of colors or see banding or just don't care about larger file size or slow scan times, choose Millions.
Your reasoning is perfectly sound. Larger files for no discernible benefit - choose the lower fidelity.