Why does the sun have to be nearly fully covered to notice any darkening in an eclipse?
Human perception is generally logarithmic. For example, the perceived loudness of a sound is measured using decibels, where an decrease of $10 \text{ dB}$ divides the sound intensity by $10$. So if the eclipse were heard instead of seen, "90% coverage" might mean reducing the intensity from $120 \text{ dB}$ to $110 \text{ dB}$, a small change.
Perceived brightness is the same way. There's a huge range of light intensities that we see every day: direct sunlight is ~100 times brighter than indoor lighting, though both look fairly bright to us. So a 90% reduction wouldn't make the sky look dark at all.
The shape of the graph 'looks like an exponential' because the $y$-axis is the log of the intensity. This is done so the graph somewhat represents "perceived brightness" vs. time.
The graph looks exponential because the vertical axis is logarithmic! If you were to re-plot it as linear lumens per square meter, it would be much more v-like, or even u-like.
It so happens that a logarithmic plot matches our subjective perception of light intensity better than a linear one would. That's a result of our eyes having evolved to work well in an extremely wide range of different amounts of light.
Based on my own anecdotal evidence, it doesn't. Several years ago there was a partial solar eclipse in my area. I don't remember precisely how much of the sun's disk was covered - it wasn't much, surely nowhere near 90% - but I do remember getting out of the house in the morning, thinking "hmm, it's quite dark today", then having the eerie realization that the sky was perfectly clear, with none of the haze or clouds I was expecting. So yes, the darkening is noticeable.