Why do we generally round 5's up instead of down?

WE do this because when you round, generally it is with decimals that are rational or long. When there is 5.000000000000000001, it is better to round up. So, lets say you have the square root of 26 (5.09901951359) It's close to 6 than to 4. When rounding, you can't have 5.090919534243123 whatever to round down because decimals cant go down. What Brett Frankel said: Among other reasons, it makes matters a bit simpler. If I want to round 1.5002 to the nearest integer, I only have to look at the tenths place. If 5's rounded down, I'd have to look as far as the ten-thousandths place to make sure my number was strictly closer to 2 than to 1. Normally, this is only seen in higher level math though, where there ARE no straight integers and instead many irrational numbers. To round pi to the nearest thousandth 3.142 In reality, pi is 3.1415925635, after .5 there are many numbers that increase the value to higher than five, inherently making it closer to 10.


For small numbers like $15$, it may feel closer to $20$. We have a certain (vague) logarithmic appreciation of numbers, so $15$ feels farther from $10$ than from $20$. As the numbers get larger, this becomes less important. Even $25$ doesn't feel to me much closer to $30$ than $20$. But you have to do something. Sometimes you round to evens, which has the advantage of not accumulating errors if you add up a lot of them.


It isn't always. A popular rounding method called banker's rounding rounds 15 to 20 but 45 to 40.

But one reason it might be rounded that way is that round(x) is often implemented as $\lfloor x+1/2\rfloor.$