Why use 95% confidence interval?
From Wikipedia article 1.96 :
The use of this number in applied statistics can be traced to the influence of Ronald Fisher's classic textbook, Statistical Methods for Research Workers, first published in 1925:
"The value for which P = .05, or 1 in 20, is 1.96 or nearly 2 ; it is convenient to take this point as a limit in judging whether a deviation is to be considered significant or not."
$95\%$ is just the conventionally accepted boundary for "reasonably certain" in general cases. It has nothing to do with any specific formulas, and is rather an arbitrary choice that statisticians have agreed is a good compromise between getting results at all and getting results we can trust.
I don't think it is arbitrary because given a normal distribution
68.27% of all values lie within 1 standard deviation
95.45% of all values lie within 2 standard deviations and
99.73% of all values lie within 3 standard deviations