Error function etymology: Why the name?
http://en.wikipedia.org/wiki/Errors_and_residuals_in_statistics
An "error" is the difference between a measurement and the value it would have had if the process of measurement were infallible and infinitely accurate. If one uses a single observed value as an estimate of the average of the population of values from which it was taken, then that observed value minus the population average is the error.
Sometimes (often) errors are modeled as being distributed normally, with probability distribution $$ \varphi_\sigma(x)\,dx = \frac 1 {\sqrt{2\pi}} \exp\left( \frac{-1} 2 \left(\frac x \sigma\right)^2 \right) \, \frac{dx} \sigma $$ with expected value $0$ and standard deviation $\sigma$.
The cumulative probability distribution function is $$ \Phi_\sigma(x) = \int_{-\infty}^x \varphi_\sigma(x)\,dx. $$ Up to a rescaling of $x$, this is the error function. The usual definition of the "error function" omits the factor of $1/2$, and thus the standard deviation of the distribution whose cumulative distribution function is the "error function" is not $1$. I am far from convinced that it ought to be rescaled in that way.