Standard Deviation in LINQ
Dynami's answer works but makes multiple passes through the data to get a result. This is a single pass method that calculates the sample standard deviation:
public static double StdDev(this IEnumerable<double> values)
{
// ref: http://warrenseen.com/blog/2006/03/13/how-to-calculate-standard-deviation/
double mean = 0.0;
double sum = 0.0;
double stdDev = 0.0;
int n = 0;
foreach (double val in values)
{
n++;
double delta = val - mean;
mean += delta / n;
sum += delta * (val - mean);
}
if (1 < n)
stdDev = Math.Sqrt(sum / (n - 1));
return stdDev;
}
This is the sample standard deviation since it divides by n - 1
. For the normal standard deviation you need to divide by n
instead.
This uses Welford's method which has higher numerical accuracy compared to the Average(x^2)-Average(x)^2
method.
You can make your own extension calculating it
public static class Extensions
{
public static double StdDev(this IEnumerable<double> values)
{
double ret = 0;
int count = values.Count();
if (count > 1)
{
//Compute the Average
double avg = values.Average();
//Perform the Sum of (value-avg)^2
double sum = values.Sum(d => (d - avg) * (d - avg));
//Put it all together
ret = Math.Sqrt(sum / count);
}
return ret;
}
}
If you have a sample of the population rather than the whole population, then you should use ret = Math.Sqrt(sum / (count - 1));
.
Transformed into extension from Adding Standard Deviation to LINQ by Chris Bennett.