Why use the C# class System.Random at all instead of System.Security.Cryptography.RandomNumberGenerator?

Speed and intent. If you're generating a random number and have no need for security, why use a slow crypto function? You don't need security, so why make someone else think that the number may be used for something secure when it won't be?


Apart from the speed and the more useful interface (NextDouble() etc) it is also possible to make a repeatable random sequence by using a fixed seed value. That is quite useful, amongst others during Testing.

Random gen1 = new Random();     // auto seeded by the clock
Random gen2 = new Random(0);    // Next(10) always yields 7,8,7,5,2,....

First of all the presentation you linked only talks about random numbers for security purposes. So it doesn't claim Random is bad for non security purposes.

But I do claim it is. The .net 4 implementation of Random is flawed in several ways. I recommend only using it if you don't care about the quality of your random numbers. I recommend using better third party implementations.

Flaw 1: The seeding

The default constructor seeds with the current time. Thus all instances of Random created with the default constructor within a short time-frame (ca. 10ms) return the same sequence. This is documented and "by-design". This is particularly annoying if you want to multi-thread your code, since you can't simply create an instance of Random at the beginning of each thread's execution.

The workaround is to be extra careful when using the default constructor and manually seed when necessary.

Another problem here is that the seed space is rather small (31 bits). So if you generate 50k instances of Random with perfectly random seeds you will probably get one sequence of random numbers twice (due to the birthday paradox). So manual seeding isn't easy to get right either.

Flaw 2: The distribution of random numbers returned by Next(int maxValue) is biased

There are parameters for which Next(int maxValue) is clearly not uniform. For example if you calculate r.Next(1431655765) % 2 you will get 0 in about 2/3 of the samples. (Sample code at the end of the answer.)

Flaw 3: The NextBytes() method is inefficient.

The per byte cost of NextBytes() is about as big as the cost to generate a full integer sample with Next(). From this I suspect that they indeed create one sample per byte.

A better implementation using 3 bytes out of each sample would speed NextBytes() up by almost a factor 3.

Thanks to this flaw Random.NextBytes() is only about 25% faster than System.Security.Cryptography.RNGCryptoServiceProvider.GetBytes on my machine (Win7, Core i3 2600MHz).

I'm sure if somebody inspected the source/decompiled byte code they'd find even more flaws than I found with my black box analysis.


Code samples

r.Next(0x55555555) % 2 is strongly biased:

Random r = new Random();
const int mod = 2;
int[] hist = new int[mod];
for(int i = 0; i < 10000000; i++)
{
    int num = r.Next(0x55555555);
    int num2 = num % 2;
    hist[num2]++;
}
for(int i=0;i<mod;i++)
    Console.WriteLine(hist[i]);

Performance:

byte[] bytes=new byte[8*1024];
var cr=new System.Security.Cryptography.RNGCryptoServiceProvider();
Random r=new Random();

// Random.NextBytes
for(int i=0;i<100000;i++)
{
    r.NextBytes(bytes);
}

//One sample per byte
for(int i=0;i<100000;i++)
{   
    for(int j=0;j<bytes.Length;j++)
      bytes[j]=(byte)r.Next();
}

//One sample per 3 bytes
for(int i=0;i<100000;i++)
{
    for(int j=0;j+2<bytes.Length;j+=3)
    {
        int num=r.Next();
        bytes[j+2]=(byte)(num>>16);   
        bytes[j+1]=(byte)(num>>8);
        bytes[j]=(byte)num;
    }
    //Yes I know I'm not handling the last few bytes, but that won't have a noticeable impact on performance
}

//Crypto
for(int i=0;i<100000;i++)
{
    cr.GetBytes(bytes);
}