Pseudo Random Number Generation on the GPU

I queried fellow students and they brought my attention to this paper: https://developer.nvidia.com/gpugems/GPUGems3/gpugems3_ch37.html

Since the paper is quite long and might not be online forever below is the general idea:

Linear Congruent Generators are ideal for use on the GPU because they are simple and do not require much state (only the previously generated number). but they are not 'random enough' for, for example, Monte Carlo based simulation. A generator like a Mersenne Twister would be better but requires way too much state to be stored.

The solution proposed by the paper is combining several LCGs using a combined Tausworthe Generator (as used by the Mersenne Twister) this guarantees a much better randomness without having to store as much state as the Mersenne Twister. The final algorithm looks like this:

struct RandomResult
{
    uint4 state;
    float value;
};

uint TausStep(uint z, int S1, int S2, int S3, uint M)
{
    uint b = (((z << S1) ^ z) >> S2);
    return (((z & M) << S3) ^ b);    
}

uint LCGStep(uint z, uint A, uint C)
{
    return (A * z + C);    
}

RandomResult Random(uint4 state)
{
    state.x = TausStep(state.x, 13, 19, 12, 4294967294);
    state.y = TausStep(state.y, 2, 25, 4, 4294967288);
    state.z = TausStep(state.z, 3, 11, 17, 4294967280);
    state.w = LCGStep(state.w, 1664525, 1013904223);

    RandomResult result;
    result.state = state;
    result.value = 2.3283064365387e-10 * (state.x ^ state.y ^ state.z ^ state.w);

    return result;
}

Note that the initial seed values for state should be larger than 128! (For background information reed the paper) and that you should fill the seed with 4 good random numbers from the CPU + four values unique for that pixel to get a nice result.

Tags:

Random