Why erase file with random patterns instead of all 0's or 1's?
The short version: theoretically, the original pattern can still be read by certain sophisticated hardware and software. To ensure the security of your "erased" data, it must be wiped.
The long answer: http://en.wikipedia.org/wiki/Data_remanence
Edit: In fairness to those that have already voted, I'm leaving my answer as originally written; however, do read the comments for full disclosure.
Many people, myself included, feel that anything more than one pass with cryptographically-sound pseudorandom data is a waste of time and CPU cycles.
I think I remember reading something about the magnetic properties of the platters being such that any given random set of byte overwriting a block would not necessarily fully demagnetize or retun an area of disk to a fully neutral state, thus some information about the prior data was left behind. Granted I don't think it was much, but it sounded like enough that a determined forensic analysis could retrieve at least the nature of the wiped data.
The idea of the specific patterns of 1's and 0's is that they are such that they work with the hard drive 8/10B encoding (or whatever it is) that is used in order to return the overall magnetic storage block to a neutral state.
Have a look at Spinrite, it can apparently show you the various magnetic "levels" that data is stored at in order to recover and "refresh" the data on a drive, at least that's what it claims.
Normal software only recovery methods cannot recover data that is overwritten once by any pattern, it takes a big budget and sophisticated techniques to recover data that has been overwritten only once. One overwrite is good enough unless you have the FBI NSA, NASA ect., wanting your data. But if your paranoid overwrite it 35 times, then disassemble the hard drive and grind the platters into fine dust, then scatter that dust in the open ocean over a 100 mile voyage, hopefully you wont get stranded on an island in the process, ;-)
Of course, modern operating systems can leave copies of " deleted" files scattered in unallocated sectors, temporary directories, swap files,remapped bad blocks, etc, but Gutmann believes that an overwritten sector can be recovered under examination by a sophisticated microscope and this claim has been accepted uncritically by numerous observers. I don't think these observers have followed up on the references in Gutmann's paper, however. So I can say that Gutmann doesn't cite anyone who claims to be reading the under-data in overwritten sectors, nor does he cite any articles suggesting that ordinary wipe-disk programs wouldn't be completely effective.
http://www.nber.org/sys-admin/overwritten-data-guttman.html
.