The Random Quine
Python 2, 3^-3 = 0.037
exec
abuse is quite handy for reducing the token count. Now updated to not read the source file!
exec '' """
s = '''{a}
s = {b}
s = s.format(a='"'*3, b="'"*3+s+"'"*3)
import random
tokens = ['exec', "''", s]
print random.choice(tokens), random.choice(tokens), random.choice(tokens),
{a}'''
s = s.format(a='"'*3, b="'"*3+s+"'"*3)
import random
tokens = ['exec', "''", s]
print random.choice(tokens), random.choice(tokens), random.choice(tokens),
"""
The extra ''
between exec
and the giant triple-quoted string is just to pad the token count to the required minimum of 3. It gets merged into the second string due to implicit string literal concatenation.
Original, opening-the-source-file version:
exec '''
# String literals are one token!
import random
import tokenize
with open(__file__) as f:
tokens = [x[1] for x in tokenize.generate_tokens(f.readline)][:-1]
''' '''
# Splitting the string into two strings pads the token count to the minimum of 3.
print random.choice(tokens), random.choice(tokens), random.choice(tokens),
'''
Strictly speaking, the Python grammar places an ENDMARKER token at the end of the source file, and we can't produce a source file with ENDMARKERs randomly strewn about. We pretend it doesn't exist.
Javascript, 102 tokens, 33 unique, 7.73×10-154
Note, this is a true quine. It doesn't read the file or use eval
or Function.toString
meta = "meta = ; out = '' ; tokens = meta . split ( '\\u0020' ) ; tokens . push ( '\"' + meta + '\"' ) ; length = tokens . length ; tmp = length ; unique = { } ; while ( tmp -- ) unique [ tokens [ tmp ] ] = unique ; unique = Object . keys ( unique ) ; tmp = unique . length ; while ( length -- ) out += tokens [ ~~ ( Math . random ( ) * tmp ) ] + '\\u0020' ; console . log ( out )";
out = '';
tokens = meta.split('\u0020');
tokens.push('"' + meta + '"');
//console.log(tokens);
length = tokens.length;
tmp = length;
unique = { };
while(tmp--) unique[tokens[tmp]] = unique;
unique = Object.keys(unique);
//console.log(unique);
tmp = unique.length;
while(length--)
out += unique[~~(Math.random() * tmp)] + '\u0020';
console.log(out)
Python: P(generating program in 1 trial) = 3.0317 * 10^-123
34 unique tokens, 80 total tokens. Note that there is a space at the end of each line.
import tokenize , random
tokens = [ x [ 1 ] for x in tokenize . generate_tokens ( open ( __file__ , 'r' ) . readline ) ] [ : -1 ]
s = ''
for x in tokens : s += random . choice ( list ( set ( tokens ) ) ) ; s += [ ' ' , '' ] [ s [ -1 ] == '\n' ]
print s
Sample output:
' ' random len set 'r' , for ( list , import ] ] tokens : random [ for '\n' import readline readline 'r' tokens [ len 'r' import '' choice '' '' for in ( readline ( = open readline , list 1 list s += for s 1 , '' : 1 += list len - __file__ ; open __file__ print . - ] 'r' for import [ print . ,
; . [ [ print print __file__ generate_tokens ] ; open ] , readline
Thanks to the other Python solution by user2357112 for reminding me to discard the last token and use __file__
which I was previously ignorant of.