Split large files using python

If there's nothing special about having a specific number of file lines in each file, the readlines() function also accepts a size 'hint' parameter that behaves like this:

If given an optional parameter sizehint, it reads that many bytes from the file and enough more to complete a line, and returns the lines from that. This is often used to allow efficient reading of a large file by lines, but without having to load the entire file in memory. Only complete lines will be returned.

...so you could write that code something like this:

# assume that an average line is about 80 chars long, and that we want about 
# 40K in each file.

SIZE_HINT = 80 * 40000

fileNumber = 0
with open("inputFile.txt", "rt") as f:
   while True:
      buf = f.readlines(SIZE_HINT)
      if not buf:
         # we've read the entire file in, so we're done.
         break
      outFile = open("outFile%d.txt" % fileNumber, "wt")
      outFile.write(buf)
      outFile.close()
      fileNumber += 1 

For a 10GB file, the second approach is clearly the way to go. Here is an outline of what you need to do:

  1. Open the input file.
  2. Open the first output file.
  3. Read one line from the input file and write it to the output file.
  4. Maintain a count of how many lines you've written to the current output file; as soon as it reaches 40000, close the output file, and open the next one.
  5. Repeat steps 3-4 until you've reached the end of the input file.
  6. Close both files.

NUM_OF_LINES=40000
filename = 'myinput.txt'
with open(filename) as fin:
    fout = open("output0.txt","wb")
    for i,line in enumerate(fin):
      fout.write(line)
      if (i+1)%NUM_OF_LINES == 0:
        fout.close()
        fout = open("output%d.txt"%(i/NUM_OF_LINES+1),"wb")

    fout.close()

Tags:

Python

Split