Efficiently insert massive amount of rows in Psycopg2
Based on the answers given here, COPY is the fastest method. COPY
reads from a file or file-like object.
Since memory I/O is many orders of magnitude faster than disk I/O, it is faster to write the data to a StringIO
file-like object than to write to an actual file.
The psycopg docs show an example of calling copy_from
with a StringIO
as input.
Therefore, you could use something like:
try:
# Python2
from cStringIO import StringIO
except ImportError:
# Python3
from io import StringIO
def db_insert_spectrum(curs, visual_data, recording_id):
f = StringIO()
# visual_data is a 2D array (a nx63 matrix)
values_list = []
for rowIndex, rowData in enumerate(visual_data):
items = []
for colIndex, colData in enumerate(rowData):
value = (rowIndex, colIndex, colData, recording_id)
items.append('\t'.join(map(str, value))+'\n')
f.writelines(items)
f.seek(0)
cur.copy_from(f, 'spectrums', columns=('row', 'col', 'value', 'recording_id'))
I don't know whether .execute_batch
can accept generator, but can u try something like:
def db_insert_spectrum(curs, visual_data, recording_id):
sql = """
INSERT INTO spectrums (row, col, value, recording_id)
VALUES %s
"""
data_gen = ((rIdx, cIdx, value, recording_id) for rIdx, cData in enumerate(visual_data)
for cIdx, value in enumerate(cData))
psycopg2.extras.execute_batch(curs, sql, data_gen, page_size=1000)
It might be faster.