Insert file records into postgres db using clojure jdbc is taking long time compared to python psycopg2
It's probably due to not using batching in your Clojure version. You insert rows one by one each triggering the commit.
If you want to do it in Clojure than you need to partition
rows from CSV files and insert!
each chunk as one batch commit. You need to use the last arity version accepting multiple col-val-vec
s. Sample code (not checked, just to show the idea):
(defn row->col-spec [row]
[(v 0) (v 1) (v 2) (str<->int (v 3))])
(with-open [csv-file (reader "/path/to/foo.txt")]
(try
(->> csv-file
(clojure-csv.core/parse-csv)
(map row->col-spec)
(partition 50)
(map (fn [batch] clojure.java.jdbc/insert! db :records ["col1" "col2" "col3" "col4"] batch))
(dorun))
(catch Exception e
(println e))))
If you don't have to do it in Clojure then using psql
's COPY
command seems to be the easiest and fastest option:
COPY records FROM '/path/to/foo.txt' WITH (FORMAT csv, DELIMITER ',', NULL 'NULL');