Export specific rows from a PostgreSQL table as INSERT SQL script
To export data only use COPY
:
COPY (SELECT * FROM nyummy.cimory WHERE city = 'tokio') TO '/path/to/file.csv';
You can export a whole table, only selected columns, or the result of a query as demonstrated. No need to create a table explicitly.
You get a file with one table row per line as plain text (not INSERT
commands). Smaller and faster than INSERT
commands.
To import the same to another Postgres table of matching structure anywhere (columns in same order, data types compatible!):
COPY other_tbl FROM '/path/to/file.csv';
COPY
writes and reads files local to the server, unlike client programs like pg_dump
or psql
which read and write files local to the client. If both run on the same machine, it doesn't matter much, but it does for remote connections.
There is also the \copy
command of psql:
Performs a frontend (client) copy. This is an operation that runs an SQL
COPY
command, but instead of the server reading or writing the specified file, psql reads or writes the file and routes the data between the server and the local file system. This means that file accessibility and privileges are those of the local user, not the server, and no SQL superuser privileges are required.
Same syntax as above. Just replace COPY
with \copy
.
Create a table with the set you want to export and then use the command line utility pg_dump to export to a file:
create table export_table as
select id, name, city
from nyummy.cimory
where city = 'tokyo'
$ pg_dump --table=export_table --data-only --column-inserts my_database > data.sql
--column-inserts
will dump as insert commands with column names.
--data-only
do not dump schema.
As commented below, creating a view in instead of a table will obviate the table creation whenever a new export is necessary.