How to load data from a text file in a PostgreSQL database?
I have a file like (CSV file):
value1|value2|value2.... value1|value2|value2.... value1|value2|value2.... value1|value2|value2....
and would like to load these data into a postgresql table.
The slightly modified version of COPY below worked better for me, where I specify the CSV format. This format treats backslash characters in text without any fuss. The default format is the somewhat quirky TEXT.
COPY myTable FROM '/path/to/file/on/server' ( FORMAT CSV, DELIMITER('|') );
Let consider that your data are in the file values.txt and that you want to import them in the database table myTable then the following query does the job
COPY myTable FROM 'value.txt' (DELIMITER('|'));
Check out the COPY command of Postgres:
There's Pgloader that uses the aforementioned COPY command and which can load data from csv (and MySQL, SQLite and dBase). It's also using separate threads for reading and copying data, so it's quite fast (interestingly enough, it got written from Python to Common Lisp and got a 20 to 30x speed gain, see blog post).
To load the csv file one needs to write a little configuration file, like
LOAD CSV FROM 'path/to/file.csv' (x, y, a, b, c, d) INTO postgresql:///pgloader?csv (a, b, d, c) …