how to insert selected columns from a very large pipe delimited text/csv file to mysql using LOAD DATA INFILE

There is a similar question already but I would like to know how to accomplish this for very large data sets.

The @dummy method etc will be take very long.

Also, what should I do if the column names are different in the spreadsheet and the table?

Answers


LOAD DATA INFILE '/tmp/actor.tsv'
REPLACE INTO TABLE actor
FIELDS TERMINATED BY '\t'
OPTIONALLY ENCLOSED BY '"'
(@f1, @f2, @f3)
SET first_name = @f1, last_name = @f3

UPDATE. If number of fields is big use this script to generate the LOAD command:

echo -n "LOAD DATA INFILE '/tmp/actor.tsv'
REPLACE INTO TABLE actor
FIELDS TERMINATED BY '\\t'
OPTIONALLY ENCLOSED BY '\"'
("
comma=""
for i in `seq 300`
do
    echo -n "$comma @f$i"
    comma=","
done
echo ")"
echo "SET first_name = @f1, last_name = @f3"

Need Your Help

Get CodeLens indicators for work done prior to upgrade to TFS 2013

tfs2013 codelens

We have recently upgraded from TFS 2012 to 2013. One of the new features in Visual Studio 2013 is the CodeLens or Code Indicators which can pull associated work item, changeset and test impact dat...

Class Carbon\Carbon not found

php laravel-4 composer-php php-carbon

I recently added a package to my Laravel 4 site and now anything that uses Eloquent (or at least Eloquent with any reference to date/time) is showing a 500 error that states: