Having tried mysqlimport command, navicat import wizard, mysql workbench, the best way to import my kinda large csv into mysql is using LOAD DATA INFILE
command (in mysql cli).
- Saw from the thumbnail, using an import wizard to import wukong dataset's csv file just doesn't work.
- Mysqlimport(high upvotes from Stackoverflow) also doesn't work, it cuts the data in the csv file.
- Using mysql workbench to import the csv file just leads to straight up crashing.
Finally, the way it worked, moreover, the way it worked properly and user-friendly, is to use the LOAD DATA INFILE
supported in mysql cli as a part of SQL language.
Here's the code:
LOAD DATA INFILE '/Users/**/Desktop/Cabin/yagoo/resources/wukong_test.csv'
INTO TABLE wukong_data
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(dir, text, token, url, alt_id);
You will have to create the table first. Using head -n 1 wukong_test.csv
to see the column names to work with.
output:
~/Desktop/Cabin/yagoo/resources$ head -n *.csv
dir,text,token,url,alt_id
Most importantly, which is not a feature in any of the failing methods above, is that using LOAD DATA INFILE
can tell you exactly why the import has failed. For me, it's mostly because in some of the column, the string is to long, I didn't set VARCHAR for enough capacity. If so, it'll output a message, telling which row the error took place, and why.
mysql> LOAD DATA INFILE '/Users/**/Desktop/Cabin/yagoo/resources/wukong_test.csv'
-> INTO TABLE wukong_data_test
-> FIELDS TERMINATED BY ','
-> ENCLOSED BY '"'
-> LINES TERMINATED BY '\n'
-> IGNORE 1 ROWS
-> (dir, text, token, url, alt_id);;
ERROR 1406 (22001): Data too long for column 'url' at row 32357
So please stick to sql more!
ps: You might need to tweak a little to make LOAD DATA INFILE
work on your device.
Top comments (0)