csv - mysql load data local infile syntax issues with set fields -


i'm trying use mysql's load data local infile syntax load .csv file existing table. here 1 record .csv file (with headers):

 prod,       plant,pord,  revn,a_cpn,     a_crev,brdi,       dte,     ltme  100100128144,12t1,2070000,04,3db18194acaa,05_01,ala13320004,20130807,171442 

the issue want 3 things done during import:

  1. a recordid int not null auto_integer primary_key field should incremented each row gets inserted (this table column , structure exists within mysql table)

  2. dte , ltme should concatenated , converted mysql datetime format , inserted existing mysql column named trans_ocr

  3. a created timestamp field should set current unix timestamp on row insertion (this table column , structure exists within mysql table)

i'm trying import data mysql table following command:

 load data local infile 'myfile.csv' table seriallog  fields terminated ',' optionally enclosed '\"'  lines terminated '\n'  ignore 1 lines  (flex_pn, flex_plant, flex_ordid, flex_rev, cust_pn, cust_rev, serialid)  set created = current_timestamp; 

i think have created column set others causing mysql warning issued:

warning: out of range value column 'flex_pn' @ row 1 warning: row 1 truncated; contained more data there input columns 

can me syntax, load data local infile module confusing me...

figured out proper syntax make work:

sql = """load data local infile %s table seriallog_dev           fields terminated ',' optionally enclosed '\"'           lines terminated '\\n'           ignore 1 lines           (flex_pn, flex_plant, flex_ordid, flex_rev, cust_pn, cust_rev, serialid, @dte, @ltme)           set recordid = null,               trans_ocr = str_to_date(concat(@dte,'',@ltme), "%%y%%m%%d%%h%%i%%s"),               created = current_timestamp;"""  params = (file,) self.db.query( sql, params ) 

mind you--this done python's mysqldb module.

caveat

the issue solution reason bulk insert inserts first 217 rows of data file. total file size 19kb can't imagine large mysql buffers... gives?

more info

also, tried syntax directly within msyql-server cli , works 255 records. so, problem python, python mysqldb module, or mysql connection mysqldb module makes...

done

i figured out problem, had nothing load data local infile command rather method using convert original .dbf file .csv before attempting import .csv. reason mysql import method running on .csv before .dbf .csv conversion method finished -- resulting in partial data set being found in .csv file , imported... sorry waste everyone's time!


Comments

Popular posts from this blog

html - How to style widget with post count different than without post count -

How to remove text and logo OR add Overflow on Android ActionBar using AppCompat on API 8? -

javascript - storing input from prompt in array and displaying the array -