mysql - Update in SQOOP Export not working -
i using hadoop 2.5.0 , sqoop 1.4.5.
i trying update in mysql following sqoop export command:
sqoop export --connect "connect_string" --username "user" --password "pass" --table "table_name" --columns "a,b,created_dtm,d,e" --export-dir "export_dir" --input-null-string "" --input-null-non-string "" --input-escaped-by \\ --input-fields-terminated-by "," --input-lines-terminated-by "\\n" --update-key --update-mode "allowinsert" -m 1;
a defined unique key in table structure.
but getting below exception , job hangs:
org.apache.sqoop.mapreduce.asyncsqloutputformat: got exception in update thread: java.sql.batchupdateexception: unknown column 'created_dtm' in 'field list' @ com.mysql.jdbc.preparedstatement.executebatchserially(preparedstatement.java:1819) @ com.mysql.jdbc.preparedstatement.executebatch(preparedstatement.java:1281) @ org.apache.sqoop.mapreduce.asyncsqloutputformat$asyncsqlexecthread.run(asyncsqloutputformat.java:231) caused by: com.mysql.jdbc.exceptions.jdbc4.mysqlsyntaxerrorexception: unknown column 'created_dtm' in 'field list'
i tried following multiple ways of specifying columns:
--columns a,b,created_dtm,d,e --columns 'a','b','created_dtm','d','e' --columns "a,b,created_dtm,d,e"
but none of above worked. when above sqoop command executed first time when table empty inserts records since have specified --update-mode="allow-insert", after when try update same data. job fails every time.
could please suggest wrong here?
the table structure follows:
create table if not exists schema.`table` ( `id` bigint(20) not null auto_increment, `a` varchar(254) not null, `b` bigint(20) default null, `created_dtm` timestamp not null default '0000-00-00 00:00:00', `c` timestamp not null default '0000-00-00 00:00:00', `e` bigint(20) not null, `d` timestamp not null default '0000-00-00 00:00:00', primary key (`id`), unique key `u_table` (`a`), key `i_table_c` (`c`) )
Comments
Post a Comment