sqoop export from Hive table stored in Parquet format to Oracle CLOB column results in (null) value -


i trying export string column hive table (stored in parquet format) oracle clob data type column using sqoop export. below commands run creation of tables in oracle & hive and, sqoop command use to export data.

table creation & insert hive:
create table default.sqoop_oracle_clob_test (sample_id int, verylargestring string) stored parquet;
[success]

insert default.sqoop_oracle_clob_test (sample_id, verylargestring) values (123, "really large string");
insert default.sqoop_oracle_clob_test (sample_id, verylargestring) values (456, "another large string");
[success]

table creation in oracle
create table sqoop_exported_oracle (sample_id number, verylargestring clob);
[success]

sqoop export command:
sqoop \
export \
--connect jdbc:oracle:thin:@//host:port/database_name \
--username ****** \
--password ****** \
--table sqoop_exported_oracle \
--columns sample_id,verylargestring \
--map-column-java "verylargestring=string" \
--hcatalog-table "sqoop_oracle_clob_test" \
--hcatalog-database "default"

sqoop job executes fine without error messages , displays message exported 2 records.

the result in oracle table below,

select * sqoop_exported_oracle;

sample_id | verylargestring
123 | (null)
456 | (null)

i tried using --staging-table but, resulted in same. can me out here?? thanks.


Comments

Popular posts from this blog

python - How to insert QWidgets in the middle of a Layout? -

python - serve multiple gunicorn django instances under nginx ubuntu -

module - Prestashop displayPaymentReturn hook url -