Jump to content

hadoop sqoop - oka doubt


mettastar

Recommended Posts

i'm exporting data from one hdfs to nz table using sqoop export

na question enti ante

naa hdfs lo hive table partitioned so I have data in multiple folders.. so sqoo export lo can I mention multiple folders to export  the data ? 

nenu ekkada chudale idi .. evaranna didded aa? 

other option is to concat the files and export but ala kakunda emana vere option undemo kanukundam ani esa

Link to comment
Share on other sites

27 minutes ago, mettastar said:

i'm exporting data from one hdfs to nz table using sqoop export

na question enti ante

naa hdfs lo hive table partitioned so I have data in multiple folders.. so sqoo export lo can I mention multiple folders to export  the data ? 

nenu ekkada chudale idi .. evaranna didded aa? 

other option is to concat the files and export but ala kakunda emana vere option undemo kanukundam ani esa

 

Keep all the folders in a warehouse dir and use this 

 

sqoop export \
–connect jdbc:oracle:thin:@enkx3-scan:1521:dbm1 \
–username wzhou \
–password wzhou \
–direct \
–export-dir ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop’ \
–table WZHOU.TEST_IMPORT_FROM_SCOOP \
–fields-terminated-by ‘\001’

Link to comment
Share on other sites

1 minute ago, kasi said:

 

Keep all the folders in a warehouse dir and use this 

 

sqoop export \
–connect jdbc:oracle:thin:@enkx3-scan:1521:dbm1 \
–username wzhou \
–password wzhou \
–direct \
–export-dir ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop’ \
–table WZHOU.TEST_IMPORT_FROM_SCOOP \
–fields-terminated-by ‘\001’

thats the thing vuncle.. i don't want to add any other process (consolidating the files) before exporting. 

Multiple folders nunchi at a time export ki option leda?

Link to comment
Share on other sites

never encountered this issue, basically erripuk design idi 

but 

try this 

–export-dir ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop’ ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop1’ \

 

not sure if it works 

 

Link to comment
Share on other sites

4 minutes ago, kasi said:

try space delimited folder directories  

Ya adi try chesi chustha.. basically hive table patitioned by date.. so oka day ki oka folder untadi. When im pulling the data >= date petti data thechukovali.. so first im thinking of resolving the folders and if there is an option i want to pass the all folders in the export command itself

Link to comment
Share on other sites

one more question bro, I'm geting this error while trying to export the data .. this is related to decimal field

java.lang.ClassCastException: org.apache.hadoop.io.BytesWritable cannot be cast to java.math.BigDecimal 

my data in hdfs is in avro format 

Link to comment
Share on other sites

22 minutes ago, kasi said:

what serde are you using??

when bringing data into hadoop not sure bro, but they are converting the data into avro format and placing them in one location and creating hive tables on that..

my team consumes the data from those hive tables/hdfs locations .. so akkada data avro lo undi..

sqoop export lo aithe im not mentioning any serde as sqoop can handle avro files

my command:

sqoop export -Dsqoop.avro.logical_types.decimal.enable=true --connect jdbc:netezza://abcdefgh:5480/ods --username xyz -P --export-dir /dev/dev_ods/DTA/ASN00/custom_year=2016/custom_month=9/custom_day=4/ --table ASN00 --input-fields-terminated-by "," --batch;

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...