Sqoop Commands
Sqoop Commands
Password File
Step 1 Create password file
echo -n "cloudera" > /home/cloudera/mysql/pass.txt
Step 2 Copy the file to HDFS
hdfs dfs -put ./mysql/pass.txt ./pass.txt
Step 3 Using --password-file option
sqoop-import --connect jdbc:mysql://localhost/training --table emp_addr
--target-dir /user/cloudera/emp_addr --username root --password-file
/user/cloudera/pass.txt -m 1;
Import
-------Primary Key
2. With mapper configuration (we can give any number of mappers using -m argument)
------------------------------------
8. Incremental Import
Incremental append (appends the newly created records and also for updated records
there will be duplicate records created)
sqoop-import --connect jdbc:mysql://localhost/training --table emp --username
root --password cloudera --incremental append --check-column id --last-value 1202
-m 1;
Incremental Last Modified (Appends the newly created record and updates the record
for which the values are updated)
HIVE Import:
-----------------------------------------------------------------------------------
-----------------------------------------------------------------
Once the import using avro format is completed the schema of all the tables will be
stored as avsc files in the local system. you can find the files at /home/cloudra/
Move all the files into hdfs using put command. Lets say the files are loaded into
directory call sqoop_import.
Then run the below command to create the table.
repete the above for all the tables imported using sqoop import.
-----------------------------------------------------------------------------------
--------------------------------------------------------
Sqoop Export
Sqoop Job
-----------------------------------------------------------------------------------
------------------------------------------------------
Sqoop Eval