sqoop数据导入

sqoop导入

# mysql 要指定数据库的时区,否则导入到HDFS文件系统中,时间类型字段可能存在时区差异问题
sqoop import 
--connect jdbc:mysql://ip:3306/hive_test?characterEncoding=utf8&serverTimezone=Asia/Shanghai 
--username root 
--password password 
--table user_account 
--target-dir '/input/user_account' 
--hive-import 
--hive-overwrite 
--hive-database mysql 
--fields-terminated-by '03' 
-m 1 

# 增量导入,target-dir要指定hive表所在的目录路径
sqoop import 
--connect jdbc:mysql://ip:3306/hive_test?characterEncoding=utf8&serverTimezone=Asia/Shanghai 
--username root 
--password password 
--table user_account 
--target-dir '/hadoop/hive-3.1.2/data/warehouse/mysql.db/user_account' 
--check-column update_time 
--incremental lastmodified  
--last-value '2020-05-29 13:05:49' 
--fields-terminated-by '03' 
-m 1 
--merge-key id

https://www.cnblogs.com/Alcesttt/p/11432547.html
https://www.cnblogs.com/one--way/p/7550795.html
https://www.jianshu.com/p/9062cf7ef498

原文地址:https://www.cnblogs.com/kayj/p/12988152.html