Sqoop导入数据到mysql数据库报错:ERROR tool.ExportTool: Error during export: Export job failed!

问题描述:

Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

21/10/10 08:51:52 INFO mapreduce.Job:  map 100% reduce 0%
21/10/10 08:51:53 INFO mapreduce.Job: Job job_1633826412371_0001 failed with state FAILED due to: Task failed task_1633826412371_0001_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

21/10/10 08:51:54 INFO mapreduce.Job: Counters: 9
    Job Counters 
        Failed map tasks=4
        Launched map tasks=4
        Other local map tasks=3
        Data-local map tasks=1
        Total time spent by all maps in occupied slots (ms)=52317
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=52317
        Total vcore-milliseconds taken by all map tasks=52317
        Total megabyte-milliseconds taken by all map tasks=53572608
21/10/10 08:51:54 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/10/10 08:51:54 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 110.2385 seconds (0 bytes/sec)
21/10/10 08:51:54 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
21/10/10 08:51:54 INFO mapreduce.ExportJobBase: Exported 0 records.
21/10/10 08:51:54 ERROR tool.ExportTool: Error during export: Export job failed!

解决方案:

①首先查看hive中的表结构是否与mysql一致。(desc 表名称)

 如果表结构一致的话可能就是mysql表的字符集不一致导致

②Sqoop导入数据到mysql数据库命令

bin/sqoop export 
> --connect “jdbc:mysql://master:3306/mysql?useUnicode=true&characterEncoding=utf-8
> --username root 
> --password 000000 
> --table QX_diyu_results 
> --num-mappers 1 
> --export-dir /user/hive/warehouse/diyu_resaults 
> --input-fields-terminated-by ","

从命令中可以看出我使用的编码为utf8,所以应将mysql中表的字符集修改为utf8

 修改完成之后再次运行命令,出现如下结果表示数据导入成功:

原文地址:https://www.cnblogs.com/zyj3955/p/15388617.html