【原创】大叔经验分享(60)hive和spark读取kudu表

从impala中创建kudu表之后,如果想从hive或spark sql直接读取,会报错:

Caused by: java.lang.ClassNotFoundException: com.cloudera.kudu.hive.KuduStorageHandler
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getStorageHandler(HiveUtils.java:309)

官方的解释是:

You will encounter this exception when you try to access a Kudu table using Hive. This is not a case of a missing jar, but simply that Impala stores Kudu metadata in Hive in a format that is unreadable to other tools, including Hive itself. and Spark. Currently, there is no workaround for Hive users. Spark users can work around this by creating temporary tables.

所以不能直接从hive或spark sql读取impala创建的kudu表,但是spark有个稍微简单的方法是

spark.read.format("kudu").options(Map("kudu.master" -> kuduMaster, "kudu.table" -> kuduTableName)).load.createOrReplaceTempView("tmp_kudu_table")
spark.sql("select * from tmp_kudu_table limit 5")

参考:

https://www.cloudera.com/documentation/enterprise/5-14-x/topics/kudu_troubleshooting.html

原文地址:https://www.cnblogs.com/barneywill/p/10907630.html