kettle在本地执行向远程hdfs执行转换错误"Couldn't open file hdfs"

kettle在本地执行向远程hdfs执行转换时,会出现以下错误:

ToHDFS.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Couldn't open file hdfs://hadoop:***@192.168.

解决过程:

  • 从服务器端拷贝core-site.xml,mapred-site.xmlyarn-site.xmldata-integration/plugins/pentaho-big-data-pluginplugin.properties指定的版本目录下面
  • 在用户变量或者系统变量里面加入以下两个参数
export HADOOP_HOME='/Users/shenfeng/greenware/hadoop-2.7.2'
export HADOOP_USER_NAME=hadoop
原文地址:https://www.cnblogs.com/shenfeng/p/kettle_Couldnot_open_file_hdfs.html