hadoophttpfs

Hadoop-httpfs: client向httpfs提交文件操作,由httpfs和集群交互; 优势:client不必访问集群

WebHDFS API: https://archive.cloudera.com/cdh5/cdh/5/hadoop/hadoop-project-dist/hadoop-hdfs/WebHDFS.html

coludera install: https://www.cloudera.com/documentation/enterprise/5-10-x/topics/cdh_ig_httpfs_install.html

yum install hadoop-httpfs

1. /etc/hadoop-httpfs/conf/httpfs-site.xml 

<property>
  <name>httpfs.hadoop.config.dir</name>
  <value>/etc/hadoop/conf</value>
</property>
<property>
  <name>httpfs.authentication.type</name>
  <value>kerberos</value>
</property>
<property>
  <name>httpfs.hadoop.authentication.type</name>
  <value>kerberos</value>
</property>
<property>
  <name>httpfs.authentication.kerberos.principal</name>
  <value>HTTP/cdd03@PASC.COM</value>
</property>
<property>
  <name>httpfs.authentication.kerberos.keytab</name>
  <value>/etc/security/keytab/httpfs.keytab</value>
</property>
<property>
  <name>httpfs.hadoop.authentication.kerberos.principal</name>
  <value>httpfs/cdd03@PASC.COM</value>
</property>
<property>
  <name>httpfs.hadoop.authentication.kerberos.keytab</name>
  <value>/etc/security/keytab/httpfs.keytab</value>
</property>

2. /etc/hadoop/conf/core-site.xml    

<property>
  <name>hadoop.proxyuser.httpfs.hosts</name>
  <value>*</value>
</property>
<property>
  <name>hadoop.proxyuser.httpfs.groups</name>
  <value>*</value>
</property>

3. 重启namenode,启动httpfs服务

使用:curl  --negotiate -u :  http://cdd03:14000/webhdfs/v1/tmp/?op=liststatus

红色地方必须使用:hostname(kerberos认证会用到)

原文地址:https://www.cnblogs.com/moonypog/p/11051335.html