ubuntu 16.4下hadoop配置伪分布式时出现的坑

在ubuntu16.4下spark的单机/伪分布式配置我在此就不在一一赘述,详情请点击如下连接:

Hadoop安装教程_单机/伪分布式配置_Hadoop2.6.0/Ubuntu14.04

我出现问题是在配置好伪分布式的文件并且NameNode 的格式化正确我弄成后,在启动hdfs时出现如下错误:

hadoop@litao-virtual-machine:/usr/local/hadoop$ ./sbin/start-dfs.sh 
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.

此错误多出现于你的java环境没有配置正确的时候,关于如何正确配置ubuntu下的java请出门右转问百度。

当你的环境是ubuntu16.4 hadoop2.7.4 并且java已经确保正确配置时,(网上其它关于次报错的解决办法都尝试过时)或许下面我的解决办法会让你柳暗花明:

 1  vim ./etc/hadoop/hadoop-env.sh  #打开你hadoop安装路径下的hadoop环境配置脚本 
 2 
 3 # The only required environment variable is JAVA_HOME.  All others are
 4 # optional.  When running a distributed configuration it is best to
 5 # set JAVA_HOME in this file, so that it is correctly defined on
 6 # remote nodes.
 7 
 8 # The java implementation to use.
 9 JAVA_HOME=/usr/local/java/jdk1.8.0_151 #你的java路径
10 export JAVA_HOME   #添加这两行内容
11 export JAVA_HOME=${JAVA_HOME}
12 
13 # The jsvc implementation to use. Jsvc is required to run secure datanodes
之后就可以重新执行:

./sbin/start-dfs.sh 

#由于我在写博客时已经运行过,所以会提示已经运行
1
hadoop@litao-virtual-machine:/usr/local/hadoop$ ./sbin/start-dfs.sh 2 Starting namenodes on [localhost] 3 localhost: namenode running as process 5085. Stop it first. 4 localhost: datanode running as process 5239. Stop it first. 5 Starting secondary namenodes [0.0.0.0] 6 0.0.0.0: secondarynamenode running as process 5407. Stop it first.

问题原因:

  根因未知,只是在检查机子各项配置没有问题之后,加上各种猜测性测试得到的解决办法。

希望本条微博可以给您带来帮助!

原文地址:https://www.cnblogs.com/crawer-1/p/8053801.html