1。伪分布式搭建;

步骤参考:

    http://wenku.baidu.com/link?url=N_Sc7dqaO5HB47SmhntYZQI2tvvAjYt0mWT0fx28FDSMRYKTLUTcWlxe2kaYWRhjN7y1z3jc0As5NV5Np3nq1fOznL3WxR-6atCCfEpcYjy

2。一个slave节点搭建:

  步骤参考:

    http://wenku.baidu.com/link?url=wltxZ0bSJ-Wmmg80B0-YQIh9gVbotsU-KeQPCCDCFnb6eMEU81ASJae4O18VZzZ2dFaA2YApZxsv2ogcYPIuFHLhUhVxKcxK-DOtM_djhkG

遇到问题:

1.

命令:

./hadoop namenode -format

报错:

1).java.io.FileNotFoundException: /usr/hadoop-0.20.2-cdh3u6/logs/SecurityAuth-hadoop.audit(No such file or directory)

2).ERROR namenode.NameNode: java.io.IOException: Cannot create directory /home/tmp/dfs/name/current

解决:

chown -R hadoop /home/tmp

给hadoop用户赋予tmp目录权限

2.

命令:

start-all.sh

问题1中的办法只解决了1中的2),不过可以不理,执行2.中命令

启动后,

命令:hadoop dfs -ls

问题:Bad connection to FS. command aborted. exception:.......Connection refused

调试:

命令:jps

发现dataNode nameNode 都没起来

查看日志文件:

命令:more  XXXX.log  记得把  XXXX.out  换成  XXXX.log

发现:

ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: All directories in dfs.data.dir are invalid.

WARN org.apache.hadoop.util.DiskChecker:Incorrect permissions were set on /home/hadoop/dfs/data, expected: rwx----,while actual: rwxr-xr-x, Fixing...

原来还是权限问题:

解决:

命令:chown -R hadoop dfs/  tmp/  (tmp是自己建的目录)

再次启动,有了DataNode

问题:

但是,还木有NameNode

ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: java.io.IOException: Cannot lock storage /home/tmp/dfs/name. The directory is already locked.

解决:

1. 删除存储目录 dfs.data.dir

2. 删除临时存储目录hadoop.tmp.dir

3. 重启环境,格式化namenode节点(确认时,要输入Y,不能输入y)

参考:

http://f.dataguru.cn/thread-139654-1-1.html

启动时报错:
hadoop 安装配置-LMLPHPhadoop 安装配置-LMLPHP
05-28 15:58