同学之间作文Hadoop常见问题及解决⽅法汇总
1、ERROR: Unable to write in /opt/hadoop-3.3.0/logs. Aborting. Starting datanodes
解决:sudo chown -R hadoop:hadoop /usr/local/hadoop
2、790 WARN util.NativeCodeLoader: Unable to load native-hadoop library for you闻风丧胆是什么意思
解决:/opt/hadoop-3.3.0/etc/hadoop/log4j.properties添加apache.hadoop.util.NativeCodeLoader=ERROR
3、localhost: ERROR: Cannot t priority of datanode process
<property>
<name>dfs.name.dir</name>
<value>/opt/hadoop-3.3.0/hdfsDir/name</value>
茹毛<description>datanode上存储hdfs名字空间元数据</description>
钓鱼怎么钓</property>
<property>
<name>dfs.data.dir</name>
<value>/opt/hadoop-3.3.0/hdfsDir/data</value>
结婚后梦见爬雪山<description>datanode上数据块的物理存储位置</description>
</property>
4、Cannot create Name node is in safe mode.
hadoop dfsadmin -safemode leave
5、There are 0 datanode(s) running and 0 node(s) are excluded in this operation.
找到hadoop安装⽬录下./data/dfs/data⾥⾯的current⽂件夹删除
然后从新执⾏⼀下 hadoop namenode -format
6、ERROR org.apache.hadoop.hdfs.rver.namenode.SecondaryNameNode: Failed to start condary namenode
头发英语怎么说
hadoop⽂件夹下的data和name⽂件夹⾥⾯的current/version,发现clusterID不⼀致.
错误原因:多次错误初始化NameNode,导致namenode和datanode的namespaceID和clusterID不⼀致
解决⽅法:删除hdfs中配置的data⽬录下的所有⽂件(l中配置的p.dir)
1、删除Hadoop⽂件夹…/tmp/dfs ⾥⾯的data和name⽂件夹
2、初始化NameNode:hdfs namenode -format
3、启动HDFS:start-dfs.sh
dear是什么意思