• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Hadoop Namenode Will Not Start

 
Michael Knapp
Greenhorn
Posts: 8
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I'm having trouble getting my Namenode to start after I re-boot my machine. I followed the instructions here very carefully:

http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html#Standalone_Operation

and it works, I can start both the DFS and YARN, and make directories in HDFS. However, when I reboot my machine I try to start the HDFS again, but the namenode will not start. I sifted through the hadoop logs a bit and found this in the namenode logs:



So I believe the problem is that it is using the wrong temp directory, it's trying to use /tmp/hadoop-michael/, but the file $HADOOP_HOME/etc/hadoop/core.site explicitly sets it to /share/hadoop/hdfs. That directory exists and has appropriate permissions.

I have run $HADOOP_HOME/bin/hdfs namenode -format in the past. If I run it again, then I can start the name node; however, all of its filesystem is now empty. So formatting it after every boot is not really an option.

Somebody please tell me what I am doing wrong.
 
Michael Knapp
Greenhorn
Posts: 8
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I think maybe it's because I was using 'hadoop-tmp-dir' as the property and not 'hadoop.tmp.dir'. My 'Pro Hadoop' book used hyphens, but I'm seeing other people use periods. Going to try re-booting now and see if it fixed things.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic