Ubuntu – NoNode for HBase master pseudodistributed mode

hadoophbaseUbuntu

I am using Ubuntu 18.04, hadoop 3.1.3 and hbase 2.2.1

To me it seems like my hadoop and HBase are not configured correctly to interact. When I through the HBase shell try to create a table it yields me with following error

ERROR: KeeperErrorCode = NoNode for /hbase/master

And when i try to scan the table it yields me with following error

ERROR: No meta znode available

JPS

hadoop@jonas:~/HBase/bin$ jps
10107 Jps
hadoop@jonas:~/HBase/bin$ 

Starting hadoop

hadoop@jonas:~/hadoop/sbin$ ./start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as hadoop in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [jonas]
Starting resourcemanager
Starting nodemanagers

JPS

hadoop@jonas:~/hadoop/sbin$ jps
10562 DataNode
11267 NodeManager
10823 SecondaryNameNode
11545 Jps
10347 NameNode

Starting HBase

seems only to be errors, even though it does seem sceptical

hadoop@jonas:~/HBase/bin$ ./start-hbase.sh
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: bad substitution
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: bad substitution
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/HBase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: bad substitution
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: bad substitution
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/HBase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
localhost: running zookeeper, logging to /home/hadoop/HBase/bin/../logs/hbase-hadoop-zookeeper-jonas.out
running master, logging to /home/hadoop/HBase/logs/hbase-hadoop-master-jonas.out
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: bad substitution
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: bad substitution
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/HBase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
: running regionserver, logging to /home/hadoop/HBase/logs/hbase-hadoop-regionserver-jonas.out
: /home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: bad substitution
: /home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: bad substitution
: SLF4J: Class path contains multiple SLF4J bindings.
: SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
: SLF4J: Found binding in [jar:file:/home/hadoop/HBase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

JPS

hadoop@jonas:~/HBase/bin$ jps
13825 HRegionServer
10562 DataNode
11267 NodeManager
14148 Jps
10823 SecondaryNameNode
13577 HQuorumPeer
10347 NameNode

Starting HBase shell and running command

hadoop@jonas:~/HBase/bin$ ./hbase shell
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2360: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: bad substitution
/home/hadoop/hadoop/libexec/hadoop-functions.sh: line 2455: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: bad substitution
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/HBase/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell
Use "help" to get list of supported commands.
Use "exit" to quit this interactive shell.
For Reference, please visit: http://hbase.apache.org/2.0/book.html#shell
Version 2.2.3, r6a830d87542b766bd3dc4cfdee28655f62de3974, 2020年 01月 10日 星期五 18:27:51 CST
Took 0.0037 seconds                                                                                      
hbase(main):001:0> create 'wiki', 'text'

ERROR: KeeperErrorCode = NoNode for /hbase/master

For usage try 'help "create"'

Took 0.0642 seconds                                                                                      
hbase(main):002:0> 

Hadoop config files

Core-site.xml

<configuration>
<property>
  <name>fs.default.name</name>
    <value>hdfs://localhost:8030</value>
</property>
</configuration>

hdfs-site.xml

<configuration>
<property>
 <name>dfs.replication</name>
 <value>1</value>
</property>

<property>
  <name>dfs.name.dir</name>
    <value>file:///home/hadoop/hadoopdata/hdfs/namenode</value>
</property>

<property>
  <name>dfs.data.dir</name>
    <value>file:///home/hadoop/hadoopdata/hdfs/datanode</value>
</property>
</configuration>

Should mapred-site.xml or yarn-site.xml be relevant, then please point it out in the comments.

HBase config

hbase-site.xml

<configuration>

   <property>
      <name>hbase.rootdir</name>
      <value>hdfs://localhost:8030/hbase</value>
   </property>

   <property>
      <name>hbase.zookeeper.property.dataDir</name>
      <value>/hadoop/zookeeper</value>
   </property>


   <property>
     <name>hbase.cluster.distributed</name>
     <value>true</value>
   </property>
</configuration>

At this point the official documentation of the individual files does not point me in the right direction. I am not really sure what to do at this point

Best Answer

this maybe help(hbase 2.3.7)

add this line to /etc/profile( or ~/.bashrc) then source /etc/profile

export HBASE_DISABLE_HADOOP_CLASSPATH_LOOKUP=true

in $HBASE_HOME/bin/hbase, there are some code like this

#If avail, add Hadoop to the CLASSPATH and to the JAVA_LIBRARY_PATH
# Allow this functionality to be disabled
if [ "$HBASE_DISABLE_HADOOP_CLASSPATH_LOOKUP" != "true" ] ; then
    HADOOP_IN_PATH=$(PATH="${HADOOP_HOME:-${HADOOP_PREFIX}}/bin:$PATH"  which hadoop 2>/dev/null)
fi
Related Topic