loadFileSystems error & ExceptionUtils错误原因分析
C/C++程序通过hdfs.h访问HDFS,运行时遇到如下错误,会是什么原因了?(注:hadoop安装在/data/hadoop/hadoop-2.4.0,而/data/hadoop/current是指向它的软链接):
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=172.25.40.171, port=9001, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
E0507 19:02:57.251287 17859 hdfs_persistence.cpp:31] connect hdfs://172.25.40.171:9001 error: 未知的错误 255
上述信息中的关键项是“NoClassDefFoundError”和“ExceptionUtils”,也就是找不到ExceptionUtils,一般可推断是因为找不到相应的jar文件,Google搜索“ExceptionUtils jar”,发现“ExceptionUtils”应当是在包apache-commons-lang.jar中。
进一步用Google去搜索“apache-commons-lang.jar”,找到下载网址:http://commons.apache.org/proper/commons-lang/download_lang.cgi,上面可以下载commons-lang3-3.3.2-bin.tar.gz,解压后就可以看到commons-lang3-3.3.2.jar。
hadoop的二进制安装包,应当自带了这个文件,通过努力,在hadoop安装目录下的share/hadoop/tools/lib子目录下发现了commons-lang-2.6.jar,应当就是它了。
使用命令“hadoop classpath”可以,可以查看hadoop的classpath:
./hadoop classpath
/data/hadoop/hadoop-2.4.0/etc/hadoop:/data/hadoop/hadoop-2.4.0/share/hadoop/common/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/common/*:/data/hadoop/hadoop-2.4.0/share/hadoop/hdfs:/data/hadoop/hadoop-2.4.0/share/hadoop/hdfs/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/hdfs/*:/data/hadoop/hadoop-2.4.0/share/hadoop/yarn/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/yarn/*:/data/hadoop/hadoop-2.4.0/share/hadoop/mapreduce/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/mapreduce/*:/data/hadoop/current/contrib/capacity-scheduler/*.jar
遗憾的是,tools并没有出现,这应当就是问题所在,于是手工将它加进去(hadoop被安装在/data/hadoop/current):
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib
重运行,发现还是不行:
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=172.25.40.171, port=9001, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
E0507 19:52:48.197748 27787 hdfs_persistence.cpp:31] connect hdfs://172.25.40.171:9001 error: 未知的错误 255
好吧,来把硬的,直接指定commons-lang-2.6.jar:
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib/commons-lang-2.6.jar
重新运行程序,ExceptionUtils错误消失了,但遇到新错误:
loadFileSystems error:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
hdfsBuilderConnect(forceNewInstance=0, nn=172.25.40.171, port=9001, kerbTicketCachePath=(NULL), userName=(NULL)) error:
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
E0507 19:12:56.522274 19834 hdfs_persistence.cpp:31] connect hdfs://172.25.40.171:9001 error: 未知的错误 255
仍然是NoClassDefFoundError错误,原因应当是一样的:classpath中漏了哪个目录,这就要看FileSystem和Configuration在哪个jar中了。
FileSystem是在hadoop-common-2.4.0.jar中,而Configuration在commons-configuration-1.6.jar中,它们两个都已经在classpath上,为何还报错了?
对java不熟,尝试将hadoop-common-2.4.0.jar和commons-configuration-1.6.jar直接加入到classpath:
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib/commons-lang-2.6.jar:/data/hadoop/current/share/hadoop/common/hadoop-common-2.4.0.jar:/data/hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar
发现FileSystem和Configuration错误消失了,说明有效:
loadFileSystems error:
java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.fs.FileSystem.(FileSystem.java:95)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
hdfsBuilderConnect(forceNewInstance=0, nn=172.25.40.171, port=9001, kerbTicketCachePath=(NULL), userName=(NULL)) error:
java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.conf.Configuration.(Configuration.java:169)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
通过寻找,LogFactory在tools目录下的commons-logging-1.1.3.jar中,把它也加入到classpath中:
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib/commons-lang-2.6.jar:/data/hadoop/current/share/hadoop/common/hadoop-common-2.4.0.jar:/data/hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar:/data/hadoop/current/share/hadoop/common/lib/commons-logging-1.1.3.jar
再次运行,还是报错:
java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated: java.lang.NoClassDefFoundError: com/google/common/collect/Maps
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2364)
还是类似的错误,这样下会去搞死人。通过上述的一些操作,估计需要将所有的jar文件一个个的将入到classpath中。由于对java不熟悉,也只有先这样做一做了:
find /data/hadoop/current/ -name *.jar|awk '{ printf("export CLASSPATH=%s:$CLASSPATH\n", $0); }'
终于可以了^_^,好折腾啊。

低调大师中文资讯倾力打造互联网数据资讯、行业资源、电子商务、移动互联网、网络营销平台。
持续更新报道IT业界、互联网、市场资讯、驱动更新,是最及时权威的产业资讯及硬件资讯报道平台。
转载内容版权归作者及来源网站所有,本站原创内容转载请注明来源。
- 上一篇
loadFileSystems error & ExceptionUtils错误原因分析
loadFileSystemserror&ExceptionUtils错误原因分析 一见2014/5/7 C/C++程序通过hdfs.h访问HDFS,运行时遇到如下错误,会是什么原因了?(注:hadoop安装在/data/hadoop/hadoop-2.4.0,而/data/hadoop/current是指向它的软链接): loadFileSystemserror: (unabletogetstacktraceforjava.lang.NoClassDefFoundErrorexception:ExceptionUtils::getStackTraceerror.) hdfsBuilderConnect(forceNewInstance=0,nn=172.25.40.171,port=9001,kerbTicketCachePath=(NULL),userName=(NULL))error: (unabletogetstacktraceforjava.lang.NoClassDefFoundErrorexception:ExceptionUtils::getStackTrac...
- 下一篇
Spark 0.9.1和Shark 0.9.1分布式安装指南
Spark 0.9.1和Shark 0.9.1分布式安装指南.pdf 目录 目录 1 1.约定 1 2.安装Scala 1 2.1.下载 2 2.2.安装 2 2.3.设置环境变量 2 3.安装Spark 2 3.1.部署 2 3.2.下载 3 3.3.安装 3 3.4.配置 3 3.4.1.修改conf/spark-env.sh 3 3.4.2.修改conf/slaves 3 4.启动Spark 3 5.安装Shark 4 5.1.下载 4 5.2.安装 4 5.3.配置 4 5.3.1.修改shark-env.sh 4 6.启动Shark 4 7.执行Shark命令 5 8.常见错误 5 9.相关文档 6 1.约定 本文约定Hadoop2.4.0安装在/data/hadoop/current,而Spark0.9.1被安装在/data/hadoop/spark,其中/data/hadoop/spark为指向/data/hadoop/spark。 Spark官网为:http://spark.apache.org/,Shark官网为:http://shark.cs.berkeley.e...
相关文章
文章评论
共有0条评论来说两句吧...