在安裝hive的過程當中,我以爲我是按照教程走的,可是在啓動hive時仍是報錯了,錯誤以下java
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/hive/ql/io/NullScanFileSystem : Unsupported major.minor version 52.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:808) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:442) at java.net.URLClassLoader.access$100(URLClassLoader.java:64) at java.net.URLClassLoader$1.run(URLClassLoader.java:354) at java.net.URLClassLoader$1.run(URLClassLoader.java:348) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:347) at java.lang.ClassLoader.loadClass(ClassLoader.java:430) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:323) at java.lang.ClassLoader.loadClass(ClassLoader.java:363) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:278) at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:363) at java.util.ServiceLoader$1.next(ServiceLoader.java:445) at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2558) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2569) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2586) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368) at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:339) at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:788) at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:717) at org.apache.hadoop.hbase.util.MapreduceDependencyClasspathTool.run(MapreduceDependencyClasspathTool.java:59) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.hbase.util.MapreduceDependencyClasspathTool.main(MapreduceDependencyClasspathTool.java:70)
而後我就去網上搜索教程,他們大部分都說是因爲jdk版本低,升級就好,或者更改Hadoop某個文件的路徑,可是我在輸入了apache
java -version javac -version
這兩個命令後,發現個人顯示的版本號相同,而且都是1.8版本的,而後我就開始思考別的問題,有一篇在Ubuntu下查看jdk安裝路徑的博客給了我啓發,bash
而後我就想是否是有多是我jdk的路徑不對,我就先進入對應的路徑oop
cd /usr/bin
而後查找javaspa
ls -l java
根據綠色的提示再進入對應的路徑,再再次查找java.net
cd /etc/alternatives
ls -l java
這樣就顯示出了個人安裝路徑,而後我返回根目錄,再打開.bashrc文件,查看個人路徑是否正確3d
圖中我已經改過來了,原先JAVA_HOME那裏是錯誤的code
更改過來後,hive就能夠正常啓動了,但願能對你們有所幫助。blog
參考博客:教程
查看jdk安裝路徑:https://blog.csdn.net/shuzhuchengfu/article/details/78546010
hive疑難雜症:https://blog.csdn.net/u012808902/article/details/77658033