在給代碼帶包成jar後,放到環境中運行出現以下錯誤:java
Exception in thread "main" java.io.IOException: No FileSystem for scheme: file at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2644) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:115) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89) at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95) at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95) at org.apache.spark.sql.internal.SessionState$$anon$1.<init>(SessionState.scala:112) at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112) at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122) at org.elasticsearch.spark.sql.EsSparkSQL$.esDF(EsSparkSQL.scala:52) at org.elasticsearch.spark.sql.EsSparkSQL$.esDF(EsSparkSQL.scala:66) at org.elasticsearch.spark.sql.package$SparkSessionFunctions.esDF(package.scala:58) at SQLAttack$.getDayDataByES(SQLAttack.scala:51) at SQLAttack$.main(SQLAttack.scala:25) at SQLAttack.main(SQLAttack.scala)
這是由於 HDFS
的配置文件沒寫好,更改方式以下:sql
找到本身項目保存庫的位置,依次點擊:apache
File -> Settings -> Build,Execution,Deployment -> Build Tools -> Maven -> Local repositoryelasticsearch
這裏的 Local repository
就是項目保存庫的位置。在這裏面依次打開文件位置:oop
\repository\org\apache\hadoop\hadoop-common\2.7.2
用 rar
打開 hadoop-common-2.7.2.jar
,把裏面的 core-default.xml
下載到本地,打開添加更改,在 <!--- global properties -->
添加以下字段:ui
<!--- global properties --> <property> <name>fs.hdfs.impl</name> <value>org.apache.hadoop.hdfs.DistributedFileSystem</value> <description>The FileSystem for hdfs: uris.</description> </property> <property> <name>fs.file.impl</name> <value>org.apache.hadoop.fs.LocalFileSystem</value> <description>The FileSystem for hdfs: uris.</description> </property>
將更改後的 core-default.xml
從新放入 hadoop-common-2.7.2.jar
中,再次打包就能夠運行了spa