本文介绍了为什么 spark-shell 会因“HDFS 上的根暂存目录:/tmp/hive 应该是可写的"而失败?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是一个 spark noob,使用 Windows 10,试图让 spark 工作.我已经正确设置了环境变量,我也有winutils.当我进入 spark/bin 并输入 spark-shell 时,它运行 spark,但出现以下错误.

I am a spark noob, and using windows 10, trying to get spark to work. I have set the environment variables correctly, and I also have winutils. When I go into spark/bin, and type spark-shell, it runs spark but it gives the following errors.

它也不显示火花上下文或火花会话.

Also it doesn't show the spark context or spark session.

C:\Users\Akshay\Downloads\spark\bin>spark-shell
    17/06/19 23:45:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/06/19 23:45:19 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Users/Akshay/Downloads/spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Users/Akshay/Downloads/spark/jars/datanucleus-api-jdo-3.2.6.jar."
    17/06/19 23:45:20 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Users/Akshay/Downloads/spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Users/Akshay/Downloads/spark/jars/datanucleus-core-3.2.10.jar."
    17/06/19 23:45:20 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Users/Akshay/Downloads/spark/bin/../jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Users/Akshay/Downloads/spark/jars/datanucleus-rdbms-3.2.9.jar."
    java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
      at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
      at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
      at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
      at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
      at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
      at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
      at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
      at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
      at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
      at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
      at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
      at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
      ... 47 elided
    Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
      at java.lang.reflect.Constructor.newInstance(Unknown Source)
      at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
      ... 58 more
    Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
      at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
      at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
      at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
      at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
      at scala.Option.getOrElse(Option.scala:121)
      at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
      at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
      at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
      at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
      ... 63 more
    Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: ---------
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
      at java.lang.reflect.Constructor.newInstance(Unknown Source)
      at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
      ... 71 more
    Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: ---------
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
      at java.lang.reflect.Constructor.newInstance(Unknown Source)
      at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
      at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
      at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
      at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
      ... 76 more
    Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: ---------
      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
      at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
      ... 84 more
    Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: ---------
      at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
      at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
      ... 85 more
    <console>:14: error: not found: value spark
           import spark.implicits._
                  ^
    <console>:14: error: not found: value spark
           import spark.sql
                  ^
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 2.1.1
          /_/

    Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101)
    Type in expressions to have them evaluated.
    Type :help for more information.

    scala>    

我该如何解决这个问题?

How do I resolve this?

推荐答案

请参考这篇文章,其中描述了如何在支持 hadoop 的 Windows 10 上运行 spark.Windows 上的火花

Please refer to this article where was described how to run spark on windows 10 with hadoop support. Spark on windows

这篇关于为什么 spark-shell 会因“HDFS 上的根暂存目录:/tmp/hive 应该是可写的"而失败?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-26 13:29