本文介绍了Spark使用s3a:java.lang.NoSuchMethodError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

我正在做一些关于spark_with_hadoop2.7(2.4.3),hadoop(3.2.0)和Ceph luminous组合的事情.当我尝试使用spark访问ceph(例如,在shell上启动spark-sql)时,出现如下异常:

I'm doing something about the combination of spark_with_hadoop2.7 (2.4.3), hadoop (3.2.0) and Ceph luminous. When I tried to use spark to access ceph (for example, start spark-sql on shell), exception like below shows:

 INFO impl.MetricsSystemImpl: s3a-file-system metrics system started
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.ProviderUtils.excludeIncompatibleCredentialProviders(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/Class;)Lorg/apache/hadoop/conf/Configuration;
        at org.apache.hadoop.fs.s3a.S3AUtils.getAWSAccessKeys(S3AUtils.java:740)
        at org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider.<init>(SimpleAWSCredentialsProvider.java:58)
        at org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:600)

对于NoSuchMethodError,最有可能是因为根据.

For NoSuchMethodError, it's most likely because the compiled class version is different from running class version according to how-do-i-fix-a-nosuchmethoderror.

要访问Ceph,请在$HADOOP_HOME/share/hadoop/tools/lib下的aws相关jars aws-java-sdk-bundle-1.11.375.jarhadoop-aws-3.2.0.jar实际使用.我做了以下操作:

To access Ceph, aws related jars aws-java-sdk-bundle-1.11.375.jar and hadoop-aws-3.2.0.jar under $HADOOP_HOME/share/hadoop/tools/libare actually used. I did operations below:

1,将这两个罐子复制到$SPARK_HOME/jars
2,修改$HADOOP_HOME/etc/hadoop/hadoop-env.sh以在下面添加语句:

1, Copy those two jars to $SPARK_HOME/jars
2, Modify $HADOOP_HOME/etc/hadoop/hadoop-env.sh to add statements below:

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_HOME/share/hadoop/tools/lib/*

通过上述步骤,我可以启动hdfs来访问ceph,例如,我可以使用hdfs dfs -ls列出ceph存储桶下的文件夹.事实证明,与aws相关的罐子工作正常.(据我了解).

By doing steps above, I can start hdfs to access ceph, for example, I can use hdfs dfs -ls to list folders under ceph bucket. It proves that the aws related jars works fine.(Just as per my understanding).

但是为什么当我调用spark时,有关aws s3a的异常抛出?

But why exceptions about aws s3a throw when I invoke spark?

推荐答案

所有hadoop- * JAR都必须在版本上100%匹配,否则您将看到这样的堆栈跟踪.

All the hadoop-* JARs need to be 100% matching on versions, else you get to see stack traces like this.

有关更多信息,请重读

这篇关于Spark使用s3a:java.lang.NoSuchMethodError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 21:54