本文介绍了Spark 2.3.0 Netty版本问题:NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚将spark项目从2.2.1升级到2.3.0,以在下面找到版本控制例外.我有依赖于datastax的spark-cassandra-connector.2.0.7和cassandra-driver-core.3.4.0,而后者又依赖netty 4.x,而spark 2.3.0使用3.9.x.

I just upgraded my spark project from 2.2.1 to 2.3.0 to find the versioning exception below. I have dependencies on the spark-cassandra-connector.2.0.7 and cassandra-driver-core.3.4.0 from datastax which in turn have dependencies on netty 4.x whereas spark 2.3.0 uses 3.9.x.

spark 2.3.0中引入了引发异常的类org.apache.spark.network.util.NettyMemoryMetrics.

The class raising the exception, org.apache.spark.network.util.NettyMemoryMetrics, has been introduced in spark 2.3.0.

降级我的Cassandra依赖项是否是解决异常的唯一方法?谢谢!

Is downgrading my Cassandra dependencies the only way round the exception? Thanks!

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
at org.apache.spark.network.util.NettyMemoryMetrics.<init>(NettyMemoryMetrics.java:76)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:109)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)

推荐答案

似乎您使用的是太旧"的netty 4版本.也许您的类路径上有多个?在类路径上有netty 4.x和3.x应该不是问题.

It seems like you use an "too old" netty 4 version. Maybe you have multiple on your classpath ? It should be not problem to have netty 4.x and 3.x on the classpath.

这篇关于Spark 2.3.0 Netty版本问题:NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-04 05:12