问题描述
试图运行http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala从源代码。
这行 VAL wordCounts = textFile.flatMap(行=> line.split())。图(字=>(字,1))。reduceByKey((A,B )=> A + b)
抛出误差
value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)]
val wordCounts = logData.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)
logData.flatMap(行=> line.split())。图(字=>(字,1))
返回MappedRDD但我找不到这种类型的http://spark.apache.org/docs/0.9.1/api/core/index.html#org.apache.spark.rdd.RDD
logData.flatMap(line => line.split(" ")).map(word => (word, 1))
returns a MappedRDD but I cannot find this type in http://spark.apache.org/docs/0.9.1/api/core/index.html#org.apache.spark.rdd.RDD
我正在从火花源这个code所以可能是一个classpath的问题?但所需的依赖都在我的类路径中。
I'm running this code from Spark source so could be a classpath problem ? But required dependencies are on my classpath.
推荐答案
您应该从 SparkContext
导入隐式转换:
You should import the implicit conversions from SparkContext
:
import org.apache.spark.SparkContext._
他们用'皮条客我的图书馆模式来添加方法RDD的具体类型。如果好奇,见SparkContext:1296
这篇关于reduceByKey方法不是在斯卡拉星火被发现的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!