本文介绍了java.lang.NoSuchMethodException: <Class>.<init>(java.lang.String) 复制自定义 Transformer 时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

目前正在使用 spark 2.0.1 和 2.2.1 在我的 spark-shell 中使用自定义转换器.

Currently playing with custom tranformers in my spark-shell using both spark 2.0.1 and 2.2.1.

在编写自定义 ml 转换器时,为了将其添加到管道中,我注意到复制方法的覆盖存在问题.

While writing a custom ml transformer, in order to add it to a pipeline, I noticed that there is an issue with the override of the copy method.

复制方法在我的例子中被 TrainValidationSplit 的 fit 方法调用.

The copy method is called by the fit method of the TrainValidationSplit in my case.

我得到的错误:

java.lang.NoSuchMethodException: Custom.<init>(java.lang.String)
  at java.lang.Class.getConstructor0(Class.java:3082)
  at java.lang.Class.getConstructor(Class.java:1825)
  at org.apache.spark.ml.param.Params$class.defaultCopy(params.scala:718)
  at org.apache.spark.ml.PipelineStage.defaultCopy(Pipeline.scala:42)
  at Custom.copy(<console>:16)
  ... 48 elided

然后我尝试直接调用复制方法,但仍然出现相同的错误.

I then tried to directly call the copy method but I still get the same error.

这是 myclass 和我执行的调用:

Here is myclass and the call I perform :

import org.apache.spark.ml.Transformer
import org.apache.spark.sql.{Dataset, DataFrame}
import org.apache.spark.sql.types.{StructField, StructType, DataTypes}
import org.apache.spark.ml.param.{Param, ParamMap}

// Simple DF
val doubles = Seq((0, 5d, 100d), (1, 4d,500d), (2, 9d,700d)).toDF("id", "rating","views")


class Custom(override val uid: String) extends org.apache.spark.ml.Transformer {
  def this() = this(org.apache.spark.ml.util.Identifiable.randomUID("custom"))

  def copy(extra: org.apache.spark.ml.param.ParamMap): Custom = {
    defaultCopy(extra)
  }

  override def transformSchema(schema: org.apache.spark.sql.types.StructType): org.apache.spark.sql.types.StructType = {
    schema.add(org.apache.spark.sql.types.StructField("trending", org.apache.spark.sql.types.IntegerType, false))
  }

   def transform(df: org.apache.spark.sql.Dataset[_]): org.apache.spark.sql.DataFrame = {

    df.withColumn("trending", (df.col("rating") > 4 && df.col("views") > 40))
  }
}


val mycustom = new Custom("Custom")
// This call throws the exception.
mycustom.copy(new org.apache.spark.ml.param.ParamMap())

有谁知道这是否是一个已知问题?我似乎在任何地方都找不到.

Does anyone know if this is a known issue ? I cant seem to find it anywhere.

是否有另一种方法可以在自定义转换器中实现复制方法?

Is there another way to implement the copy method in a custom transformer ?

谢谢

推荐答案

关于您的自定义 Transformer(也为了启用您的 PipelineModel 的 SerDe 操作),我将更改以下几点:

These are a couple of things that I would change about your custom Transformer (also to enable SerDe operations of your PipelineModel):

  • Implement the DefaultParamsWritable trait
  • Add a Companion object that extends the DefaultParamsReadable Interface

例如

class Custom(override val uid: String) extends Transformer
    with DefaultParamsWritable {
      ...
      ...

}
object Custom extends DefaultParamsReadable[Custom]

请查看 UnaryTransformer 如果您只有 1 个输入/输出列.

Do take a look at the UnaryTransformer if you have only 1 Input/Output columns.

最后,究竟需要怎样调用mycustom.copy(new ParamMap())??

Finally, what's the need to call mycustom.copy(new ParamMap()) exactly??

这篇关于java.lang.NoSuchMethodException: <Class>.<init>(java.lang.String) 复制自定义 Transformer 时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-05 19:55