spark1.5 scala.collection.mutable.WrappedArray$ofRef cannot be cast to ...解决办法

下面是我在spark user list的求助贴,很快就得到了正确回答,有遇到问题的同学解决不了也可以去上面提问。

I can use it under spark1.4.1,but error on spark1.5.1,how to deal with this problem?

//define Schema 
 val struct =StructType( 
      StructField("app_name", StringType, true):: 
        StructField("apply_time",LongType ,true) :: 
        StructField("final_decision" , StringType) :: 
        StructField("final_score" ,IntegerType ) :: 
        StructField("partner_code", StringType, true) :: 
        StructField("person_info",MapType(StringType,StringType,true)) :: 
        StructField("report_id",StringType, true) :: 
        StructField("report_time",LongType ,true) :: 
        StructField("risk_items",ArrayType(MapType(StringType,StringType,true))) :: 
        Nil 
    ) 

val rdd2 = sqlContext.read.schema(struct).json(jsonData).map(r => { 
... 
      //Exception 
         val risk_items = r.getAs[List[Map[String, String]]]("risk_items") 
... 
} 

报错提示如下:

java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to scala.collection.immutable.List

很快得到了一个正确的答案,只需要将List改为Seq就行了,回复如下:

这里写图片描述

原文地址:https://www.cnblogs.com/zhangyunlin/p/6168158.html