不能使用spark/java将org.apache.spark.sql.types.decimal转换为org.apache.spark.unsafe.types.utf8string

lh80um4z  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(649)

我有以下Dataframe:

+-------------+-----------------+------------------+
|longitude    |latitude         |geom              |
+-------------+-----------------+------------------+
|-7.07378166  |33.826661        [00 00 00 00 01 0..|
|-7.5952683   |33.544191        [00 00 00 00 01 0..|                  
+-------------+-----------------+------------------+

我使用以下代码:

Dataset<Row> result_f = sparkSession.sql("select * from data_f where ST_WITHIN(ST_PointFromText(CAST(data_f.latitude_f AS Decimal(24,20)), CAST(data_f.longitude_f AS Decimal(24,20))),geom)");
    result_f.show();

执行时出现以下异常:

java.lang.ClassCastException: org.apache.spark.sql.types.Decimal cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.geosparksql.expressions.ST_PointFromText.eval(Constructors.scala:44)
at org.apache.spark.sql.geosparksql.expressions.ST_Within.eval(Predicates.scala:104)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate.And_0$(Unknown Source)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate.eval(Unknown Source)
at org.apache.spark.sql.execution.joins.CartesianProductExec$$anonfun$doExecute$1$$anonfun$2.apply(CartesianProductExec.scala:89)
at org.apache.spark.sql.execution.joins.CartesianProductExec$$anonfun$doExecute$1$$anonfun$2.apply(CartesianProductExec.scala:88)

你知道吗?
我需要你的帮助。
谢谢您

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题