spark双类型加减错误

p1tboqfb  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(270)

这一定是个已知问题,但我找不到任何相关信息:

spark.sql("""
 select 48.85 + 6.95 + -55.80 x, 
        '48.85' + '6.95' + '-55.80' y,
        cast('48.85' as double) + cast('6.95' as double) + cast('-55.80' as double) z
""").show()
+----+--------------------+--------------------+
|   x|                   y|                   z|
+----+--------------------+--------------------+
|0.00|7.105427357601002...|7.105427357601002...|
+----+--------------------+--------------------+

我在aws emr spark 2.4.4上

iibxawm4

iibxawm41#

你的结果其实不是 7.10... 但是 7.10... E-15 .

val df = spark.sql("""
 select 48.85 + 6.95 + -55.80 x, 
        '48.85' + '6.95' + '-55.80' y,
        cast('48.85' as double) + cast('6.95' as double) + cast('-55.80' as double) z
""")
df.printSchema()
df.show(false)

root
 |-- x: decimal(6,2) (nullable = true)
 |-- y: double (nullable = true)
 |-- z: double (nullable = true)

+----+---------------------+---------------------+
|x   |y                    |z                    |
+----+---------------------+---------------------+
|0.00|7.105427357601002E-15|7.105427357601002E-15|
+----+---------------------+---------------------+

如你所知,这是来自双精度类型。如果要精确计算值,请使用 decimal 类型,例如列 x .

相关问题