为什么我在2.4.6的官方配置文档中找不到spark.shuffle.minnumpartitionstohighlycompress

pn9klfpd  于 2021-05-29  发布在  Spark
关注(0)|答案(0)|浏览(233)

为什么在2.4.6的spark配置文档中找不到spark.shuffle.minnumpartitionstohighlycompress
以下是package.scala blow中的代码:

private[spark] val SHUFFLE_MIN_NUM_PARTS_TO_HIGHLY_COMPRESS =
  ConfigBuilder("spark.shuffle.minNumPartitionsToHighlyCompress")
  .internal()
  .doc("Number of partitions to determine if MapStatus should use HighlyCompressedMapStatus")
  .intConf
  .checkValue(v => v > 0, "The value should be a positive integer.")
  .createWithDefault(2000)

将来会改变吗?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题