org.apache.spark.SparkContext.broadcast()方法的使用及代码示例

x33g5p2x  于2022-01-30 转载在 其他  
字(2.6k)|赞(0)|评价(0)|浏览(343)

本文整理了Java中org.apache.spark.SparkContext.broadcast()方法的一些代码示例,展示了SparkContext.broadcast()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。SparkContext.broadcast()方法的具体详情如下:
包路径:org.apache.spark.SparkContext
类名称:SparkContext
方法名:broadcast

SparkContext.broadcast介绍

暂无

代码示例

代码示例来源:origin: locationtech/geowave

public static Broadcast<? extends NumericIndexStrategy> broadcastIndexStrategy(
   SparkContext sc,
   NumericIndexStrategy indexStrategy) {
  ClassTag<NumericIndexStrategy> indexClassTag =
    scala.reflect.ClassTag$.MODULE$.apply(indexStrategy.getClass());
  Broadcast<NumericIndexStrategy> broadcastStrategy = sc.broadcast(indexStrategy, indexClassTag);
  return broadcastStrategy;
 }
}

代码示例来源:origin: com.stratio.deep/deep-core

public DeepRDD(SparkContext sc, S config) {
  super(sc, scala.collection.Seq$.MODULE$.empty(), ClassTag$.MODULE$.<T>apply(config
      .getEntityClass()));
  config.setRddId(id());
  this.config =
      sc.broadcast(config, ClassTag$.MODULE$
          .<S>apply(config.getClass()));
}

代码示例来源:origin: Stratio/deep-spark

public DeepRDD(SparkContext sc, S config) {
  super(sc, scala.collection.Seq$.MODULE$.empty(), ClassTag$.MODULE$.<T>apply(config
      .getEntityClass()));
  config.setRddId(id());
  this.config =
      sc.broadcast(config, ClassTag$.MODULE$
          .<S>apply(config.getClass()));
}

代码示例来源:origin: locationtech/geowave

Broadcast<String> typeName = sc.broadcast(adapter.getTypeName(), stringTag);
Broadcast<String> indexName = sc.broadcast(index.getName(), stringTag);

代码示例来源:origin: uk.gov.gchq.gaffer/spark-accumulo-library

public void doOperation(final ImportRDDOfElements operation, final Context context, final AccumuloStore store) throws OperationException {
    final String outputPath = operation.getOption(OUTPUT_PATH);
    if (null == outputPath || outputPath.isEmpty()) {
      throw new OperationException("Option outputPath must be set for this option to be run against the accumulostore");
    }
    final String failurePath = operation.getOption(FAILURE_PATH);
    if (null == failurePath || failurePath.isEmpty()) {
      throw new OperationException("Option failurePath must be set for this option to be run against the accumulostore");
    }
    final ElementConverterFunction func = new ElementConverterFunction(SparkContextUtil.getSparkSession(context, store.getProperties()).sparkContext().broadcast(store.getKeyPackage().getKeyConverter(), ACCUMULO_ELEMENT_CONVERTER_CLASS_TAG));
    final RDD<Tuple2<Key, Value>> rdd = operation.getInput().flatMap(func, TUPLE2_CLASS_TAG);
    final ImportKeyValuePairRDDToAccumulo op =
        new ImportKeyValuePairRDDToAccumulo.Builder()
            .input(rdd)
            .failurePath(failurePath)
            .outputPath(outputPath)
            .build();
    store.execute(new OperationChain<>(op), context);
  }
}

相关文章