org.apache.flink.api.java.operators.MapOperator.reduceGroup()方法的使用及代码示例

x33g5p2x  于2022-01-25 转载在 其他  
字(3.8k)|赞(0)|评价(0)|浏览(105)

本文整理了Java中org.apache.flink.api.java.operators.MapOperator.reduceGroup()方法的一些代码示例,展示了MapOperator.reduceGroup()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。MapOperator.reduceGroup()方法的具体详情如下:
包路径:org.apache.flink.api.java.operators.MapOperator
类名称:MapOperator
方法名:reduceGroup

MapOperator.reduceGroup介绍

暂无

代码示例

代码示例来源:origin: apache/flink

@Test
public void testBCVariableClosure() {
  ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
  
  DataSet<String> input = env.readTextFile(IN_FILE).name("source1");
  
  DataSet<String> reduced = input
      .map(new IdentityMapper<String>())
      .reduceGroup(new Top1GroupReducer<String>());
  
  
  DataSet<String> initialSolution = input.map(new IdentityMapper<String>()).withBroadcastSet(reduced, "bc");
  
  
  IterativeDataSet<String> iteration = initialSolution.iterate(100);
  
  iteration.closeWith(iteration.map(new IdentityMapper<String>()).withBroadcastSet(reduced, "red"))
      .output(new DiscardingOutputFormat<String>());
  
  Plan plan = env.createProgramPlan();
  
  try{
    compileNoStats(plan);
  }catch(Exception e){
    e.printStackTrace();
    Assert.fail(e.getMessage());
  }
}

代码示例来源:origin: apache/flink

.reduceGroup(new Top1GroupReducer<String>())
  .withBroadcastSet(input3, "bc");
.reduceGroup(new Top1GroupReducer<String>())
  .withBroadcastSet(input3, "bc");

代码示例来源:origin: apache/flink

.reduceGroup(new Top1GroupReducer<String>());

代码示例来源:origin: org.gradoop/gradoop-flink

/**
 * Creates a new data source. Paths can be local (file://) or HDFS (hdfs://).
 *
 * @param tlfPath tlf data file
 * @param tlfVertexDictionaryPath tlf vertex dictionary file
 * @param tlfEdgeDictionaryPath tlf edge dictionary file
 * @param config Gradoop Flink configuration
 */
public TLFDataSource(String tlfPath, String tlfVertexDictionaryPath,
 String tlfEdgeDictionaryPath, GradoopFlinkConfig config) throws Exception {
 super(tlfPath, tlfVertexDictionaryPath, tlfEdgeDictionaryPath, config);
 ExecutionEnvironment env = config.getExecutionEnvironment();
 if (hasVertexDictionary()) {
  DataSet<Map<Integer, String>> dictionary = env.createInput(HadoopInputs.readHadoopFile(
   new TextInputFormat(), LongWritable.class, Text.class, getTLFVertexDictionaryPath()))
    .filter(t -> !t.f1.toString().isEmpty())
    .map(new DictionaryEntry())
    .reduceGroup(new Dictionary());
  setVertexDictionary(dictionary);
 }
 if (hasEdgeDictionary()) {
  DataSet<Map<Integer, String>> dictionary = env.createInput(HadoopInputs.readHadoopFile(
   new TextInputFormat(), LongWritable.class, Text.class, getTLFEdgeDictionaryPath()))
    .filter(t -> !t.f1.toString().isEmpty())
    .map(new DictionaryEntry())
    .reduceGroup(new Dictionary());
  setEdgeDictionary(dictionary);
 }
}

代码示例来源:origin: dbs-leipzig/gradoop

/**
 * Creates a new data source. Paths can be local (file://) or HDFS (hdfs://).
 *
 * @param tlfPath tlf data file
 * @param tlfVertexDictionaryPath tlf vertex dictionary file
 * @param tlfEdgeDictionaryPath tlf edge dictionary file
 * @param config Gradoop Flink configuration
 */
public TLFDataSource(String tlfPath, String tlfVertexDictionaryPath,
 String tlfEdgeDictionaryPath, GradoopFlinkConfig config) throws Exception {
 super(tlfPath, tlfVertexDictionaryPath, tlfEdgeDictionaryPath, config);
 ExecutionEnvironment env = config.getExecutionEnvironment();
 if (hasVertexDictionary()) {
  DataSet<Map<Integer, String>> dictionary = env.createInput(HadoopInputs.readHadoopFile(
   new TextInputFormat(), LongWritable.class, Text.class, getTLFVertexDictionaryPath()))
    .filter(t -> !t.f1.toString().isEmpty())
    .map(new DictionaryEntry())
    .reduceGroup(new Dictionary());
  setVertexDictionary(dictionary);
 }
 if (hasEdgeDictionary()) {
  DataSet<Map<Integer, String>> dictionary = env.createInput(HadoopInputs.readHadoopFile(
   new TextInputFormat(), LongWritable.class, Text.class, getTLFEdgeDictionaryPath()))
    .filter(t -> !t.f1.toString().isEmpty())
    .map(new DictionaryEntry())
    .reduceGroup(new Dictionary());
  setEdgeDictionary(dictionary);
 }
}

相关文章