本文整理了Java中org.apache.flink.api.java.operators.UnsortedGrouping.min()
方法的一些代码示例,展示了UnsortedGrouping.min()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。UnsortedGrouping.min()
方法的具体详情如下:
包路径:org.apache.flink.api.java.operators.UnsortedGrouping
类名称:UnsortedGrouping
方法名:min
[英]Syntactic sugar for aggregate (MIN, field).
[中]聚合的语法糖(MIN,field)。
代码示例来源:origin: apache/flink
@Test
public void testNestedAggregate() throws Exception {
/*
* Nested Aggregate
*/
final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
DataSet<Tuple3<Integer, Long, String>> ds = CollectionDataSets.get3TupleDataSet(env);
DataSet<Tuple1<Integer>> aggregateDs = ds.groupBy(1)
.min(0)
.min(0)
.project(0);
List<Tuple1<Integer>> result = aggregateDs.collect();
String expected = "1\n";
compareResultAsTuples(result, expected);
}
}
代码示例来源:origin: apache/flink
@Test
public void testConnectedComponentsWithParametrizableConvergence() throws Exception {
// name of the aggregator that checks for convergence
final String updatedElements = "updated.elements.aggr";
// the iteration stops if less than this number of elements change value
final long convergenceThreshold = 3;
final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
DataSet<Tuple2<Long, Long>> initialSolutionSet = env.fromCollection(verticesInput);
DataSet<Tuple2<Long, Long>> edges = env.fromCollection(edgesInput);
IterativeDataSet<Tuple2<Long, Long>> iteration = initialSolutionSet.iterate(10);
// register the convergence criterion
iteration.registerAggregationConvergenceCriterion(updatedElements,
new LongSumAggregator(), new UpdatedElementsConvergenceCriterion(convergenceThreshold));
DataSet<Tuple2<Long, Long>> verticesWithNewComponents = iteration.join(edges).where(0).equalTo(0)
.with(new NeighborWithComponentIDJoin())
.groupBy(0).min(1);
DataSet<Tuple2<Long, Long>> updatedComponentId =
verticesWithNewComponents.join(iteration).where(0).equalTo(0)
.flatMap(new MinimumIdFilter(updatedElements));
List<Tuple2<Long, Long>> result = iteration.closeWith(updatedComponentId).collect();
Collections.sort(result, new TestBaseUtils.TupleComparator<Tuple2<Long, Long>>());
assertEquals(expectedResult, result);
}
代码示例来源:origin: apache/flink
@Test
public void testDeltaConnectedComponentsWithParametrizableConvergence() throws Exception {
// name of the aggregator that checks for convergence
final String updatedElements = "updated.elements.aggr";
// the iteration stops if less than this number of elements change value
final long convergenceThreshold = 3;
final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
DataSet<Tuple2<Long, Long>> initialSolutionSet = env.fromCollection(verticesInput);
DataSet<Tuple2<Long, Long>> edges = env.fromCollection(edgesInput);
DeltaIteration<Tuple2<Long, Long>, Tuple2<Long, Long>> iteration =
initialSolutionSet.iterateDelta(initialSolutionSet, 10, 0);
// register the convergence criterion
iteration.registerAggregationConvergenceCriterion(updatedElements,
new LongSumAggregator(), new UpdatedElementsConvergenceCriterion(convergenceThreshold));
DataSet<Tuple2<Long, Long>> verticesWithNewComponents = iteration.getWorkset().join(edges).where(0).equalTo(0)
.with(new NeighborWithComponentIDJoin())
.groupBy(0).min(1);
DataSet<Tuple2<Long, Long>> updatedComponentId =
verticesWithNewComponents.join(iteration.getSolutionSet()).where(0).equalTo(0)
.flatMap(new MinimumIdFilter(updatedElements));
List<Tuple2<Long, Long>> result = iteration.closeWith(updatedComponentId, updatedComponentId).collect();
Collections.sort(result, new TestBaseUtils.TupleComparator<Tuple2<Long, Long>>());
assertEquals(expectedResult, result);
}
代码示例来源:origin: apache/flink
.groupBy(0).min(1);
代码示例来源:origin: apache/flink
.min(1)
.name("Find Minimum Candidate Id");
内容来源于网络,如有侵权,请联系作者删除!