org.apache.spark.api.java.JavaRDD.glom()方法的使用及代码示例

x33g5p2x  于2022-01-21 转载在 其他  
字(3.4k)|赞(0)|评价(0)|浏览(86)

本文整理了Java中org.apache.spark.api.java.JavaRDD.glom()方法的一些代码示例,展示了JavaRDD.glom()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。JavaRDD.glom()方法的具体详情如下:
包路径:org.apache.spark.api.java.JavaRDD
类名称:JavaRDD
方法名:glom

JavaRDD.glom介绍

暂无

代码示例

代码示例来源:origin: org.apache.spark/spark-core_2.10

@Test
public void glom() {
 JavaRDD<Integer> rdd = sc.parallelize(Arrays.asList(1, 2, 3, 4), 2);
 assertEquals("[1, 2]", rdd.glom().first().toString());
}

代码示例来源:origin: org.apache.spark/spark-core

@Test
public void glom() {
 JavaRDD<Integer> rdd = sc.parallelize(Arrays.asList(1, 2, 3, 4), 2);
 assertEquals("[1, 2]", rdd.glom().first().toString());
}

代码示例来源:origin: org.apache.spark/spark-core_2.11

@Test
public void glom() {
 JavaRDD<Integer> rdd = sc.parallelize(Arrays.asList(1, 2, 3, 4), 2);
 assertEquals("[1, 2]", rdd.glom().first().toString());
}

代码示例来源:origin: org.apache.spark/spark-core_2.11

@Test
public void repartition() {
 // Shrinking number of partitions
 JavaRDD<Integer> in1 = sc.parallelize(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8), 2);
 JavaRDD<Integer> repartitioned1 = in1.repartition(4);
 List<List<Integer>> result1 = repartitioned1.glom().collect();
 assertEquals(4, result1.size());
 for (List<Integer> l : result1) {
  assertFalse(l.isEmpty());
 }
 // Growing number of partitions
 JavaRDD<Integer> in2 = sc.parallelize(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8), 4);
 JavaRDD<Integer> repartitioned2 = in2.repartition(2);
 List<List<Integer>> result2 = repartitioned2.glom().collect();
 assertEquals(2, result2.size());
 for (List<Integer> l: result2) {
  assertFalse(l.isEmpty());
 }
}

代码示例来源:origin: org.apache.spark/spark-core_2.10

@Test
public void repartition() {
 // Shrinking number of partitions
 JavaRDD<Integer> in1 = sc.parallelize(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8), 2);
 JavaRDD<Integer> repartitioned1 = in1.repartition(4);
 List<List<Integer>> result1 = repartitioned1.glom().collect();
 assertEquals(4, result1.size());
 for (List<Integer> l : result1) {
  assertFalse(l.isEmpty());
 }
 // Growing number of partitions
 JavaRDD<Integer> in2 = sc.parallelize(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8), 4);
 JavaRDD<Integer> repartitioned2 = in2.repartition(2);
 List<List<Integer>> result2 = repartitioned2.glom().collect();
 assertEquals(2, result2.size());
 for (List<Integer> l: result2) {
  assertFalse(l.isEmpty());
 }
}

代码示例来源:origin: org.apache.spark/spark-core

@Test
public void repartition() {
 // Shrinking number of partitions
 JavaRDD<Integer> in1 = sc.parallelize(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8), 2);
 JavaRDD<Integer> repartitioned1 = in1.repartition(4);
 List<List<Integer>> result1 = repartitioned1.glom().collect();
 assertEquals(4, result1.size());
 for (List<Integer> l : result1) {
  assertFalse(l.isEmpty());
 }
 // Growing number of partitions
 JavaRDD<Integer> in2 = sc.parallelize(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8), 4);
 JavaRDD<Integer> repartitioned2 = in2.repartition(2);
 List<List<Integer>> result2 = repartitioned2.glom().collect();
 assertEquals(2, result2.size());
 for (List<Integer> l: result2) {
  assertFalse(l.isEmpty());
 }
}

代码示例来源:origin: usc-isi-i2/Web-Karma

input = input.values().glom().flatMapToPair(
    new PairFlatMapFunction<List<String>, String, String>() {

代码示例来源:origin: spirom/learning-spark-with-java

JavaRDD<List<Double>> partitionsRDD = transformedRDD.glom();
System.out.println("*** We _should_ have 4 partitions");
System.out.println("*** (They can't be of equal size)");

相关文章

微信公众号

最新文章

更多