本文整理了Java中org.apache.spark.api.java.JavaRDD.flatMapToDouble()
方法的一些代码示例,展示了JavaRDD.flatMapToDouble()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。JavaRDD.flatMapToDouble()
方法的具体详情如下:
包路径:org.apache.spark.api.java.JavaRDD
类名称:JavaRDD
方法名:flatMapToDouble
暂无
代码示例来源:origin: org.apache.spark/spark-core_2.11
@Test
public void flatMap() {
JavaRDD<String> rdd = sc.parallelize(Arrays.asList("Hello World!",
"The quick brown fox jumps over the lazy dog."));
JavaRDD<String> words = rdd.flatMap(x -> Arrays.asList(x.split(" ")).iterator());
Assert.assertEquals("Hello", words.first());
Assert.assertEquals(11, words.count());
JavaPairRDD<String, String> pairs = rdd.flatMapToPair(s -> {
List<Tuple2<String, String>> pairs2 = new LinkedList<>();
for (String word : s.split(" ")) {
pairs2.add(new Tuple2<>(word, word));
}
return pairs2.iterator();
});
Assert.assertEquals(new Tuple2<>("Hello", "Hello"), pairs.first());
Assert.assertEquals(11, pairs.count());
JavaDoubleRDD doubles = rdd.flatMapToDouble(s -> {
List<Double> lengths = new LinkedList<>();
for (String word : s.split(" ")) {
lengths.add((double) word.length());
}
return lengths.iterator();
});
Assert.assertEquals(5.0, doubles.first(), 0.01);
Assert.assertEquals(11, pairs.count());
}
代码示例来源:origin: org.apache.spark/spark-core
@Test
public void flatMap() {
JavaRDD<String> rdd = sc.parallelize(Arrays.asList("Hello World!",
"The quick brown fox jumps over the lazy dog."));
JavaRDD<String> words = rdd.flatMap(x -> Arrays.asList(x.split(" ")).iterator());
Assert.assertEquals("Hello", words.first());
Assert.assertEquals(11, words.count());
JavaPairRDD<String, String> pairs = rdd.flatMapToPair(s -> {
List<Tuple2<String, String>> pairs2 = new LinkedList<>();
for (String word : s.split(" ")) {
pairs2.add(new Tuple2<>(word, word));
}
return pairs2.iterator();
});
Assert.assertEquals(new Tuple2<>("Hello", "Hello"), pairs.first());
Assert.assertEquals(11, pairs.count());
JavaDoubleRDD doubles = rdd.flatMapToDouble(s -> {
List<Double> lengths = new LinkedList<>();
for (String word : s.split(" ")) {
lengths.add((double) word.length());
}
return lengths.iterator();
});
Assert.assertEquals(5.0, doubles.first(), 0.01);
Assert.assertEquals(11, pairs.count());
}
代码示例来源:origin: org.apache.spark/spark-core_2.10
@Test
public void flatMap() {
JavaRDD<String> rdd = sc.parallelize(Arrays.asList("Hello World!",
"The quick brown fox jumps over the lazy dog."));
JavaRDD<String> words = rdd.flatMap(x -> Arrays.asList(x.split(" ")).iterator());
assertEquals("Hello", words.first());
assertEquals(11, words.count());
JavaPairRDD<String, String> pairsRDD = rdd.flatMapToPair(s -> {
List<Tuple2<String, String>> pairs = new LinkedList<>();
for (String word : s.split(" ")) {
pairs.add(new Tuple2<>(word, word));
}
return pairs.iterator();
}
);
assertEquals(new Tuple2<>("Hello", "Hello"), pairsRDD.first());
assertEquals(11, pairsRDD.count());
JavaDoubleRDD doubles = rdd.flatMapToDouble(s -> {
List<Double> lengths = new LinkedList<>();
for (String word : s.split(" ")) {
lengths.add((double) word.length());
}
return lengths.iterator();
});
assertEquals(5.0, doubles.first(), 0.01);
assertEquals(11, pairsRDD.count());
}
代码示例来源:origin: org.apache.spark/spark-core
@Test
public void flatMap() {
JavaRDD<String> rdd = sc.parallelize(Arrays.asList("Hello World!",
"The quick brown fox jumps over the lazy dog."));
JavaRDD<String> words = rdd.flatMap(x -> Arrays.asList(x.split(" ")).iterator());
assertEquals("Hello", words.first());
assertEquals(11, words.count());
JavaPairRDD<String, String> pairsRDD = rdd.flatMapToPair(s -> {
List<Tuple2<String, String>> pairs = new LinkedList<>();
for (String word : s.split(" ")) {
pairs.add(new Tuple2<>(word, word));
}
return pairs.iterator();
}
);
assertEquals(new Tuple2<>("Hello", "Hello"), pairsRDD.first());
assertEquals(11, pairsRDD.count());
JavaDoubleRDD doubles = rdd.flatMapToDouble(s -> {
List<Double> lengths = new LinkedList<>();
for (String word : s.split(" ")) {
lengths.add((double) word.length());
}
return lengths.iterator();
});
assertEquals(5.0, doubles.first(), 0.01);
assertEquals(11, pairsRDD.count());
}
代码示例来源:origin: org.apache.spark/spark-core_2.10
@Test
public void flatMap() {
JavaRDD<String> rdd = sc.parallelize(Arrays.asList("Hello World!",
"The quick brown fox jumps over the lazy dog."));
JavaRDD<String> words = rdd.flatMap(x -> Arrays.asList(x.split(" ")).iterator());
Assert.assertEquals("Hello", words.first());
Assert.assertEquals(11, words.count());
JavaPairRDD<String, String> pairs = rdd.flatMapToPair(s -> {
List<Tuple2<String, String>> pairs2 = new LinkedList<>();
for (String word : s.split(" ")) {
pairs2.add(new Tuple2<>(word, word));
}
return pairs2.iterator();
});
Assert.assertEquals(new Tuple2<>("Hello", "Hello"), pairs.first());
Assert.assertEquals(11, pairs.count());
JavaDoubleRDD doubles = rdd.flatMapToDouble(s -> {
List<Double> lengths = new LinkedList<>();
for (String word : s.split(" ")) {
lengths.add((double) word.length());
}
return lengths.iterator();
});
Assert.assertEquals(5.0, doubles.first(), 0.01);
Assert.assertEquals(11, pairs.count());
}
代码示例来源:origin: org.apache.spark/spark-core_2.11
@Test
public void flatMap() {
JavaRDD<String> rdd = sc.parallelize(Arrays.asList("Hello World!",
"The quick brown fox jumps over the lazy dog."));
JavaRDD<String> words = rdd.flatMap(x -> Arrays.asList(x.split(" ")).iterator());
assertEquals("Hello", words.first());
assertEquals(11, words.count());
JavaPairRDD<String, String> pairsRDD = rdd.flatMapToPair(s -> {
List<Tuple2<String, String>> pairs = new LinkedList<>();
for (String word : s.split(" ")) {
pairs.add(new Tuple2<>(word, word));
}
return pairs.iterator();
}
);
assertEquals(new Tuple2<>("Hello", "Hello"), pairsRDD.first());
assertEquals(11, pairsRDD.count());
JavaDoubleRDD doubles = rdd.flatMapToDouble(s -> {
List<Double> lengths = new LinkedList<>();
for (String word : s.split(" ")) {
lengths.add((double) word.length());
}
return lengths.iterator();
});
assertEquals(5.0, doubles.first(), 0.01);
assertEquals(11, pairsRDD.count());
}
内容来源于网络,如有侵权,请联系作者删除!