org.apache.spark.api.java.JavaRDD.countAsync()方法的使用及代码示例

x33g5p2x  于2022-01-21 转载在 其他  
字(3.1k)|赞(0)|评价(0)|浏览(90)

本文整理了Java中org.apache.spark.api.java.JavaRDD.countAsync()方法的一些代码示例,展示了JavaRDD.countAsync()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。JavaRDD.countAsync()方法的具体详情如下:
包路径:org.apache.spark.api.java.JavaRDD
类名称:JavaRDD
方法名:countAsync

JavaRDD.countAsync介绍

暂无

代码示例

代码示例来源:origin: org.apache.spark/spark-core_2.10

@Test
public void testAsyncActionErrorWrapping() throws Exception {
 List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
 JavaRDD<Integer> rdd = sc.parallelize(data, 1);
 JavaFutureAction<Long> future = rdd.map(new BuggyMapFunction<>()).countAsync();
 try {
  future.get(2, TimeUnit.SECONDS);
  fail("Expected future.get() for failed job to throw ExcecutionException");
 } catch (ExecutionException ee) {
  assertTrue(Throwables.getStackTraceAsString(ee).contains("Custom exception!"));
 }
 assertTrue(future.isDone());
}

代码示例来源:origin: org.apache.spark/spark-core

@Test
public void testAsyncActionErrorWrapping() throws Exception {
 List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
 JavaRDD<Integer> rdd = sc.parallelize(data, 1);
 JavaFutureAction<Long> future = rdd.map(new BuggyMapFunction<>()).countAsync();
 try {
  future.get(2, TimeUnit.SECONDS);
  fail("Expected future.get() for failed job to throw ExcecutionException");
 } catch (ExecutionException ee) {
  assertTrue(Throwables.getStackTraceAsString(ee).contains("Custom exception!"));
 }
 assertTrue(future.isDone());
}

代码示例来源:origin: org.apache.spark/spark-core_2.11

@Test
public void countAsync() throws Exception {
 List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
 JavaRDD<Integer> rdd = sc.parallelize(data, 1);
 JavaFutureAction<Long> future = rdd.countAsync();
 long count = future.get();
 assertEquals(data.size(), count);
 assertFalse(future.isCancelled());
 assertTrue(future.isDone());
 assertEquals(1, future.jobIds().size());
}

代码示例来源:origin: org.apache.spark/spark-core_2.11

@Test
public void testAsyncActionErrorWrapping() throws Exception {
 List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
 JavaRDD<Integer> rdd = sc.parallelize(data, 1);
 JavaFutureAction<Long> future = rdd.map(new BuggyMapFunction<>()).countAsync();
 try {
  future.get(2, TimeUnit.SECONDS);
  fail("Expected future.get() for failed job to throw ExcecutionException");
 } catch (ExecutionException ee) {
  assertTrue(Throwables.getStackTraceAsString(ee).contains("Custom exception!"));
 }
 assertTrue(future.isDone());
}

代码示例来源:origin: org.apache.spark/spark-core_2.10

@Test
public void countAsync() throws Exception {
 List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
 JavaRDD<Integer> rdd = sc.parallelize(data, 1);
 JavaFutureAction<Long> future = rdd.countAsync();
 long count = future.get();
 assertEquals(data.size(), count);
 assertFalse(future.isCancelled());
 assertTrue(future.isDone());
 assertEquals(1, future.jobIds().size());
}

代码示例来源:origin: org.apache.spark/spark-core

@Test
public void countAsync() throws Exception {
 List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
 JavaRDD<Integer> rdd = sc.parallelize(data, 1);
 JavaFutureAction<Long> future = rdd.countAsync();
 long count = future.get();
 assertEquals(data.size(), count);
 assertFalse(future.isCancelled());
 assertTrue(future.isDone());
 assertEquals(1, future.jobIds().size());
}

相关文章

微信公众号

最新文章

更多