org.apache.spark.SparkContext.cancelAllJobs()方法的使用及代码示例

x33g5p2x  于2022-01-30 转载在 其他  
字(1.5k)|赞(0)|评价(0)|浏览(154)

本文整理了Java中org.apache.spark.SparkContext.cancelAllJobs()方法的一些代码示例,展示了SparkContext.cancelAllJobs()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。SparkContext.cancelAllJobs()方法的具体详情如下:
包路径:org.apache.spark.SparkContext
类名称:SparkContext
方法名:cancelAllJobs

SparkContext.cancelAllJobs介绍

暂无

代码示例

代码示例来源:origin: twosigma/beakerx

public void cancelAllJobs() {
 getSparkSession().sparkContext().cancelAllJobs();
}

代码示例来源:origin: uber/marmaray

private void monitorTimeout() {
  try {
    while (true) {
      FREQUENCY_UNIT.sleep(FREQUENCY_IN_MINS);
      log.info("Checking whether the job or any Spark stage has timed out...");
      if (jobTimeout()) {
        log.error("The spark job is taking longer than {} ms. Cancelling all jobs...",
            this.jobTimeoutMillis);
        this.sc.cancelAllJobs();
        throw new TimeoutException("The spark job is timing out");
      }
      final List<Stage> stalledStages = this.stalledStages();
      if (stalledStages.size() > 0) {
        for (Stage stage: stalledStages) {
          log.error("Cancelling stage {}-{} and its related jobs due to inactivity... details: {}",
              stage.id(), stage.name(), stage.details());
          this.sc.cancelStage(stage.id());
        }
      }
      log.info("The job and all stages are running fine within the timeout limits.");
    }
  } catch (InterruptedException | TimeoutException e) {
    log.info("Shutting down timeout monitor thread");
    throw new JobRuntimeException(e);
  }
}

代码示例来源:origin: dibbhatt/kafka-spark-consumer

jsc.awaitTermination();
}catch (Exception ex ) {
 jsc.ssc().sc().cancelAllJobs();
 jsc.stop(true, false);
 System.exit(-1);

相关文章