org.apache.spark.SparkContext.stop()方法的使用及代码示例

x33g5p2x  于2022-01-30 转载在 其他  
字(4.3k)|赞(0)|评价(0)|浏览(237)

本文整理了Java中org.apache.spark.SparkContext.stop()方法的一些代码示例,展示了SparkContext.stop()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。SparkContext.stop()方法的具体详情如下:
包路径:org.apache.spark.SparkContext
类名称:SparkContext
方法名:stop

SparkContext.stop介绍

暂无

代码示例

代码示例来源:origin: org.apache.spark/spark-core_2.11

@Test
 public void scalaSparkContext() {
  List<String> jars = List$.MODULE$.empty();
  Map<String, String> environment = Map$.MODULE$.empty();

  new SparkContext(new SparkConf().setMaster("local").setAppName("name")).stop();
  new SparkContext("local", "name", new SparkConf()).stop();
  new SparkContext("local", "name").stop();
  new SparkContext("local", "name", "sparkHome").stop();
  new SparkContext("local", "name", "sparkHome", jars).stop();
  new SparkContext("local", "name", "sparkHome", jars, environment).stop();
 }
}

代码示例来源:origin: org.apache.spark/spark-core

@Test
 public void scalaSparkContext() {
  List<String> jars = List$.MODULE$.empty();
  Map<String, String> environment = Map$.MODULE$.empty();

  new SparkContext(new SparkConf().setMaster("local").setAppName("name")).stop();
  new SparkContext("local", "name", new SparkConf()).stop();
  new SparkContext("local", "name").stop();
  new SparkContext("local", "name", "sparkHome").stop();
  new SparkContext("local", "name", "sparkHome", jars).stop();
  new SparkContext("local", "name", "sparkHome", jars, environment).stop();
 }
}

代码示例来源:origin: twosigma/beakerx

private void applicationStart() {
 this.statusPanel = new SparkUIStatus(() -> getSparkSession().sparkContext().stop());
 this.sparkUIForm.setDomClasses(new ArrayList<>(asList("bx-disabled")));
 add(0, this.statusPanel);
 sendUpdate(SPARK_APP_ID, sparkEngine.getSparkAppId());
 sendUpdate("sparkUiWebUrl", sparkEngine.getSparkUiWebUrl());
 sendUpdate("sparkMasterUrl", sparkEngine.getSparkMasterUrl());
}

代码示例来源:origin: org.apache.spark/spark-core_2.11

public static void main(String[] args) throws Exception {
 assertNotEquals(0, args.length);
 assertEquals(args[0], "hello");
 new SparkContext().stop();
 synchronized (LOCK) {
  LOCK.notifyAll();
 }
}

代码示例来源:origin: org.apache.spark/spark-core

public static void main(String[] args) throws Exception {
 assertNotEquals(0, args.length);
 assertEquals(args[0], "hello");
 new SparkContext().stop();
 synchronized (LOCK) {
  LOCK.notifyAll();
 }
}

代码示例来源:origin: apache/tinkerpop

public static void close() {
  NAME_TO_RDD.clear();
  if (null != CONTEXT)
    CONTEXT.stop();
  CONTEXT = null;
}

代码示例来源:origin: Impetus/Kundera

@Override
public void destroy()
{
  indexManager.close();
  if (schemaManager != null)
  {
    schemaManager.dropSchema();
  }
  if (sparkContext != null)
  {
    logger.info("Closing connection to spark.");
    sparkContext.stop();
    logger.info("Closed connection to spark.");
  }
  else
  {
    logger.warn("Can't close connection to Spark, it was already disconnected");
  }
  externalProperties = null;
  schemaManager = null;
}

代码示例来源:origin: org.apache.tinkerpop/spark-gremlin

public static void close() {
  NAME_TO_RDD.clear();
  if (null != CONTEXT)
    CONTEXT.stop();
  CONTEXT = null;
}

代码示例来源:origin: org.wso2.carbon.analytics/org.wso2.carbon.analytics.spark.core

public void stop() {
  if (this.sqlCtx != null) {
    this.sqlCtx.sparkContext().stop();
  }
}

代码示例来源:origin: uber/marmaray

@After
public void tearDown() {
  final SparkArgs sparkArgs = getSampleMarmaraySparkArgs();
  // gets existing sc
  this.sparkFactory.get().getSparkContext(sparkArgs).sc().stop();
  this.sparkFactory = Optional.absent();
}

代码示例来源:origin: io.zipkin.dependencies/zipkin-dependencies-cassandra

public void run() {
 long microsLower = day * 1000;
 long microsUpper = (day * 1000) + TimeUnit.DAYS.toMicros(1) - 1;
 log.info("Running Dependencies job for {}: {} ≤ Span.timestamp {}", dateStamp, microsLower,
   microsUpper);
 SparkContext sc = new SparkContext(conf);
 List<DependencyLink> links = javaFunctions(sc)
  .cassandraTable(keyspace, "traces")
  .spanBy(ROW_TRACE_ID, Long.class)
  .flatMapValues(new CassandraRowsToDependencyLinks(logInitializer, microsLower, microsUpper))
  .values()
  .mapToPair(LINK_TO_PAIR)
  .reduceByKey(MERGE_LINK)
  .values()
  .collect();
 sc.stop();
 saveToCassandra(links);
}

代码示例来源:origin: io.zipkin.dependencies/zipkin-dependencies-cassandra3

sc.stop();

代码示例来源:origin: bhdrkn/Java-Examples

sparkContext.stop();
sc.stop();

相关文章