org.apache.spark.SparkContext.jarOfClass()方法的使用及代码示例

x33g5p2x  于2022-01-30 转载在 其他  
字(1.9k)|赞(0)|评价(0)|浏览(106)

本文整理了Java中org.apache.spark.SparkContext.jarOfClass()方法的一些代码示例,展示了SparkContext.jarOfClass()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。SparkContext.jarOfClass()方法的具体详情如下:
包路径:org.apache.spark.SparkContext
类名称:SparkContext
方法名:jarOfClass

SparkContext.jarOfClass介绍

暂无

代码示例

代码示例来源:origin: apache/hive

public static String findKryoRegistratorJar(HiveConf conf) throws FileNotFoundException {
 // find the jar in local maven repo for testing
 if (HiveConf.getBoolVar(conf, HiveConf.ConfVars.HIVE_IN_TEST)) {
  String repo = System.getProperty("maven.local.repository");
  String version = System.getProperty("hive.version");
  String jarName = HIVE_KRYO_REG_JAR_NAME + "-" + version + ".jar";
  String[] parts = new String[]{repo, "org", "apache", "hive",
    HIVE_KRYO_REG_JAR_NAME, version, jarName};
  String jar = Joiner.on(File.separator).join(parts);
  if (!new File(jar).exists()) {
   throw new FileNotFoundException(jar + " doesn't exist.");
  }
  return jar;
 }
 Option<String> option = SparkContext.jarOfClass(SparkClientUtilities.class);
 if (!option.isDefined()) {
  throw new FileNotFoundException("Cannot find the path to hive-exec.jar");
 }
 File path = new File(option.get());
 File[] jars = path.getParentFile().listFiles((dir, name) ->
   name.startsWith(HIVE_KRYO_REG_JAR_NAME));
 if (jars != null && jars.length > 0) {
  return jars[0].getAbsolutePath();
 }
 throw new FileNotFoundException("Cannot find the " + HIVE_KRYO_REG_JAR_NAME +
   " jar under " + path.getParent());
}

代码示例来源:origin: apache/hive

if (SparkContext.jarOfClass(this.getClass()).isDefined()) {
 jar = SparkContext.jarOfClass(this.getClass()).get();

代码示例来源:origin: com.github.hyukjinkwon/spark-client

if (SparkContext.jarOfClass(this.getClass()).isDefined()) {
 jar = SparkContext.jarOfClass(this.getClass()).get();

代码示例来源:origin: org.spark-project.hive/spark-client

if (SparkContext.jarOfClass(this.getClass()).isDefined()) {
 jar = SparkContext.jarOfClass(this.getClass()).get();

相关文章