org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter()方法的使用及代码示例

x33g5p2x  于2022-02-01 转载在 其他  
字(6.4k)|赞(0)|评价(0)|浏览(88)

本文整理了Java中org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter()方法的一些代码示例,展示了Utilities.createSequenceWriter()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Utilities.createSequenceWriter()方法的具体详情如下:
包路径:org.apache.hadoop.hive.ql.exec.Utilities
类名称:Utilities
方法名:createSequenceWriter

Utilities.createSequenceWriter介绍

[英]Create a sequencefile output stream based on job configuration.
[中]根据作业配置创建sequencefile输出流。

代码示例

代码示例来源:origin: apache/hive

@Override
public RecordWriter getHiveRecordWriter(JobConf jc, Path finalOutPath,
  Class<? extends Writable> valueClass, boolean isCompressed,
  Properties tableProperties, Progressable progress) throws IOException {
 FileSystem fs = finalOutPath.getFileSystem(jc);
 final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
  BytesWritable.class, valueClass, isCompressed, progress);
 return new PTFRecordWriter(outStream);
}

代码示例来源:origin: apache/hive

final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
BytesWritable.class, valueClass, isCompressed, progress);

代码示例来源:origin: apache/drill

@Override
public RecordWriter getHiveRecordWriter(JobConf jc, Path finalOutPath,
  Class<? extends Writable> valueClass, boolean isCompressed,
  Properties tableProperties, Progressable progress) throws IOException {
 FileSystem fs = finalOutPath.getFileSystem(jc);
 final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
  BytesWritable.class, valueClass, isCompressed, progress);
 return new PTFRecordWriter(outStream);
}

代码示例来源:origin: apache/drill

final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
BytesWritable.class, valueClass, isCompressed, progress);

代码示例来源:origin: apache/hive

/**
 * Create a sequencefile output stream based on job configuration.
 *
 * @param jc
 *          Job configuration
 * @param fs
 *          File System to create file in
 * @param file
 *          Path to be created
 * @param keyClass
 *          Java Class for key
 * @param valClass
 *          Java Class for value
 * @return output stream over the created sequencefile
 */
public static SequenceFile.Writer createSequenceWriter(JobConf jc, FileSystem fs, Path file,
  Class<?> keyClass, Class<?> valClass, Progressable progressable) throws IOException {
 boolean isCompressed = FileOutputFormat.getCompressOutput(jc);
 return createSequenceWriter(jc, fs, file, keyClass, valClass, isCompressed, progressable);
}

代码示例来源:origin: apache/drill

/**
 * Create a sequencefile output stream based on job configuration.
 *
 * @param jc
 *          Job configuration
 * @param fs
 *          File System to create file in
 * @param file
 *          Path to be created
 * @param keyClass
 *          Java Class for key
 * @param valClass
 *          Java Class for value
 * @return output stream over the created sequencefile
 */
public static SequenceFile.Writer createSequenceWriter(JobConf jc, FileSystem fs, Path file,
  Class<?> keyClass, Class<?> valClass, Progressable progressable) throws IOException {
 boolean isCompressed = FileOutputFormat.getCompressOutput(jc);
 return createSequenceWriter(jc, fs, file, keyClass, valClass, isCompressed, progressable);
}

代码示例来源:origin: apache/hive

final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
HiveKey.class, NullWritable.class, isCompressed, progress);

代码示例来源:origin: apache/drill

final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
HiveKey.class, NullWritable.class, isCompressed, progress);

代码示例来源:origin: com.facebook.presto.hive/hive-apache

@Override
public RecordWriter getHiveRecordWriter(JobConf jc, Path finalOutPath,
  Class<? extends Writable> valueClass, boolean isCompressed,
  Properties tableProperties, Progressable progress) throws IOException {
 FileSystem fs = finalOutPath.getFileSystem(jc);
 final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
  BytesWritable.class, valueClass, isCompressed, progress);
 return new PTFRecordWriter(outStream);
}

代码示例来源:origin: org.apache.hadoop.hive/hive-exec

final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc,
  fs, finalOutPath, BytesWritable.class, valueClass, isCompressed);

代码示例来源:origin: com.facebook.presto.hive/hive-apache

final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
BytesWritable.class, valueClass, isCompressed, progress);

代码示例来源:origin: com.facebook.presto.hive/hive-apache

/**
 * Create a sequencefile output stream based on job configuration.
 *
 * @param jc
 *          Job configuration
 * @param fs
 *          File System to create file in
 * @param file
 *          Path to be created
 * @param keyClass
 *          Java Class for key
 * @param valClass
 *          Java Class for value
 * @return output stream over the created sequencefile
 */
public static SequenceFile.Writer createSequenceWriter(JobConf jc, FileSystem fs, Path file,
  Class<?> keyClass, Class<?> valClass, Progressable progressable) throws IOException {
 boolean isCompressed = FileOutputFormat.getCompressOutput(jc);
 return createSequenceWriter(jc, fs, file, keyClass, valClass, isCompressed, progressable);
}

代码示例来源:origin: org.apache.hadoop.hive/hive-exec

/**
 * Create a sequencefile output stream based on job configuration.
 *
 * @param jc
 *          Job configuration
 * @param fs
 *          File System to create file in
 * @param file
 *          Path to be created
 * @param keyClass
 *          Java Class for key
 * @param valClass
 *          Java Class for value
 * @return output stream over the created sequencefile
 */
public static SequenceFile.Writer createSequenceWriter(JobConf jc, FileSystem fs, Path file,
  Class<?> keyClass, Class<?> valClass) throws IOException {
 boolean isCompressed = FileOutputFormat.getCompressOutput(jc);
 return createSequenceWriter(jc, fs, file, keyClass, valClass, isCompressed);
}

代码示例来源:origin: org.apache.hadoop.hive/hive-exec

@Override
public RecordWriter getHiveRecordWriter(JobConf jc, Path finalOutPath,
  Class<? extends Writable> valueClass, boolean isCompressed,
  Properties tableProperties, Progressable progress) throws IOException {
 FileSystem fs = finalOutPath.getFileSystem(jc);
 final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc,
   fs, finalOutPath, HiveKey.class, NullWritable.class, isCompressed);
 keyWritable = new HiveKey();
 keyIsText = valueClass.equals(Text.class);
 return new RecordWriter() {
  public void write(Writable r) throws IOException {
   if (keyIsText) {
    Text text = (Text) r;
    keyWritable.set(text.getBytes(), 0, text.getLength());
   } else {
    BytesWritable bw = (BytesWritable) r;
    // Once we drop support for old Hadoop versions, change these
    // to getBytes() and getLength() to fix the deprecation warnings.
    // Not worth a shim.
    keyWritable.set(bw.get(), 0, bw.getSize());
   }
   keyWritable.setHashCode(r.hashCode());
   outStream.append(keyWritable, NULL_WRITABLE);
  }
  public void close(boolean abort) throws IOException {
   outStream.close();
  }
 };
}

代码示例来源:origin: com.facebook.presto.hive/hive-apache

final SequenceFile.Writer outStream = Utilities.createSequenceWriter(jc, fs, finalOutPath,
HiveKey.class, NullWritable.class, isCompressed, progress);

相关文章

微信公众号

最新文章

更多

Utilities类方法