org.apache.hadoop.io.file.tfile.Utils.readVLong()方法的使用及代码示例

x33g5p2x  于2022-02-01 转载在 其他  
字(5.8k)|赞(0)|评价(0)|浏览(83)

本文整理了Java中org.apache.hadoop.io.file.tfile.Utils.readVLong()方法的一些代码示例,展示了Utils.readVLong()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Utils.readVLong()方法的具体详情如下:
包路径:org.apache.hadoop.io.file.tfile.Utils
类名称:Utils
方法名:readVLong

Utils.readVLong介绍

[英]Decoding the variable-length integer. Suppose the value of the first byte is FB, and the following bytes are NB[*].

  • if (FB >= -32), return (long)FB;
  • if (FB in [-72, -33]), return (FB+52)<if (FB in [-104, -73]), return (FB+88)<if (FB in [-120, -105]), return (FB+112)<if (FB in [-128, -121]), return interpret NB[FB+129] as a signed big-endian integer.
    [中]解码可变长度整数。假设第一个字节的值是FB,下面的字节是NB[*]。
    *如果(FB>=-32),返回(长)FB;
    *if(FB in[-72,-33]),return(FB+52)<if(FB in[-104,-73]),return(FB+88)<if(FB in[-120,-105]),return(FB+112)<if(FB in[-128,-121]),return将NB[FB+129]解释为带符号的大端整数。

代码示例

代码示例来源:origin: org.apache.hadoop/hadoop-common

public BlockRegion(DataInput in) throws IOException {
 offset = Utils.readVLong(in);
 compressedSize = Utils.readVLong(in);
 rawSize = Utils.readVLong(in);
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Decoding the variable-length integer. Synonymous to
 * <code>(int)Utils#readVLong(in)</code>.
 * 
 * @param in
 *          input stream
 * @return the decoded integer
 * @throws IOException
 * 
 * @see Utils#readVLong(DataInput)
 */
public static int readVInt(DataInput in) throws IOException {
 long ret = readVLong(in);
 if ((ret > Integer.MAX_VALUE) || (ret < Integer.MIN_VALUE)) {
  throw new RuntimeException(
    "Number too large to be represented as Integer");
 }
 return (int) ret;
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

public TFileIndexEntry(DataInput in) throws IOException {
 int len = Utils.readVInt(in);
 key = new byte[len];
 in.readFully(key, 0, len);
 kvEntries = Utils.readVLong(in);
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

public TFileMeta(DataInput in) throws IOException {
 version = new Version(in);
 if (!version.compatibleWith(TFile.API_VERSION)) {
  throw new RuntimeException("Incompatible TFile fileVersion.");
 }
 recordCount = Utils.readVLong(in);
 strComparator = Utils.readString(in);
 comparator = makeComparator(strComparator);
}

代码示例来源:origin: io.hops/hadoop-common

public BlockRegion(DataInput in) throws IOException {
 offset = Utils.readVLong(in);
 compressedSize = Utils.readVLong(in);
 rawSize = Utils.readVLong(in);
}

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

public BlockRegion(DataInput in) throws IOException {
 offset = Utils.readVLong(in);
 compressedSize = Utils.readVLong(in);
 rawSize = Utils.readVLong(in);
}

代码示例来源:origin: com.github.jiayuhan-it/hadoop-common

public BlockRegion(DataInput in) throws IOException {
 offset = Utils.readVLong(in);
 compressedSize = Utils.readVLong(in);
 rawSize = Utils.readVLong(in);
}

代码示例来源:origin: ch.cern.hadoop/hadoop-common

public BlockRegion(DataInput in) throws IOException {
 offset = Utils.readVLong(in);
 compressedSize = Utils.readVLong(in);
 rawSize = Utils.readVLong(in);
}

代码示例来源:origin: org.apache.apex/malhar-library

public BlockRegion(DataInput in) throws IOException {
 offset = Utils.readVLong(in);
 compressedSize = Utils.readVLong(in);
 rawSize = Utils.readVLong(in);
}

代码示例来源:origin: org.apache.apex/malhar-library

public TFileIndexEntry(DataInput in) throws IOException {
 int len = Utils.readVInt(in);
 key = new byte[len];
 in.readFully(key, 0, len);
 kvEntries = Utils.readVLong(in);
}

代码示例来源:origin: ch.cern.hadoop/hadoop-common

public TFileIndexEntry(DataInput in) throws IOException {
 int len = Utils.readVInt(in);
 key = new byte[len];
 in.readFully(key, 0, len);
 kvEntries = Utils.readVLong(in);
}

代码示例来源:origin: io.hops/hadoop-common

public TFileIndexEntry(DataInput in) throws IOException {
 int len = Utils.readVInt(in);
 key = new byte[len];
 in.readFully(key, 0, len);
 kvEntries = Utils.readVLong(in);
}

代码示例来源:origin: com.facebook.hadoop/hadoop-core

public TFileIndexEntry(DataInput in) throws IOException {
 int len = Utils.readVInt(in);
 key = new byte[len];
 in.readFully(key, 0, len);
 kvEntries = Utils.readVLong(in);
}

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

public TFileIndexEntry(DataInput in) throws IOException {
 int len = Utils.readVInt(in);
 key = new byte[len];
 in.readFully(key, 0, len);
 kvEntries = Utils.readVLong(in);
}

代码示例来源:origin: com.github.jiayuhan-it/hadoop-common

public TFileIndexEntry(DataInput in) throws IOException {
 int len = Utils.readVInt(in);
 key = new byte[len];
 in.readFully(key, 0, len);
 kvEntries = Utils.readVLong(in);
}

代码示例来源:origin: io.hops/hadoop-common

public TFileMeta(DataInput in) throws IOException {
 version = new Version(in);
 if (!version.compatibleWith(TFile.API_VERSION)) {
  throw new RuntimeException("Incompatible TFile fileVersion.");
 }
 recordCount = Utils.readVLong(in);
 strComparator = Utils.readString(in);
 comparator = makeComparator(strComparator);
}

代码示例来源:origin: org.apache.apex/malhar-library

public TFileMeta(DataInput in) throws IOException {
 version = new Version(in);
 if (!version.compatibleWith(DTFile.API_VERSION)) {
  throw new RuntimeException("Incompatible TFile fileVersion.");
 }
 recordCount = Utils.readVLong(in);
 strComparator = Utils.readString(in);
 comparator = makeComparator(strComparator);
}

代码示例来源:origin: com.github.jiayuhan-it/hadoop-common

public TFileMeta(DataInput in) throws IOException {
 version = new Version(in);
 if (!version.compatibleWith(TFile.API_VERSION)) {
  throw new RuntimeException("Incompatible TFile fileVersion.");
 }
 recordCount = Utils.readVLong(in);
 strComparator = Utils.readString(in);
 comparator = makeComparator(strComparator);
}

代码示例来源:origin: com.facebook.hadoop/hadoop-core

public TFileMeta(DataInput in) throws IOException {
 version = new Version(in);
 if (!version.compatibleWith(TFile.API_VERSION)) {
  throw new RuntimeException("Incompatible TFile fileVersion.");
 }
 recordCount = Utils.readVLong(in);
 strComparator = Utils.readString(in);
 comparator = makeComparator(strComparator);
}

代码示例来源:origin: ch.cern.hadoop/hadoop-common

public TFileMeta(DataInput in) throws IOException {
 version = new Version(in);
 if (!version.compatibleWith(TFile.API_VERSION)) {
  throw new RuntimeException("Incompatible TFile fileVersion.");
 }
 recordCount = Utils.readVLong(in);
 strComparator = Utils.readString(in);
 comparator = makeComparator(strComparator);
}

相关文章