org.apache.hadoop.io.UTF8.getLength()方法的使用及代码示例

x33g5p2x  于2022-02-01 转载在 其他  
字(3.1k)|赞(0)|评价(0)|浏览(132)

本文整理了Java中org.apache.hadoop.io.UTF8.getLength()方法的一些代码示例,展示了UTF8.getLength()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。UTF8.getLength()方法的具体详情如下:
包路径:org.apache.hadoop.io.UTF8
类名称:UTF8
方法名:getLength

UTF8.getLength介绍

[英]The number of bytes in the encoded string.
[中]编码字符串中的字节数。

代码示例

代码示例来源:origin: org.apache.hadoop/hadoop-common

/** Construct a hash value for a String. */
public static MD5Hash digest(UTF8 utf8) {
 return digest(utf8.getBytes(), 0, utf8.getLength());
}

代码示例来源:origin: elastic/elasticsearch-hadoop

generator.writeUTF8String(utf8.getBytes(), 0, utf8.getLength());

代码示例来源:origin: com.facebook.hadoop/hadoop-core

@SuppressWarnings("deprecation")
public static byte[] readBytes(DataInputStream in) throws IOException {
 UTF8 ustr = TL_DATA.get().U_STR;
 ustr.readFields(in);
 int len = ustr.getLength();
 byte[] bytes = new byte[len];
 System.arraycopy(ustr.getBytes(), 0, bytes, 0, len);
 return bytes;
}

代码示例来源:origin: org.jvnet.hudson.hadoop/hadoop-core

static byte[] readBytes(DataInputStream in) throws IOException {
 U_STR.readFields(in);
 int len = U_STR.getLength();
 byte[] bytes = new byte[len];
 System.arraycopy(U_STR.getBytes(), 0, bytes, 0, len);
 return bytes;
}

代码示例来源:origin: com.github.jiayuhan-it/hadoop-common

/** Construct a hash value for a String. */
public static MD5Hash digest(UTF8 utf8) {
 return digest(utf8.getBytes(), 0, utf8.getLength());
}

代码示例来源:origin: org.jvnet.hudson.hadoop/hadoop-core

/** Construct a hash value for a String. */
public static MD5Hash digest(UTF8 utf8) {
 return digest(utf8.getBytes(), 0, utf8.getLength());
}

代码示例来源:origin: io.hops/hadoop-common

/** Construct a hash value for a String. */
public static MD5Hash digest(UTF8 utf8) {
 return digest(utf8.getBytes(), 0, utf8.getLength());
}

代码示例来源:origin: com.facebook.hadoop/hadoop-core

/** Construct a hash value for a String. */
public static MD5Hash digest(UTF8 utf8) {
 return digest(utf8.getBytes(), 0, utf8.getLength());
}

代码示例来源:origin: ch.cern.hadoop/hadoop-common

/** Construct a hash value for a String. */
public static MD5Hash digest(UTF8 utf8) {
 return digest(utf8.getBytes(), 0, utf8.getLength());
}

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

/** Construct a hash value for a String. */
public static MD5Hash digest(UTF8 utf8) {
 return digest(utf8.getBytes(), 0, utf8.getLength());
}

代码示例来源:origin: com.facebook.hadoop/hadoop-core

/**
 * Reading the path from the image and converting it to byte[][] directly
 * this saves us an array copy and conversions to and from String
 * @param in
 * @return the array each element of which is a byte[] representation 
 *            of a path component
 * @throws IOException
 */
@SuppressWarnings("deprecation")
public static byte[][] readPathComponents(DataInputStream in)
  throws IOException {
 UTF8 ustr = TL_DATA.get().U_STR;
 
 ustr.readFields(in);
 return DFSUtil.bytes2byteArray(ustr.getBytes(),
  ustr.getLength(), (byte) Path.SEPARATOR_CHAR);
}

代码示例来源:origin: org.elasticsearch/elasticsearch-spark

generator.writeUTF8String(utf8.getBytes(), 0, utf8.getLength());

代码示例来源:origin: org.elasticsearch/elasticsearch-hadoop

generator.writeUTF8String(utf8.getBytes(), 0, utf8.getLength());

代码示例来源:origin: org.elasticsearch/elasticsearch-spark-13

generator.writeUTF8String(utf8.getBytes(), 0, utf8.getLength());

代码示例来源:origin: org.elasticsearch/elasticsearch-hadoop-mr

generator.writeUTF8String(utf8.getBytes(), 0, utf8.getLength());

相关文章