android.media.Image.getFormat()方法的使用及代码示例

x33g5p2x  于2022-01-21 转载在 其他  
字(6.8k)|赞(0)|评价(0)|浏览(223)

本文整理了Java中android.media.Image.getFormat()方法的一些代码示例,展示了Image.getFormat()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Image.getFormat()方法的具体详情如下:
包路径:android.media.Image
类名称:Image
方法名:getFormat

Image.getFormat介绍

[英]Get the format for this image. This format determines the number of ByteBuffers needed to represent the image, and the general layout of the pixel data in each in ByteBuffer.

The format is one of the values from android.graphics.ImageFormat. The mapping between the formats and the planes is as follows:

FormatPlane countLayout detailsandroid.graphics.ImageFormat#JPEG1Compressed data, so row and pixel strides are 0. To uncompress, use android.graphics.BitmapFactory#decodeByteArray. android.graphics.ImageFormat#YUV_420_8883A luminance plane followed by the Cb and Cr chroma planes. The chroma planes have half the width and height of the luminance plane (4:2:0 subsampling). Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride.android.graphics.ImageFormat#YUV_422_8883A luminance plane followed by the Cb and Cr chroma planes. The chroma planes have half the width and the full height of the luminance plane (4:2:2 subsampling). Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride.android.graphics.ImageFormat#YUV_444_8883A luminance plane followed by the Cb and Cr chroma planes. The chroma planes have the same width and height as that of the luminance plane (4:4:4 subsampling). Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride.android.graphics.ImageFormat#FLEX_RGB_8883A R (red) plane followed by the G (green) and B (blue) planes. All planes have the same widths and heights. Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride.android.graphics.ImageFormat#FLEX_RGBA_88884A R (red) plane followed by the G (green), B (blue), and A (alpha) planes. All planes have the same widths and heights. Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride.android.graphics.ImageFormat#RAW_SENSOR1A single plane of raw sensor image data, with 16 bits per color sample. The details of the layout need to be queried from the source of the raw sensor data, such as android.hardware.camera2.CameraDevice. android.graphics.ImageFormat#RAW_PRIVATE1A single plane of raw sensor image data of private layout. The details of the layout is implementation specific. Row stride and pixel stride are undefined for this format. Calling Plane#getRowStride()or Plane#getPixelStride() on RAW_PRIVATE image will cause UnSupportedOperationException being thrown.
[中]获取此图像的格式。此格式确定表示图像所需的字节缓冲区数量,以及每个字节缓冲区中像素数据的总体布局。
该格式是来自android的值之一。图样图像格式。格式和平面之间的映射如下所示:
FormatPlane countLayout Details和Roid。图样ImageFormat#JPEG 1压缩数据,因此行和像素步长为0。要解压缩,请使用android。图样BitmapFactory#decodeByteArray。安卓图样ImageFormat#YUV_420_8883; a亮度平面,然后是Cb和Cr色度平面。色度平面的宽度和高度为亮度平面的一半(4:2:0子采样)。每个平面中的每个像素采样有8位。每个平面都有自己的行步幅和像素步幅。安卓图样ImageFormat#YUV_422_8883A亮度平面,然后是Cb和Cr色度平面。色度平面的宽度和高度为亮度平面的一半(4:2:2子采样)。每个平面中的每个像素采样有8位。每个平面都有自己的行步幅和像素步幅。安卓图样ImageFormat#YUV_444_8883A亮度平面,然后是Cb和Cr色度平面。色度平面具有与亮度平面相同的宽度和高度(4:4:4子采样)。每个平面中的每个像素采样有8位。每个平面都有自己的行步幅和像素步幅。安卓图样ImageFormat#FLEX#RGB⊳A R(红色)平面,后跟G(绿色)和B(蓝色)平面。所有平面的宽度和高度都相同。每个平面中的每个像素采样有8位。每个平面都有自己的行步幅和像素步幅。安卓图样ImageFormat#FLEX_RGBA_8884A R(红色)平面,后跟G(绿色)、B(蓝色)和A(阿尔法)平面。所有平面的宽度和高度都相同。每个平面中的每个像素采样有8位。每个平面都有自己的行步幅和像素步幅。安卓图样ImageFormat#RAW#U SENSOR1A原始传感器图像数据的单个平面,每个颜色样本16位。布局的细节需要从原始传感器数据源(如android)进行查询。硬件摄影机2。摄影器材。安卓图样ImageFormat#RAW#U PRIVATE1A专用布局的单个原始传感器图像数据平面。布局的细节是特定于实现的。此格式未定义行步幅和像素步幅。在原始私有图像上调用Plane#getRowStride()或Plane#getPixelStride()将导致引发UnSupportedOperationException。

代码示例

代码示例来源:origin: zhantong/Android-VideoToImages

private static boolean isImageFormatSupported(Image image) {
  int format = image.getFormat();
  switch (format) {
    case ImageFormat.YUV_420_888:
    case ImageFormat.NV21:
    case ImageFormat.YV12:
      return true;
  }
  return false;
}

代码示例来源:origin: zhantong/Android-VideoToImages

throw new RuntimeException("can't convert Image to byte array, format " + image.getFormat());
int format = image.getFormat();
int width = crop.width();
int height = crop.height();

代码示例来源:origin: googlesamples/android-Camera2Raw

@Override
public void run() {
  boolean success = false;
  int format = mImage.getFormat();
  switch (format) {
    case ImageFormat.JPEG: {

代码示例来源:origin: gqjjqg/android-extend

size += buffer.remaining();
CameraFrameData data = new CameraFrameData(image.getWidth(), image.getHeight(), image.getFormat(), size);
      bytes, image.getWidth(), image.getHeight(), image.getFormat(), image.getTimestamp());
  data.setParams(param);

代码示例来源:origin: org.boofcv/boofcv-android

public static void imageToBoof(Image yuv, ColorFormat colorOutput, ImageBase output, byte[] work) {
    if( BOverrideConvertAndroid.invokeYuv420ToBoof(yuv,colorOutput,output,work))
      return;

    if(ImageFormat.YUV_420_888 != yuv.getFormat() )
      throw new RuntimeException("Unexpected format");

    Image.Plane planes[] = yuv.getPlanes();

    ByteBuffer bufferY = planes[0].getBuffer();
    ByteBuffer bufferU = planes[2].getBuffer();
    ByteBuffer bufferV = planes[1].getBuffer();

    int width = yuv.getWidth();
    int height = yuv.getHeight();

    int strideY = planes[0].getRowStride();
    int strideUV = planes[1].getRowStride();
    int stridePixelUV = planes[1].getPixelStride();

    ConvertYuv420_888.yuvToBoof(
        bufferY,bufferU,bufferV,
        width,height,strideY,strideUV,stridePixelUV,
        colorOutput,output,work);
  }
}

代码示例来源:origin: DuckDeck/AndroidDemo

@Override
  public void onImageAvailable(ImageReader reader) {
    Image image = reader.acquireLatestImage();
    if (image == null)
      return;
    // sanity checks - 3 planes
    Image.Plane[] planes = image.getPlanes();
    assert (planes.length == 3);
    assert (image.getFormat() == mPreviewFormat);
    // see also https://developer.android.com/reference/android/graphics/ImageFormat.html#YUV_420_888
    // Y plane (0) non-interleaved => stride == 1; U/V plane interleaved => stride == 2
    assert (planes[0].getPixelStride() == 1);
    assert (planes[1].getPixelStride() == 2);
    assert (planes[2].getPixelStride() == 2);
    ByteBuffer y_plane = planes[0].getBuffer();
    ByteBuffer uv_plane = planes[1].getBuffer();
    Mat y_mat = new Mat(h, w, CvType.CV_8UC1, y_plane);
    Mat uv_mat = new Mat(h / 2, w / 2, CvType.CV_8UC2, uv_plane);
    JavaCamera2Frame tempFrame = new JavaCamera2Frame(y_mat, uv_mat, w, h);
    deliverAndDrawFrame(tempFrame);
    tempFrame.release();
    image.close();
  }
}, mBackgroundHandler);

代码示例来源:origin: leadrien/opencv_native_androidstudio

@Override
  public void onImageAvailable(ImageReader reader) {
    Image image = reader.acquireLatestImage();
    if (image == null)
      return;
    // sanity checks - 3 planes
    Image.Plane[] planes = image.getPlanes();
    assert (planes.length == 3);
    assert (image.getFormat() == mPreviewFormat);
    // see also https://developer.android.com/reference/android/graphics/ImageFormat.html#YUV_420_888
    // Y plane (0) non-interleaved => stride == 1; U/V plane interleaved => stride == 2
    assert (planes[0].getPixelStride() == 1);
    assert (planes[1].getPixelStride() == 2);
    assert (planes[2].getPixelStride() == 2);
    ByteBuffer y_plane = planes[0].getBuffer();
    ByteBuffer uv_plane = planes[1].getBuffer();
    Mat y_mat = new Mat(h, w, CvType.CV_8UC1, y_plane);
    Mat uv_mat = new Mat(h / 2, w / 2, CvType.CV_8UC2, uv_plane);
    JavaCamera2Frame tempFrame = new JavaCamera2Frame(y_mat, uv_mat, w, h);
    deliverAndDrawFrame(tempFrame);
    tempFrame.release();
    image.close();
  }
}, mBackgroundHandler);

相关文章

微信公众号

最新文章

更多