本文介绍了如何使用ios中的libyuv库将kCVPixelFormatType_420YpCbCr8BiPlanarFullRange缓冲区转换为YUV420?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用AVFoundation捕获视频。我已设置(视频设置)并获取outputsamplebuffer kCVPixelFormatType_420YpCbCr8BiPlanarFullRange格式。但我需要YUV420格式进行进一步处理。

i have captured video using AVFoundation .i have set (video setting )and get in outputsamplebuffer kCVPixelFormatType_420YpCbCr8BiPlanarFullRange format. But i need YUV420 format for further processing.

为此我使用libyuv框架。

For that i use libyuv framework.

LIBYUV_API
int NV12ToI420(const uint8* src_y, int src_stride_y,
           const uint8* src_uv, int src_stride_uv,
           uint8* dst_y, int dst_stride_y,
           uint8* dst_u, int dst_stride_u,
           uint8* dst_v, int dst_stride_v,
           int width, int height);

 libyuv::NV12ToI420(src_yplane, inWidth ,
                   src_uvplane, inWidth,
                   dst_yplane, inWidth,
                   dst_vplane, inWidth / 2,
                   dst_uplane, inWidth / 2,
                   inWidth,  inHeight);

但我得到的输出缓冲区是全绿色?我为这个过程做了任何错误请帮助我吗?

But i am getting output buffer is full green color? i done any mistake for that process pls help me?

推荐答案

在我从AVCaptureSession(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)获取原始视频帧之后,我在captureOutput中在iOS上执行此操作:

Here is how I do it on iOS in my captureOutput after I get a raw video frame from AVCaptureSession(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange):

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{

    CVImageBufferRef videoFrame = CMSampleBufferGetImageBuffer(sampleBuffer);

    CFRetain(sampleBuffer);

    CVPixelBufferLockBaseAddress(videoFrame, 0);
    size_t _width = CVPixelBufferGetWidth(videoFrame);
    size_t _height = CVPixelBufferGetHeight(videoFrame);

    const uint8* plane1 = (uint8*)CVPixelBufferGetBaseAddressOfPlane(videoFrame,0);
    const uint8* plane2 = (uint8*)CVPixelBufferGetBaseAddressOfPlane(videoFrame,1);
    size_t plane1_stride = CVPixelBufferGetBytesPerRowOfPlane (videoFrame, 0);
    size_t plane2_stride = CVPixelBufferGetBytesPerRowOfPlane (videoFrame, 1);

    size_t plane1_size = plane1_stride * CVPixelBufferGetHeightOfPlane(videoFrame, 0);
    size_t plane2_size = CVPixelBufferGetBytesPerRowOfPlane (videoFrame, 1) * CVPixelBufferGetHeightOfPlane(videoFrame, 1);

    size_t frame_size = plane1_size + plane2_size;

    uint8* buffer = new uint8[ frame_size ];
    uint8* dst_u = buffer + plane1_size;
    uint8* dst_v = dst_u + plane1_size/4;

    // Let libyuv convert
    libyuv::NV12ToI420(/*const uint8* src_y=*/plane1, /*int src_stride_y=*/plane1_stride,
                /*const uint8* src_uv=*/plane2, /*int src_stride_uv=*/plane2_stride,
                   /*uint8* dst_y=*/buffer, /*int dst_stride_y=*/plane1_stride,
                   /*uint8* dst_u=*/dst_u, /*int dst_stride_u=*/plane2_stride/2,
                   /*uint8* dst_v=*/dst_v, /*int dst_stride_v=*/plane2_stride/2,
                   _width, _height);

    CVPixelBufferUnlockBaseAddress(videoFrame, 0);
    CFRelease( sampleBuffer)

    // TODO: call your method here with 'buffer' variable. note that you need to deallocated the buffer after using it
  }

我将代码设为a为清晰起见,描述性更强。

I made the code a bit more descriptive for clarity.

这篇关于如何使用ios中的libyuv库将kCVPixelFormatType_420YpCbCr8BiPlanarFullRange缓冲区转换为YUV420?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-18 00:23