本文介绍了可可捕获框架从网络摄像头的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在实施一个软件来从网络摄像头捕获视频。我看过MyRecorder示例在Apple Dev和它工作正常。



我试图添加一个按钮,使用此代码从视频中拍摄快照:

   - (IBAction)addFrame:(id)sender 
{
CVImageBufferRef imageBuffer;
@synchronized(self){
imageBuffer = CVBufferRetain(mCurrentImageBuffer);
}
if(imageBuffer){
[bla bla bla]
}
}

但mCurrentImageBuffer总是为空。如何从我的网络摄像头获取当前帧,并放上mCurrentImageBuffer?



我试图使用

 (void)captureOutput:(QTCaptureOutput *)captureOutput 
didOutputVideoFrame:(CVImageBufferRef)videoFrame
withSampleBuffer:(QTSampleBuffer *)sampleBuffer
fromConnection:(QTCaptureConnection *)连接
{
CVImageBufferRef imageBufferToRelease;

CVBufferRetain(videoFrame);

@synchronized(self){
imageBufferToRelease = mCurrentImageBuffer;
mCurrentImageBuffer = videoFrame;
}
CVBufferRelease(imageBufferToRelease);
}

但它从未被调用。我如何决定什么时候调用captureOutput委托方法?
任何想法?



感谢,
Andrea

解决方案>

您似乎尝试使用QTKit Capture API从网络摄像头捕获视频。 MyRecorder示例应用程序是使用此API几乎是最简单的功能视频捕获程序。从您的描述中不清楚,但您需要确保您遵循他们的示例,并以与在 -awakeFromNib 中相同的方式初始化您的视频会话,方法在MyRecorderController。



对于您尝试使用的方法, - 如果没有, captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection: QTCaptureDecompressedVideoOutput 的委托方法。此类的实例不存在于MyRecorder示例中,因为该示例仅将压缩视频记录到磁盘。要使用它,您需要创建一个 QTCaptureDecompressedVideoOutput 的实例,使用<$ c将它附加到您的 QTCaptureSession $ c> -addOutput:error:,并将 QTCaptureDecompressedVideoOutput 实例的委托设置为您的类。



有关QTKit如何处理此类事件的详细信息,请参阅 /mac/library/documentation/Cocoa/Conceptual/QTKitApplicationProgrammingGuide/Introduction/Introduction.html#//apple_ref/doc/uid/TP40008156-CH1-SW1rel =nofollow noreferrer> QTKit应用程序编程指南。 / p>

I'm implementing a software to caputre video from webcam. I've seen MyRecorder sample in Apple Dev and it works fine.

I've tried to add a button to take a snapshot from video with this code:

- (IBAction)addFrame:(id)sender
{
    CVImageBufferRef imageBuffer;
    @synchronized (self) {
        imageBuffer = CVBufferRetain(mCurrentImageBuffer);
    }
    if (imageBuffer) {
    [ bla bla bla ]
    }
}

but mCurrentImageBuffer is always empty. How can I take current frame from my webcam and put on mCurrentImageBuffer?

I've tried to use

(void)captureOutput:(QTCaptureOutput *)captureOutput
                    didOutputVideoFrame:(CVImageBufferRef)videoFrame
                    withSampleBuffer:(QTSampleBuffer *)sampleBuffer
                    fromConnection:(QTCaptureConnection *)connection
{
    CVImageBufferRef imageBufferToRelease;

    CVBufferRetain(videoFrame);

    @synchronized (self) {
        imageBufferToRelease = mCurrentImageBuffer;
        mCurrentImageBuffer = videoFrame;
    }
    CVBufferRelease(imageBufferToRelease);
}

but it's never called. How can I decide when call captureOutput delegate method?Any idea?

thanks,Andrea

解决方案

It looks like you're trying to use the QTKit Capture API for capturing video from your webcam. The MyRecorder sample application is pretty much the simplest functioning video capture program you can make using this API. It wasn't clear from your description, but you need to make sure that you follow their example, and initialize your video session in the same manner as they do in the -awakeFromNib method within MyRecorderController. If you don't, you won't get any video being captured.

As far as the method you're trying to use, -captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection: is a delegate method for QTCaptureDecompressedVideoOutput. An instance of this class is not present in the MyRecorder sample, because that sample only records compressed video to disk. To use this, you'll need to create an instance of QTCaptureDecompressedVideoOutput, attach it to your QTCaptureSession using -addOutput:error:, and set the delegate for the QTCaptureDecompressedVideoOutput instance to be your class.

For more information on how QTKit handles this sort of thing, you can consult the QTKit Capture section of the QTKit Application Programming Guide.

这篇关于可可捕获框架从网络摄像头的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-04 22:53