我正在尝试将简单的Core Image Filter应用于实时摄像头输入。我认为我的代码还可以,但是在captureOutput方法中使用方法drawImage:inRect:fromRect会导致EXC_BAD_ACCESS[__NSCFNumber drawImage:inRect:fromRect:]: unrecognized选择器,这使我认为在尝试对其调用drawImage时,我的上下文已被释放。这对我来说没有意义,因为我的CIContext是班级成员。

这个问题似乎不是来自OpenGL,因为我尝试了一个简单的上下文(不是从EAGLContext创建的),并且遇到了同样的问题。

我正在使用ios 6的iphone 5上进行测试,因为相机无法在模拟器上正常工作。

你能帮我吗?非常感谢您的宝贵时间

我有我的.h文件:

<!-- language: c# -->

    //  CameraController.h

    #import <UIKit/UIKit.h>
    #import <OpenGLES/EAGL.h>
    #import <AVFoundation/AVFoundation.h>
    #import <GLKit/GLKit.h>
    #import <CoreMedia/CoreMedia.h>
    #import <CoreVideo/CoreVideo.h>
    #import <QuartzCore/QuartzCore.h>
    #import <CoreImage/CoreImage.h>
    #import <ImageIO/ImageIO.h>

    @interface CameraController : GLKViewController <AVCaptureVideoDataOutputSampleBufferDelegate>{

        AVCaptureSession *avCaptureSession;
        CIContext *coreImageContext;
        CIContext *ciTestContext;
        GLuint _renderBuffer;
        EAGLContext *glContext;
    }

    @end


和我的.m文件

<!-- language: c# -->

    //  CameraController.m

    #import "CameraController.h"

    @interface CameraController ()

    @end

    @implementation CameraController

    - (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
    {
        self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
        if (self) {

        }
        return self;
    }

    - (void)viewDidLoad
    {
        [super viewDidLoad];

        // Initialize Open GL ES2 Context
        glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
        if (!glContext) {
            NSLog(@"Failed to create ES context");
        }
        [EAGLContext setCurrentContext:nil];

        // Gets the GL View and sets the depth format to 24 bits, and the context of the view to be the Open GL context created above
        GLKView *view = (GLKView *)self.view;
        view.context = glContext;
        view.drawableDepthFormat = GLKViewDrawableDepthFormat24;

        // Creates CI Context from  EAGLContext
        NSMutableDictionary *options = [[NSMutableDictionary alloc] init];
        [options setObject: [NSNull null] forKey: kCIContextWorkingColorSpace];
        coreImageContext = [CIContext contextWithEAGLContext:glContext options:options];

        glGenRenderbuffers(1, &_renderBuffer);
        glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);

        // Initialize Video Capture Device
        NSError *error;
        AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

        // Initialize Video Output object and set output settings
        AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];

        [dataOutput setAlwaysDiscardsLateVideoFrames:YES];
        [dataOutput setVideoSettings:[NSDictionary  dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                                                  forKey:(id)kCVPixelBufferPixelFormatTypeKey]];


        // Delegates the SampleBuffer to the current object which implements the AVCaptureVideoDataOutputSampleBufferDelegate interface via the captureOutput method
        [dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

        // Initialize the capture session, add input, output, start urnning
        avCaptureSession = [[AVCaptureSession alloc] init];
        [avCaptureSession beginConfiguration];
        [avCaptureSession setSessionPreset:AVCaptureSessionPreset1280x720];
        [avCaptureSession addInput:input];
        [avCaptureSession addOutput:dataOutput];
        [avCaptureSession commitConfiguration];
        [avCaptureSession startRunning];


    }

    -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

        // Creates a CIImage from the sample buffer of the camera frame
        CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
        CIImage *inputImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];

        // Creates the relevant filter
        CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone"];
        [filter setValue:inputImage forKey:kCIInputImageKey];
        [filter setValue:[NSNumber numberWithFloat:0.8f] forKey:@"InputIntensity"];

        // Creates a reference to the output of the filter
        CIImage *result = [filter valueForKey:kCIOutputImageKey];

        // Draw to the context
        [coreImageContext drawImage:result inRect:[result extent] fromRect:[result extent]]; // 5

        [glContext presentRenderbuffer:GL_RENDERBUFFER];
    }

    - (void)didReceiveMemoryWarning
    {
        [super didReceiveMemoryWarning];
        // Dispose of any resources that can be recreated.
    }


    @end

最佳答案

在viewDidLoad方法中,您具有:

coreImageContext = [CIContext contextWithEAGLContext:glContext options:options];


如果要在captureOutput方法中使用coreImageContext,则需要保留它。

关于ios - CIContext drawImage导致EXC_BAD_ACCESS-iOS 6,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/16843093/

10-12 04:01