本文介绍了Objective-C简单的方式来拍摄没有相机界面的照片。只是从相机得到一个图片,并保存到一个文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我找不到一个简单的方法来拍摄没有相机界面的照片。
我只需要从相机获取图片并将其保存到文件。

I can't find a simple way of taking a photo without a camera interface. I just need to get a picture from the camera and save it to a file.

推荐答案

拍照与正面照相机。不是所有的代码都是我的,但我没有找到一个链接到原始来源。此代码还产生快门声音。图像质量不是很好(非常暗),所以代码需要一两个调整。

I used this code to take a photo with frontal camera. Not all code is mine but I didn't find a link to original source. This code also produces a shutter sound. Image quality is not very good (it's quite dark) so code needs a tweak or two.

-(void) takePhoto 
{
    AVCaptureDevice *frontalCamera;

    NSArray *allCameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for ( int i = 0; i < allCameras.count; i++ )
    {
        AVCaptureDevice *camera = [allCameras objectAtIndex:i];

        if ( camera.position == AVCaptureDevicePositionFront )
        {
            frontalCamera = camera;
        }
    }

    if ( frontalCamera != nil )
    {
        photoSession = [[AVCaptureSession alloc] init];

        NSError *error;
        AVCaptureDeviceInput *input =
        [AVCaptureDeviceInput deviceInputWithDevice:frontalCamera error:&error];

        if ( !error && [photoSession canAddInput:input] )
        {
            [photoSession addInput:input];

            AVCaptureStillImageOutput *output = [[AVCaptureStillImageOutput alloc] init];

            [output setOutputSettings:
             [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];

            if ( [photoSession canAddOutput:output] )
            {
                [photoSession addOutput:output];

                AVCaptureConnection *videoConnection = nil;

                for (AVCaptureConnection *connection in output.connections)
                {
                    for (AVCaptureInputPort *port in [connection inputPorts])
                    {
                        if ([[port mediaType] isEqual:AVMediaTypeVideo] )
                        {
                            videoConnection = connection;
                            break;
                        }
                    }
                    if (videoConnection) { break; }
                }

                if ( videoConnection )
                {
                    [photoSession startRunning];

                    [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                                        completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

                        if (imageDataSampleBuffer != NULL)
                        {
                            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                            UIImage *photo = [[UIImage alloc] initWithData:imageData];
                            [self processImage:photo]; //this is a custom method
                        }
                    }];
                }
            }
        }
    }
}


$ b b

photoSession 是一个 AVCaptureSession * ivar的类,持有 takePhoto 方法。

photoSession is an AVCaptureSession * ivar of the class holding the takePhoto method.

EDIT(tweak):如果更改 if(videoConnection)

EDIT (tweak): If you change the if ( videoConnection ) block to the code below you will add 1 second delay and get a good image.

if ( videoConnection )
{
    [photoSession startRunning];

    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC);
    dispatch_after(popTime, dispatch_get_main_queue(), ^(void){

        [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                            completionHandler:^(CMSampleBufferRefimageDataSampleBuffer, NSError *error) {

            if (imageDataSampleBuffer != NULL)
            {
                NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *photo = [[UIImage alloc] initWithData:imageData];
                [self processImage:photo];
            }
        }];
    });
}

如果您的应用程序不能接受滞后,您可以将代码分成两部分并在 viewDidAppear (或类似的地方)启动 photoSession ,并且只需在需要时立即拍摄快照

If lag is not acceptable for you application you could split the code in two parts and start the photoSession at viewDidAppear (or somewhere similar) and simply take an immediate snapshot whenever needed - usually after some user interaction.

dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 0.25 * NSEC_PER_SEC);

也产生了良好的结果 - 所以不需要一个完整的第二个滞后。

also produces a good result - so there is no need for a whole second lag.

请注意,这段代码是用正面照相机拍摄照片 - 如果你需要使用背面照相机,我相信你会知道如何修复。

Note that this code is written to take a photo with frontal camera - I'm sure you will know how to mend it if you need to use back camera.

这篇关于Objective-C简单的方式来拍摄没有相机界面的照片。只是从相机得到一个图片,并保存到一个文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-24 20:54