本文介绍了AVPlayer播放视频合成结果不正确的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要一件简单的事情:在旋转并在其上应用CIFilter时播放视频.

I need a simple thing: play a video while rotating and applying CIFilter on it.

首先,我创建播放器项目:

First, I create the player item:

AVPlayerItem *item = [AVPlayerItem playerItemWithURL:videoURL];

// DEBUG LOGGING
AVAssetTrack *track = [[item.asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSLog(@"Natural size is: %@", NSStringFromCGSize(track.naturalSize));
NSLog(@"Preffered track transform is: %@", NSStringFromCGAffineTransform(track.preferredTransform));
NSLog(@"Preffered asset transform is: %@", NSStringFromCGAffineTransform(item.asset.preferredTransform));

然后我需要应用视频合成.最初,我想创建一个包含2条指令的AVVideoComposition-一个将成为旋转的AVVideoCompositionLayerInstruction,另一个将成为CIFilter应用程序.但是,我抛出一个异常,说期望视频组成仅包含AVCoreImageFilterVideoCompositionInstruction" ,这意味着Apple不允许将这2条指令组合在一起.结果,我在过滤条件下将两者结合在一起,下面是代码:

Then I need to apply the video composition. Originally, I was thinking to create an AVVideoComposition with 2 instructions - one will be the AVVideoCompositionLayerInstruction for rotation and the other one will be CIFilter application. However, I got an exception thrown saying "Expecting video composition to contain only AVCoreImageFilterVideoCompositionInstruction" which means Apple doesn't allow to combine those 2 instructions. As a result, I combined both under the filtering, here is the code:

AVAsset *asset = playerItem.asset;
CGAffineTransform rotation = [self transformForItem:playerItem];

AVVideoComposition *composition = [AVVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest * _Nonnull request) {
    // Step 1: get the input frame image (screenshot 1)
    CIImage *sourceImage = request.sourceImage;

    // Step 2: rotate the frame
    CIFilter *transformFilter = [CIFilter filterWithName:@"CIAffineTransform"];
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey];
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: rotation] forKey: kCIInputTransformKey];
    sourceImage = transformFilter.outputImage;
    CGRect extent = sourceImage.extent;
    CGAffineTransform translation = CGAffineTransformMakeTranslation(-extent.origin.x, -extent.origin.y);
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey];
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: translation] forKey: kCIInputTransformKey];
    sourceImage = transformFilter.outputImage;

    // Step 3: apply the custom filter chosen by the user
    extent = sourceImage.extent;
    sourceImage = [sourceImage imageByClampingToExtent];
    [filter setValue:sourceImage forKey:kCIInputImageKey];
    sourceImage = filter.outputImage;
    sourceImage = [sourceImage imageByCroppingToRect:extent];

    // Step 4: finish processing the frame (screenshot 2)
    [request finishWithImage:sourceImage context:nil];
}];

playerItem.videoComposition = composition;

我在调试过程中截取的屏幕截图显示,图像已成功旋转并且已应用滤镜(在本示例中,它是一个不会更改图像的身份滤镜).这是截图1和截图2,它们是在上面的注释中标记的点上拍摄的:

The screenshots I made during debugging show that the image is successfully rotated and the filter is applied (in this example it was an identity filter which doesn't change the image). Here are the screenshot 1 and screenshot 2 which were taken at the points marked in the comments above:

如您所见,旋转成功,结果框架的范围也正确.

As you can see, the rotation is successful, the extent of the resulting frame was also correct.

当我尝试在播放器中播放此视频时,问题开始了.这是我得到的:

The problem starts when I try to play this video in a player. Here is what I get:

因此,似乎所有帧都按比例缩放和下移.绿色区域是空白的框架信息,当我限制范围以使框架无限大小时,它将显示边框像素而不是绿色.我觉得播放器在从AVPlayerItem旋转之前仍需要一些旧的尺寸信息,这就是为什么在上面的第一个代码段中,我记录了尺寸和变换,其中有日志:

So seems like all the frames are scaled and shifted down. The green area is the empty frame info, when I clamp to extent to make frame infinite size it shows border pixels instead of green. I have a feeling that the player still takes some old size info before rotation from the AVPlayerItem, that's why in the first code snippet above I was logging the sizes and transforms, there are the logs:

Natural size is: {1920, 1080}
Preffered track transform is: [0, 1, -1, 0, 1080, 0]
Preffered asset transform is: [1, 0, 0, 1, 0, 0]

播放器的设置如下:

layer.videoGravity = AVLayerVideoGravityResizeAspectFill;
layer.needsDisplayOnBoundsChange = YES;

请注意最重要的事情::这仅适用于由应用本身使用相机横向 iPhone [6s]录制并保存在设备上的视频以前存储.该应用在纵向模式下录制的视频是完全可以的(顺便说一句,纵向视频的大小完全相同,并且可以像横向视频一样转换日志!奇怪的是……也许iphone将旋转信息放入了视频中并进行了修复).因此,缩放和移动视频似乎是纵横比填充"和旋转之前的旧分辨率信息的结合.顺便说一句,由于缩放以填充具有不同纵横比的播放器区域而部分显示了纵向视频帧,但这是预期的行为.

PLEASE NOTE the most important thing: this only happens to videos which were recorded by the app itself using camera in landscape iPhone[6s] orientation and saved on the device storage previously. The videos that the app records in portrait mode are totally fine (by the way, the portrait videos got exactly the same size and transform log like landscape videos! strange...maybe iphone puts the rotation info in the video and fixes it). So zooming and shifting the video seems like a combination of "aspect fill" and old resolution info before rotation. By the way, the portrait video frames are shown partially because of scaling to fill the player area which has a different aspect ratio, but this is expected behavior.

让我知道您对此的想法,并且,如果您知道一种更好的方法来完成我所需要的,那真是太好了.

Let me know your thoughts on this and, if you know a better way how to accomplish what I need, then it would be great to know.

推荐答案

更新:出现了一种在回放过程中更改" AVPlayerItem视频尺寸的简便方法-设置renderSize视频合成的属性(可以使用AVMutableVideoComposition类完成).

UPDATE: There comes out to be an easier way to "change" the AVPlayerItem video dimensions during playback - set the renderSize property of video composition (can be done using AVMutableVideoComposition class).

下面的我的旧答案:

经过大量调试,我了解了问题并找到了解决方案.我最初的猜测是, AVPlayer仍认为视频是原始大小是正确的.在下面的图像中,它解释了正在发生的事情:

After a lot of debugging I understood the problem and found a solution. My initial guess that AVPlayer still considers the video being of the original size was correct. In the image below it is explained what was happening:

至于解决方案,我找不到改变AVAssetAVPlayerItem内部视频大小的方法.因此,我只是操纵视频以使其符合AVPlayer的预期大小和比例,然后在具有正确的宽高比的播放器中播放并标记并缩放和填充播放器区域时,一切看起来都不错.这是图形说明:

As for the solution, I couldn't find a way to change the video size inside AVAsset or AVPlayerItem. So I just manipulated the video to fit the size and scale that AVPlayer was expecting, and then when playing in a player with correct aspect ratio and flag to scale and fill the player area - everything looks good. Here is the graphical explanation:

这是问题中提到的applyingCIFiltersWithHandler块中需要插入的其他代码:

And here goes the additional code that needs to be inserted in the applyingCIFiltersWithHandler block mentioned in the question:

... after Step 3 in the question codes above

// make the frame the same aspect ratio as the original input frame
// by adding empty spaces at the top and the bottom of the extent rectangle
CGFloat newHeight = originalExtent.size.height * originalExtent.size.height / extent.size.height;
CGFloat inset = (extent.size.height - newHeight) / 2;
extent = CGRectInset(extent, 0, inset);
sourceImage = [sourceImage imageByCroppingToRect:extent];

// scale down to the original frame size
CGFloat scale = originalExtent.size.height / newHeight;
CGAffineTransform scaleTransform = CGAffineTransformMakeScale(scale, scale);
[transformFilter setValue:sourceImage forKey: kCIInputImageKey];
[transformFilter setValue: [NSValue valueWithCGAffineTransform: scaleTransform] forKey: kCIInputTransformKey];
sourceImage = transformFilter.outputImage;

// translate the frame to make it's origin start at (0, 0)
CGAffineTransform translation = CGAffineTransformMakeTranslation(0, -inset * scale);
[transformFilter setValue:sourceImage forKey: kCIInputImageKey];
[transformFilter setValue: [NSValue valueWithCGAffineTransform: translation] forKey: kCIInputTransformKey];
sourceImage = transformFilter.outputImage;

这篇关于AVPlayer播放视频合成结果不正确的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-23 19:48