iOS視頻採集實戰(AVCaptureSession)

需求:使用AVFoundation中的AVCaptureSession實現設置相機的分辨率,幀率(包括高幀率), 切換先後置攝像頭,對焦,屏幕旋轉,調節曝光度...


閱讀前提:


GitHub地址(附代碼) : iOS視頻採集實戰(AVCaptureSession)

簡書地址 : iOS視頻採集實戰(AVCaptureSession)

博客地址 : iOS視頻採集實戰(AVCaptureSession)

掘金地址 : iOS視頻採集實戰(AVCaptureSession)


1. 設置分辨率與幀率

1.1. 低幀率模式(fps <= 30)

在要求幀率小於等於30幀的狀況下,相機設置分辨率與幀率的方法是單獨的,即設置幀率是幀率的方法,設置分辨率是分辨率的方法,二者沒有綁定.git

  • 設置分辨率github

    使用此方法能夠設置相機分辨率,能夠設置的類型能夠直接跳轉進API文檔處自行選擇,目前支持最大的是3840*2160,若是不要求相機幀率大於30幀,此方法能夠適用於你.bash

- (void)setCameraResolutionByPresetWithHeight:(int)height session:(AVCaptureSession *)session {
    /*
     Note: the method only support your frame rate <= 30 because we must use `activeFormat` when frame rate > 30, the `activeFormat` and `sessionPreset` are exclusive
     */
    AVCaptureSessionPreset preset = [self getSessionPresetByResolutionHeight:height];
    if ([session.sessionPreset isEqualToString:preset]) {
        NSLog(@"Needn't to set camera resolution repeatly !");
        return;
    }
    
    if (![session canSetSessionPreset:preset]) {
        NSLog(@"Can't set the sessionPreset !");
        return;
    }
    
    [session beginConfiguration];
    session.sessionPreset = preset;
    [session commitConfiguration];
}
複製代碼
  • 設置幀率session

    使用此方法能夠設置相機幀率,僅支持幀率小於等於30幀.框架

- (void)setCameraForLFRWithFrameRate:(int)frameRate {
    // Only for frame rate <= 30
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    [captureDevice lockForConfiguration:NULL];
    [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
    [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
    [captureDevice unlockForConfiguration];
}
複製代碼
1.2. 高幀率模式(fps > 30)

若是須要對某一分辨率支持高幀率的設置,如50幀,60幀,120幀...,原先setActiveVideoMinFrameDurationsetActiveVideoMaxFrameDuration是沒法作到的,Apple規定咱們須要使用新的方法設置幀率setActiveVideoMinFrameDurationsetActiveVideoMaxFrameDuration,而且該方法必須配合新的設置分辨率activeFormat的方法一塊兒使用.iphone

新的設置分辨率的方法activeFormatsessionPreset是互斥的,若是使用了一個, 另外一個會失效,建議直接使用高幀率的設置方法,廢棄低幀率下設置方法,避免產生兼容問題。ide

Apple在更新方法後將原先分離的分辨率與幀率的設置方法合二爲一,原先是單獨設置相機分辨率與幀率,而如今則須要一塊兒設置,即每一個分辨率有其對應支持的幀率範圍,每一個幀率也有其支持的分辨率,須要咱們遍從來查詢,因此原先統一的單獨的設置分辨率與幀率的方法在高幀率模式下至關於棄用,能夠根據項目需求選擇,若是肯定項目不會支持高幀率(fps>30),可使用之前的方法,簡單且有效.post

注意: 使用activeFormat方法後,以前使用sessionPreset方法設置的分辨率將自動變爲AVCaptureSessionPresetInputPriority,因此若是項目以前有用canSetSessionPreset比較的if語句也都將失效,建議若是項目必須支持高幀率則完全啓用sessionPreset方法.ui

+ (BOOL)setCameraFrameRateAndResolutionWithFrameRate:(int)frameRate andResolutionHeight:(CGFloat)resolutionHeight bySession:(AVCaptureSession *)session position:(AVCaptureDevicePosition)position videoFormat:(OSType)videoFormat {
    AVCaptureDevice *captureDevice = [self getCaptureDevicePosition:position];
    
    BOOL isSuccess = NO;
    for(AVCaptureDeviceFormat *vFormat in [captureDevice formats]) {
        CMFormatDescriptionRef description = vFormat.formatDescription;
        float maxRate = ((AVFrameRateRange*) [vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
        if (maxRate >= frameRate && CMFormatDescriptionGetMediaSubType(description) == videoFormat) {
            if ([captureDevice lockForConfiguration:NULL] == YES) {
                // 對比鏡頭支持的分辨率和當前設置的分辨率
                CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(description);
                if (dims.height == resolutionHeight && dims.width == [self getResolutionWidthByHeight:resolutionHeight]) {
                    [session beginConfiguration];
                    if ([captureDevice lockForConfiguration:NULL]){
                        captureDevice.activeFormat = vFormat;
                        [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
                        [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
                        [captureDevice unlockForConfiguration];
                    }
                    [session commitConfiguration];
                    
                    return YES;
                }
            }else {
                NSLog(@"%s: lock failed!",__func__);
            }
        }
    }
    
    NSLog(@"Set camera frame is success : %d, frame rate is %lu, resolution height = %f",isSuccess,(unsigned long)frameRate,resolutionHeight);
    return NO;
}

+ (AVCaptureDevice *)getCaptureDevicePosition:(AVCaptureDevicePosition)position {
    NSArray *devices = nil;
    
    if (@available(iOS 10.0, *)) {
        AVCaptureDeviceDiscoverySession *deviceDiscoverySession =  [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
        devices = deviceDiscoverySession.devices;
    } else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
        devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
    }
    
    for (AVCaptureDevice *device in devices) {
        if (position == device.position) {
            return device;
        }
    }
    return NULL;
}
複製代碼

2. 先後置攝像頭切換

切換先後置攝像頭,看似簡單,實際應用中會產生不少問題,由於同一部設備先後置攝像頭支持的分辨率幀率的值是不一樣的,因此若是從支持切向不支持就會產生問題,具體案例以下spa

好比iPhoneX, 後置攝像頭最大支持(4K,60fps),前置攝像頭最大支持(2K,30fps),當使用(4K,60fps)後置攝像頭切到前置攝像頭若是不作處理則沒法切換,程序錯亂.

注意

下面代碼中咱們這行代碼session.sessionPreset = AVCaptureSessionPresetLow;,由於從後置切到前置咱們須要從新計算當前輸入設備支持最大的分辨率與幀率,而輸入設備若是不先添加上去咱們沒法計算,因此在這裏先隨便設置一個可接受的分辨率以使咱們能夠把輸入設備添加,以後在求出當前設備最大支持的分辨率與幀率後再從新設置分辨率與幀率.

- (void)setCameraPosition:(AVCaptureDevicePosition)position session:(AVCaptureSession *)session input:(AVCaptureDeviceInput *)input videoFormat:(OSType)videoFormat resolutionHeight:(CGFloat)resolutionHeight frameRate:(int)frameRate {
    if (input) {
        [session beginConfiguration];
        [session removeInput:input];
        
        AVCaptureDevice *device = [self.class getCaptureDevicePosition:position];
        
        NSError *error = nil;
        AVCaptureDeviceInput *newInput = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                               error:&error];
        
        if (error != noErr) {
            NSLog(@"%s: error:%@",__func__, error.localizedDescription);
            return;
        }
        
        // 好比: 後置是4K, 前置最多支持2K,此時切換須要降級, 而若是不先把Input添加到session中,咱們沒法計算當前攝像頭支持的最大分辨率
        session.sessionPreset = AVCaptureSessionPresetLow;
        if ([session canAddInput:newInput])  {
            self.input = newInput;
            [session addInput:newInput];
        }else {
            NSLog(@"%s: add input failed.",__func__);
            return;
        }
        
        int maxResolutionHeight = [self getMaxSupportResolutionByPreset];
        if (resolutionHeight > maxResolutionHeight) {
            resolutionHeight = maxResolutionHeight;
            self.cameraModel.resolutionHeight = resolutionHeight;
            NSLog(@"%s: Current support max resolution height = %d", __func__, maxResolutionHeight);
        }
        
        int maxFrameRate = [self getMaxFrameRateByCurrentResolution];
        if (frameRate > maxFrameRate) {
            frameRate = maxFrameRate;
            self.cameraModel.frameRate = frameRate;
            NSLog(@"%s: Current support max frame rate = %d",__func__, maxFrameRate);
        }

        BOOL isSuccess = [self.class setCameraFrameRateAndResolutionWithFrameRate:frameRate
                                                              andResolutionHeight:resolutionHeight
                                                                        bySession:session
                                                                         position:position
                                                                      videoFormat:videoFormat];
        
        if (!isSuccess) {
            NSLog(@"%s: Set resolution and frame rate failed.",__func__);
        }
        
        [session commitConfiguration];
    }
}
複製代碼

3.屏幕視頻方向切換

咱們在這裏首先要區分下屏幕方向與視頻方向的概念,一個是用來表示設備方向(UIDeviceOrientation),一個是用來表示視頻方向(AVCaptureVideoOrientation). 咱們使用的AVCaptureSession,若是要支持屏幕旋轉,須要在屏幕旋轉的同時將咱們的視頻畫面也進行旋轉.

屏幕方向的旋轉能夠經過通知UIDeviceOrientationDidChangeNotification接收,這裏不作過多說明.

- (void)adjustVideoOrientationByScreenOrientation:(UIDeviceOrientation)orientation previewFrame:(CGRect)previewFrame previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
    [previewLayer setFrame:previewFrame];
    
    switch (orientation) {
        case UIInterfaceOrientationPortrait:
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortrait
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationPortraitUpsideDown:
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortraitUpsideDown
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationLandscapeLeft:
            [[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeLeft
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationLandscapeRight:
            [[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeRight
                                    videoOutput:videoOutput];
            break;
            
        default:
            break;
            
    }
}

-(void)adjustAVOutputDataOrientation:(AVCaptureVideoOrientation)orientation videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
    for(AVCaptureConnection *connection in videoOutput.connections) {
        for(AVCaptureInputPort *port in [connection inputPorts]) {
            if([[port mediaType] isEqual:AVMediaTypeVideo]) {
                if([connection isVideoOrientationSupported]) {
                    [connection setVideoOrientation:orientation];
                }
            }
        }
    }
}

複製代碼

4.對焦調節

關於對焦,咱們須要特別說明手動設置對焦點進行對焦,由於對焦方法僅接受以左上角爲(0,0),右下角爲(1,1)的座標系,因此咱們須要對UIView的座標系進行轉換,可是轉換須要分爲多種狀況,以下

  • 視頻是否以鏡像模式輸出: 如前置攝像頭可能會開啓鏡像模式(x,y座標是反的)
  • 屏幕方向是以Home在右仍是在左: 在右的話是以左上角爲原點,在左的話則是以右下角爲原點.
  • 視頻渲染方式: 是保持分辨率比例,仍是填充模式,由於手機型號不一樣,因此多是填充黑邊,可能超出屏幕,須要從新計算對焦點.

若是咱們是直接使用AVCaptureSession的AVCaptureVideoPreviewLayer作渲染,咱們可使用captureDevicePointOfInterestForPoint方法自動計算,此結果會考慮上面全部狀況.但若是咱們是本身對屏幕作渲染,則須要本身計算對焦點,上面的狀況都須要考慮. 下面提供自動與手動計算兩種方法.

- (void)autoFocusAtPoint:(CGPoint)point {
    AVCaptureDevice *device = self.input.device;
    if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            [device setExposurePointOfInterest:point];
            [device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
            [device setFocusPointOfInterest:point];
            [device setFocusMode:AVCaptureFocusModeAutoFocus];
            [device unlockForConfiguration];
        }
    }
}
複製代碼
4.1. 自動計算對焦點
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer {
    CGPoint pointOfInterest = CGPointMake(.5f, .5f);
    CGSize frameSize = [captureVideoPreviewLayer frame].size;
    
    if ([captureVideoPreviewLayer.connection isVideoMirrored]) {
        viewCoordinates.x = frameSize.width - viewCoordinates.x;
    }

    // Convert UIKit coordinate to Focus Point(0.0~1.1)
    pointOfInterest = [captureVideoPreviewLayer captureDevicePointOfInterestForPoint:viewCoordinates];
    
    // NSLog(@"Focus - Auto test: %@",NSStringFromCGPoint(pointOfInterest));
    
    return pointOfInterest;
}

複製代碼
4.2. 手動計算對焦點
  • 若是手機屏幕尺寸與分辨率比例徹底吻合,則直接將座標系轉爲(0,0)到(1,1)便可
  • 若是屏幕尺寸比例與分辨率比例不一樣,須要進一步分析視頻渲染方式來計算,若是是保持分辨率,則確定會留下黑邊,咱們在計算對焦點時須要減去黑邊長度,若是是以分辨率比例填充屏幕則會犧牲一部分像素,咱們在計算對焦點時一樣須要加上犧牲的像素.
- (CGPoint)manualConvertFocusPoint:(CGPoint)point frameSize:(CGSize)frameSize captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer position:(AVCaptureDevicePosition)position videoDataOutput:(AVCaptureVideoDataOutput *)videoDataOutput input:(AVCaptureDeviceInput *)input {
    CGPoint pointOfInterest = CGPointMake(.5f, .5f);
    
    if ([[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] isVideoMirrored]) {
        point.x = frameSize.width - point.x;
    }
    
    for (AVCaptureInputPort *port in [input ports]) {
        if ([port mediaType] == AVMediaTypeVideo) {
            CGRect cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
            CGSize resolutionSize = cleanAperture.size;
            
            CGFloat resolutionRatio = resolutionSize.width / resolutionSize.height;
            CGFloat screenSizeRatio = frameSize.width / frameSize.height;
            CGFloat xc = .5f;
            CGFloat yc = .5f;
        
            if (resolutionRatio == screenSizeRatio) {
                xc = point.x / frameSize.width;
                yc = point.y / frameSize.height;
            }else if (resolutionRatio > screenSizeRatio) {
                if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
                    CGFloat needScreenWidth = resolutionRatio * frameSize.height;
                    CGFloat cropWidth = (needScreenWidth - frameSize.width) / 2;
                    xc = (cropWidth + point.x) / needScreenWidth;
                    yc = point.y / frameSize.height;
                }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
                    CGFloat needScreenHeight = frameSize.width * (1/resolutionRatio);
                    CGFloat blackBarLength   = (frameSize.height - needScreenHeight) / 2;
                    xc = point.x / frameSize.width;
                    yc = (point.y - blackBarLength) / needScreenHeight;
                }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
                    xc = point.x / frameSize.width;
                    yc = point.y / frameSize.height;
                }
            }else {
                if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
                    CGFloat needScreenHeight = (1/resolutionRatio) * frameSize.width;
                    CGFloat cropHeight = (needScreenHeight - frameSize.height) / 2;
                    xc = point.x / frameSize.width;
                    yc = (cropHeight + point.y) / needScreenHeight;
                }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
                    CGFloat needScreenWidth = frameSize.height * resolutionRatio;
                    CGFloat blackBarLength   = (frameSize.width - needScreenWidth) / 2;
                    xc = (point.x - blackBarLength) / needScreenWidth;
                    yc = point.y / frameSize.height;
                }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
                    xc = point.x / frameSize.width;
                    yc = point.y / frameSize.height;
                }
            }
            pointOfInterest = CGPointMake(xc, yc);
        }
    }
    
    if (position == AVCaptureDevicePositionBack) {
        if (captureVideoPreviewLayer.connection.videoOrientation == AVCaptureVideoOrientationLandscapeLeft) {
            pointOfInterest = CGPointMake(1-pointOfInterest.x, 1-pointOfInterest.y);
        }
    }else {
        pointOfInterest = CGPointMake(pointOfInterest.x, 1-pointOfInterest.y);
    }
    
    //NSLog(@"Focus - manu test: %@",NSStringFromCGPoint(pointOfInterest));
    return pointOfInterest;
}
複製代碼

5.曝光調節

若是咱們是以UISlider做爲調節控件,最簡單的作法能夠將其範圍設置的與曝光度值的範圍相同,即(-8~8),這樣無需轉換值,直接傳入便可,若是是手勢或是其餘控件可根據需求自行調整.較爲簡單,再也不敘述.

- (void)setExposureWithNewValue:(CGFloat)newExposureValue device:(AVCaptureDevice *)device {
    NSError *error;
    if ([device lockForConfiguration:&error]) {
        [device setExposureTargetBias:newExposureValue completionHandler:nil];
        [device unlockForConfiguration];
    }
}
複製代碼

6.手電筒模式

  • AVCaptureTorchModeAuto: 自動
  • AVCaptureTorchModeOn: 打開
  • AVCaptureTorchModeOff: 關閉
- (void)setTorchState:(BOOL)isOpen device:(AVCaptureDevice *)device {
    if ([device hasTorch]) {
        NSError *error;
        [device lockForConfiguration:&error];
        device.torchMode = isOpen ? AVCaptureTorchModeOn : AVCaptureTorchModeOff;
        [device unlockForConfiguration];
    }else {
        NSLog(@"The device not support torch!");
    }
}
複製代碼

7.視頻穩定性調節

注意: 部分機型,部分分辨率使用此屬性渲染可能會出現問題 (iphone xs, 本身渲染)

-(void)adjustVideoStabilizationWithOutput:(AVCaptureVideoDataOutput *)output {
    NSArray *devices = nil;
    
    if (@available(iOS 10.0, *)) {
        AVCaptureDeviceDiscoverySession *deviceDiscoverySession =  [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:self.cameraModel.position];
        devices = deviceDiscoverySession.devices;
    } else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
        devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
    }
    
    for(AVCaptureDevice *device in devices){
        if([device hasMediaType:AVMediaTypeVideo]){
            if([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
                for(AVCaptureConnection *connection in output.connections) {
                    for(AVCaptureInputPort *port in [connection inputPorts]) {
                        if([[port mediaType] isEqual:AVMediaTypeVideo]) {
                            if(connection.supportsVideoStabilization) {
                                connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
                                NSLog(@"activeVideoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
                            }else {
                                NSLog(@"connection don't support video stabilization");
                            }
                        }
                    }
                }
            }else{
                NSLog(@"device don't support video stablization");
            }
        }
    }
}

複製代碼

8.白平衡調節

  • temperature: 經過華氏溫度調節 (-150-~250)
  • tint: 經過色調調節 (-150-~150)

注意在使用setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains方法時必須比較當前的AVCaptureWhiteBalanceGains值是否在有效範圍.

-(AVCaptureWhiteBalanceGains)clampGains:(AVCaptureWhiteBalanceGains)gains toMinVal:(CGFloat)minVal andMaxVal:(CGFloat)maxVal {
    AVCaptureWhiteBalanceGains tmpGains = gains;
    tmpGains.blueGain   = MAX(MIN(tmpGains.blueGain , maxVal), minVal);
    tmpGains.redGain    = MAX(MIN(tmpGains.redGain  , maxVal), minVal);
    tmpGains.greenGain  = MAX(MIN(tmpGains.greenGain, maxVal), minVal);
    
    return tmpGains;
}

-(void)setWhiteBlanceValueByTemperature:(CGFloat)temperature device:(AVCaptureDevice *)device {
    if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
        [device lockForConfiguration:nil];
        AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
        CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
        AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
            .temperature = temperature,
            .tint        = currentTint,
        };
        
        AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
        CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
        deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalanceGain];
        
        [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
        [device unlockForConfiguration];
    }
}

-(void)setWhiteBlanceValueByTint:(CGFloat)tint device:(AVCaptureDevice *)device {
    if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
        [device lockForConfiguration:nil];
        CGFloat maxWhiteBalaceGain = device.maxWhiteBalanceGain;
        AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
        currentGains = [self clampGains:currentGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
        CGFloat currentTemperature = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].temperature;
        AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
            .temperature = currentTemperature,
            .tint        = tint,
        };
        
        AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
        deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
        
        [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
        [device unlockForConfiguration];
    }
}

複製代碼

9.屏幕填充方式

  • AVLayerVideoGravityResizeAspect: 保持分辨率比例,若是屏幕分辨率與視頻分辨率不一致會留下黑邊.
  • AVLayerVideoGravityResizeAspectFill: 保持分辨率比例去填充屏幕,即以較小的邊來準填充屏幕,會犧牲掉一些像素,由於超出屏幕.
  • AVLayerVideoGravityResize:以拉伸的方式來填充屏幕,不會犧牲像素,可是畫面會被拉伸.
- (void)setVideoGravity:(AVLayerVideoGravity)videoGravity previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer session:(AVCaptureSession *)session {
    [session beginConfiguration];
    [previewLayer setVideoGravity:videoGravity];
    [session commitConfiguration];
}
複製代碼
相關文章
相關標籤/搜索