AVFoundation 捕捉媒體

1. 捕捉功能綜述

  • 捕捉會話

AVCaptureSession 用於鏈接輸入和輸出的資源,從物理設備如攝像頭和麥克風等獲取數據流,輸出到一個或多個目的地。AVCaptureSession 能夠額外配置一個會話預設值(session preset),用於控制捕捉數據的格式和質量,預設值默認值爲 AVCaptureSessionPresetHigh。ios

  • 捕捉設備

AVCaptureDevice 爲物理設備定義統一接口,以及大量控制方法,獲取指定類型的默認設備方法以下安全

self.activeVideoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
複製代碼
  • 捕捉設備的輸入

不能直接將 AVCaptureDevice 加入到 AVCaptureSession 中,須要封裝爲 AVCaptureDeviceInput。session

self.captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:self.activeVideoDevice error:&videoError];
    if (self.captureVideoInput) {
        if ([self.captureSession canAddInput:self.captureVideoInput]){
            [self.captureSession addInput:self.captureVideoInput];
        }
    } else if (videoError) {
    }
複製代碼
  • 捕捉輸出

AVCaptureOutput 做爲抽象基類提供了捕捉會話數據流的輸出目的地,同時定義了此抽象類的高級擴展類。app

  • AVCaptureStillImageOutput - 靜態照片異步

  • AVCaptureMovieFileOutput - 視頻async

  • AVCaptureAudioFileOutput - 音頻ide

  • AVCaptureAudioDataOutput - 音頻底層數字樣本測試

  • AVCaptureVideoDataOutput - 視頻底層數字樣本fetch

  • 捕捉鏈接ui

AVCaptureConnection 用於肯定哪些輸入產生視頻,哪些輸入產生音頻,可以禁用特定鏈接或訪問單獨的音頻軌道。

  • 捕捉預覽

AVCaptureVideoPreviewLayer 是一個 CALayer 的子類,能夠對捕捉視頻數據進行實時預覽。

2. 實踐

2.1 建立預覽視圖

能夠直接向一個 view 的 layer 中加入一個 AVCaptureVideoPreviewLayer 對象

self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] init];
    [self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    [self.previewLayer setSession:self.cameraHelper.captureSession];
    self.previewLayer.frame = CGRectMake(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT - 50);
    [self.previewImageView.layer addSublayer:self.previewLayer];
複製代碼

也能夠經過 view 的類方法直接換掉 view 的 clayer 實例

+ (Class)layerClass {
	return [AVCaptureVideoPreviewLayer class];
}

- (AVCaptureSession*)session {
	return [(AVCaptureVideoPreviewLayer*)self.layer session];
}

- (void)setSession:(AVCaptureSession *)session {
	[(AVCaptureVideoPreviewLayer*)self.layer setSession:session];
}
複製代碼

2.1.1 座標轉換

AVCaptureVideoPreviewLayer 定義了兩個方法用於在屏幕座標系和設備座標系之間轉換,設備座標系規定左上角爲 (0,0),右下角爲(1,1)。

  • (CGPoint)captureDevicePointOfInterestForPoint:(CGPoint)pointInLayer 從屏幕座標系的點轉換爲設備座標系
  • (CGPoint)pointForCaptureDevicePointOfInterest:(CGPoint)captureDevicePointOfInterest 從設備座標系的點轉換爲屏幕座標系

2.2 設置捕捉會話

首先是初始化捕捉會話

self.captureSession = [[AVCaptureSession alloc]init];
    [self.captureSession setSessionPreset:(self.isVideoMode)?AVCaptureSessionPreset1280x720:AVCaptureSessionPresetPhoto];
複製代碼

根據拍攝視頻仍是拍攝照片選擇不一樣的預設值,而後設置會話輸入。

- (void)configSessionInput
{
    // 攝像頭輸入
    NSError *videoError = [[NSError alloc] init];
    self.activeVideoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    self.flashMode = self.activeVideoDevice.flashMode;
    self.captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:self.activeVideoDevice error:&videoError];
    if (self.captureVideoInput) {
        if ([self.captureSession canAddInput:self.captureVideoInput]){
            [self.captureSession addInput:self.captureVideoInput];
        }
    } else if (videoError) {
    }
    
    if (self.isVideoMode) {
        // 麥克風輸入
        NSError *audioError = [[NSError alloc] init];
        AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio] error:&audioError];
        if (audioInput) {
            if ([self.captureSession canAddInput:audioInput]) {
                [self.captureSession addInput:audioInput];
            }
        } else if (audioError) {
        }
    }
}
複製代碼

對攝像頭和麥克風設備均封裝爲 AVCaptureDeviceInput 後加入到會話中。

而後配置會話輸出。

- (void)configSessionOutput
{
    if (self.isVideoMode) {
        // 視頻輸出
        self.movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
        if ([self.captureSession canAddOutput:self.movieFileOutput]) {
            [self.captureSession addOutput:self.movieFileOutput];
        }
    } else {
        // 圖片輸出
        self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
        self.imageOutput.outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};// 配置 outputSetting 屬性,表示但願捕捉 JPEG 格式的圖片
        if ([self.captureSession canAddOutput:self.imageOutput]) {
            [self.captureSession addOutput:self.imageOutput];
        }
    }
}
複製代碼

2.3 啓動和中止會話

能夠在一個 VC 的生命週期內啓動和中止會話

- (void)startSession {
	if (![self.captureSession isRunning]) {                                 // 1
		dispatch_async([self globalQueue], ^{
			[self.captureSession startRunning];
		});
	}
}

- (void)stopSession {
	if ([self.captureSession isRunning]) {                                  // 2
		dispatch_async([self globalQueue], ^{
			[self.captureSession stopRunning];
		});
	}
}
複製代碼

因爲這個操做是比較耗時的同步操做,所以建議在異步線程裏執行此方法。

2.4 權限請求

若是沒有獲取到相機和麥克風權限,在設置 captureVideoInput 時就會出錯。

/// 檢測 AVAuthorization 權限
/// 傳入待檢查的 AVMediaType,AVMediaTypeVideo or AVMediaTypeAudio
/// 返回是否權限可用
- (BOOL)ifAVAuthorizationValid:(NSString *)targetAVMediaType grantedCallback:(void (^)())grantedCallback
{
    NSString *mediaType = targetAVMediaType;
    BOOL result = NO;
    if ([AVCaptureDevice respondsToSelector:@selector(authorizationStatusForMediaType:)]) {
        AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:mediaType];
        switch (authStatus) {
            case AVAuthorizationStatusNotDetermined: { // 還沒有請求受權
                [AVCaptureDevice requestAccessForMediaType:targetAVMediaType completionHandler:^(BOOL granted) {
                    dispatch_async(dispatch_get_main_queue(), ^{
                        if (granted) {
                            grantedCallback();
                        }
                    });
                }];
                break;
            }
            case AVAuthorizationStatusDenied: { // 明確拒絕
                if ([mediaType isEqualToString:AVMediaTypeVideo]) {
                    [METSettingPermissionAlertView showAlertViewWithPermissionType:METSettingPermissionTypeCamera];// 申請相機權限
                } else if ([mediaType isEqualToString:AVMediaTypeAudio]) {
                    [METSettingPermissionAlertView showAlertViewWithPermissionType:METSettingPermissionTypeMicrophone];// 申請麥克風權限
                }
                break;
            }
            case AVAuthorizationStatusRestricted: { // 限制權限更改
                break;
            }
            case AVAuthorizationStatusAuthorized: { // 已受權
                result = YES;
                break;
            }
            default: // 兜底
                break;
        }
    }
    return result;
}
複製代碼

能夠用這個方法對各類狀況進行相應邏輯處理,避免沒有權限致使的應用異常,同時因爲用戶隨時能夠在後臺更改權限設置,應該每次啓動相機前進行權限判斷。

2.5 切換攝像頭

大多數 ios 設備都有先後兩個攝像頭,標識先後攝像頭須要用到 AVCaptureDevicePosition 枚舉類

typedef NS_ENUM(NSInteger, AVCaptureDevicePosition) {
    AVCaptureDevicePositionUnspecified = 0, // 未知
    AVCaptureDevicePositionBack        = 1, // 後置攝像頭
    AVCaptureDevicePositionFront       = 2, // 前置攝像頭
}
複製代碼

切換攝像頭前首先要判斷可否切換

- (BOOL)canSwitchCameras {
    return [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] > 1;
}
複製代碼

接下來獲取當前活躍的設備

- (AVCaptureDevice *)activeCamera {
    return self.activeVideoInput.device;
}
複製代碼

從 AVCaptureDeviceInput 就能夠獲取到當前活躍的 device,而後找到與其相對的設備

- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position { // 1
	NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
	for (AVCaptureDevice *device in devices) {                              // 2
		if (device.position == position) {
			return device;
		}
	}
	return nil;
}
複製代碼

獲取到對應的 device 後就能夠封裝爲 AVCaptureInput 對象,而後進行配置

[self.captureSession beginConfiguration];// 開始配置新的視頻輸入
            [self.captureSession removeInput:self.captureVideoInput]; // 首先移除舊的 input,才能加入新的 input
            if ([self.captureSession canAddInput:newInput]) {
                [self.captureSession addInput:newInput];
                self.activeVideoDevice = newActiveDevice;
                self.captureVideoInput = newInput;
            } else {
                [self.captureSession addInput:self.captureVideoInput];
            }
            [self.captureSession commitConfiguration];
複製代碼

這裏 beginConfiguration 和 commitConfiguration 可使修改操做成爲原子性操做,保證設備運行安全。

2.6 調整焦距和曝光

這裏主要關注對於設置操做的測試以及對設置過程的加鎖解鎖。

  • 對焦
- (BOOL)cameraSupportsTapToFocus {
    return [self.activeVideoInput.device isFocusPointOfInterestSupported];
}

- (void)focusAtPoint:(CGPoint)point {
    AVCaptureDevice *device = self.activeVideoInput.device;
    if (device.isFocusPointOfInterestSupported &&
        [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.focusPointOfInterest = point;
            device.focusMode = AVCaptureFocusModeAutoFocus;
            [device unlockForConfiguration];
        } else {
        }
    }
}
複製代碼

isFocusPointOfInterestSupported 用於判斷設備是否支持興趣點對焦,isFocusModeSupported 判斷是否支持某種對焦模式,AVCaptureFocusModeAutoFocus 即自動對焦,而後進行對焦設置。

  • 曝光

曝光與對焦很是相似,核心方法以下

[self.activeVideoDevice setExposurePointOfInterest:focusPoint];
[self.activeVideoDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
複製代碼

2.7 調整閃光燈和手電筒模式

閃光燈(flash)和手電筒(torch)是兩個不一樣的模式,分別定義以下

typedef NS_ENUM(NSInteger, AVCaptureFlashMode) {
    AVCaptureFlashModeOff  = 0,
    AVCaptureFlashModeOn   = 1,
    AVCaptureFlashModeAuto = 2,
}

typedef NS_ENUM(NSInteger, AVCaptureTorchMode) {
    AVCaptureTorchModeOff  = 0,
    AVCaptureTorchModeOn   = 1,
    AVCaptureTorchModeAuto = 2,
}
複製代碼

一般在拍照時須要設置閃光燈,而拍視頻時須要設置手電筒。具體配置模式代碼以下

- (BOOL)cameraHasFlash {
    return [[self activeCamera] hasFlash];
}

- (AVCaptureFlashMode)flashMode {
    return [[self activeCamera] flashMode];
}

- (void)setFlashMode:(AVCaptureFlashMode)flashMode {
    AVCaptureDevice *device = [self activeCamera];
    if (device.flashMode != flashMode &&
        [device isFlashModeSupported:flashMode]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.flashMode = flashMode;
            [device unlockForConfiguration];
        } else {
            // 錯誤處理
        }
    }
}

- (BOOL)cameraHasTorch {
    return [[self activeCamera] hasTorch];
}

- (AVCaptureTorchMode)torchMode {
    return [[self activeCamera] torchMode];
}

- (void)setTorchMode:(AVCaptureTorchMode)torchMode {
    AVCaptureDevice *device = [self activeCamera];
    if (device.torchMode != torchMode &&
        [device isTorchModeSupported:torchMode]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.torchMode = torchMode;
            [device unlockForConfiguration];
        } else {
            // 錯誤處理
        }
    }
}
複製代碼

2.8 拍攝靜態圖片

設置捕捉會話時咱們將 AVCaptureStillImageOutput 實例加入到會話中,這個會話能夠用來拍攝靜態圖片。

AVCaptureConnection *connection = [self.cameraHelper.imageOutput connectionWithMediaType:AVMediaTypeVideo];
    if ([connection isVideoOrientationSupported]) {
        [connection setVideoOrientation:self.cameraHelper.videoOrientation];
    }
    if (!connection.enabled || !connection.isActive) { // connection 不可用
        // 處理非法狀況
        return;
    }
複製代碼

這裏從 AVCaptureStillImageOutput 實例類中獲取到一個 AVCaptureConnection 對象後,須要設置此 connection 的 orientation 值,有兩種方法能夠獲取。

  • 經過監聽重力感應器修改 orientation
// 監測重力感應器並調整 orientation
    CMMotionManager *motionManager = [[CMMotionManager alloc] init];
    motionManager.deviceMotionUpdateInterval = 1/15.0;
    if (motionManager.deviceMotionAvailable) {
        [motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
                                           withHandler: ^(CMDeviceMotion *motion, NSError *error){
                                               double x = motion.gravity.x;
                                               double y = motion.gravity.y;
                                               if (fabs(y) >= fabs(x)) { // y 軸份量大於 x 軸
                                                   if (y >= 0) { // 頂部向下
                                                       self.videoOrientation = AVCaptureVideoOrientationPortraitUpsideDown; // UIDeviceOrientationPortraitUpsideDown;
                                                   } else { // 頂部向上
                                                       self.videoOrientation = AVCaptureVideoOrientationPortrait; // UIDeviceOrientationPortrait;
                                                   }
                                               } else {
                                                   if (x >= 0) { // 頂部向右
                                                       self.videoOrientation = AVCaptureVideoOrientationLandscapeLeft; // UIDeviceOrientationLandscapeRight;
                                                   } else { // 頂部向左
                                                       self.videoOrientation = AVCaptureVideoOrientationLandscapeRight; // UIDeviceOrientationLandscapeLeft;
                                                   }
                                               }
                                           }];
        self.motionManager = motionManager;
    } else {
        self.videoOrientation = AVCaptureVideoOrientationPortrait;
    }
複製代碼

要注意這裏一些枚舉量的名稱,AVCaptureVideoOrientationLandscapeLeft 表示 home 鍵在左,AVCaptureVideoOrientationLandscapeRight 表示 home 鍵在右。

  • 經過 UIDevice 獲取
AVCaptureVideoOrientation orientation;

    switch ([UIDevice currentDevice].orientation) {                         // 3
        case UIDeviceOrientationPortrait:
            orientation = AVCaptureVideoOrientationPortrait;
            break;
        case UIDeviceOrientationLandscapeRight:
            orientation = AVCaptureVideoOrientationLandscapeLeft;
            break;
        case UIDeviceOrientationPortraitUpsideDown:
            orientation = AVCaptureVideoOrientationPortraitUpsideDown;
            break;
        default:
            orientation = AVCaptureVideoOrientationLandscapeRight;
            break;
    }

    return orientation;
複製代碼

這裏也要注意,UIDeviceOrientationLandscapeRight 表示 home 鍵在左,UIDeviceOrientationLandscapeLeft 表示 home 鍵在右。

最終調用方法來獲取 CMSampleBufferRef,CMSampleBufferRef 是一個 Core Media 定義的 Core Foundation 對象,能夠經過 AVCaptureStillImageOutput 的 jpegStillImageNSDataRepresentation 類方法將其轉化爲 NSData 類型。

@weakify(self)
    [self.cameraHelper.imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
        @strongify(self)
        if (!error && imageDataSampleBuffer) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            if (!imageData) {return;}
            UIImage *image = [UIImage imageWithData:imageData];
            if (!image) {return;}
    }];
複製代碼

2.9 保存圖片

《AVFoundation 開發祕籍》介紹的 Assets Library 在 ios 8 之後已經被 Photo Library 替代,這裏用 Photo Library 實現保存圖片的功能。

[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
        PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:targetImage];
        NSString *imageIdentifier = changeRequest.placeholderForCreatedAsset.localIdentifier;
    } completionHandler:^( BOOL success, NSError * _Nullable error ) {
    }];
複製代碼

能夠經過保存時返回的 imageIdentifier 從相冊裏找到這個圖片。

2.10 視頻捕捉

QuickTime 格式的影片,元數據處於影片文件的開頭位置,這樣能夠幫助視頻播放器快速讀取頭文件來肯定文件內容、結構和樣本位置,可是錄製時須要等全部樣本捕捉完成才能建立頭數據並將其附在文件結尾處。這樣一來,若是錄製時發生崩潰或中斷就會致使沒法建立影片頭,從而在磁盤生成一個不可讀的文件。

所以 AVFoundation 的 AVCaptureMovieFileOutput 類就提供了分段捕捉能力,錄製開始時生成最小化的頭信息,錄製進行中,片斷間隔必定週期再次建立頭信息,從而逐步完成建立。默認狀態下每 10s 寫入一個片斷,能夠經過 movieFragmentInterval 屬性來修改。

首先是開啓視頻拍攝

AVCaptureConnection *videoConnection = [self.cameraHelper.movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    if ([videoConnection isVideoOrientationSupported]) {
        [videoConnection setVideoOrientation:self.cameraHelper.videoOrientation];
    }
    
    if ([videoConnection isVideoStabilizationSupported]) {
        [videoConnection setPreferredVideoStabilizationMode:AVCaptureVideoStabilizationModeAuto];
    }
    
    [videoConnection setVideoScaleAndCropFactor:1.0];
    if (![self.cameraHelper.movieFileOutput isRecording] && videoConnection.isActive && videoConnection.isEnabled) {
        // 判斷視頻鏈接是否可用
        self.countTimer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(refreshTimeLabel) userInfo:nil repeats:YES];
        NSString *urlString = [NSTemporaryDirectory() stringByAppendingString:[NSString stringWithFormat:@"%.0f.mov", [[NSDate date] timeIntervalSince1970] * 1000]];
        NSURL *url = [NSURL fileURLWithPath:urlString];
        [self.cameraHelper.movieFileOutput startRecordingToOutputFileURL:url recordingDelegate:self];
        [self.captureButton setTitle:@"結束" forState:UIControlStateNormal];
    } else {
    }
複製代碼

設置 PreferredVideoStabilizationMode 能夠支持視頻拍攝時的穩定性和拍攝質量,可是這一穩定效果只會在拍攝的視頻中感覺到,預覽視頻時沒法感知。

咱們將視頻文件臨時寫入到臨時文件中,等待拍攝結束時會調用 AVCaptureFileOutputRecordingDelegate 的 (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error 方法。此時能夠進行保存視頻和生成視頻縮略圖的操做。

- (void)saveVideo:(NSURL *)videoURL
{
    __block NSString *imageIdentifier;
    @weakify(self)
    [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
        // 保存視頻
        PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:videoURL];
        imageIdentifier = changeRequest.placeholderForCreatedAsset.localIdentifier;
    } completionHandler:^( BOOL success, NSError * _Nullable error ) {
        @strongify(self)
        dispatch_async(dispatch_get_main_queue(), ^{
            @strongify(self)
            [self resetTimeCounter];
            if (!success) {
                // 錯誤處理
            } else {
                PHAsset *asset = [PHAsset fetchAssetsWithLocalIdentifiers:@[imageIdentifier] options:nil].firstObject;
                if (asset && asset.mediaType == PHAssetMediaTypeVideo) {
                    PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
                    options.version = PHImageRequestOptionsVersionCurrent;
                    options.deliveryMode = PHVideoRequestOptionsDeliveryModeAutomatic;
                    [[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset * _Nullable obj, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
                        @strongify(self)
                        [self resolveAVAsset:obj identifier:asset.localIdentifier];
                    }];
                }
            }
        });
    }];
}
    
- (void)resolveAVAsset:(AVAsset *)asset identifier:(NSString *)identifier
{
    if (!asset) {
        return;
    }
    if (![asset isKindOfClass:[AVURLAsset class]]) {
        return;
    }
    AVURLAsset *urlAsset = (AVURLAsset *)asset;
    NSURL *url = urlAsset.URL;
    NSData *data = [NSData dataWithContentsOfURL:url];
    
    AVAssetImageGenerator *generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
    generator.appliesPreferredTrackTransform = YES; //捕捉縮略圖時考慮視頻 orientation 變化,避免錯誤的縮略圖方向
    CMTime snaptime = kCMTimeZero;
    CGImageRef cgImageRef = [generator copyCGImageAtTime:snaptime actualTime:NULL error:nil];
    UIImage *assetImage = [UIImage imageWithCGImage:cgImageRef];
    CGImageRelease(cgImageRef);
}
複製代碼
相關文章
相關標籤/搜索