如何利用 AVFoundation 設計一個通用穩定的音視頻框架?

前言

承接上篇的《AV Foundation開發祕籍——實踐掌握iOS & OS X應用的視聽處理技術 閱讀指南》 今天這篇給你們講解下如何利用AVFoundation設計一套通用穩定的音視頻框架。ios

核心

AVCaptureSession開啓捕獲任務,配置AVCaptureDeviceInput定製捕獲任務的輸入源(多種攝像頭),經過AVFoundation內各類Data output輸出數據(元數據、視頻幀、音頻幀),AVAssetWriter開啓寫任務,將音視頻數據歸檔爲媒體文件。git

實現功能

視頻流預覽、錄像歸檔、捕獲相片、切換攝像頭、人臉檢測、幀率配置、相機詳細配置github

框架源碼

github.com/caixindong/…緩存

具體設計

核心模塊 XDCaptureService & XDVideoWritter

XDCaptureService是對外API的總入口,也是框架的核心類,主要作音視頻輸入輸出配置工做和調度工做。 XDVideoWritter是音視頻寫模塊,主要提供寫數據和歸檔數據的基礎操做,不對外暴露。 對外API設計:bash

@class XDCaptureService;


@protocol XDCaptureServiceDelegate <NSObject>

@optional
//service生命週期
- (void)captureServiceDidStartService:(XDCaptureService *)service;

- (void)captureService:(XDCaptureService *)service serviceDidFailWithError:(NSError *)error;

- (void)captureServiceDidStopService:(XDCaptureService *)service;

- (void)captureService:(XDCaptureService *)service getPreviewLayer:(AVCaptureVideoPreviewLayer *)previewLayer;

- (void)captureService:(XDCaptureService *)service outputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

//錄像相關
- (void)captureServiceRecorderDidStart:(XDCaptureService *)service ;

- (void)captureService:(XDCaptureService *)service recorderDidFailWithError:(NSError *)error;

- (void)captureServiceRecorderDidStop:(XDCaptureService *)service;

//照片捕獲
- (void)captureService:(XDCaptureService *)service capturePhoto:(UIImage *)photo;

//人臉檢測
- (void)captureService:(XDCaptureService *)service outputFaceDetectData:(NSArray <AVMetadataFaceObject*>*) faces;

//景深數據
- (void)captureService:(XDCaptureService *)service captureTrueDepth:(AVDepthData *)depthData API_AVAILABLE(ios(11.0));

@end

@protocol XDCaptureServicePreViewSource <NSObject>

- (AVCaptureVideoPreviewLayer *)preViewLayerSource;

@end

@interface XDCaptureService : NSObject

//是否錄製音頻,默認是NO
@property (nonatomic, assign) BOOL shouldRecordAudio;

//iOS原生人臉檢測,默認是NO
@property (nonatomic, assign) BOOL openNativeFaceDetect;

//攝像頭的方向,默認是AVCaptureDevicePositionFront(前置)
@property (nonatomic, assign) AVCaptureDevicePosition devicePosition;

//判斷是否支持景深模式,當前只支持7p、8p、X的後置攝像頭及X的先後攝像頭,系統要求是iOS 11以上
@property (nonatomic, assign, readonly) BOOL depthSupported;

//是否開啓景深模式,默認是NO
@property (nonatomic, assign) BOOL openDepth;

//只有如下指定的sessionPreset纔有depth數據:AVCaptureSessionPresetPhoto、AVCaptureSessionPreset1280x720、AVCaptureSessionPreset640x480
@property (nonatomic, assign) AVCaptureSessionPreset sessionPreset;

//幀率,默認是30
@property (nonatomic, assign) int frameRate;

//錄像的臨時存儲地址,建議每次錄完視頻作下重定向
@property (nonatomic, strong, readonly) NSURL *recordURL;

//若是設置preViewSource則內部不生成AVCaptureVideoPreviewLayer
@property (nonatomic, assign) id<XDCaptureServicePreViewSource> preViewSource;

@property (nonatomic, assign) id<XDCaptureServiceDelegate> delegate;

@property (nonatomic, assign, readonly) BOOL isRunning;


//視頻編碼設置(影響錄製的視頻的編碼和大小)
@property (nonatomic, strong) NSDictionary *videoSetting;

///相機專業設置,除非特定需求,通常不設置
//感光度(iOS8以上)
@property (nonatomic, assign, readonly) CGFloat deviceISO;
@property (nonatomic, assign, readonly) CGFloat deviceMinISO;
@property (nonatomic, assign, readonly) CGFloat deviceMaxISO;

//鏡頭光圈大小
@property (nonatomic, assign, readonly) CGFloat deviceAperture;

//曝光
@property (nonatomic, assign, readonly) BOOL supportsTapToExpose;
@property (nonatomic, assign) AVCaptureExposureMode exposureMode;
@property (nonatomic, assign) CGPoint exposurePoint;
@property (nonatomic, assign, readonly) CMTime deviceExposureDuration;

//聚焦
@property (nonatomic, assign, readonly) BOOL supportsTapToFocus;
@property (nonatomic, assign) AVCaptureFocusMode focusMode;
@property (nonatomic, assign) CGPoint focusPoint;

//白平衡
@property (nonatomic, assign) AVCaptureWhiteBalanceMode whiteBalanceMode;

//手電筒
@property (nonatomic, assign, readonly) BOOL hasTorch;
@property (nonatomic, assign) AVCaptureTorchMode torchMode;

//閃光燈
@property (nonatomic, assign, readonly) BOOL hasFlash;
@property (nonatomic, assign) AVCaptureFlashMode flashMode;

//相機權限判斷
+ (BOOL)videoGranted;

//麥克風權限判斷
+ (BOOL)audioGranted;

//切換攝像機
- (void)switchCamera;

//啓動
- (void)startRunning;

//關閉
- (void)stopRunning;

//開始錄像
- (void)startRecording;

//取消錄像
- (void)cancleRecording;

//中止錄像
- (void)stopRecording;

//拍照
- (void)capturePhoto;

@end
複製代碼

CDG隊列分流

由於在主線程啓動音視頻捕獲及音視頻讀寫會阻塞主線程,因此咱們須要將這些任務派發到子線程中執行。咱們選擇GCD隊列幫咱們作這個視頻。咱們框架總共配置3個隊列,分別是sessionQueue、writtingQueue、outputQueue,這些隊列都是串行隊列,由於音視頻相關操做都是有順序(時序)要求,保證當前隊列只有一個操做的執行(配置、寫數據、讀數據)。sessionQueue主要負責音視頻任務啓動的調度,writtingQueue主要負責寫數據的調度,保證數據幀可以準確歸檔到文件,outputQueue主要負責數據幀的輸出。session

@property (nonatomic, strong) dispatch_queue_t sessionQueue;
@property (nonatomic, strong) dispatch_queue_t writtingQueue;
@property (nonatomic, strong) dispatch_queue_t outputQueue;

 _sessionQueue = dispatch_queue_create("com.caixindong.captureservice.session", DISPATCH_QUEUE_SERIAL);
_writtingQueue = dispatch_queue_create("com.caixindong.captureservice.writting", DISPATCH_QUEUE_SERIAL);
_outputQueue = dispatch_queue_create("com.caixindong.captureservice.output", DISPATCH_QUEUE_SERIAL);
複製代碼

音視頻捕獲

初始化捕獲任務

sessionPreset指定了輸出的視頻幀的像素,例如640*480app

@property (nonatomic, strong) AVCaptureSession *captureSession;
 _captureSession = [[AVCaptureSession alloc] init];
_captureSession.sessionPreset = _sessionPreset;
複製代碼

配置捕獲的輸入

獲取輸入源設備,經過_cameraWithPosition能夠獲取攝像頭的抽象表示,由於紅外攝像頭、雙攝像頭只能從較新API中獲取,因此方法裏已經作了兼容處理。並用輸入源設備配置捕獲輸入AVCaptureDeviceInput框架

@property (nonatomic, strong) AVCaptureDeviceInput *videoInput;

- (BOOL)_setupVideoInputOutput:(NSError **) error {
    self.currentDevice = [self _cameraWithPosition:_devicePosition];
    
    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_currentDevice error:error];
    if (_videoInput) {
        if ([_captureSession canAddInput:_videoInput]) {
            [_captureSession addInput:_videoInput];
        } else {
            *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2200 userInfo:@{NSLocalizedDescriptionKey:@"add video input fail"}];
            return NO;
        }
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2201 userInfo:@{NSLocalizedDescriptionKey:@"video input is nil"}];
        return NO;
    }
    
    //穩定幀率
    CMTime frameDuration = CMTimeMake(1, _frameRate);
    if ([_currentDevice lockForConfiguration:error]) {
        _currentDevice.activeVideoMaxFrameDuration = frameDuration;
        _currentDevice.activeVideoMinFrameDuration = frameDuration;
        [_currentDevice unlockForConfiguration];
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2203 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(input)"}];
        
        return NO;
    }

……Other code
}

- (AVCaptureDevice *)_cameraWithPosition:(AVCaptureDevicePosition)position {
    if (@available(iOS 10.0, *)) {
        //AVCaptureDeviceTypeBuiltInWideAngleCamera默認廣角攝像頭,AVCaptureDeviceTypeBuiltInTelephotoCamera長焦攝像頭,AVCaptureDeviceTypeBuiltInDualCamera後置雙攝像頭,AVCaptureDeviceTypeBuiltInTrueDepthCamera紅外前置攝像頭
        NSMutableArray *mulArr = [NSMutableArray arrayWithObjects:AVCaptureDeviceTypeBuiltInWideAngleCamera,AVCaptureDeviceTypeBuiltInTelephotoCamera,nil];
        if (@available(iOS 10.2, *)) {
            [mulArr addObject:AVCaptureDeviceTypeBuiltInDualCamera];
        }
        if (@available(iOS 11.1, *)) {
            [mulArr addObject:AVCaptureDeviceTypeBuiltInTrueDepthCamera];
        }
        AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:[mulArr copy] mediaType:AVMediaTypeVideo position:position];
        return discoverySession.devices.firstObject;
    } else {
        NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        for (AVCaptureDevice *device in videoDevices) {
            if (device.position == position) {
                return device;
            }
        }
    }
    return nil;
}

複製代碼

配置捕獲輸出

根據不一樣的功能需求,咱們能夠往捕獲任務裏添加不一樣的輸出,例如捕獲基礎的視頻幀數據,咱們添加AVCaptureVideoDataOutput,捕獲音頻數據,咱們添加AVCaptureAudioDataOutput,捕獲人臉數據,咱們添加AVCaptureMetadataOutput。由於音頻的輸出和視頻輸出的設置方式大同小異,因此這裏只列出視頻輸出的關鍵代碼,這裏有幾個關鍵的設計:
一、由於相機傳感器問題,輸出的視頻流的方向會有90度偏轉,因此咱們須要經過獲取與輸出鏈接的videoConnection進行偏轉配置; 二、視頻幀(或者音頻幀)都是以CMSampleBufferRef格式輸出,視頻幀可能通過多個業務處理,例如寫文件或者拋到上層業務處理,因此處理數據前都對視頻幀數據進行retatin操做,保證各個業務線處理的視頻幀是獨立的,具體能夠看_processVideoData;
三、爲了及時清理臨時變量(對視頻幀處理的各類操做可能須要較多內存空間),因此將外拋的幀處理用autorelease pool包裹起來,防止出現內存高峯;異步

@property (nonatomic, strong) AVCaptureVideoDataOutput *videoOutput;

- (BOOL)_setupVideoInputOutput:(NSError **) error {
……Other code

self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    _videoOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
    //對遲到的幀作丟幀處理
    _videoOutput.alwaysDiscardsLateVideoFrames = YES;
    
    dispatch_queue_t videoQueue = dispatch_queue_create("com.caixindong.captureservice.video", DISPATCH_QUEUE_SERIAL);
    //設置數據輸出的delegate
    [_videoOutput setSampleBufferDelegate:self queue:videoQueue];
    
    if ([_captureSession canAddOutput:_videoOutput]) {
        [_captureSession addOutput:_videoOutput];
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2204 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(output)"}];
        return NO;
    }
    
    self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
    //錄製視頻會有90度偏轉,是由於相機傳感器問題,因此在這裏設置輸出的視頻流的方向
    if (_videoConnection.isVideoOrientationSupported) {
        _videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
    }
    return YES;
}

#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate && AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    //能夠捕獲到不一樣的線程
    if (connection == _videoConnection) {
        @synchronized(self) {
            [self _processVideoData:sampleBuffer];
        };
    } else if (connection == _audioConnection) {
        @synchronized(self) {
            [self _processAudioData:sampleBuffer];
        };
    }
}

#pragma mark - process Data
- (void)_processVideoData:(CMSampleBufferRef)sampleBuffer {
    //CFRetain的目的是爲了每條業務線(寫視頻、拋幀)的sampleBuffer都是獨立的
    if (_videoWriter && _videoWriter.isWriting) {
        CFRetain(sampleBuffer);
        dispatch_async(_writtingQueue, ^{
            [_videoWriter appendSampleBuffer:sampleBuffer];
            CFRelease(sampleBuffer);
        });
    }
    
    CFRetain(sampleBuffer);
    //及時清理臨時變量,防止出現內存高峯
    dispatch_async(_outputQueue, ^{
        @autoreleasepool{
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:outputSampleBuffer:)]) {
                [self.delegate captureService:self outputSampleBuffer:sampleBuffer];
            }
        }
        CFRelease(sampleBuffer);
    });
}
複製代碼

配置圖片數據輸出

配置圖片數據輸出的目的是爲了實現相片捕獲功能,經過setOutputSettings,咱們能夠配置咱們輸出的圖片格式。async

@property (nonatomic, strong) AVCaptureStillImageOutput *imageOutput;

- (BOOL)_setupImageOutput:(NSError **)error {
    self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSetting = @{AVVideoCodecKey: AVVideoCodecJPEG};
    [_imageOutput setOutputSettings:outputSetting];
    if ([_captureSession canAddOutput:_imageOutput]) {
        [_captureSession addOutput:_imageOutput];
        return YES;
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.image" code:-2205 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(output)"}];
        return NO;
    }
}

//拍照功能實現
- (void)capturePhoto {
    AVCaptureConnection *connection = [_imageOutput connectionWithMediaType:AVMediaTypeVideo];
    if (connection.isVideoOrientationSupported) {
        connection.videoOrientation = AVCaptureVideoOrientationPortrait;
    }
    
    __weak typeof(self) weakSelf = self;
    [_imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef  _Nullable imageDataSampleBuffer, NSError * _Nullable error) {
        __strong typeof(weakSelf) strongSelf = weakSelf;
        if (imageDataSampleBuffer != NULL) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage *image = [UIImage imageWithData:imageData];
            if (strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(captureService:capturePhoto:)]) {
                [strongSelf.delegate captureService:strongSelf capturePhoto:image];
            }
        }
    }];
}
複製代碼

配置人臉數據輸出

關鍵是配置人臉元數據輸出AVCaptureMetadataOutput,並指定metadataObjectTypes爲AVMetadataObjectTypeFace,捕獲的人臉數據包含當前視頻幀中全部的人臉,能夠從數據中提取人臉的範圍、位置、偏轉角,但這個有個注意點,就是原始的人臉數據的座標是相機座標系,咱們須要轉化爲屏幕座標,這樣才方便咱們的業務處理,具體能夠看人臉數據輸出那一塊。

@property (nonatomic, strong) AVCaptureMetadataOutput *metadataOutput;

-(void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
    NSMutableArray *transformedFaces = [NSMutableArray array];
    for (AVMetadataObject *face in metadataObjects) {
        @autoreleasepool{
            AVMetadataFaceObject *transformedFace = (AVMetadataFaceObject*)[self.previewLayer transformedMetadataObjectForMetadataObject:face];
            if (transformedFace) {
                [transformedFaces addObject:transformedFace];
            }
        };
    }
    @autoreleasepool{
        if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:outputFaceDetectData:)]) {
            [self.delegate captureService:self outputFaceDetectData:[transformedFaces copy]];
        }
    };
}

複製代碼

配置預覽源

這裏有兩種方式,一種是外部已經經過實現預覽數據源方法配置了數據源,另一種是內部本身生成AVCaptureVideoPreviewLayer配置爲預覽源。

if (self.preViewSource && [self.preViewSource respondsToSelector:@selector(preViewLayerSource)]) {
        self.previewLayer = [self.preViewSource preViewLayerSource];
        [_previewLayer setSession:_captureSession];
        [_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    } else {
        self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
        //充滿整個屏幕
        [_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
        
        if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:getPreviewLayer:)]) {
            [self.delegate captureService:self getPreviewLayer:_previewLayer];
        }
    }
複製代碼

處理音視頻先後臺狀態變化

iOS的音視頻先後臺機制較複雜,有各類生命週期變化,爲了保證咱們框架在正確狀態下作正確的事,咱們將數據幀的讀和寫的狀態處理進行解耦,各位維護本身的通知狀態變化處理。外層業務無需監聽AVFoundation的通知手動處理視頻流的狀態變化。 讀模塊音視頻通知配置:

//CaptureService和VideoWritter各自維護本身的生命週期,捕獲視頻流的狀態與寫入視頻流的狀態解耦分離,音視頻狀態變遷由captureservice內部管理,外層業務無需手動處理視頻流變化
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_captureSessionNotification:) name:nil object:self.captureSession];
    
    //爲了適配低於iOS 9的版本,在iOS 9之前,當session start 還沒完成就退到後臺,回到前臺會捕獲AVCaptureSessionRuntimeErrorNotification,這時須要手動從新啓動session,iOS 9之後系統對此作了優化,系統退到後臺後會將session start緩存起來,回到前臺會自動調用緩存的session start,無需手動調用
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_enterForegroundNotification:) name:UIApplicationWillEnterForegroundNotification object:nil];

#pragma mark - CaptureSession Notification
- (void)_captureSessionNotification:(NSNotification *)notification {
    NSLog(@"_captureSessionNotification:%@",notification.name);
    if ([notification.name isEqualToString:AVCaptureSessionDidStartRunningNotification]) {
        if (!_firstStartRunning) {
            NSLog(@"session start running");
            _firstStartRunning = YES;
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureServiceDidStartService:)]) {
                [self.delegate captureServiceDidStartService:self];
            }
        } else {
            NSLog(@"session resunme running");
        }
    } else if ([notification.name isEqualToString:AVCaptureSessionDidStopRunningNotification]) {
        if (!_isRunning) {
            NSLog(@"session stop running");
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureServiceDidStopService:)]) {
                [self.delegate captureServiceDidStopService:self];
            }
        } else {
            NSLog(@"interupte session stop running");
        }
    } else if ([notification.name isEqualToString:AVCaptureSessionWasInterruptedNotification]) {
        NSLog(@"session was interupted, userInfo: %@",notification.userInfo);
    } else if ([notification.name isEqualToString:AVCaptureSessionInterruptionEndedNotification]) {
        NSLog(@"session interupted end");
    } else if ([notification.name isEqualToString:AVCaptureSessionRuntimeErrorNotification]) {
        NSError *error = notification.userInfo[AVCaptureSessionErrorKey];
        if (error.code == AVErrorDeviceIsNotAvailableInBackground) {
            NSLog(@"session runtime error : AVErrorDeviceIsNotAvailableInBackground");
            _startSessionOnEnteringForeground = YES;
        } else {
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
                [self.delegate captureService:self serviceDidFailWithError:error];
            }
        }
    } else {
        NSLog(@"handel other notification : %@",notification.name);
    }
}

#pragma mark - UIApplicationWillEnterForegroundNotification
- (void)_enterForegroundNotification:(NSNotification *)notification {
    if (_startSessionOnEnteringForeground == YES) {
        NSLog(@"爲了適配低於iOS 9的版本,在iOS 9之前,當session start 還沒完成就退到後臺,回到前臺會捕獲AVCaptureSessionRuntimeErrorNotification,這時須要手動從新啓動session,iOS 9之後系統對此作了優化,系統退到後臺後會將session start緩存起來,回到前臺會自動調用緩存的session start,無需手動調用");
        _startSessionOnEnteringForeground = NO;
        [self startRunning];
    }
}
複製代碼

寫模塊音視頻通知配置:

//寫模塊註冊通知,只負責寫相關的狀態處理
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_assetWritterInterruptedNotification:) name:AVCaptureSessionWasInterruptedNotification object:nil];

- (void)_assetWritterInterruptedNotification:(NSNotification *)notification {
    NSLog(@"assetWritterInterruptedNotification");
    [self cancleWriting];
}
複製代碼

啓動捕獲 & 關閉捕獲

異步啓動,防止阻塞主線程。串行隊列中執行啓動和關閉,保證不會出現啓動到一半就關閉這種異常case。

- (void)startRunning {
    dispatch_async(_sessionQueue, ^{
        NSError *error = nil;
        BOOL result =  [self _setupSession:&error];
        if (result) {
            _isRunning = YES;
            [_captureSession startRunning];
        }else{
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
                [self.delegate captureService:self serviceDidFailWithError:error];
            }
        }
    });
}

- (void)stopRunning {
    dispatch_async(_sessionQueue, ^{
        _isRunning = NO;
        NSError *error = nil;
        [self _clearVideoFile:&error];
        if (error) {
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
                [self.delegate captureService:self serviceDidFailWithError:error];
            }
        }
        [_captureSession stopRunning];
    });
}

複製代碼

切換攝像頭

切換不單單是切換device,同時還要將舊的捕獲input移除,添加新的device input。切換攝像頭時,videoConnection會變化,因此須要從新獲取。

- (void)switchCamera {
    if (_openDepth) {
        return;
    }
    
    NSError *error;
    AVCaptureDevice *videoDevice = [self _inactiveCamera];
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    
    if (videoInput) {
        [_captureSession beginConfiguration];
        
        [_captureSession removeInput:self.videoInput];
        
        if ([self.captureSession canAddInput:videoInput]) {
            [self.captureSession addInput:videoInput];
            self.videoInput = videoInput;
            //切換攝像頭videoConnection會變化,因此須要從新獲取
            self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
            if (_videoConnection.isVideoOrientationSupported) {
                _videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
            }
        } else {
            [self.captureSession addInput:self.videoInput];
        }
        
        [self.captureSession commitConfiguration];
    }
    
    _devicePosition = _devicePosition == AVCaptureDevicePositionFront?AVCaptureDevicePositionBack:AVCaptureDevicePositionFront;
}

- (AVCaptureDevice *)_inactiveCamera {
    AVCaptureDevice *device = nil;
    if (_devicePosition == AVCaptureDevicePositionBack) {
        device = [self _cameraWithPosition:AVCaptureDevicePositionFront];
    } else {
        device = [self _cameraWithPosition:AVCaptureDevicePositionBack];
    }
    return device;
}
複製代碼

錄像功能

經過videoSetting配置錄製完的視頻的編碼格式,框架默認的編碼格式是H.264,H.264是一種高效的視頻編碼格式,以後再出篇文章講下這種編碼格式,現階段你只須要知道這是一種經常使用的編碼格式,想了解更多編碼格式,能夠看下AVVideoCodecType裏面的內容。在XDCaptureService的startRecording方法中,咱們初始化咱們的寫模塊XDVideoWritter,XDVideoWritter根據videoSetting配置對應的編碼格式。

- (void)startRecording {
    dispatch_async(_writtingQueue, ^{
        @synchronized(self) {
            NSString *videoFilePath = [_videoDir stringByAppendingPathComponent:[NSString stringWithFormat:@"Record-%llu.mp4",mach_absolute_time()]];
            
            _recordURL = [[NSURL alloc] initFileURLWithPath:videoFilePath];
            
            if (_recordURL) {
                _videoWriter = [[XDVideoWritter alloc] initWithURL:_recordURL VideoSettings:_videoSetting audioSetting:_audioSetting];
                _videoWriter.delegate = self;
                [_videoWriter startWriting];
                if (self.delegate && [self.delegate respondsToSelector:@selector(captureServiceRecorderDidStart:)]) {
                    [self.delegate captureServiceRecorderDidStart:self];
                }
            } else {
                NSLog(@"No record URL");
            }
        }
    });
}

//XDVideoWritter.m
- (void)startWriting {
    if (_assetWriter) {
        _assetWriter = nil;
    }
    NSError *error = nil;
    
    NSString *fileType = AVFileTypeMPEG4;
    _assetWriter = [[AVAssetWriter alloc] initWithURL:_outputURL fileType:fileType error:&error];
    
    if (!_assetWriter || error) {
        if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]){
            [self.delegate videoWritter:self didFailWithError:error];
        }
    }
    
    if (_videoSetting) {
        _videoInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:_videoSetting];
        
        _videoInput.expectsMediaDataInRealTime = YES;
        
        if ([_assetWriter canAddInput:_videoInput]) {
            [_assetWriter addInput:_videoInput];
        } else {
            NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2210 userInfo:@{NSLocalizedDescriptionKey:@"VideoWritter unable to add video input"}];
            if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                [self.delegate videoWritter:self didFailWithError:error];
            }
            return;
        }
    } else {
        NSLog(@"warning: no video setting");
    }
    
    if (_audioSetting) {
        _audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:_audioSetting];
        
        _audioInput.expectsMediaDataInRealTime = YES;
        
        if ([_assetWriter canAddInput:_audioInput]) {
            [_assetWriter addInput:_audioInput];
        } else {
            NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2211 userInfo:@{NSLocalizedDescriptionKey:@"VideoWritter unable to add audio input"}];
            if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                [self.delegate videoWritter:self didFailWithError:error];
            }
            return;
        }
    } else {
        NSLog(@"warning: no audio setting");
    }
    
    if ([_assetWriter startWriting]) {
        self.isWriting = YES;
    } else {
        NSError *error = [NSError errorWithDomain:@"com.xindong.captureservice.writter" code:-2212 userInfo:@{NSLocalizedDescriptionKey: [NSString stringWithFormat: @"VideoWritter startWriting fail error: %@",_assetWriter.error.localizedDescription]}];
        if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
            [self.delegate videoWritter:self didFailWithError:error];
        }
    }
    
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_assetWritterInterruptedNotification:) name:AVCaptureSessionWasInterruptedNotification object:nil];
}

複製代碼

錄像的原理就是在拋數據的同時調用XDVideoWritter的appendSampleBuffer方法將數據寫入一個臨時文件中,當調用stopRecording,也就是調用到XDVideoWritter的stopWriting方法中止寫數據,將臨時文件歸檔爲MP4文件。

- (void)stopRecording {
    dispatch_async(_writtingQueue, ^{
        @synchronized(self) {
            if (_videoWriter) {
                [_videoWriter stopWriting];
            }
        }
    });
}

//XDVideoWritter.m
- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    CMFormatDescriptionRef formatDesc = CMSampleBufferGetFormatDescription(sampleBuffer);
    
    CMMediaType mediaType = CMFormatDescriptionGetMediaType(formatDesc);
    
    if (mediaType == kCMMediaType_Video) {
        CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        
        if (self.firstSample) {
            [_assetWriter startSessionAtSourceTime:timestamp];
            self.firstSample = NO;
        }
        
        if (_videoInput.readyForMoreMediaData) {
            if (![_videoInput appendSampleBuffer:sampleBuffer]) {
                NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2213 userInfo:@{NSLocalizedDescriptionKey:[NSString stringWithFormat: @"VideoWritter appending video sample buffer fail error:%@",_assetWriter.error.localizedDescription]}];
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                    [self.delegate videoWritter:self didFailWithError:error];
                }
            }
        }
    } else if (!self.firstSample && mediaType == kCMMediaType_Audio) {
        if (_audioInput.readyForMoreMediaData) {
            if (![_audioInput appendSampleBuffer:sampleBuffer]) {
                NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2214 userInfo:@{NSLocalizedDescriptionKey:[NSString stringWithFormat:@"VideoWritter appending audio sample buffer fail error: %@",_assetWriter.error]}];
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                    [self.delegate videoWritter:self didFailWithError:error];
                }
            }
        }
    }
}

- (void)stopWriting {
    if (_assetWriter.status == AVAssetWriterStatusWriting) {
        self.isWriting = NO;
        [_assetWriter finishWritingWithCompletionHandler:^{
            if (_assetWriter.status == AVAssetWriterStatusCompleted) {
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:completeWriting:)]) {
                    [self.delegate videoWritter:self completeWriting:nil];
                }
            } else {
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:completeWriting:)]) {
                    [self.delegate videoWritter:self completeWriting:_assetWriter.error];
                }
            }
        }];
    } else {
        NSLog(@"warning : stop writing with unsuitable state : %ld",_assetWriter.status);
    }
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureSessionWasInterruptedNotification object:nil];
}

複製代碼

感謝

感受每一位給XDCaptureService issue 和使用反饋的同窗,開源完善靠你們!

相關文章
相關標籤/搜索