IOS 錄製視頻

 

最近開發中遇到一個需求,就是想微信那樣錄製一個小視頻,而後在錄製視頻的圖層上播放,而後發佈到朋友圈,無聲播放,但有滾動起來不影響性能。一開始接到這個需求的時候我是很興奮的,能夠好好研究一番 AVFoundation 的東西了。可是在研究中不斷的高潮迭起,也是讓我心力交瘁呀。可是,作程序猿的成長就是這樣的嘛。題外話了,好了,今天咱們就說一下怎麼用 AVCaptureSession + AVCaptureMovieFileOutput 來錄製視頻,並經過AVAssetExportSeeion 手段來壓縮視頻並轉換爲 MP4 格式數組

一開始咱們要了解一下 AVFoundation 作視頻的類應該有那些,而且他們有什麼用呢?服務器

AVCaptureSession

AVCaptureSession:媒體(音、視頻)捕獲會話,負責把捕獲的音視頻數據輸出到輸出設備中。一個AVCaptureSession能夠有多個輸入輸出。
 AVCaptureDevice :輸入設備,包括麥克風、攝像頭,經過該對象能夠設置物理設備的一些屬性(例如相機聚焦、白平衡等)。
 AVCaptureDeviceInput :設備輸入數據管理對象,能夠根據AVCaptureDevice建立對應的AVCaptureDeviceInput對象,該對象將會被添加到AVCaptureSession中管理。
 AVCaptureVideoPreviewLayer :相機拍攝預覽圖層,是CALayer的子類,使用該對象能夠實時查看拍照或視頻錄製效果,建立該對象須要指定對應的 AVCaptureSession對象。

 AVCaptureOutput :輸出數據管理對象,用於接收各種輸出數據,一般使用對應的子類AVCaptureAudioDataOutput、AVCaptureStillImageOutput、
 AVCaptureVideoDataOutput、AVCaptureFileOutput, 該對象將會被添加到AVCaptureSession中管理。
 注意:前面幾個對象的輸出數據都是NSData類型,而AVCaptureFileOutput表明數據以文件形式輸出,相似的,AVCcaptureFileOutput也不會直接建立使用,一般會使用其子類:
 AVCaptureAudioFileOutput、AVCaptureMovieFileOutput。當把一個輸入或者輸出添加到AVCaptureSession以後AVCaptureSession就會在全部相符的輸入、輸出設備之間
 創建鏈接(AVCaptionConnection)。

那麼創建視頻拍攝的步驟以下 :
1.建立AVCaptureSession對象。微信

// 建立會話 (AVCaptureSession) 對象。
_captureSession = [[AVCaptureSession alloc] init];
if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) {
    // 設置會話的 sessionPreset 屬性, 這個屬性影響視頻的分辨率
    [_captureSession setSessionPreset:AVCaptureSessionPreset640x480];
}

2.使用AVCaptureDevice的靜態方法得到須要使用的設備,例如拍照和錄像就須要得到攝像頭設備,錄音就要得到麥克風設備。網絡

// 獲取攝像頭輸入設備, 建立 AVCaptureDeviceInput 對象
// 在獲取攝像頭的時候,攝像頭分爲先後攝像頭,咱們建立了一個方法經過用攝像頭的位置來獲取攝像頭 
AVCaptureDevice *videoCaptureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
if (!captureDevice) {
    NSLog(@"---- 取得後置攝像頭時出現問題---- ");
    return;
}

// 添加一個音頻輸入設備
// 直接能夠拿數組中的數組中的第一個
AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];

3.利用輸入設備AVCaptureDevice初始化AVCaptureDeviceInput對象。session

// 視頻輸入對象
// 根據輸入設備初始化輸入對象,用戶獲取輸入數據
_videoCaptureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
if (error) {
    NSLog(@"---- 取得設備輸入對象時出錯 ------ %@",error);
    return;
} 

//  音頻輸入對象
//根據輸入設備初始化設備輸入對象,用於得到輸入數據
_audioCaptureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioCaptureDevice error:&error];
if (error) {
    NSLog(@"取得設備輸入對象時出錯 ------ %@",error);
    return;
}

4.初始化輸出數據管理對象,若是要拍照就初始化AVCaptureStillImageOutput對象;若是拍攝視頻就初始化AVCaptureMovieFileOutput對象。async

// 拍攝視頻輸出對象
// 初始化輸出設備對象,用戶獲取輸出數據
_caputureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

5.將數據輸入對象AVCaptureDeviceInput、數據輸出對象AVCaptureOutput添加到媒體會話管理對象AVCaptureSession中。ide

// 將視頻輸入對象添加到會話 (AVCaptureSession) 中
if ([_captureSession canAddInput:_videoCaptureDeviceInput]) {
    [_captureSession addInput:_videoCaptureDeviceInput];
}

// 將音頻輸入對象添加到會話 (AVCaptureSession) 中
if ([_captureSession canAddInput:_captureDeviceInput]) {
    [_captureSession addInput:audioCaptureDeviceInput];
    AVCaptureConnection *captureConnection = [_caputureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    // 標識視頻錄入時穩定音頻流的接受,咱們這裏設置爲自動
    if ([captureConnection isVideoStabilizationSupported]) {
        captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
    }
}

6.建立視頻預覽圖層AVCaptureVideoPreviewLayer並指定媒體會話,添加圖層到顯示容器中,調用AVCaptureSession的startRuning方法開始捕獲。性能

// 經過會話 (AVCaptureSession) 建立預覽層
_captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];

// 顯示在視圖表面的圖層
CALayer *layer = self.viewContrain.layer;
layer.masksToBounds = true;

_captureVideoPreviewLayer.frame = layer.bounds;
_captureVideoPreviewLayer.masksToBounds = true;
_captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式
[layer addSublayer:_captureVideoPreviewLayer];

// 讓會話(AVCaptureSession)勾搭好輸入輸出,而後把視圖渲染到預覽層上
[_captureSession startRunning];

7.將捕獲的音頻或視頻數據輸出到指定文件。測試

建立一個拍攝的按鈕,當咱們點擊這個按鈕就會觸發視頻錄製,並將這個錄製的視頻放到 temp 文件夾中

- (IBAction)takeMovie:(id)sender {
[(UIButton *)sender setSelected:![(UIButton *)sender isSelected]];
if ([(UIButton *)sender isSelected]) {
     AVCaptureConnection *captureConnection=[self.caputureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    // 開啓視頻防抖模式
    AVCaptureVideoStabilizationMode stabilizationMode = AVCaptureVideoStabilizationModeCinematic;
    if ([self.captureDeviceInput.device.activeFormat isVideoStabilizationModeSupported:stabilizationMode]) {
        [captureConnection setPreferredVideoStabilizationMode:stabilizationMode];
    }

    //若是支持多任務則則開始多任務
    if ([[UIDevice currentDevice] isMultitaskingSupported]) {
       self.backgroundTaskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
    }
    // 預覽圖層和視頻方向保持一致,這個屬性設置很重要,若是不設置,那麼出來的視頻圖像能夠是倒向左邊的。
    captureConnection.videoOrientation=[self.captureVideoPreviewLayer connection].videoOrientation;

   // 設置視頻輸出的文件路徑,這裏設置爲 temp 文件
    NSString *outputFielPath=[NSTemporaryDirectory() stringByAppendingString:MOVIEPATH];

    // 路徑轉換成 URL 要用這個方法,用 NSBundle 方法轉換成 URL 的話可能會出現讀取不到路徑的錯誤
    NSURL *fileUrl=[NSURL fileURLWithPath:outputFielPath];

   // 往路徑的 URL 開始寫入錄像 Buffer ,邊錄邊寫
    [self.caputureMovieFileOutput startRecordingToOutputFileURL:fileUrl recordingDelegate:self];
}
else {
   // 取消視頻拍攝
    [self.caputureMovieFileOutput stopRecording];
    [self.captureSession stopRunning];
    [self completeHandle];
}
}

固然咱們錄製的開始與結束都是有監聽方法的,AVCaptureFileOutputRecordingDelegate 這個代理裏面就有咱們想要作的優化

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
   NSLog(@"---- 開始錄製 ----");
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    NSLog(@"---- 錄製結束 ----");
}

到此,咱們錄製視頻就結束了,那麼是否是咱們錄製好了視頻,就能夠立刻把這個視頻上傳給服務器分享給你的小夥伴們看了呢?
咱們能夠用以下方法測試一下咱們錄製出來的視頻有多大 (m)

- (CGFloat)getfileSize:(NSString *)path
{
    NSDictionary *outputFileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:path error:nil];
    NSLog (@"file size: %f", (unsigned long long)[outputFileAttributes fileSize]/1024.00 /1024.00);
    return (CGFloat)[outputFileAttributes fileSize]/1024.00 /1024.00;
}

我的在這裏作過測試,錄製了 10s 的小視頻獲得的文件大小爲 4.1M 左右,並且我用的分辨率仍是640x480。。。很無語了是否是?
若是咱們錄製的視頻,錄製完成後要與服務器進行必要的上傳,那麼,咱們確定不能把這個剛剛錄製出來的視頻上傳給服務器的,咱們有必要對這個視頻進行壓縮了。那麼咱們的壓縮方法,就要用到 AVAssetExportSeeion 這個類了。

// 這裏咱們建立一個按鈕,當點擊這個按鈕,咱們就會調用壓縮視頻的方法,而後再去從新計算大小,這樣就會跟未被壓縮前的大小有個明顯的對比了


// 壓縮視頻
- (IBAction)compressVideo:(id)sender
{
    NSString *cachePath=[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
    NSString *savePath=[cachePath stringByAppendingPathComponent:MOVIEPATH];
    NSURL *saveUrl=[NSURL fileURLWithPath:savePath];

    // 經過文件的 url 獲取到這個文件的資源
    AVURLAsset *avAsset = [[AVURLAsset alloc] initWithURL:saveUrl options:nil];
    // 用 AVAssetExportSession 這個類來導出資源中的屬性
    NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];

    // 壓縮視頻
    if ([compatiblePresets containsObject:AVAssetExportPresetLowQuality]) { // 導出屬性是否包含低分辨率
    // 經過資源(AVURLAsset)來定義 AVAssetExportSession,獲得資源屬性來從新打包資源 (AVURLAsset, 將某一些屬性從新定義
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:avAsset presetName:AVAssetExportPresetLowQuality];
    // 設置導出文件的存放路徑
    NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
    [formatter setDateFormat:@"yyyy-MM-dd-HH:mm:ss"];
    NSDate    *date = [[NSDate alloc] init];
    NSString *outPutPath = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, true) lastObject] stringByAppendingPathComponent:[NSString stringWithFormat:@"output-%@.mp4",[formatter stringFromDate:date]]];
    exportSession.outputURL = [NSURL fileURLWithPath:outPutPath];

    // 是否對網絡進行優化
    exportSession.shouldOptimizeForNetworkUse = true;

    // 轉換成MP4格式
    exportSession.outputFileType = AVFileTypeMPEG4;

    // 開始導出,導出後執行完成的block
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        // 若是導出的狀態爲完成
        if ([exportSession status] == AVAssetExportSessionStatusCompleted) {
            dispatch_async(dispatch_get_main_queue(), ^{
                // 更新一下顯示包的大小
                self.videoSize.text = [NSString stringWithFormat:@"%f MB",[self getfileSize:outPutPath]];
            });
        }
    }];
}
}

通過咱們的壓縮,這個時候10s 的 4M 視頻就只剩下不夠 1M 了。

如下是一些擴展

自動閃光燈開啓

- (IBAction)flashAutoClick:(UIButton *)sender {
    [self setFlashMode:AVCaptureFlashModeAuto];
    [self setFlashModeButtonStatus];
}

打開閃光燈

- (IBAction)flashOnClick:(UIButton *)sender {
    [self setFlashMode:AVCaptureFlashModeOn];
    [self setFlashModeButtonStatus];
}

關閉閃光燈

- (IBAction)flashOffClick:(UIButton *)sender {
    [self setFlashMode:AVCaptureFlashModeOff];
    [self setFlashModeButtonStatus];
}

通知

/**
 *  給輸入設備添加通知
 */
-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{
//注意添加區域改變捕獲通知必須首先設置設備容許捕獲
[self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
    captureDevice.subjectAreaChangeMonitoringEnabled=YES;
}];
NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
//捕獲區域發生改變
[notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];
}
-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice{
NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
[notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];
}
/**
 *  移除全部通知
 */
-(void)removeNotification{
    NSNotificationCenter *notificationCenter= [NSNotificationCenter     defaultCenter];
    [notificationCenter removeObserver:self];
}

-(void)addNotificationToCaptureSession:(AVCaptureSession *)captureSession{
NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
//會話出錯
[notificationCenter addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:captureSession];
}

/**
 *  設備鏈接成功
 *
 *  @param notification 通知對象
 */
-(void)deviceConnected:(NSNotification *)notification{
    NSLog(@"設備已鏈接...");
}
/**
 *  設備鏈接斷開
 *
 *  @param notification 通知對象
 */
-(void)deviceDisconnected:(NSNotification *)notification{
NSLog(@"設備已斷開.");
}
/**
 *  捕獲區域改變
 *
 *  @param notification 通知對象
 */
-(void)areaChange:(NSNotification *)notification{
    NSLog(@"捕獲區域改變...");
}

/**
 *  會話出錯
 *
 *  @param notification 通知對象
 */
-(void)sessionRuntimeError:(NSNotification *)notification{
NSLog(@"會話發生錯誤.");

}

私有方法

/**
 *  取得指定位置的攝像頭
 *
 *  @param position 攝像頭位置
 *
 *  @return 攝像頭設備
 */
-(AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{
    NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *camera in cameras) {
        if ([camera position]==position) {
            return camera;
        }
    }
    return nil;
}

/**
 *  改變設備屬性的統一操做方法
 *
 *  @param propertyChange 屬性改變操做
 */
-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{
    AVCaptureDevice *captureDevice= [self.captureDeviceInput device];
    NSError *error;
    //注意改變設備屬性前必定要首先調用lockForConfiguration:調用完以後使用unlockForConfiguration方法解鎖
    if ([captureDevice lockForConfiguration:&error]) {
        propertyChange(captureDevice);
        [captureDevice unlockForConfiguration];
    }else{
        NSLog(@"設置設備屬性過程發生錯誤,錯誤信息:%@",error.localizedDescription);
    }
}

/**
 *  設置閃光燈模式
 *
 *  @param flashMode 閃光燈模式
 */
-(void)setFlashMode:(AVCaptureFlashMode )flashMode{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isFlashModeSupported:flashMode]) {
            [captureDevice setFlashMode:flashMode];
        }
    }];
}
/**
 *  設置聚焦模式
 *
 *  @param focusMode 聚焦模式
 */
-(void)setFocusMode:(AVCaptureFocusMode )focusMode{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isFocusModeSupported:focusMode]) {
            [captureDevice setFocusMode:focusMode];
        }
    }];
}
/**
 *  設置曝光模式
 *
 *  @param exposureMode 曝光模式
 */
-(void)setExposureMode:(AVCaptureExposureMode)exposureMode{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isExposureModeSupported:exposureMode]) {
            [captureDevice setExposureMode:exposureMode];
        }
    }];
}

/**
 *  設置聚焦點
 *
 *  @param point 聚焦點
 */
-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isFocusModeSupported:focusMode]) {
            [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
        }
        if ([captureDevice isFocusPointOfInterestSupported]) {
            [captureDevice setFocusPointOfInterest:point];
        }
        if ([captureDevice isExposureModeSupported:exposureMode]) {
            [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];
        }
        if ([captureDevice isExposurePointOfInterestSupported]) {
            [captureDevice setExposurePointOfInterest:point];
        }
    }];
}

/**
 *  添加點按手勢,點按時聚焦
 */
-(void)addGenstureRecognizer{
UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)];
    [self.viewContainer addGestureRecognizer:tapGesture];
}
-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{
    CGPoint point= [tapGesture locationInView:self.viewContainer];
    //將UI座標轉化爲攝像頭座標
    CGPoint cameraPoint= [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point];
    [self setFocusCursorWithPoint:point];
    [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];
}

/**
 *  設置閃光燈按鈕狀態
 */
-(void)setFlashModeButtonStatus{
    AVCaptureDevice *captureDevice=[self.captureDeviceInput device];
    AVCaptureFlashMode flashMode=captureDevice.flashMode;
    if([captureDevice isFlashAvailable]){
        self.flashAutoButton.hidden=NO;
    self.flashOnButton.hidden=NO;
    self.flashOffButton.hidden=NO;
    self.flashAutoButton.enabled=YES;
    self.flashOnButton.enabled=YES;
    self.flashOffButton.enabled=YES;
    switch (flashMode) {
        case AVCaptureFlashModeAuto:
            self.flashAutoButton.enabled=NO;
            break;
        case AVCaptureFlashModeOn:
            self.flashOnButton.enabled=NO;
            break;
        case AVCaptureFlashModeOff:
            self.flashOffButton.enabled=NO;
            break;
        default:
            break;
    }
}else{
    self.flashAutoButton.hidden=YES;
    self.flashOnButton.hidden=YES;
    self.flashOffButton.hidden=YES;
}
}

/**
 *  設置聚焦光標位置
 *
 *  @param point 光標位置
 */
-(void)setFocusCursorWithPoint:(CGPoint)point{
    self.focusCursor.center=point;
    self.focusCursor.transform=CGAffineTransformMakeScale(1.5, 1.5);
    self.focusCursor.alpha=1.0;
    [UIView animateWithDuration:1.0 animations:^{
        self.focusCursor.transform=CGAffineTransformIdentity;
    } completion:^(BOOL finished) {
        self.focusCursor.alpha=0;

    }];
}
相關文章
相關標籤/搜索