在直播應用中,視頻的採集通常都是用AVFoundation框架,由於利用它咱們能定製採集視頻的參數;也能作切換手機攝像頭、拍照、打開手電筒等一些列相機的操做;固然,更重要的一點是咱們能獲取到原始視頻數據用來作編碼等操做。這篇文章咱們介紹的內容以下:ios
代碼:git
它表示硬件設備,咱們能夠從這個類中獲取手機硬件的照相機,聲音傳感器等。當咱們須要改變一些硬件設備的屬性時(例如:閃光模式改變,相機聚焦改變等),必需要在改變設備屬性以前調用lockForConfiguration
爲設備加鎖,改變完成後調用unlockForConfiguration
方法解鎖設備。github
輸入設備管理對象,能夠根據AVCaptureDevice建立建立對應的AVCaptureDeviceInput對象,該對象會被添加到AVCaptureSession中管理。它表明輸入設備,它配置硬件設備的ports,一般的輸入設備有(麥克風,相機等)。緩存
表明輸出數據,輸出的能夠是圖片(AVCaptureStillImageOutput)或者視頻(AVCaptureMovieFileOutput)bash
媒體捕捉會話,負責把捕捉的音視頻數據輸出到輸出設備中。一個AVCaptureSession能夠有多個輸入或輸出。它是鏈接AVCaptureInput和AVCaptureOutput的橋樑,它協調input到output之間傳輸數據。它用startRunning和stopRunning兩種方法來開啓和結束會話。session
每一個session稱之爲一個會話,也就是在應用運行過程當中若是須要改變會話的一些配置(eg:切換攝像頭),此時須要先開啓配置,配置完成以後再提交配置。框架
AVCaptureConnection represents a connection between an AVCaptureInputPort or ports, and an AVCaptureOutput or AVCaptureVideoPreviewLayer present in an AVCaptureSession.即它是一個鏈接,這個鏈接是inputPort和output之間或者是圖像當前預覽層和當前會話之間的。ide
它是圖片預覽層。咱們的照片以及視頻是如何顯示在手機上的呢?那就是經過把這個對象添加到UIView 的layer上的。ui
如下是視頻採集的代碼,幀率是30FPS
,分辨率是1920*1080
。編碼
#import "MiVideoCollectVC.h"
#import <AVFoundation/AVFoundation.h>
@interface MiVideoCollectVC ()<AVCaptureVideoDataOutputSampleBufferDelegate>
@property (nonatomic,strong) AVCaptureVideoDataOutput *video_output;
@property (nonatomic,strong) AVCaptureSession *m_session;
@property (weak, nonatomic) IBOutlet UIView *m_displayView;
@end
@implementation MiVideoCollectVC
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
[self startCaptureSession];
}
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
[self startPreview];
}
- (IBAction)onpressedBtnDismiss:(id)sender {
[self dismissViewControllerAnimated:YES completion:^{
[self stopPreview];
}];
}
- (void)startCaptureSession
{
NSError *error = nil;
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([session canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {
session.sessionPreset = AVCaptureSessionPreset1920x1080;
}else{
session.sessionPreset = AVCaptureSessionPresetHigh;
}
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (error || !input) {
NSLog(@"get input device error...");
return;
}
[session addInput:input];
_video_output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:_video_output];
// Specify the pixel format
_video_output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
_video_output.alwaysDiscardsLateVideoFrames = NO;
dispatch_queue_t video_queue = dispatch_queue_create("MIVideoQueue", NULL);
[_video_output setSampleBufferDelegate:self queue:video_queue];
CMTime frameDuration = CMTimeMake(1, 30);
BOOL frameRateSupported = NO;
for (AVFrameRateRange *range in [device.activeFormat videoSupportedFrameRateRanges]) {
if (CMTIME_COMPARE_INLINE(frameDuration, >=, range.minFrameDuration) &&
CMTIME_COMPARE_INLINE(frameDuration, <=, range.maxFrameDuration)) {
frameRateSupported = YES;
}
}
if (frameRateSupported && [device lockForConfiguration:&error]) {
[device setActiveVideoMaxFrameDuration:frameDuration];
[device setActiveVideoMinFrameDuration:frameDuration];
[device unlockForConfiguration];
}
[self adjustVideoStabilization];
_m_session = session;
CALayer *previewViewLayer = [self.m_displayView layer];
previewViewLayer.backgroundColor = [[UIColor blackColor] CGColor];
AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_m_session];
[newPreviewLayer setFrame:[UIApplication sharedApplication].keyWindow.bounds];
[newPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
// [previewViewLayer insertSublayer:newPreviewLayer atIndex:2];
[previewViewLayer insertSublayer:newPreviewLayer atIndex:0];
}
- (void)adjustVideoStabilization
{
NSArray *devices = [AVCaptureDevice devices];
for (AVCaptureDevice *device in devices) {
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
for (AVCaptureConnection *connection in _video_output.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
if (connection.supportsVideoStabilization) {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
NSLog(@"now videoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
}else{
NSLog(@"connection does not support video stablization");
}
}
}
}
}else{
NSLog(@"device does not support video stablization");
}
}
}
}
- (void)startPreview
{
if (![_m_session isRunning]) {
[_m_session startRunning];
}
}
- (void)stopPreview
{
if ([_m_session isRunning]) {
[_m_session stopRunning];
}
}
#pragma mark -AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"%s",__func__);
}
// 有丟幀時,此代理方法會觸發
- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"MediaIOS: 丟幀...");
}
@end
複製代碼
視頻採集的具體步驟總結以下:
咱們先不介紹如何改變視頻的分辨率和幀率,咱們首先來說一下如何監控視頻採集的這些參數,由於咱們只有能監控到這些參數的變化才能知道咱們對這些參數的設置是否成功。
監控視頻分辨率:
咱們能夠經過AVCaptureSession對象的sessionPreset
直接獲取到,它是一個字符串,咱們設置完成以後直接打印一下就能夠了。
監控視頻幀率:
視頻的幀率表示的是每秒採集的視頻幀數,咱們能夠經過啓動一個timer(1s刷新一次),來實時打印當前採集的視頻幀率是多少。下面是計算1s內採集視頻幀數的代碼:
// 計算每秒鐘採集視頻多少幀
static int captureVideoFPS;
+ (void)calculatorCaptureFPS
{
static int count = 0;
static float lastTime = 0;
CMClockRef hostClockRef = CMClockGetHostTimeClock();
CMTime hostTime = CMClockGetTime(hostClockRef);
float nowTime = CMTimeGetSeconds(hostTime);
if(nowTime - lastTime >= 1)
{
captureVideoFPS = count;
lastTime = nowTime;
count = 0;
}
else
{
count ++;
}
}
// 獲取視頻幀率
+ (int)getCaptureVideoFPS
{
return captureVideoFPS;
}
複製代碼
/**
* Reset resolution
*
* @param m_session AVCaptureSession instance
* @param resolution
*/
+ (void)resetSessionPreset:(AVCaptureSession *)m_session resolution:(int)resolution
{
[m_session beginConfiguration];
switch (resolution) {
case 1080:
m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1920x1080] ? AVCaptureSessionPreset1920x1080 : AVCaptureSessionPresetHigh;
break;
case 720:
m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1280x720] ? AVCaptureSessionPreset1280x720 : AVCaptureSessionPresetMedium;
break;
case 480:
m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset640x480] ? AVCaptureSessionPreset640x480 : AVCaptureSessionPresetMedium;
break;
case 360:
m_session.sessionPreset = AVCaptureSessionPresetMedium;
break;
default:
break;
}
[m_session commitConfiguration];
}
複製代碼
+ (void)settingFrameRate:(int)frameRate
{
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[captureDevice lockForConfiguration:NULL];
@try {
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
} @catch (NSException *exception) {
NSLog(@"MediaIOS, 設備不支持所設置的分辨率,錯誤信息:%@",exception.description);
} @finally {
}
[captureDevice unlockForConfiguration];
}
複製代碼
在用雙手勢時,能夠放大縮小所預覽的視頻。
#define MiMaxZoomFactor 5.0f
#define MiPrinchVelocityDividerFactor 20.0f
+ (void)zoomCapture:(UIPinchGestureRecognizer *)recognizer
{
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[videoDevice formats];
if ([recognizer state] == UIGestureRecognizerStateChanged) {
NSError *error = nil;
if ([videoDevice lockForConfiguration:&error]) {
CGFloat desiredZoomFactor = videoDevice.videoZoomFactor + atan2f(recognizer.velocity, MiPrinchVelocityDividerFactor);
videoDevice.videoZoomFactor = desiredZoomFactor <= MiMaxZoomFactor ? MAX(1.0, MIN(desiredZoomFactor, videoDevice.activeFormat.videoMaxZoomFactor)) : MiMaxZoomFactor ;
[videoDevice unlockForConfiguration];
} else {
NSLog(@"error: %@", error);
}
}
}
複製代碼
在視頻採集的時候,可能還伴隨有切換先後鏡頭、打開&關閉閃光燈、拍照等操做。
此處切換鏡頭後,我把分辨率默認設置爲了720p,由於對於有的設備可能前置攝像頭不支持1080p,因此我在此設定一個固定的720p,若是在真實的項目中,這個值應該是你之前設定的那個值,若是前置攝像頭不支持對應的又不支持的策略。
// 切換攝像頭
- (void)switchCamera
{
[_m_session beginConfiguration];
if ([[_video_input device] position] == AVCaptureDevicePositionBack) {
NSArray * devices = [AVCaptureDevice devices];
for(AVCaptureDevice * device in devices) {
if([device hasMediaType:AVMediaTypeVideo]) {
if([device position] == AVCaptureDevicePositionFront) {
[self rePreviewWithCameraType:MiCameraType_Front device:device];
break;
}
}
}
}else{
NSArray * devices = [AVCaptureDevice devices];
for(AVCaptureDevice * device in devices) {
if([device hasMediaType:AVMediaTypeVideo]) {
if([device position] == AVCaptureDevicePositionBack) {
[self rePreviewWithCameraType:MiCameraType_Back device:device];
break;
}
}
}
}
[_m_session commitConfiguration];
}
- (void)rePreviewWithCameraType:(MiCameraType)cameraType device:(AVCaptureDevice *)device {
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) return;
[_m_session removeInput:_video_input];
_m_session.sessionPreset = AVCaptureSessionPresetLow;
if ([_m_session canAddInput:input]) {
[_m_session addInput:input];
}else {
return;
}
_video_input = input;
_m_cameraType = cameraType;
NSString *preset = AVCaptureSessionPreset1280x720;
if([device supportsAVCaptureSessionPreset:preset] && [_m_session canSetSessionPreset:preset]) {
_m_session.sessionPreset = preset;
}else {
NSString *sesssionPreset = AVCaptureSessionPreset1280x720;
if(![sesssionPreset isEqualToString:preset]) {
_m_session.sessionPreset = sesssionPreset;
}
}
}
複製代碼
// 打開關閉閃光燈
-(void)switchTorch
{
[_m_session beginConfiguration];
[[_video_input device] lockForConfiguration:NULL];
self.m_torchMode = [_video_input device].torchMode == AVCaptureTorchModeOn ? AVCaptureTorchModeOff : AVCaptureTorchModeOn;
if ([[_video_input device] isTorchModeSupported:_m_torchMode ]) {
[_video_input device].torchMode = self.m_torchMode;
}
[[_video_input device] unlockForConfiguration];
[_m_session commitConfiguration];
}
複製代碼
具體的方案是:
CMSampleBufferRef
爲UIImage,而後再把UIImage存儲到相冊中注意:如下代碼只有指定像素格式爲RGB的時候,才能保存成功一張彩色的照片到相冊。
- (UIImage *)convertSameBufferToUIImage:(CMSampleBufferRef)sampleBuffer
{
// 爲媒體數據設置一個CMSampleBuffer的Core Video圖像緩存對象
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// 鎖定pixel buffer的基地址
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// 獲得pixel buffer的基地址
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// 獲得pixel buffer的行字節數
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// 獲得pixel buffer的寬和高
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// 建立一個依賴於設備的RGB顏色空間
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// 用抽樣緩存的數據建立一個位圖格式的圖形上下文(graphics context)對象
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// 根據這個位圖context中的像素數據建立一個Quartz image對象
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// 解鎖pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// 釋放context和顏色空間
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// 用Quartz image建立一個UIImage對象image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// 釋放Quartz image對象
CGImageRelease(quartzImage);
return (image);
}
+ (void)saveImageToSysphotos:(UIImage *)image
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:image.CGImage metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(@"MediaIos, save photo to photos error, error info: %@",error.description);
}else{
NSLog(@"MediaIos, save photo success...");
}
}];
}
複製代碼
// 設置爲自動對焦
- (void)mifocus:(UITapGestureRecognizer *)sender
{
CGPoint point = [sender locationInView:self.m_displayView];
[self miAutoFocusWithPoint:point];
NSLog(@"MediaIos, auto focus complete...");
}
- (void)miAutoFocusWithPoint:(CGPoint)touchPoint{
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([captureDevice isFocusPointOfInterestSupported] && [captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([captureDevice lockForConfiguration:&error]) {
// 設置曝光點
[captureDevice setExposurePointOfInterest:touchPoint];
[captureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
// 設置對焦點
[captureDevice setFocusPointOfInterest:touchPoint];
[captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
[captureDevice unlockForConfiguration];
}
}
}
複製代碼
// 曝光調節
- (void)changeExposure:(id)sender
{
UISlider *slider = (UISlider *)sender;
[self michangeExposure:slider.value];
}
- (void)michangeExposure:(CGFloat)value{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposureTargetBias:value completionHandler:nil];
[device unlockForConfiguration];
}
}
複製代碼
- (AVCaptureWhiteBalanceGains)recalcGains:(AVCaptureWhiteBalanceGains)gains
minValue:(CGFloat)minValue
maxValue:(CGFloat)maxValue
{
AVCaptureWhiteBalanceGains tmpGains = gains;
tmpGains.blueGain = MAX(MIN(tmpGains.blueGain , maxValue), minValue);
tmpGains.redGain = MAX(MIN(tmpGains.redGain , maxValue), minValue);
tmpGains.greenGain = MAX(MIN(tmpGains.greenGain, maxValue), minValue);
return tmpGains;
}
-(void)setWhiteBlanceUseTemperature:(CGFloat)temperature{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = temperature,
.tint = currentTint,
};
AVCaptureWhiteBalanceGains gains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
gains = [self recalcGains:gains minValue:1 maxValue:maxWhiteBalanceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:gains completionHandler:nil];
[device unlockForConfiguration];
}
}
// 黑白平衡調節
- (void)whiteBlanceChange:(id)sender
{
UISlider *slider = (UISlider *)sender;
[self setWhiteBlanceUseTemperature:slider.value];
}
複製代碼