在看這篇以前,若是您還不瞭解直播原理,請查看這篇文章如何快速的開發一個完整的iOS直播app(原理篇)html
開發一款直播app,首先須要採集主播的視頻和音頻,而後傳入流媒體服務器,本篇主要講解如何採集主播的視頻和音頻,當前能夠切換前置後置攝像頭和焦點光標
,可是美顏功能還沒作,能夠看見素顏的你,後續還會有直播的其餘功能文章陸續發佈。ios
爲了採集效果圖,我也是豁出去了,請忽略人物,關注技術。git
AVFoundation
: 音視頻數據採集須要用AVFoundation框架.github
AVCaptureDevice
:硬件設備,包括麥克風、攝像頭,經過該對象能夠設置物理設備的一些屬性(例如相機聚焦、白平衡等)緩存
AVCaptureDeviceInput
:硬件輸入對象,能夠根據AVCaptureDevice建立對應的AVCaptureDeviceInput對象,用於管理硬件輸入數據。AVCaptureOutput
:硬件輸出對象,用於接收各種輸出數據,一般使用對應的子類AVCaptureAudioDataOutput(聲音數據輸出對象)、AVCaptureVideoDataOutput(視頻數據輸出對象)AVCaptionConnection
:當把一個輸入和輸出添加到AVCaptureSession以後,AVCaptureSession就會在輸入、輸出設備之間創建鏈接,並且經過AVCaptureOutput能夠獲取這個鏈接對象。AVCaptureVideoPreviewLayer
:相機拍攝預覽圖層,能實時查看拍照或視頻錄製效果,建立該對象須要指定對應的AVCaptureSession對象,由於AVCaptureSession包含視頻輸入數據,有視頻數據才能展現。AVCaptureSession
: 協調輸入與輸出之間傳輸數據
1.
建立AVCaptureSession對象2.
獲取AVCaptureDevicel錄像設備(攝像頭),錄音設備(麥克風),注意不具有輸入數據功能,只是用來調節硬件設備的配置。3.
根據音頻/視頻硬件設備(AVCaptureDevice)建立音頻/視頻硬件輸入數據對象(AVCaptureDeviceInput),專門管理數據輸入。4.
建立視頻輸出數據管理對象(AVCaptureVideoDataOutput),而且設置樣品緩存代理(setSampleBufferDelegate)就能夠經過它拿到採集到的視頻數據5.
建立音頻輸出數據管理對象(AVCaptureAudioDataOutput),而且設置樣品緩存代理(setSampleBufferDelegate)就能夠經過它拿到採集到的音頻數據6.
將數據輸入對象AVCaptureDeviceInput、數據輸出對象AVCaptureOutput添加到媒體會話管理對象AVCaptureSession中,就會自動讓音頻輸入與輸出和視頻輸入與輸出產生鏈接.7.
建立視頻預覽圖層AVCaptureVideoPreviewLayer並指定媒體會話,添加圖層到顯示容器layer中8.
啓動AVCaptureSession,只有開啓,纔會開始輸入到輸出數據流傳輸。// 捕獲音視頻 - (void)setupCaputureVideo { // 1.建立捕獲會話,必需要強引用,不然會被釋放 AVCaptureSession *captureSession = [[AVCaptureSession alloc] init]; _captureSession = captureSession; // 2.獲取攝像頭設備,默認是後置攝像頭 AVCaptureDevice *videoDevice = [self getVideoDevice:AVCaptureDevicePositionFront]; // 3.獲取聲音設備 AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; // 4.建立對應視頻設備輸入對象 AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil]; _currentVideoDeviceInput = videoDeviceInput; // 5.建立對應音頻設備輸入對象 AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil]; // 6.添加到會話中 // 注意「最好要判斷是否能添加輸入,會話不能添加空的 // 6.1 添加視頻 if ([captureSession canAddInput:videoDeviceInput]) { [captureSession addInput:videoDeviceInput]; } // 6.2 添加音頻 if ([captureSession canAddInput:audioDeviceInput]) { [captureSession addInput:audioDeviceInput]; } // 7.獲取視頻數據輸出設備 AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init]; // 7.1 設置代理,捕獲視頻樣品數據 // 注意:隊列必須是串行隊列,才能獲取到數據,並且不能爲空 dispatch_queue_t videoQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL); [videoOutput setSampleBufferDelegate:self queue:videoQueue]; if ([captureSession canAddOutput:videoOutput]) { [captureSession addOutput:videoOutput]; } // 8.獲取音頻數據輸出設備 AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init]; // 8.2 設置代理,捕獲視頻樣品數據 // 注意:隊列必須是串行隊列,才能獲取到數據,並且不能爲空 dispatch_queue_t audioQueue = dispatch_queue_create("Audio Capture Queue", DISPATCH_QUEUE_SERIAL); [audioOutput setSampleBufferDelegate:self queue:audioQueue]; if ([captureSession canAddOutput:audioOutput]) { [captureSession addOutput:audioOutput]; } // 9.獲取視頻輸入與輸出鏈接,用於分辨音視頻數據 _videoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo]; // 10.添加視頻預覽圖層 AVCaptureVideoPreviewLayer *previedLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previedLayer.frame = [UIScreen mainScreen].bounds; [self.view.layer insertSublayer:previedLayer atIndex:0]; _previedLayer = previedLayer; // 11.啓動會話 [captureSession startRunning]; } // 指定攝像頭方向獲取攝像頭 - (AVCaptureDevice *)getVideoDevice:(AVCaptureDevicePosition)position { NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *device in devices) { if (device.position == position) { return device; } } return nil; } #pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate // 獲取輸入設備數據,有多是音頻有多是視頻 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { if (_videoConnection == connection) { NSLog(@"採集到視頻數據"); } else { NSLog(@"採集到音頻數據"); } }
1.
獲取當前視頻設備輸入對象2.
判斷當前視頻設備是前置仍是後置3.
肯定切換攝像頭的方向4.
根據攝像頭方向獲取對應的攝像頭設備5.
建立對應的攝像頭輸入對象6.
從會話中移除以前的視頻輸入對象7.
添加新的視頻輸入對象到會話中// 切換攝像頭 - (IBAction)toggleCapture:(id)sender { // 獲取當前設備方向 AVCaptureDevicePosition curPosition = _currentVideoDeviceInput.device.position; // 獲取須要改變的方向 AVCaptureDevicePosition togglePosition = curPosition == AVCaptureDevicePositionFront?AVCaptureDevicePositionBack:AVCaptureDevicePositionFront; // 獲取改變的攝像頭設備 AVCaptureDevice *toggleDevice = [self getVideoDevice:togglePosition]; // 獲取改變的攝像頭輸入設備 AVCaptureDeviceInput *toggleDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:toggleDevice error:nil]; // 移除以前攝像頭輸入設備 [_captureSession removeInput:_currentVideoDeviceInput]; // 添加新的攝像頭輸入設備 [_captureSession addInput:toggleDeviceInput]; // 記錄當前攝像頭輸入設備 _currentVideoDeviceInput = toggleDeviceInput; }
1.
監聽屏幕的點擊2.
獲取點擊的點位置,轉換爲攝像頭上的點,必須經過視頻預覽圖層(AVCaptureVideoPreviewLayer
)轉3.
設置聚焦光標圖片的位置,並作動畫4.
設置攝像頭設備聚焦模式和曝光模式(注意:這裏設置必定要鎖定配置lockForConfiguration
,不然報錯)// 點擊屏幕,出現聚焦視圖 - (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event { // 獲取點擊位置 UITouch *touch = [touches anyObject]; CGPoint point = [touch locationInView:self.view]; // 把當前位置轉換爲攝像頭點上的位置 CGPoint cameraPoint = [_previedLayer captureDevicePointOfInterestForPoint:point]; // 設置聚焦點光標位置 [self setFocusCursorWithPoint:point]; // 設置聚焦 [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint]; } /** * 設置聚焦光標位置 * * @param point 光標位置 */ -(void)setFocusCursorWithPoint:(CGPoint)point{ self.focusCursorImageView.center=point; self.focusCursorImageView.transform=CGAffineTransformMakeScale(1.5, 1.5); self.focusCursorImageView.alpha=1.0; [UIView animateWithDuration:1.0 animations:^{ self.focusCursorImageView.transform=CGAffineTransformIdentity; } completion:^(BOOL finished) { self.focusCursorImageView.alpha=0; }]; } /** * 設置聚焦 */ -(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{ AVCaptureDevice *captureDevice = _currentVideoDeviceInput.device; // 鎖定配置 [captureDevice lockForConfiguration:nil]; // 設置聚焦 if ([captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus]; } if ([captureDevice isFocusPointOfInterestSupported]) { [captureDevice setFocusPointOfInterest:point]; } // 設置曝光 if ([captureDevice isExposureModeSupported:AVCaptureExposureModeAutoExpose]) { [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose]; } if ([captureDevice isExposurePointOfInterestSupported]) { [captureDevice setExposurePointOfInterest:point]; } // 解鎖配置 [captureDevice unlockForConfiguration]; }
後續還會更新更多有關直播的資料,但願作到教會每個朋友從零開始作一款直播app,而且Demo也會慢慢完善.
Demo點擊下載服務器
打開工程,把jkplayer拖入到工程中
,而是直接把jkplayer庫拷貝到與Classes同一級目錄下就能夠了。不要向下面這樣操做