本篇文章介紹在ios平臺如何利用rtmp進行推流,進而實現一個簡易直播功能,其內容概要以下:html
librtmp
庫到應用中實例代碼:ios
代碼結構:nginx
運行截圖:c++
下面介紹了兩種最經常使用的rtmp推流服務搭建方式:git
nginx編譯依賴gcc環境github
yum -y install gcc gcc-c++
複製代碼
nginx的http模塊使用pcre來解析正則表達式正則表達式
yum install -y pcre pcre-devel
複製代碼
nginx使用zlib對http包的內容進行gzip算法
yum install -y zlib zlib-devel
複製代碼
OpenSSL 是一個強大的安全套接字層密碼庫,囊括主要的密碼算法、經常使用的密鑰和證書封裝管理功能及 SSL 協議,並提供豐富的應用程序供測試或其它目的使用。nginx 不只支持 http 協議,還支持 https(即在ssl協議上傳輸http),因此須要在 Centos 安裝 OpenSSL 庫。centos
yum install -y openssl openssl-devel
複製代碼
#下載rtmp包
wget https://github.com/arut/nginx-rtmp-module/archive/master.zip
#解壓下載包(centos中默認沒有unzip命令,須要yum下載)
unzip -o master.zip
#修改文件夾名
mv nginx-rtmp-module-master nginx-rtmp-module
複製代碼
安裝nginxapi
#下載nginx
wget http://nginx.org/download/nginx-1.13.8.tar.gz
#解壓nignx
tar -zxvf nginx-1.13.8.tar.gz
#切換到nginx中
cd nginx-1.13.8
#生成配置文件,將上述下載的文件配置到configure中
./configure --prefix=/usr/local/nginx --add-module=/home/nginx-rtmp-module --with-http_ssl_module
#編譯程序
make
#安裝程序
make install
#查看nginx模塊
nginx -V
複製代碼
vi /usr/local/nginx/conf/nginx.conf
複製代碼
#工做進程
worker_processes 1;
#事件配置
events {
worker_connections 1024;
}
#RTMP配置
rtmp {
server {
#監聽端口
listen 1935;
#
application myapp {
live on;
}
#hls配置
application hls {
live on;
hls on;
hls_path /tmp/hls;
}
}
}
http {
include mime.types;
default_type application/octet-stream;
sendfile on;
keepalive_timeout 65;
gzip on;
server {
listen 80;
server_name localhost;
location / {
root html;
index index.html index.htm;
}
#配置hls
location /hls {
types {
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
root /tmp;
add_header Cache-Control no-cache;
}
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root html;
}
}
}
複製代碼
/usr/local/nginx/sbin/nginx
複製代碼
ffmpeg -re -i "/home/123.mp4" -vcodec libx264 -vprofile baseline -acodec aac -ar 44100 -strict -2 -ac 1 -f flv -s 640x480 -q 10 rtmp://localhost:1935/myapp/test1
複製代碼
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
複製代碼
brew tap denji/nginx
brew install nginx-full --with-rtmp-module
複製代碼
安裝過程當中會提示須要安裝xcode command line tool,Mac最新場景下安裝Xcode時已經沒有Command Line了,須要單獨安裝。根據提示在使用命令xcode-select --install
在apple官網下載對應的版本並安裝 developer.apple.com/download/mo…
我當前的開發環境是:
因此下載的版本對應以下圖所示:
查看nginx安裝信息:
brew info nginx-full
複製代碼
顯示以下:
==> Caveats
Docroot is: /usr/local/var/www
The default port has been set in /usr/local/etc/nginx/nginx.conf to 8080 so that
nginx can run without sudo.
nginx will load all files in /usr/local/etc/nginx/servers/.
- Tips -
Run port 80:
$ sudo chown root:wheel /usr/local/Cellar/nginx-full/1.17.1/bin/nginx
$ sudo chmod u+s /usr/local/Cellar/nginx-full/1.17.1/bin/nginx
Reload config:
$ nginx -s reload
Reopen Logfile:
$ nginx -s reopen
Stop process:
$ nginx -s stop
Waiting on exit process
$ nginx -s quit
To have launchd start denji/nginx/nginx-full now and restart at login:
brew services start denji/nginx/nginx-full
Or, if you don't want/need a background service you can just run: nginx 複製代碼
nginx安裝位置:
/usr/local/Cellar/nginx-full/
複製代碼
nginx配置文件位置:
/usr/local/etc/nginx/nginx.conf
複製代碼
nginx服務器根目錄位置:
/usr/local/var/www
複製代碼
驗證是否安裝成功,執行命令:
nginx
複製代碼
而後瀏覽器中輸入http://localhost:8080,出現如下界面,證實安裝成功
nginx
的配置文件在/usr/local/etc/nginx
目錄下,選擇編輯器打開nginx.conf
文件,在http節點後面添加rtmp配置。
http{
...
}
#在http節點後面加上rtmp配置
rtmp {
server {
# 監聽端口
listen 1935;
# 分塊大小
chunk_size 4000;
# RTMP 直播流配置
application rtmplive {
# 開啓直播的模式
live on;
# 設置最大鏈接數
max_connections 1024;
}
# hls 直播流配置
application hls {
live on;
hls on;
# 分割文件的存儲位置
hls_path /usr/local/var/www/hls;
# hls分片大小
hls_fragment 5s;
}
}
}
複製代碼
在http節點內的server節點內增長配置:
http {
...
server {
...
location / {
root html;
index index.html index.htm;
}
location /hls {
# 響應類型
types {
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
root /usr/local/var/www;
# 不要緩存
add_header Cache-Control no-cache;
}
...
}
...
}
複製代碼
配置完成後,使用下面命令重啓nginx:
nginx -s stop // 關閉nginx
nginx // 打開nginx
nginx -s reload // 重啓nginx
複製代碼
首先使用ffmpeg網rtmp服務器推流:
ffmpeg -re -i test.mp4 -vcodec libx264 -acodec aac -f flv rtmp://localhost:1935/rtmplive/room1
複製代碼
利用ffplay或者vlc查看:
ffplay -i rtmp://localhost:1935/rtmplive/room1
複製代碼
ffmpeg -re -i Test.MOV -vcodec libx264 -acodec aac -f flv rtmp://localhost:1935/hls/stream
複製代碼
利用ffplay或者vlc查看:
ffplay -i rtmp://localhost:1935/hls/stream
複製代碼
也能夠在瀏覽器中輸入http://localhost:8080/hls/stream.m3u8
地址查看hls流:
示例程序主要採用的是音視頻採集->硬編碼->推流
這樣一個流程,以下圖所示:
曾經嘗試編譯出適用於ios平臺的librtmp
庫,都因爲種種緣由沒有成功,後續會繼續嘗試。此處我使用的librtmp
庫是在網上找的一個版本,能夠經過如下方式下載:
連接:pan.baidu.com/s/1ATEqt31W… 密碼:v63u
解壓後把庫和頭文件放到工程中:
而後正確設置庫和頭文件的搜索路徑。
demo中所採集音視頻的信息以下:
與rtmp服務器創建鏈接以後首先要發送音視頻的metadata信息。具體代碼以下:
- (void)sendMetaData {
RTMPPacket packet;
char pbuf[2048], *pend = pbuf+sizeof(pbuf);
packet.m_nChannel = 0x03; // control channel (invoke)
packet.m_headerType = RTMP_PACKET_SIZE_LARGE;
packet.m_packetType = RTMP_PACKET_TYPE_INFO;
packet.m_nTimeStamp = 0;
packet.m_nInfoField2 = self->rtmp->m_stream_id;
packet.m_hasAbsTimestamp = TRUE;
packet.m_body = pbuf + RTMP_MAX_HEADER_SIZE;
char *enc = packet.m_body;
enc = AMF_EncodeString(enc, pend, &av_setDataFrame);
enc = AMF_EncodeString(enc, pend, &av_onMetaData);
*enc++ = AMF_OBJECT;
enc = AMF_EncodeNamedNumber(enc, pend, &av_duration, 0.0);
enc = AMF_EncodeNamedNumber(enc, pend, &av_fileSize, 0.0);
// videosize
enc = AMF_EncodeNamedNumber(enc, pend, &av_width, 480);
enc = AMF_EncodeNamedNumber(enc, pend, &av_height, 640);
// video
enc = AMF_EncodeNamedString(enc, pend, &av_videocodecid, &av_avc1);
//640x480
enc = AMF_EncodeNamedNumber(enc, pend, &av_videodatarate, 480 * 640 / 1000.f);
enc = AMF_EncodeNamedNumber(enc, pend, &av_framerate, 20);
// audio
enc = AMF_EncodeNamedString(enc, pend, &av_audiocodecid, &av_mp4a);
enc = AMF_EncodeNamedNumber(enc, pend, &av_audiodatarate, 96000);
enc = AMF_EncodeNamedNumber(enc, pend, &av_audiosamplerate, 44100);
enc = AMF_EncodeNamedNumber(enc, pend, &av_audiosamplesize, 16.0);
enc = AMF_EncodeNamedBoolean(enc, pend, &av_stereo, NO);
// sdk version
enc = AMF_EncodeNamedString(enc, pend, &av_encoder, &av_SDKVersion);
*enc++ = 0;
*enc++ = 0;
*enc++ = AMF_OBJECT_END;
packet.m_nBodySize = enc - packet.m_body;
if(!RTMP_SendPacket(self->rtmp, &packet, FALSE)) {
return;
}
}
複製代碼
sps和pps是須要在其餘NALU以前打包推送給服務器。因爲RTMP推送的音視頻流的封裝形式和FLV格式類似,向FMS等流媒體服務器推送H264和AAC直播流時,須要首先發送"AVC sequence header"和"AAC sequence header"(這兩項數據包含的是重要的編碼信息,沒有它們,解碼器將沒法解碼),所以這裏的"AVC sequence header"就是用來打包sps和pps的。
AVC sequence header其實就是AVCDecoderConfigurationRecord結構,該結構在標準文檔「ISO/IEC-14496-15:2004」的5.2.4.1章節中有詳細說明。
以下代碼在網上不少地方都能找到,可是卻不多有對其每一個字節表示什麼意思作詳細解釋。我經過查閱官方文檔對其作了詳細註釋。下面是相關的包結構資料:
VIDEODATA:
AVCDecoderConfigurationRecord的定義:
詳細註釋具體代碼以下:
- (void)sendVideoSps:(NSData *)spsData pps:(NSData *)ppsData
{
unsigned char* sps = (unsigned char*)spsData.bytes;
unsigned char* pps = (unsigned char*)ppsData.bytes;
long sps_len = spsData.length;
long pps_len = ppsData.length;
dispatch_async(self.rtmpQueue, ^{
if(self->rtmp!= NULL)
{
unsigned char *body = NULL;
NSInteger iIndex = 0;
NSInteger rtmpLength = 1024;
body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
/*** VideoTagHeader: 編碼格式爲AVC時,該header長度爲5 ***/
body[iIndex++] = 0x17; // 表示幀類型和CodecID,各佔4個bit加一塊兒是1個Byte 1: 表示幀類型,當前是I幀(for AVC, A seekable frame) 7: AVC 元數據當作I幀發送
body[iIndex++] = 0x00; // AVCPacketType: 0 = AVC sequence header, 長度爲1
body[iIndex++] = 0x00; // CompositionTime: 0 ,長度爲3
body[iIndex++] = 0x00;
body[iIndex++] = 0x00;
/*** AVCDecoderConfigurationRecord:包含着H.264解碼相關比較重要的sps,pps信息,在給AVC解碼器送數據流以前必定要把sps和pps信息先發送,不然解碼器不能正常work,並且在
解碼器stop以後再次start以前,如seek,快進快退狀態切換等都須要從新發送一遍sps和pps信息。AVCDecoderConfigurationRecord在FLV文件中通常狀況也是出現1次,也就是第一個
video tag.
***/
body[iIndex++] = 0x01; // 版本 = 1
body[iIndex++] = sps[1]; // AVCProfileIndication,1個字節長度:
body[iIndex++] = sps[2]; // profile_compatibility,1個字節長度
body[iIndex++] = sps[3]; // AVCLevelIndication , 1個字節長度
body[iIndex++] = 0xff;
// sps
body[iIndex++] = 0xe1; // 它的後5位表示SPS數目, 0xe1 = 1110 0001 後五位爲 00001 = 1,表示只有1個SPS
body[iIndex++] = (sps_len >> 8) & 0xff; // 表示SPS長度:2個字節 ,其存儲的就是sps_len (策略:sps長度右移8位&0xff,而後sps長度&0xff)
body[iIndex++] = sps_len & 0xff;
memcpy(&body[iIndex], sps, sps_len);
iIndex += sps_len;
// pps
body[iIndex++] = 0x01; // 表示pps的數目,當前表示只有1個pps
body[iIndex++] = (pps_len >> 8) & 0xff; // 和sps同理,表示pps的長度:佔2個字節 ...
body[iIndex++] = (pps_len) & 0xff;
memcpy(&body[iIndex], pps, pps_len);
iIndex += pps_len;
[self sendPacket:RTMP_PACKET_TYPE_VIDEO data:body size:iIndex nTimestamp:0];
free(body);
}
});
}
複製代碼
根據官方文檔,對上面的代碼關鍵點解釋以下(詳細解釋在上面代碼中):
body[iIndex++] = 0x17;
body[iIndex++] = 0xe1;
首先看一下視頻數據包的具體結構:
具體代碼實現:
- (void)sendVideoData:(NSData *)data isKeyFrame:(BOOL)isKeyFrame
{
__block uint32_t length = data.length;
dispatch_async(self.rtmpQueue, ^{
if(self->rtmp != NULL)
{
uint32_t timeoffset = [[NSDate date] timeIntervalSince1970]*1000 - self->start_time; /*start_time爲開始直播時的時間戳*/
NSInteger i = 0;
NSInteger rtmpLength = data.length + 9;
unsigned char *body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
if (isKeyFrame) {
body[i++] = 0x17; // 1:Iframe 7:AVC
} else {
body[i++] = 0x27; // 2:Pframe 7:AVC
}
body[i++] = 0x01; // AVCPacketType: 0 表示AVC sequence header; 1 表示AVC NALU; 2 表示AVC end of sequence....
body[i++] = 0x00; // CompositionTime,佔3個字節: 1表示 Composition time offset; 其它狀況都是0
body[i++] = 0x00;
body[i++] = 0x00;
body[i++] = (data.length >> 24) & 0xff; // NALU size
body[i++] = (data.length >> 16) & 0xff;
body[i++] = (data.length >> 8) & 0xff;
body[i++] = (data.length) & 0xff;
memcpy(&body[i], data.bytes, data.length); // NALU data
[self sendPacket:RTMP_PACKET_TYPE_VIDEO data:body size:(rtmpLength) nTimestamp:timeoffset];
free(body);
}
});
}
複製代碼
音頻header主要是音頻的一些信息,此時包的第二個字節要爲0.
具體代碼:
- (void)sendAudioHeader:(NSData *)data{
NSInteger audioLength = data.length;
dispatch_async(self.rtmpQueue, ^{
NSInteger rtmpLength = audioLength + 2; /*spec data長度,通常是2*/
unsigned char *body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
/*AF 00 + AAC RAW data*/
body[0] = 0xAE; // 4bit表示音頻格式, 10表示AAC,因此用A來表示。 A: 表示發送的是AAC ; SountRate佔2bit,此處是44100用3表示,轉化爲二進制位 11 ; SoundSize佔1個bit,0表示8位,1表示16位,此處是16位用1表示,二進制表示爲 1; SoundType佔1個bit,0表示單聲道,1表示立體聲,此處是單聲道用0表示,二進制表示爲 0; 1110 = E
body[1] = 0x00; // 0表示的是audio的配置
memcpy(&body[2], data.bytes, audioLength); /*spec_buf是AAC sequence header數據*/
[self sendPacket:RTMP_PACKET_TYPE_AUDIO data:body size:rtmpLength nTimestamp:0];
free(body);
});
}
複製代碼
根據官方文檔,對上述代碼作解釋:
body[0] = 0xAE
:
body[1] = 0x00
:
和音頻的頭相比,變化的僅僅是第二個字節。
- (void)sendAudioData:(NSData *)data{
NSInteger audioLength = data.length;
dispatch_async(self.rtmpQueue, ^{
uint32_t timeoffset = [[NSDate date] timeIntervalSince1970]*1000 - self->start_time;
NSInteger rtmpLength = audioLength + 2; /*spec data長度,通常是2*/
unsigned char *body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
/*AF 01 + AAC RAW data*/
body[0] = 0xAE;
body[1] = 0x01;
memcpy(&body[2], data.bytes, audioLength);
[self sendPacket:RTMP_PACKET_TYPE_AUDIO data:body size:rtmpLength nTimestamp:timeoffset];
free(body);
});
}
複製代碼