前兩篇文章記錄了音視頻通話的一些概念和一些流程,以及一個局域網內音視頻通話的示例。git
今天以一個僞真實網絡間的音視頻通話示例,來分析WebRTC音視頻通話的過程。github
上一篇由於是在相同路由內,因此不須要穿牆,兩個客戶端是能夠直接傳輸多媒體流數據。用XMPP做爲信令傳輸的通道也很是的簡單。web
本篇會添加上STUN服務器和TURN服務器,讓ICE框架的功能發揮出來,實現完整的音視頻通話。可是由於兩個客戶端所處網絡環境不一樣,須要將這兩個客戶端加入到同一個虛擬的網絡中(即房間服務器),因此須要服務器端的支持,關於服務器端的開發,這裏就不作描述了。json
- (void)startCommunication:(BOOL)isVideo
{
WebRTCClient *client = [WebRTCClient sharedInstance];
client.myJID = [HLIMCenter sharedInstance].xmppStream.myJID.full;
client.remoteJID = self.chatJID.full;
[client showRTCViewByRemoteName:self.chatJID.full isVideo:isVideo isCaller:YES];
}
複製代碼
而在顯示音視頻通話視圖的同時,須要作一系列的操做:數組
instance.ICEServers = [NSMutableArray arrayWithObject:[instance defaultSTUNServer]];
複製代碼
而ICEServer的建立有一個類,RTCICEServer。bash
- (RTCICEServer *)defaultSTUNServer {
NSURL *defaultSTUNServerURL = [NSURL URLWithString:RTCSTUNServerURL];
return [[RTCICEServer alloc] initWithURI:defaultSTUNServerURL
username:@""
password:@""];
}
複製代碼
STUN服務器,你能夠用google提供的stun:stun.l.google.com:19302
,你也可讓服務器開發人員提供一個STUN服務器。服務器
而TURN服務器(轉發服務器),雖然通常不添加也能夠,可是仍是最好提供幾個座位備用。網絡
/**
* 關於RTC 的設置
*/
- (void)initRTCSetting
{
//添加 turn 服務器
// NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:[NSURL URLWithString:RTCTRUNServerURL]];
// [request addValue:@"Mozilla/5.0" forHTTPHeaderField:@"user-agent"];
// [request addValue:RTCRoomServerURL forHTTPHeaderField:@"origin"];
// [request setTimeoutInterval:5];
// [request setCachePolicy:NSURLRequestReloadIgnoringCacheData];
//
// NSURLSessionDataTask *turnTask = [[NSURLSession sharedSession] dataTaskWithRequest:request completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
// NSDictionary *dict = [NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingMutableContainers error:NULL];
// NSLog(@"返回的服務器:%@",dict);
// NSString *username = dict[@"username"];
// NSString *password = dict[@"password"];
// NSArray *uris = dict[@"uris"];
//
// for (NSString *uri in uris) {
// RTCICEServer *server = [[RTCICEServer alloc] initWithURI:[NSURL URLWithString:uri] username:username password:password];
// [_ICEServers addObject:server];
// }
//
// }];
// [turnTask resume];
self.peerConnection = [self.peerConnectionFactory peerConnectionWithICEServers:_ICEServers constraints:self.pcConstraints delegate:self];
//設置 local media stream
RTCMediaStream *mediaStream = [self.peerConnectionFactory mediaStreamWithLabel:@"ARDAMS"];
// 添加 local video track
RTCAVFoundationVideoSource *source = [[RTCAVFoundationVideoSource alloc] initWithFactory:self.peerConnectionFactory constraints:self.videoConstraints];
RTCVideoTrack *localVideoTrack = [[RTCVideoTrack alloc] initWithFactory:self.peerConnectionFactory source:source trackId:@"AVAMSv0"];
[mediaStream addVideoTrack:localVideoTrack];
self.localVideoTrack = localVideoTrack;
// 添加 local audio track
RTCAudioTrack *localAudioTrack = [self.peerConnectionFactory audioTrackWithID:@"ARDAMSa0"];
[mediaStream addAudioTrack:localAudioTrack];
// 添加 mediaStream
[self.peerConnection addStream:mediaStream];
RTCEAGLVideoView *localVideoView = [[RTCEAGLVideoView alloc] initWithFrame:self.rtcView.ownImageView.bounds];
localVideoView.transform = CGAffineTransformMakeScale(-1, 1);
localVideoView.delegate = self;
[self.rtcView.ownImageView addSubview:localVideoView];
self.localVideoView = localVideoView;
[self.localVideoTrack addRenderer:self.localVideoView];
RTCEAGLVideoView *remoteVideoView = [[RTCEAGLVideoView alloc] initWithFrame:self.rtcView.adverseImageView.bounds];
remoteVideoView.transform = CGAffineTransformMakeScale(-1, 1);
remoteVideoView.delegate = self;
[self.rtcView.adverseImageView addSubview:remoteVideoView];
self.remoteVideoView = remoteVideoView;
}
複製代碼
// 建立一個offer信令
[self.peerConnection createOfferWithDelegate:self constraints:self.sdpConstraints];
複製代碼
- (void)peerConnection:(RTCPeerConnection *)peerConnection
didCreateSessionDescription:(RTCSessionDescription *)sdp
error:(NSError *)error
{
if (error) {
NSLog(@"建立SessionDescription 失敗");
#warning 這裏建立 建立SessionDescription 失敗,建立失敗應該隱藏撥打界面,並給予提示。
} else {
NSLog(@"建立SessionDescription 成功");
RTCSessionDescription *sdpH264 = [self descriptionWithDescription:sdp videoFormat:@"H264"];
[self.peerConnection setLocalDescriptionWithDelegate:self sessionDescription:sdpH264];
if ([sdp.type isEqualToString:@"offer"]) {
NSDictionary *dict = @{@"roomId":self.roomId};
NSData *data = [NSJSONSerialization dataWithJSONObject:dict options:0 error:nil];
NSString *message = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
[[HLIMClient shareClient] sendSignalingMessage:message toUser:self.remoteJID];
}
NSDictionary *jsonDict = @{ @"type" : sdp.type, @"sdp" : sdp.description };
NSData *jsonData = [NSJSONSerialization dataWithJSONObject:jsonDict options:0 error:nil];
NSString *jsonStr = [[NSString alloc] initWithData:jsonData encoding:NSUTF8StringEncoding];
[[HLIMClient shareClient] sendSignalingMessage:jsonStr toUser:self.remoteJID];
}
}
複製代碼
以上基本回調方法的處理與上一篇基本一致,也就兩個劃線的回調方法有些變化。 -peerConnection:iceConnectionChanged
在監聽到斷開後,移除音視頻通話的界面。session
關鍵代碼:併發
case RTCICEConnectionDisconnected:
{
NSLog(@"newState = RTCICEConnectionDisconnected");
dispatch_async(dispatch_get_main_queue(), ^{
[self.rtcView dismiss];
[self cleanCache];
});
}
break;
複製代碼
而-peerConnection:gotICECandidate
,由於本端會生成全部網絡接口對應不一樣協議的Candidate。 每個Candidate實際上描述了和本身的通訊方式。好比一個STUN類型的Candidate會包含本端在防火牆外的IP和端口類型。由於添加了STUN和TURN服務器,因此可能的通訊方式也變多了,回調次數也會變多。
- (void)receiveSignalingMessage:(NSNotification *)notification
{
NSDictionary *dict = [notification object];
[self handleSignalingMessage:dict];
[self drainMessages];
}
- (void)handleSignalingMessage:(NSDictionary *)dict
{
NSString *type = dict[@"type"];
if ([type isEqualToString:@"offer"] || [type isEqualToString:@"answer"]) {
[self.messages insertObject:dict atIndex:0];
_hasReceivedSdp = YES;
} else if ([type isEqualToString:@"candidate"]) {
[self.messages addObject:dict];
} else if ([type isEqualToString:@"bye"]) {
[self processMessageDict:dict];
}
}
- (void)drainMessages
{
if (!_peerConnection || !_hasReceivedSdp) {
return;
}
for (NSDictionary *dict in self.messages) {
[self processMessageDict:dict];
}
[self.messages removeAllObjects];
}
- (void)processMessageDict:(NSDictionary *)dict
{
NSString *type = dict[@"type"];
if ([type isEqualToString:@"offer"]) {
RTCSessionDescription *remoteSdp = [[RTCSessionDescription alloc] initWithType:type sdp:dict[@"sdp"]];
[self.peerConnection setRemoteDescriptionWithDelegate:self sessionDescription:remoteSdp];
[self.peerConnection createAnswerWithDelegate:self constraints:self.sdpConstraints];
} else if ([type isEqualToString:@"answer"]) {
RTCSessionDescription *remoteSdp = [[RTCSessionDescription alloc] initWithType:type sdp:dict[@"sdp"]];
[self.peerConnection setRemoteDescriptionWithDelegate:self sessionDescription:remoteSdp];
} else if ([type isEqualToString:@"candidate"]) {
NSString *mid = [dict objectForKey:@"id"];
NSNumber *sdpLineIndex = [dict objectForKey:@"label"];
NSString *sdp = [dict objectForKey:@"sdp"];
RTCICECandidate *candidate = [[RTCICECandidate alloc] initWithMid:mid index:sdpLineIndex.intValue sdp:sdp];
[self.peerConnection addICECandidate:candidate];
} else if ([type isEqualToString:@"bye"]) {
if (self.rtcView) {
NSData *jsonData = [NSJSONSerialization dataWithJSONObject:dict options:0 error:nil];
NSString *jsonStr = [[NSString alloc] initWithData:jsonData encoding:NSUTF8StringEncoding];
if (jsonStr.length > 0) {
[[HLIMClient shareClient] sendSignalingMessage:jsonStr toUser:self.remoteJID];
}
[self.rtcView dismiss];
[self cleanCache];
}
}
}
複製代碼
設置完answer信令以後,兩方就能夠點對點發送多媒體流數據了。
第一步,在接收到發起方經過XMPP發送過來的房間號信息後,顯示出接聽界面,可是RTC的配置推遲到點擊接聽按鈕時。
第二步,註冊並加入該房間,由於房間已經被髮起方建立過,因此會直接加入房間。
第三步,點擊接聽按鈕時,中止聲音的播放,而後作RTC的相關配置。
- (void)acceptAction
{
[self.audioPlayer stop];
[self initRTCSetting];
[self drainMessages];
}
複製代碼
第四步,處理信令消息。
在處理信令小時前,判斷是否已經收到offer信令。若是收到offer信令以後,才處理信令消息,現將offer的sdp設置爲peerConnection的遠程sdp。同時建立一個answer信令,並將answer信令發送給對端。
在兩端都已經設置好遠程和本地sdp後,就會開始點對點的發送多媒體流數據了。
在WebRTC的第一篇,就講過信令的傳輸能夠用多種方式,除了XMPP,其餘協議方式也是能夠用來傳輸信令的,好比WebSocket。可是房間號不屬於信令消息。
怎麼使用WebSocket來傳輸信令消息呢?
在註冊房間並加入成功後,會返回服務器端WebSocket的地址。這時候建立一個WebSocket,而後用房間號和clientId註冊,其實就是將房間號和clientId包裝後,經過WebSocket發送給服務器。
關鍵代碼:
NSURL *webSocketURL = [NSURL URLWithString:dict[kARDJoinWebSocketURLKey]];
_webSocket = [[SRWebSocket alloc] initWithURL:webSocketURL];
_webSocket.delegate = self;
[_webSocket open];
[self registerForRoomId:self.roomId clientId:self.clientId];
複製代碼
而用房間號和clientId註冊的關鍵代碼:
NSDictionary *registerMessage = @{
@"cmd": @"register",
@"roomid" : _roomId,
@"clientid" : _clientId,
};
NSData *message = [NSJSONSerialization dataWithJSONObject:registerMessage
options:NSJSONWritingPrettyPrinted
error:nil];
NSString *messageString = [[NSString alloc] initWithData:message encoding:NSUTF8StringEncoding];
NSLog(@"Registering on WSS for rid:%@ cid:%@", _roomId, _clientId);
// Registration can fail if server rejects it. For example, if the room is full.
[_webSocket send:messageString];
複製代碼
發送信令的關鍵代碼範例:
NSDictionary *jsonDict = @{ @"type" : sdp.type, @"sdp" : sdp.description };
NSData *jsonData = [NSJSONSerialization dataWithJSONObject:jsonDict options:0 error:nil];
NSString *jsonStr = [[NSString alloc] initWithData:jsonData encoding:NSUTF8StringEncoding];
NSDictionary *messageDict = @{@"cmd": @"send", @"msg": jsonStr};
NSData *messageJSONObject = [NSJSONSerialization dataWithJSONObject:messageDict
options:NSJSONWritingPrettyPrinted
error:nil];
NSString *messageString = [[NSString alloc] initWithData:messageJSONObject
encoding:NSUTF8StringEncoding];
[_webSocket send:messageString];
複製代碼
WebSocket的代理方法會在socket打開成功,打開失敗,關閉,以及收到消息時回調。
這裏主要將一下收到消息,收到的消息就是信令消息,而信令消息有多種,candidate消息就須要存起來,而offer、answer、bye消息就須要馬上處理。
以下是接收到信令消息的處理範例:
- (void)webSocket:(SRWebSocket *)webSocket didReceiveMessage:(id)message {
NSString *messageString = message;
NSData *messageData = [messageString dataUsingEncoding:NSUTF8StringEncoding];
id jsonObject = [NSJSONSerialization JSONObjectWithData:messageData
options:0
error:nil];
if (![jsonObject isKindOfClass:[NSDictionary class]]) {
NSLog(@"Unexpected message: %@", jsonObject);
return;
}
NSDictionary *wssMessage = jsonObject;
NSLog(@"WebSocket 接收到信息:%@",wssMessage);
NSString *errorString = wssMessage[@"error"];
if (errorString.length) {
NSLog(@"WebSocket收到錯誤信息");
return;
}
NSString *msg = wssMessage[@"msg"];
NSData *data = [msg dataUsingEncoding:NSUTF8StringEncoding];
NSDictionary *sinalingMsg = [NSJSONSerialization JSONObjectWithData:data options:0 error:nil];
[self handleSignalingMessage:sinalingMsg];
[self drainMessages];
}
複製代碼
用XMPP傳輸信令的示例工程地址:RemoteXMPPRTC
用WebSocket傳輸信令的示例工程地址:RemoteWebRTC
工程中用到的WebRTC靜態庫已放到:百度網盤
關於WebRTC的介紹就到這裏了,Have Fun!