代碼剖析
原文做者:Tony Parisi
那麼,Unity到底是如何支持Oculus VR運行的?首先,咱們來看看Unity場景是如何構建的。在Unity集成開發包中有一個相機預設體,這個預設體提供了最基本的VR技術,包括:Oculus的立體渲染和頭動追蹤,下面咱們來具體操做一下。
在Hierarchy面板中定位到OVRCameraRig物體,而後咱們點擊它左邊的向下箭頭展開它的子物體,Camera Rig中包含一個叫TrackingSpace的子物體,TrackingSpace下面包含:LeftEyeAnchor,CenterEyeAnchor,RightEyeAnchor和TrackerAnchor四個子物體。其中,Left和Right Anchor 是關鍵所在,它們分別帶有一個相機,用來分別渲染左右眼視圖。這兩個相機在Inspector中的參數都是默認值,它們的參數會在程序運行的時候有所改變。
咱們再次定位到OVRCameraRig物體,咱們雙擊它上面帶的一個叫OVRCameraRig的腳本組件,Unity的編輯器會用MonoDevelop爲咱們打開一個叫OVRCameraRig.cs的腳本源代碼。
在Monodevelop中搜索源代碼,找到LateUpdate函數,以下:
- #if !UNITY_ANDROID || UNITY_EDITOR
- privatevoid LateUpdate()
- #else
- privatevoid Update()
- #endif
- {
- EnsureGameObjectIntegrity();
- if(!Application.isPlaying)
- return;
- UpdateCameras();
- UpdateAnchors();
- }
#if !UNITY_ANDROID || UNITY_EDITOR
privatevoid LateUpdate()
#else
privatevoid Update()
#endif
{
EnsureGameObjectIntegrity();
if(!Application.isPlaying)
return;
UpdateCameras();
UpdateAnchors();
}
咱們沒有在安卓上面構建,因此#if語句爲真,咱們使用的是LateUpdate這個函數,Unity腳本在運行的時候會不停的調用多個函數,其中就包括Update和LateUpdate函數。其中,LateUpdate函數更適合用做相機的更新,由於引擎能夠確保全部更新操做執行完之後才調用LateUpdate函數,這點很是有必要,好比咱們須要在更新相機前獲取頭動信息。
LateUpdate函數中咱們首先調用了EnsureGameObjectIntegrity這個函數,目的是爲了確保場景中有咱們必須的物體(即OVRCameraRig預設實例),這樣能夠防止腳本包含進場景但沒有實例化OVRCameraRig物體。
檢查完應用是否正在運行以後,咱們就開始幹正事了。首先,咱們調用UpdateCameras更新兩個相機的參數,以下代碼:
- privatevoid UpdateCameras()
- {
- if(needsCameraConfigure)
- {
- leftEyeCamera = ConfigureCamera(OVREye.Left);
- rightEyeCamera = ConfigureCamera(OVREye.Right);
- #if !UNITY_ANDROID || UNITY_EDITOR
- needsCameraConfigure = false;
- #endif
- }
- }
privatevoid UpdateCameras()
{
if(needsCameraConfigure)
{
leftEyeCamera = ConfigureCamera(OVREye.Left);
rightEyeCamera = ConfigureCamera(OVREye.Right);
#if !UNITY_ANDROID || UNITY_EDITOR
needsCameraConfigure = false;
#endif
}
}
這個函數在桌面端只會調用一次,它會經過Oculus的配置資源中獲取配置參數,而後經過一個標識變量告訴下一次執行的程序已經配置過了。
下面就是ConfigureCamera函數,用來配置每一個相機的參數:
- privateCamera ConfigureCamera(OVREye eye)
- {
- Transform anchor = (eye == OVREye.Left) ? leftEyeAnchor : rightEyeAnchor;
- Camera cam = anchor.GetComponent<Camera>();
- OVRDisplay.EyeRenderDesc eyeDesc = OVRManager.display.GetEyeRenderDesc(eye);
- cam.fieldOfView = eyeDesc.fov.y;
- cam.aspect = eyeDesc.resolution.x / eyeDesc.resolution.y;
- cam.rect = newRect(0f, 0f, OVRManager.instance.virtualTextureScale, OVRManager.instance.virtualTextureScale);
- cam.targetTexture = OVRManager.display.GetEyeTexture(eye);
- cam.hdr = OVRManager.instance.hdr;
- ...
- returncam;
- }
privateCamera ConfigureCamera(OVREye eye)
{
Transform anchor = (eye == OVREye.Left) ? leftEyeAnchor : rightEyeAnchor;
Camera cam = anchor.GetComponent<Camera>();
OVRDisplay.EyeRenderDesc eyeDesc = OVRManager.display.GetEyeRenderDesc(eye);
cam.fieldOfView = eyeDesc.fov.y;
cam.aspect = eyeDesc.resolution.x / eyeDesc.resolution.y;
cam.rect = newRect(0f, 0f, OVRManager.instance.virtualTextureScale, OVRManager.instance.virtualTextureScale);
cam.targetTexture = OVRManager.display.GetEyeTexture(eye);
cam.hdr = OVRManager.instance.hdr;
...
returncam;
}
其中,OVRManager類是與Oculus Mobile SDK的主要接口,它負責許多東西,包括與本地的Oculus SDK接口。若是你好奇這個腳本,你能夠回到Unity編輯器中找到OVRCameraRig物體,而後在它的組件中就能夠找到OVRManager這個腳本。目前爲止,咱們經過黑盒的方式給兩個相機賦予了參數,包含:FOV、屏幕長寬比、視口、渲染目標、是否支持HDR。
相機的基本參數已經設置好了,可是咱們仍是得根據HMD的信息實時調整相機位置和朝向,經過下面UpdateAnchors函數能夠實現:
- privatevoid UpdateAnchors()
- {
- boolmonoscopic = OVRManager.instance.monoscopic;
- OVRPose tracker = OVRManager.tracker.GetPose();
- OVRPose hmdLeftEye = OVRManager.display.GetEyePose(OVREye.Left);
- OVRPose hmdRightEye = OVRManager.display.GetEyePose(OVREye.Right);
- trackerAnchor.localRotation = tracker.orientation;
- centerEyeAnchor.localRotation = hmdLeftEye.orientation;
- leftEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : hmdLeftEye.orientation;
- rightEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : hmdRightEye.orientation;
- trackerAnchor.localPosition = tracker.position;
- centerEyeAnchor.localPosition = 0.5f * (hmdLeftEye.position + hmdRightEye.position);
- leftEyeAnchor.localPosition = monoscopic ? centerEyeAnchor.localPosition : hmdLeftEye.position;
- rightEyeAnchor.localPosition = monoscopic ? centerEyeAnchor.localPosition : hmdRightEye.position;
- if(UpdatedAnchors != null)
- {
- UpdatedAnchors(this);
- }
- }
privatevoid UpdateAnchors()
{
boolmonoscopic = OVRManager.instance.monoscopic;
OVRPose tracker = OVRManager.tracker.GetPose();
OVRPose hmdLeftEye = OVRManager.display.GetEyePose(OVREye.Left);
OVRPose hmdRightEye = OVRManager.display.GetEyePose(OVREye.Right);
trackerAnchor.localRotation = tracker.orientation;
centerEyeAnchor.localRotation = hmdLeftEye.orientation; // using left eye for now
leftEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : hmdLeftEye.orientation;
rightEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : hmdRightEye.orientation;
trackerAnchor.localPosition = tracker.position;
centerEyeAnchor.localPosition = 0.5f * (hmdLeftEye.position + hmdRightEye.position);
leftEyeAnchor.localPosition = monoscopic ? centerEyeAnchor.localPosition : hmdLeftEye.position;
rightEyeAnchor.localPosition = monoscopic ? centerEyeAnchor.localPosition : hmdRightEye.position;
if(UpdatedAnchors != null)
{
UpdatedAnchors(this);
}
}
這個函數經過OVRTracker和OVRDisplay獲取到了HMD當前的位置和朝向,以後又賦值給了相應的組件。左右眼Anchor負責真正的渲染,中心Anchor做爲一個標記存在,這樣方便應用查找中心位置而不是從新計算,tracker變量負責保存位置追蹤的信息。
至此,咱們僅僅是添加了一個預設的實例,就已經實現了Oculus Rift 立體渲染和位置追蹤功能。雖然這個預設有點負責,可是,咱們仔細深刻研究仍是會找到奧祕所在。