記錄一款Unity VR視頻播放器插件的開發

效果圖

先上一個效果圖:
效果圖java

背景

公司最近在作VR直播平臺,VR開發咱們用到了Unity,而在Unity中播放視頻就須要一款視頻插件,咱們調研了幾個視頻插件,記錄兩個,以下:android

Unity視頻插件調研

網上搜了搜,最流行的有如下兩款Unity插件:git

  • AVPro 這個在Unity商店售價150$,最新release版本爲1.6.15,功能包括:

Powerful cross-platform video playback solution for Unity.github

Native video playback on Android, iOS, macOS and tvOS (Apple TV), WebGL, Windows, Windows Phone and UWP.less

Features include:ide

  • New Unity 2017 supported
  • New New iOS video playback path that uses less memory
  • One API for video playback on all supported platforms
  • Unity 4.6 - 5.x supported
  • 8K video (on supported hardware)
  • VR Support (mono, stereo, equirectangular and cubemap)
  • Transparency support (native and packed)
  • Subtitles support (external SRT)
  • Fast flexible video playback
  • In-editor playback support for Windows and macOS
  • Free watermarked trial version available
  • Components for IMGUI, uGUI and NGUI
  • Over 64 PlayMaker actions included
  • Easy to use drag and drop components
  • Linear and Gamma colour spaces supported
  • Fast native Direct3D, OpenGL and Metal texture updates
  • Desktop support for Hap, Hap Alpha, Hap Q and Hap Q Alpha
  • Streaming video from URL (when supported by platform)

此插件支持HLS視頻播放,使用文檔很詳細,可是此插件沒有源碼,不適合作之後的個性化開發。函數

Supported resolutions:flex

  • Android: General devices support up to 1920 * 1080.
    The latest device supports up to 4k.
  • iOS: General devices support up to 1920 * 1080.
    The latest device is support up to 2560 * 1440.
    iPhone 6s Plus supports up to 4k.
  • It also supports StreamingAssets, external storage, and streaming services.
  • Android streaming support list: http, HLS (http live streaming),rtsp
  • iOS streaming support list: http,HLS (http live streaming)
  • EasyMovieTexture requires Android 4.0 or above.
  • EasyMovieTexture requires iOS 6.0 or Above.
  • Unity 4.X requires an iOS Pro.
  • In Unity 5.X it does not require a Pro.
  • Supports multithreaded rendering options. (Only supports Unity 5.X.)

這個插件貌似是我的開發的,沒有說明文檔,有部分java源碼,native code並無給出。咱們須要有源碼的插件方便之後的個性化開發。ui

本身動手,風衣足食

綜合以上調研結果,咱們決定本身動手實現一個簡單能知足咱們要求的Unity播放器插件,有兩個難點要突破:spa

  • 一個是找一個合適的開源播放器。
  • 另外一個就是如何把播放視頻畫面映射到Unity中的物體表面,這個是最關鍵的。

尋找素材

從下面這個帖子中,找到了一些能夠參考的資料。

unity 3d 中如何實現以物體的表面做爲播放視頻的位置,好比在牆面播放視頻?

尋找開源播放器

原本打算使用VLC播放器的,可是同事發現有一個商用的開源播放器,而且使用的人數也很多,B站的ijkplayer。正好在上面的帖子中回覆人也提到了這個播放器,咱們決定使用這個播放器。

如何作視頻畫面映射

沒有一點Unity開發經驗,只能從頭一點點學起,知乎的帖子裏面,有我的回覆能夠參考OVR裏面的例子。閱讀了裏面的代碼,同時也參考了easyMovieTexture中的源碼(easyMovie中只有java代碼,關鍵的native code並無給)。看的有些似懂非懂,嘗試了以後,竟然成功了。

最關鍵的一點我描述成下面的話:

將Ijkplayer的AndroidSurfaceTexture紋理ID和Unity中Texture2D的紋理ID分別同時綁定到不一樣的目標上。AndroidSurfaceTexture綁定到GL_TEXTURE_EXTERNAL_OES,Unity的紋理ID綁定到GL_TEXTURE_2D

從頭至尾梳理一遍流程

初始化

  • Unity

Unity端初始化一個Texture2D紋理ID用於顯示視頻幀。

m_VideoTexture = new Texture2D (Call_GetVideoWidth (), Call_GetVideoHeight (), TextureFormat.RGB565, false);
  • OVR

這裏使用了OVR裏面的native code,OVR中初始化AndroidSurfaceTexture和相關的函數:

static const char * className = "android/graphics/SurfaceTexture";
    const jclass surfaceTextureClass = jni->FindClass(className);
    if ( surfaceTextureClass == 0 ) {
        FAIL( "FindClass( %s ) failed", className );
    }

    // find the constructor that takes an int
    const jmethodID constructor = jni->GetMethodID( surfaceTextureClass, "<init>", "(I)V" );
    if ( constructor == 0 ) {
        FAIL( "GetMethodID( <init> ) failed" );
    }

    jobject obj = jni->NewObject( surfaceTextureClass, constructor, textureId );
    if ( obj == 0 ) {
        FAIL( "NewObject() failed" );
    }

    javaObject = jni->NewGlobalRef( obj );
    if ( javaObject == 0 ) {
        FAIL( "NewGlobalRef() failed" );
    }

    // Now that we have a globalRef, we can free the localRef
    jni->DeleteLocalRef( obj );

    updateTexImageMethodId = jni->GetMethodID( surfaceTextureClass, "updateTexImage", "()V" );
    if ( !updateTexImageMethodId ) {
        FAIL( "couldn't get updateTexImageMethodId" );
    }

    getTimestampMethodId = jni->GetMethodID( surfaceTextureClass, "getTimestamp", "()J" );
    if ( !getTimestampMethodId ) {
        FAIL( "couldn't get getTimestampMethodId" );
    }

    setDefaultBufferSizeMethodId = jni->GetMethodID( surfaceTextureClass, "setDefaultBufferSize", "(II)V" );
    if ( !setDefaultBufferSizeMethodId ) {
        FAIL( "couldn't get setDefaultBufferSize" );
    }

    // jclass objects are localRefs that need to be freed
    jni->DeleteLocalRef( surfaceTextureClass );

初始化紋理ID,並將其綁定到目標GL_TEXTURE_2D上:

glGenTextures( 1, &textureId );
    glBindTexture( GL_TEXTURE_EXTERNAL_OES, textureId );
    glTexParameterf( GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
    glTexParameterf( GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
    glTexParameterf( GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
    glTexParameterf( GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
    glBindTexture( GL_TEXTURE_EXTERNAL_OES, 0 );

將Unity的紋理ID傳遞到OVR中,用於綁定到目標GL_TEXTURE_EXTERNAL_OES上:

jobject OVR_Media_Surface( void * texPtr, int const width, int const height )
{
    GLuint texId = (GLuint)(size_t)(texPtr);
    LOG( "OVR_Media_Surface(%i, %i, %i)", texId, width, height );
    return _msp.VideoSurface.Bind( texId, width, height );
}
  • Ijkplayer

建立一個播放器,注意這裏咱們使用OVR中已經實例化的AndroidMovieTexture來初始化播放器。

m_IjkMediaPlayer.setSurface(m_Surface);

刷新

刷新操做由Unity中的Update函數觸發,最終在OVR中執行,首先調用AndroidMovieTexture中的Update函數,接下來就是綁定紋理操做,Ijkplayer的紋理ID每刷新一次綁定一次。而Unity的紋理ID只有在視頻圖像長度或者寬度發生變化纔會綁定。

void MediaSurface::Update()
{
    if ( !AndroidSurfaceTexture )
    {
        LOG( "!AndroidSurfaceTexture" );
        return;
    }
    if ( TexId <= 0 )
    {
        //LOG( "TexId <= 0" );
        return;
    }
    AndroidSurfaceTexture->Update();
    if ( AndroidSurfaceTexture->GetNanoTimeStamp() == LastSurfaceTexNanoTimeStamp )
    {
        //LOG( "No new surface!" );
        return;
    }
    LastSurfaceTexNanoTimeStamp = AndroidSurfaceTexture->GetNanoTimeStamp()

   // If the SurfaceTexture has changed dimensions, we need to
    // reallocate the texture and FBO.
    glActiveTexture( GL_TEXTURE0 );
    glBindTexture( GL_TEXTURE_EXTERNAL_OES, AndroidSurfaceTexture->GetTextureId() );
    if ( TexIdWidth != BoundWidth || TexIdHeight != BoundHeight )
    {
        LOG( "New surface size: %ix%i", BoundWidth, BoundHeight );

        TexIdWidth = BoundWidth;
        TexIdHeight = BoundHeight;

        if ( Fbo )
        {
            glDeleteFramebuffers( 1, &Fbo );
        }

        glActiveTexture( GL_TEXTURE1 );
        glBindTexture( GL_TEXTURE_2D, TexId );
        glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA,
                TexIdWidth, TexIdHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL );
        glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
        glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
        glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR );
        glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

        glBindTexture( GL_TEXTURE_2D, 0 );
        glActiveTexture( GL_TEXTURE0 );

        glGenFramebuffers( 1, &Fbo );
        glBindFramebuffer( GL_FRAMEBUFFER, Fbo );
        glFramebufferTexture2D( GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D,
                TexId, 0 );
        glBindFramebuffer( GL_FRAMEBUFFER, 0 );
    }
}

最後的結果多是這個樣子的:Ijkplayer負責推進視頻不停向前播放,播放器的紋理也會不停刷新,這會帶動Unity紋理跟着刷新,最終顯示在Unity的Material上。

相關文章
相關標籤/搜索