使用Native(以Android爲例)播放器構建Flutter播放插件


Flutter官方提供了一系列的插件的插件java

來爲Flutter提供衆多原生系統級API調用,包括傳感器、文件讀寫、數據庫、輕量存儲等等,這些插件大都是以原生、Dart間經過MethodChannelEventChannel相互通訊實現的。但google還提拱了一個Video_Player,專門用於視頻播放的插件。插件原理是經過Native提供播放器能力,dart層經過Channel調用,以及雙方通訊控制播放,但畫面是如何渲染到FlutterView(Android繼承自SurfaceView)上的呢? 在此以前咱們先回顧下Android上的視圖都是如何渲染到屏幕上的。android

在Android 中負責繪製UI的服務是SurfaceFlingerSurfaceFlinger服務運行在Android系統的System進程中,它負責管理Android系統的幀緩衝區(Frame Buffer)。Android應用程序爲了可以將本身的UI繪製在系統的幀緩衝區上,它們就必需要與SurfaceFlinger服務進行通訊,因爲Android應用程序與SurfaceFlinger服務是運行在不一樣的進程中的,所以,它們採用Binder進程間通訊機制來進行通訊。git

Android應用程序在通知SurfaceFlinger服務來繪製本身的UI的時候,須要將大量的UI相關數據傳遞給SurfaceFlinger服務(如要繪製UI的區域、位置等信息)。一個Android應用程序可能會有不少個Window,而每個Window都有本身的UI元數據,爲了解決客戶端與服務端通訊效率問題,每個客戶端利用Android系統的匿名共享內存機制(Anonymous Shared Memory)與SurfaceFlinger服務進程開闢一塊共享內存。github

並對該共享內存進行封裝,成爲一個ShareClient。ShareClient中存儲着ShareBufferStack,能夠簡單的理解爲對應的Surface數據庫

Android 應用程序經過Canvas 繪製其UI並存儲進FramBuffer中,而Canvas是其Surface 的成員,因此SurfaceFlinger能夠讀取FramBuffer數據並繪製渲染UI。canvas

簡而言之,Android是經過SurfaceFlinger服務進行繪製UI,而FlutterApplication中經過FlutterView(SurfaceView)將渲染UI的工做交給了Flutter ,dart->組裝LayerTree->Skia(SurfaceFlinger服務底層繪製二維圖像一樣使用開源的Skia)less

其實咱們能夠發現Android的繪製流程和Flutter的回執流程大致至關,可是否能夠在Dart層將繪製能力返還給Android呢(Java),因爲Surface持有Canvas,咱們只要可以獲得Surface就能夠將Surface交由播放器繪製了,事實上FlutterSDK也提供了相關的API。下面咱們來用一個簡單的例子看下如何實現。dom

首先建立一個FlutterPlugin:MySurfaceTestPluginasync

package com.example.darttest;

import android.annotation.TargetApi;
import android.graphics.Canvas;
import android.graphics.Color;
import android.os.Build;
import android.view.Surface;

import java.util.HashMap;
import java.util.Map;
import java.util.Random;

import io.flutter.plugin.common.MethodCall;
import io.flutter.plugin.common.MethodChannel;
import io.flutter.plugin.common.PluginRegistry.Registrar;
import io.flutter.view.TextureRegistry;

public class MySurfaceTestPlugin implements MethodChannel.MethodCallHandler {
    private final Registrar flutterRegistrar;
    private TextureRegistry textureRegistry;
    private TextureRegistry.SurfaceTextureEntry surfaceTextureEntry;
    private Surface surface;
    Random random;

    @TargetApi(Build.VERSION_CODES.ICE_CREAM_SANDWICH)
    @Override
    public void onMethodCall(MethodCall methodCall, MethodChannel.Result result) {

        String method = methodCall.method;

        switch (method) {
            
            case "init":
                //flutterRegistrar的textures方法獲取textureRegistry,textureRegistry是包含了當前Flutter應用全部SurfaceTexture的登記表
                //同時也能夠建立一個新的surfaceTexture
                textureRegistry = flutterRegistrar.textures();
                surfaceTextureEntry = textureRegistry.createSurfaceTexture();
                //經過一個剛剛建立的SurfaceTexure做爲參數建立一個Surface
                surface = new Surface(surfaceTextureEntry.surfaceTexture());
                Map<String, Long> reply = new HashMap<>();
                long textureId = surfaceTextureEntry.id();
                //textureId回傳給Flutter 用來建立Texure控件
                reply.put("textureId", textureId);
                result.success(reply);
                break;

            case "render":
                Canvas canvas = surface.lockCanvas(null);
                //這裏的canvas 寬和高只有1個像素 由於surface得建立不是surfaceView拖管的,因此不可以draw實際內容,可是仍然能夠繪製背景色
                //int height = canvas.getHeight();
                //int width = canvas.getWidth();
                canvas.drawColor(Color.argb(255, random.nextInt(255), random.nextInt(255), random.nextInt(255)));
                surface.unlockCanvasAndPost(canvas);
                result.success(null);
                break;
                
            default:
                break;
        }
    }

    public static void registerWith(Registrar registrar) {
        final MethodChannel channel = new MethodChannel(registrar.messenger(), "flutter.io/SurfaceTest");
        channel.setMethodCallHandler(new MySurfaceTestPlugin(registrar));
    }


    private MySurfaceTestPlugin(Registrar registrar) {
        this.flutterRegistrar = registrar;
        random = new Random();

    }


}


複製代碼

能夠看到經過FlutterSdk提供的方法獲取了一個SurfaceTexure,經過SurfaceTexure咱們new了一個Surface,官方文檔對於Surface有以下描述:ide

SurfaceTexture對象中建立的Surface類對象可被用做下述類的輸出目標: android.hardware.camera2, MediaCodec, MediaPlayer, 和 Allocation APIs。

也就是說Surface能夠直接交由MediaPlayer、或者更低一級的MediaCodec做爲輸出對象使用。經過**new Surface(SurfaceTexture texture) 方式建立的Surface.LockCanvas(DirtyRect rect)**獲取到的畫布只有1px,顯然私自建立的Surface官方意圖並不是讓咱們拿來繪製,而是交由MediaCodec, MediaPlayer輸出,但咱們仍然能夠drawColor背景色來確認咱們的java層與dart創建了關聯。

接下來咱們註冊該插件

MySurfaceTestPlugin.registerWith(getFlutterView().getPluginRegistry().registrarFor("flutter.io/SurfaceTest"));
複製代碼

註冊完畢咱們就能夠在Dart中使用該插件了,咱們建立一個簡單的DartApp 上面是經過java層渲染過的"播放窗口",下面是一個控制渲染的按鈕。

import 'dart:async';

import 'package:flutter/material.dart';
import 'package:flutter/services.dart';

main() async {
  await _initVideoPlugin();
  runApp(new MyApp());
}

final MethodChannel _channel = const MethodChannel('flutter.io/SurfaceTest');
int _textureId;

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return new MaterialApp(
      title: 'Test',
      home: new Scaffold(
          body: Column(
        children: <Widget>[
          AspectRatio(child: new VideoView(), aspectRatio: 16 / 9),
          Expanded(
              child: Center(
                  child: new FlatButton(
                      padding: EdgeInsets.only(
                          left: 25.0, right: 25.0, top: 15.0, bottom: 15.0),
                      onPressed: _render,
                      child: new Text("render"))))
        ],
      )
          ),
    );
  }
}

Future<void> _render() async {
  _channel.invokeMethod(
    'render',
    <String, dynamic>{},
  );
}

class VideoView extends StatefulWidget {
  @override
  State createState() {
    return new VideoState();
  }
}

_initVideoPlugin() async {
  final Map<dynamic, dynamic> response = await _channel.invokeMethod('init');
  _textureId = response['textureId'];
}

class VideoState extends State<VideoView> {
  @override
  Widget build(BuildContext context) {
	//這裏注意Texure是dart提供的控件 參數必須是java Plugin傳過來的texureId
    return new Texture(textureId: _textureId);
  }
}

複製代碼

須要注意的是播放窗口其實是一個Dart控件Texture,這個Texure就是Plugin裏經過flutterRegistrar.textures()獲取到的textureRegistry所管理的SurfaceTexture 一一對應的。運行看一下效果

可見由java層繪製背景已經能夠渲染到屏幕上,接下來咱們只要將Surface交於播放器便可實現視頻播放。如官方的Video_Player使用的ExoPlayer,咱們也用ExoPlayer快速實現視頻播放。

case "init":
                //flutterRegistrar的textures方法獲取textureRegistry,textureRegistry是包含了當前Flutter應用全部SurfaceTexture的登記表
                //同時也能夠建立一個新的surfaceTexture
                textureRegistry = flutterRegistrar.textures();
                surfaceTextureEntry = textureRegistry.createSurfaceTexture();
                //經過一個剛剛建立的SurfaceTexure做爲參數建立一個Surface
                surface = new Surface(surfaceTextureEntry.surfaceTexture());
                initPlayer();
                Map<String, Long> reply = new HashMap<>();
                long textureId = surfaceTextureEntry.id();
                //textureId回傳給Flutter 用來建立Texure控件
                reply.put("textureId", textureId);
                result.success(reply);
                break;

            case "render":
// Canvas canvas = surface.lockCanvas(null);
// canvas.drawColor(Color.argb(255, random.nextInt(255), random.nextInt(255), random.nextInt(255)));
// surface.unlockCanvasAndPost(canvas);
                exoPlayer.setPlayWhenReady(true);
                exoPlayer.setRepeatMode(REPEAT_MODE_ALL);
                result.success(null);
                break;
複製代碼

這裏 ,咱們在init中執行initPlayer() 其中將Surface直接交給播放器,render中執行播放

private void initPlayer() {
        DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(flutterRegistrar.activeContext(), "ExoPlayer");
        MediaSource mediaSource = new ExtractorMediaSource(Uri.parse("asset:///flutter_assets/assets/Butterfly.mp4"), dataSourceFactory, new DefaultExtractorsFactory(), null, null);
        exoPlayer = ExoPlayerFactory.newSimpleInstance(flutterRegistrar.activeContext(), new DefaultTrackSelector());
        exoPlayer.setVideoSurface(surface);
        exoPlayer.prepare(mediaSource);
    }
複製代碼

ok 如今咱們開下效果

參考:

source.android.com/devices/gra… blog.csdn.net/luoshengyan…

相關文章
相關標籤/搜索