Flutter 拍照 視頻 原理分析

前言

最近在看鹹魚的《Flutter技術解析與實戰》,直接上來看的第二章,發現能力加強這塊有點暈,Texture 是啥東西?快速過了一遍後,就去Flutter官網瞭解了一下相機示例,針對相機示例進行源碼分析,而後回過頭看第二章能力加強,着實很輕鬆。html

本文先針對相機示例進行拍照和視頻分析 (沒有Android端與系統拍照、視頻交互源碼分析) ,從中能夠學到相關插件裏面原生和Flutter之間如何共享圖像,以及如何在Flutter中嵌套原生組件。最後貼上了 鹹魚這本書的第二章能力加強-基於外接紋理的同層渲染 相關連接。android

參考資料ios

Flutter技術解析與實戰 鹹魚技術演進與建立第二章能力加強git

Flutter中文網攝像頭示例github

插件地址api

Flutter外接紋理markdown

經過共享內存優化flutter外接紋理的渲染性能,實時渲染不是夢session

目錄

1、相機示例

Flutter中文網攝像頭示例app

效果圖less

代碼,和中文網示例同樣,須要注意相機的初始化和一些設置是異步的。

pubspec.yaml

dependencies:
  camera: ^0.5.2+2
  video_player: ^0.10.12+2
  path_provider: ^0.4.1
複製代碼

main.dart

List<CameraDescription> cameras;
void main() async{
  WidgetsFlutterBinding.ensureInitialized();
  cameras = await availableCameras();
  runApp(MyApp());
}
class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Flutter Demo',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: CameraHome(),
    );
  }
}
複製代碼

camera.dart

import 'dart:io';
​
import 'package:camera/camera.dart';
import 'package:flutter/material.dart';
import 'package:fluttertwo/main.dart';
import 'package:path_provider/path_provider.dart';
import 'package:video_player/video_player.dart';
​
class CameraHome extends StatefulWidget {
  @override
  _CameraHomeState createState() {
    return _CameraHomeState();
  }
}
​
class _CameraHomeState extends State<CameraHome> with WidgetsBindingObserver {
  CameraController controller;
  String imagePath; //圖片保存路徑
  String videoPath; //視頻保存路徑
  VideoPlayerController videoController;
  VoidCallback videoPlayerListener;
  bool enableAudio = true;
  final GlobalKey<ScaffoldState> _scaffoldKey = GlobalKey<ScaffoldState>();
​
  @override
  void setState(fn) {
    super.setState(fn);
    WidgetsBinding.instance.addObserver(this);
  }
​
  @override
  void dispose() {
    WidgetsBinding.instance.removeObserver(this);
    super.dispose();
  }
​
  @override
  void didChangeAppLifecycleState(AppLifecycleState state) {
    //若是APP不在前臺
    if (state == AppLifecycleState.inactive) {
      controller?.dispose();
    } else if (state == AppLifecycleState.resumed) {
      //在前臺
      if (controller != null) {
        onNewCameraSelected(controller.description);
      }
    }
  }
​
  @override
  Widget build(BuildContext context) {
    return Scaffold(
      key: _scaffoldKey,
      appBar: AppBar(
        title: Text("相機示例"),
      ),
      body: Column(
        children: <Widget>[
          Expanded(
            child: Container(
              child: Padding(
                padding: EdgeInsets.all(1.0),
                child: Center(
                  child: _cameraPreviewWidget(),
                ),
              ),
              decoration: BoxDecoration(
                color: Colors.black,
                border: Border.all(
                  color: controller != null && controller.value.isRecordingVideo
                      ? Colors.redAccent
                      : Colors.grey,
                  width: 3.0,
                ),
              ),
            ),
          ),
          _captureControlRowWidget(),
          _toggleAudioWidget(),
          Padding(
            padding: EdgeInsets.all(5.0),
            child: Row(
              mainAxisAlignment: MainAxisAlignment.start,
              children: <Widget>[
                _cameraTogglesRowWidget(),
                _thumbnailWidget(),
              ],
            ),
          ),
        ],
      ),
    );
  }
​
  ///顯示已拍攝的圖片/視頻縮略圖
  Widget _thumbnailWidget() {
    return Expanded(
      child: Align(
        alignment: Alignment.centerRight,
        child: Row(
          mainAxisSize: MainAxisSize.min,
          children: <Widget>[
            videoController == null && imagePath == null
                ? Container()
                : SizedBox(
                    child: (videoController == null)
                        ? Image.file(File(imagePath),width: 64.0,height: 64.0,)
                        : Container(
                            child: Center(
                              child: AspectRatio(
                                aspectRatio: videoController.value.size != null
                                    ? videoController.value.aspectRatio
                                    : 1.0,
                                child: VideoPlayer(videoController),
                              ),
                            ),
                            decoration: BoxDecoration(
                              border: Border.all(color: Colors.pink),
                            ),
                            width: 64.0,
                            height: 64.0,
                          ),
                  ),
          ],
        ),
      ),
    );
  }
​
  ///展現全部攝像頭
  Widget _cameraTogglesRowWidget() {
    final List<Widget> toggles = <Widget>[];
​
    if (cameras.isEmpty) {
      return Text("沒有檢測到攝像頭");
    } else {
      for (CameraDescription cameraDescription in cameras) {
        toggles.add(SizedBox(
          width: 90.0,
          child: RadioListTile<CameraDescription>(
              title: Icon(getCameraLensIcon(cameraDescription.lensDirection)),
              groupValue: controller?.description,
              value: cameraDescription,
              onChanged: controller != null && controller.value.isRecordingVideo
                  ? null
                  : onNewCameraSelected),
        ));
      }
      return Row(
        children: toggles,
      );
    }
  }
​
  ///開啓或關閉錄音
  Widget _toggleAudioWidget() {
    return Padding(
      padding: EdgeInsets.only(left: 25),
      child: Row(
        children: <Widget>[
          Text("開啓錄音"),
          Switch(
            value: enableAudio,
            onChanged: (value) {
              enableAudio = value;
              if (controller != null) {
                onNewCameraSelected(controller.description);
              }
            },
          ),
        ],
      ),
    );
  }
​
  ///相機工具欄
  Widget _captureControlRowWidget() {
    return Row(
      mainAxisAlignment: MainAxisAlignment.spaceEvenly, //均勻放置
      mainAxisSize: MainAxisSize.max,
      children: <Widget>[
        IconButton(
          icon: Icon(Icons.camera_alt),
          color: Colors.blue,
          onPressed: controller != null &&
                  controller.value.isInitialized &&
                  !controller.value.isRecordingVideo
              ? onTakePictureButtonPressed
              : null,
        ),
        IconButton(
          icon: Icon(Icons.videocam),
          color: Colors.blue,
          onPressed: controller != null &&
                  controller.value.isInitialized &&
                  !controller.value.isRecordingVideo
              ? onVideoRecordButtonPressed
              : null,
        ),
        IconButton(
          icon: Icon(Icons.stop),
          color: Colors.red,
          onPressed: controller != null &&
                  controller.value.isInitialized &&
                  controller.value.isRecordingVideo
              ? onStopButtonPressed
              : null,
        ),
      ],
    );
  }
​
  ///開始錄製視頻
  void onVideoRecordButtonPressed() {
    startVideoRecording().then((value) {
      if (mounted) {
        setState(() {});
      }
      if (value != null) {
        showInSnackBar("正在保存視頻於 ${value}");
      }
    });
  }
​
  ///終止視頻錄製
  void onStopButtonPressed() {
    stopVideoRecording().then((value) {
      if (mounted) {
        setState(() {});
      }
      showInSnackBar("視頻保存在: ${videoPath}");
    });
  }
​
  Future<void> stopVideoRecording() async {
    if (!controller.value.isRecordingVideo) {
      return null;
    }
    try {
      await controller.stopVideoRecording();
    } on CameraException catch (e) {
      _showCameraException(e);
      return null;
    }
    await _startVideoPlayer();
  }
​
  Future<void> _startVideoPlayer() async {
    final VideoPlayerController vcontroller =
        VideoPlayerController.file(File(videoPath));
    videoPlayerListener = () {
      if (videoController != null && videoController.value.size != null) {
        if (mounted) {
          setState(() {});
        }
        videoController.removeListener(videoPlayerListener);
      }
    };
    vcontroller.addListener(videoPlayerListener);
    await vcontroller.setLooping(true);
    await vcontroller.initialize();
    await videoController?.dispose();
    if (mounted) {
      setState(() {
        imagePath = null;
        videoController = vcontroller;
      });
    }
    await vcontroller.play();
  }
​
  Future<String> startVideoRecording() async {
    if (!controller.value.isInitialized) {
      showInSnackBar("請選擇一個攝像頭");
      return null;
    }
    //肯定視頻保存的路徑
    final Directory extDir = await getApplicationDocumentsDirectory();
    final String dirPath = "${extDir.path}/Movies/flutter_test";
    await Directory(dirPath).createSync(recursive: true);
    final String filePath = "$dirPath/${timestamp()}.mp4";
​
    if (controller.value.isRecordingVideo) {
      return null; //正在錄製
    }
    try {
      videoPath = filePath;
      await controller.startVideoRecording(filePath);
    } on CameraException catch (e) {
      _showCameraException(e);
      return null;
    }
    return filePath;
  }
​
  ///拍照按鈕點擊回調
  void onTakePictureButtonPressed() {
    takePicture().then((value) {
      if (mounted) {
        setState(() {
          imagePath = value;
          videoController?.dispose();
          videoController = null;
        });
        if (value != null) {
          showInSnackBar('圖片保存在 $value');
        }
      }
    });
  }
​
  Future<String> takePicture() async {
    if (!controller.value.isInitialized) {
      showInSnackBar("錯誤: 請選擇一個相機");
      return null;
    }
    final Directory extDir = await getApplicationDocumentsDirectory();
    final String dirPath = '${extDir.path}/Movies/flutter_test';
    await Directory(dirPath).createSync(recursive: true);
    final String filePath = '$dirPath/${timestamp()}.jpg';
    if (controller.value.isTakingPicture) {
      return null;
    }
    try {
      await controller.takePicture(filePath);
    } on CameraException catch (e) {
      _showCameraException(e);
      return null;
    }
    return filePath;
  }
​
  String timestamp() => DateTime.now().millisecondsSinceEpoch.toString();
​
  ///預覽窗口
  Widget _cameraPreviewWidget() {
    if (controller == null || !controller.value.isInitialized) {
      return Text(
        "選擇一個攝像頭",
        style: TextStyle(
          color: Colors.white,
          fontSize: 24.0,
          fontWeight: FontWeight.w900,
        ),
      );
    } else {
      //調整child到設置的寬高比
      return AspectRatio(
        aspectRatio: controller.value.aspectRatio,
        child: CameraPreview(controller),
      );
    }
  }
​
  ///攝像頭選中回調
  void onNewCameraSelected(CameraDescription cameraDescription) async {
    if (controller != null) {
      await controller.dispose();
    }
    controller = CameraController(
      cameraDescription,
      ResolutionPreset.high,
      enableAudio: enableAudio,
    );
    controller.addListener(() {
      if (mounted) {
        setState(() {});
        if (controller.value.hasError) {
          showInSnackBar("Camera error ${controller.value.errorDescription}");
        }
      }
    });
    try {
      await controller.initialize();
    } on CameraException catch (e) {
      _showCameraException(e);
    }
  }
​
  _showCameraException(CameraException e) {
    logError(e.code, e.description);
    showInSnackBar("Error: ${e.code}\n${e.description}");
  }
​
  showInSnackBar(String message) {
    _scaffoldKey.currentState.showSnackBar(SnackBar(
      content: Text(message),
    ));
  }
}
​
/// 獲取不一樣攝像頭的圖標(前置、後置、其它)
IconData getCameraLensIcon(CameraLensDirection direction) {
  switch (direction) {
    case CameraLensDirection.back:
      return Icons.camera_rear;
    case CameraLensDirection.front:
      return Icons.camera_front;
    case CameraLensDirection.external:
      return Icons.camera;
  }
  throw ArgumentError("Unknown lens direction");
}
​
void logError(String code, String message) =>
    print('Error: $code\nError Message: $message');
​
複製代碼

2、Flutter相機工做原理分析

經過上述代碼,咱們能夠看出在使用 cameravideo_player 插件後 (插件地址) ,Flutter端只須要使用如下關鍵代碼就能夠正常拍照和記錄視頻

//拍照
await controller.takePicture(filePath);
​
//錄製視頻
final VideoPlayerController vcontroller =VideoPlayerController.file(File(videoPath));
vcontroller.play()
複製代碼

一、拍照

咱們直接看CameraController調用的**takePicture**方法

Future<void> takePicture(String path) async {
    if (!value.isInitialized || _isDisposed) {
      throw CameraException(
        'Uninitialized CameraController.',
        'takePicture was called on uninitialized CameraController',
      );
    }
    if (value.isTakingPicture) {
      throw CameraException(
        'Previous capture has not returned yet.',
        'takePicture was called before the previous capture returned.',
      );
    }
    try {
    //註釋 1
      value = value.copyWith(isTakingPicture: true);
      await _channel.invokeMethod<void>(
        'takePicture',
        <String, dynamic>{'textureId': _textureId, 'path': path},
      );
      value = value.copyWith(isTakingPicture: false);
    } on PlatformException catch (e) {
      value = value.copyWith(isTakingPicture: false);
      throw CameraException(e.code, e.message);
    }
  }
複製代碼

其它代碼不關心,咱們先研究關鍵代碼,從註釋 1 處能夠看到調用了MethodChannel的invokeMethod方法,對於這個不清楚的能夠看我上次分享的 Flutter與Native通訊示例及源碼分析 ,這裏咱們看一下MethodChannel的構造函數

final MethodChannel _channel = const MethodChannel('plugins.flutter.io/camera');
複製代碼

接下來去 camera 插件的 android或者 ios模塊 找對應的MethodChannel,這裏以Android爲例。全局搜一下,發如今android模塊的MethodCallHandlerImpl中有以下代碼

methodChannel = new MethodChannel(messenger, "plugins.flutter.io/camera");
    imageStreamChannel = new EventChannel(messenger, "plugins.flutter.io/camera/imageStream");
複製代碼

基本就肯定了,Flutter 端 調用的 await _channel.invokeMethod( 'takePicture',<String, dynamic>{'textureId': _textureId, 'path': path},) 會回調到 Android 端 這邊onMethodCall方法

@Override
  public void onMethodCall(@NonNull MethodCall call, @NonNull final Result result) {
    switch (call.method) {
      case "availableCameras":  // 示例代碼開頭 調用了這個方法獲取攝像頭列表
        try {
          result.success(CameraUtils.getAvailableCameras(activity));
        } catch (Exception e) {
          handleException(e, result);
        }
        break;
      case "initialize":
        {
          if (camera != null) {
            camera.close();
          }
          cameraPermissions.requestPermissions(
              activity,
              permissionsRegistry,
              call.argument("enableAudio"),
              (String errCode, String errDesc) -> {
                if (errCode == null) {
                  try {
                  //註釋 1
                    instantiateCamera(call, result);
                  } catch (Exception e) {
                    handleException(e, result);
                  }
                } else {
                  result.error(errCode, errDesc, null);
                }
              });
​
          break;
        }
      case "takePicture":
        {
        //註釋 2
          camera.takePicture(call.argument("path"), result);
          break;
        }
      case "prepareForVideoRecording":
        {
          // This optimization is not required for Android.
          result.success(null);
          break;
        }
      case "startVideoRecording":
....
      case "stopVideoRecording":
...
      case "pauseVideoRecording":
...
      case "resumeVideoRecording":
...
      case "startImageStream":
...
      case "stopImageStream":
...
      case "dispose":
     ...
      default:
        result.notImplemented();
        break;
    }
  }
複製代碼

方法不少,這裏不做分析,直接看 註釋1處 instantiateCamera(call, result) 和 註釋 2 處 takePicture ,註釋 1 處 你可能會發現是否是拍照片沒有調用,這是不存在的,在示例代碼中,拍照時,選擇攝像頭是調用的 初始化操做的,即註釋 1 處 方法

instantiateCamera 方法內部調用以下

private void instantiateCamera(MethodCall call, Result result) throws CameraAccessException {
    String cameraName = call.argument("cameraName");
    String resolutionPreset = call.argument("resolutionPreset");
    boolean enableAudio = call.argument("enableAudio");
    TextureRegistry.SurfaceTextureEntry flutterSurfaceTexture =
        textureRegistry.createSurfaceTexture();
    DartMessenger dartMessenger = new DartMessenger(messenger, flutterSurfaceTexture.id());
    camera = new Camera(activity,flutterSurfaceTexture,dartMessenger,cameraName,resolutionPreset,
            enableAudio);
​
    camera.open(result);
  }
複製代碼

這裏就是獲取了一些Flutter端爲攝像頭設置的操做,關鍵代碼在這兩行

DartMessenger dartMessenger = new DartMessenger(messenger, flutterSurfaceTexture.id());
camera.open(result);
複製代碼

DartMessenger 是用來和 Flutter 端 通訊的,例如 Flutter 端 CameraController 類

camera.open內部又作了些什麼呢?

public void open(@NonNull final Result result) throws CameraAccessException {
    pictureImageReader =
        ImageReader.newInstance(
            captureSize.getWidth(), captureSize.getHeight(), ImageFormat.JPEG, 2);
​
    // Used to steam image byte data to dart side.
    imageStreamReader =
        ImageReader.newInstance(previewSize.getWidth(), previewSize.getHeight(), ImageFormat.YUV_420_888, 2);
​
    cameraManager.openCamera(
        cameraName,
        new CameraDevice.StateCallback() {
          @Override
          public void onOpened(@NonNull CameraDevice device) {
            cameraDevice = device;
            try {
              startPreview();
            } catch (CameraAccessException e) {
              result.error("CameraAccess", e.getMessage(), null);
              close();
              return;
            }
            Map<String, Object> reply = new HashMap<>();
            reply.put("textureId", flutterTexture.id());
            reply.put("previewWidth", previewSize.getWidth());
            reply.put("previewHeight", previewSize.getHeight());
            result.success(reply);
          }
​
          @Override
          public void onClosed(@NonNull CameraDevice camera) {
            dartMessenger.sendCameraClosingEvent();
            super.onClosed(camera);
          }
​
        .......
        }
複製代碼

咱們須要關心一下 上面 openCamera 方法 中的 onOpened 方法,攝像頭打開後,這裏會 先調用startPreview 開啓攝像頭預覽,而後回調 數據給 Flutter 端,包括 textureIdpreviewWidthpreviewHeight , 這裏這個是重點,先記下,稍後分析攝像頭界面顯示的時候會用到。可是在開啓攝像頭預覽裏並不簡單,內部調用方法以下

public void startPreview() throws CameraAccessException {
   //註釋 1
    createCaptureSession(CameraDevice.TEMPLATE_PREVIEW, pictureImageReader.getSurface());
  }
private void createCaptureSession(
      int templateType, Runnable onSuccessCallback, Surface... surfaces)
      throws CameraAccessException {
    // Close any existing capture session.
    closeCaptureSession();
​
    // Create a new capture builder.
    captureRequestBuilder = cameraDevice.createCaptureRequest(templateType);
​
    // Build Flutter surface to render to
    SurfaceTexture surfaceTexture = flutterTexture.surfaceTexture();
    surfaceTexture.setDefaultBufferSize(previewSize.getWidth(), previewSize.getHeight());
    Surface flutterSurface = new Surface(surfaceTexture);
    captureRequestBuilder.addTarget(flutterSurface);
​
    List<Surface> remainingSurfaces = Arrays.asList(surfaces);
   ......
   //註釋 4
    // Collect all surfaces we want to render to.
    List<Surface> surfaceList = new ArrayList<>();
    surfaceList.add(flutterSurface);
    surfaceList.addAll(remainingSurfaces);
    // Start the session
    cameraDevice.createCaptureSession(surfaceList, callback, null);
  }
複製代碼

註釋 1 處 的 pictureImageReader 是 ImageReader 類型的,這個類被容許直接從Surface接收渲染的圖像數據

註釋 4 處 收集全部 Surfaces 交給 CameraDevice ,該類是CameraManager調用 openCamera 回調過來的,CameraDevice是鏈接在安卓設備上的單個相機的抽象表示,具體實現能夠看 CameraDeviceImpl 類,音視頻相關開發應該很清楚,做爲一個渣渣就不介紹了。createCaptureSession 方法的三個參數含義分別爲 每一個 CaptureRequest 的輸出 Surface 集合 、 建立會話的回調、指定回調執行的線程

接下來看 takePicture 方法

camera.takePicture(call.argument("path"), result); //第一個參數 是Flutter端傳過來的文件路徑 第二個參數傳遞過來是用來設置回覆數據的
複製代碼

注意,這裏的camera 對象類型是 io.flutter.plugins.camera.Camera , takePicture 方法 以下

public void takePicture(String filePath, @NonNull final Result result) {
  ......
​
    pictureImageReader.setOnImageAvailableListener(
        reader -> {
          try (Image image = reader.acquireLatestImage()) {
            ByteBuffer buffer = image.getPlanes()[0].getBuffer();
            writeToFile(buffer, file);
            result.success(null);
  ......
  }
複製代碼

而後Flutter端顯示的話,直接用 Image.file

最後還有一個東西沒有分析,那就是攝像頭 在 Flutter 端是 如何顯示 的,即示例效果圖的上半部分

在示例中,是調用 CameraPreview(controller) 就直接顯示出來了

class CameraPreview extends StatelessWidget {
  const CameraPreview(this.controller);
​
  final CameraController controller;
​
  @override
  Widget build(BuildContext context) {
    return controller.value.isInitialized
        ? Texture(textureId: controller._textureId)
        : Container();
  }
}
複製代碼

關鍵顯示代碼 是 Texture(textureId: controller._textureId) ,那這個controller_textureId是怎麼獲得的

Future<void> initialize() async {
......
      final Map<String, dynamic> reply =
          await _channel.invokeMapMethod<String, dynamic>(
        'initialize',
        <String, dynamic>{
          'cameraName': description.name,
          'resolutionPreset': serializeResolutionPreset(resolutionPreset),
          'enableAudio': enableAudio,
        },
      );
      _textureId = reply['textureId'];
.....
    return _creatingCompleter.future;
  }
複製代碼

以前 提到過, textureId 會在相機初始化時從 Android 端 回傳給 Flutter 端,那這個 textureId 有什麼做用呢?繼續看Texture dart 類

class Texture extends LeafRenderObjectWidget {
  /// Creates a widget backed by the texture identified by [textureId].
  const Texture({
    Key key,
    @required this.textureId,
  }) : assert(textureId != null),
       super(key: key);
​
  /// The identity of the backend texture.
  final int textureId;
​
  @override
  TextureBox createRenderObject(BuildContext context) => TextureBox(textureId: textureId);
​
  @override
  void updateRenderObject(BuildContext context, TextureBox renderObject) {
    renderObject.textureId = textureId;
  }
}
​
複製代碼

Texture代碼不多,它是繼承自LeafRenderObjectWidget的,這裏涉及自定義組件的相關知識,能夠 簡單先了解一下 Flutter中文網自定義組件 ,在構建Texture 時,createRenderObject 是必須會執行的,因此咱們看一下返回的 TextureBox

class TextureBox extends RenderBox {
    TextureBox({ @required int textureId })
     : assert(textureId != null),
      _textureId = textureId;
​

     int _textureId;
     set textureId(int value) {
       assert(value != null);
      if (value != _textureId) {
         _textureId = value;
        markNeedsPaint();
      }
    }
    @override
    void paint(PaintingContext context, Offset offset) {
      if (_textureId == null)
         return;
     context.addLayer(TextureLayer(
       rect: Rect.fromLTWH(offset.dx, offset.dy, size.width, size.height),
        textureId: _textureId,
      ));
   }
}
複製代碼

在paint方法中,利用TextureLayer進行數據設置並放入PaintingContext中,其實這裏咱們或多或少能猜出來, textureId 就是用來給 Flutter 端提供獲取 圖像視圖的 憑證。

二、視頻

VideoPlayerControllerCameraController 類 的做用同樣,只不過一個是用來拍照的,一個是用來錄製視頻的,textureId 也是在 VideoPlayerController 中獲取的,都是在 調用 initialize 方法獲取的,只不過 VideoPlayerController 中 是經過 _videoPlayerPlatform.create(dataSourceDescription) 獲取的,大概代碼流程以下

//1
_textureId = await _videoPlayerPlatform.create(dataSourceDescription);
//2  _videoPlayerPlatform 
final VideoPlayerPlatform _videoPlayerPlatform = VideoPlayerPlatform.instance..init();
//3 instance -> _instance 
static VideoPlayerPlatform _instance = MethodChannelVideoPlayer();
//4
  @override
  Future<int> create(DataSource dataSource) async {
    .....
    TextureMessage response = await _api.create(message);
    return response.textureId;
  }
  //5._api 類型 
  VideoPlayerApi _api = VideoPlayerApi();
  //6\.  _api_create方法 
    Future<TextureMessage> create(CreateMessage arg) async {
    final Map<dynamic, dynamic> requestMap = arg._toMap();
    const BasicMessageChannel<dynamic> channel = BasicMessageChannel<dynamic>(
        'dev.flutter.pigeon.VideoPlayerApi.create', StandardMessageCodec());
​
    final Map<dynamic, dynamic> replyMap = await channel.send(requestMap);
    if (replyMap == null) {
      throw PlatformException(
          code: 'channel-error',
          message: 'Unable to establish connection on channel.',
          details: null);
    } else if (replyMap['error'] != null) {
      final Map<dynamic, dynamic> error = replyMap['error'];
      throw PlatformException(
          code: error['code'],
          message: error['message'],
          details: error['details']);
    } else {
      return TextureMessage._fromMap(replyMap['result']);
    }
  }
複製代碼

這裏利用了 BasicMessageChannel 進行通訊,將數據經過 StandardMessageCodec 進行解碼。

大致 textureId 獲取流程就是這樣,其實和拍照 相似,殊途同歸。剩下的這裏不分析了,能夠去文章參考資料中的插件去下載,看看。

3、Flutter技術解析與實戰能力加強部分

這裏剛開始準備貼書中內容,後面發現鹹魚技術在知乎上發表了,排版比CSDN好,直接貼連接了。

萬萬沒想到——Flutter外接紋理

經過共享內存優化flutter外接紋理的渲染性能,實時渲染不是夢

相關文章
相關標籤/搜索