1.CallActivity#onCreate 執行startCall開始鏈接或建立房間html
2.WebSocketClient#connectToRoom 請求一次服務器android
3.回調到CallActivity#onConnectToRoom 開始建立對等鏈接,同時將視頻採集對象,本地和遠程的VideoSink,相關參數傳入web
localProxyVideoSink代理本地視頻渲染器服務器
remoteSinks是代理遠程視頻的渲染器,這裏是一個集合session
videoCapture是本地視頻採集器app
4.PeerConnectionClient#createPeerConnectionInternal 建立PeerConnection對象和建立視頻軌道ide
factory是在CallActivity#onCreate中建立的函數
pcObserver是一個對等鏈接觀察者,用於底層消息的回調oop
若是開啓了視頻功能,則將本地採集的數據添加到軌道(經過C++底層完成)源碼分析
若是是遠程的數據,經過(getRemoteVideoTrack調用C++底層方法)獲取到遠程視頻軌道,添加傳入進來的remoteSinks
這裏繼續添加音頻軌道。
APPRTC的demo調用基本結束,到這裏手機上就能預覽出視頻了。
下面具體分析這個VideoCapture怎麼來的。
首先來一個參考文獻==>webrtc源碼分析之視頻採集之一
1.回到最初的起點CallActivity#onConnectedToRoomInternal
2.CallActivity#createVideoCapturer
webrtc針對視頻採集對外主要提供了3個接口。
ScreenCapturerAndroid==>屏幕的視頻來源採集
FileVideoCapturer==>文件的視頻來源採集
CameraCapturer==>攝像頭的視頻來源採集 這裏有分爲Camera1Capturer和Camera2Capturer
反正返回一個VideoCapturer對象就ok啦。
下面進入如何建立攝像頭採集器的方法體中。
3.CallActivity#createCameraCapturer
這裏經過一個CameraEnumerator對象來建立視頻採集器。
詳細看一下這個CameraEnumerator接口。
這個對象能夠獲取到設備名,判斷是不是前置或後置攝像頭,還能夠建立一個攝像頭的視頻採集器。
拿攝像頭1的CamearEnumerator實現類爲例。
#Camera1Enumerator#createCapturer
因此最終仍是調用了Camera1Capturer來建立攝像頭採集器。
下面分析Camera1Capturer實現過程。
4.Camera1Capturer實現過程。
相機採集的實現是CameraCapturer
針對不一樣相機API又分爲Camera1Capturer和Camera2Capturer。
相機採集的邏輯都封裝在CameraCapturer中
只有建立CameraSession的代碼在兩個子類中有不一樣的實現。
5.無奈,再進入CameraCapturer類中再次尋覓。
abstract class CameraCapturer implements CameraVideoCapturer { enum SwitchState { IDLE, // No switch requested. PENDING, // Waiting for previous capture session to open. IN_PROGRESS, // Waiting for new switched capture session to start. } private static final String TAG = "CameraCapturer"; private final static int MAX_OPEN_CAMERA_ATTEMPTS = 3; private final static int OPEN_CAMERA_DELAY_MS = 500; private final static int OPEN_CAMERA_TIMEOUT = 10000; private final CameraEnumerator cameraEnumerator; @Nullable private final CameraEventsHandler eventsHandler; private final Handler uiThreadHandler; @Nullable private final CameraSession.CreateSessionCallback createSessionCallback = new CameraSession.CreateSessionCallback() { @Override public void onDone(CameraSession session) { checkIsOnCameraThread(); Logging.d(TAG, "Create session done. Switch state: " + switchState); uiThreadHandler.removeCallbacks(openCameraTimeoutRunnable); synchronized (stateLock) { capturerObserver.onCapturerStarted(true /* success */); sessionOpening = false; currentSession = session; cameraStatistics = new CameraStatistics(surfaceHelper, eventsHandler); firstFrameObserved = false; stateLock.notifyAll(); if (switchState == SwitchState.IN_PROGRESS) { if (switchEventsHandler != null) { switchEventsHandler.onCameraSwitchDone(cameraEnumerator.isFrontFacing(cameraName)); switchEventsHandler = null; } switchState = SwitchState.IDLE; } else if (switchState == SwitchState.PENDING) { switchState = SwitchState.IDLE; switchCameraInternal(switchEventsHandler); } } } @Override public void onFailure(CameraSession.FailureType failureType, String error) { checkIsOnCameraThread(); uiThreadHandler.removeCallbacks(openCameraTimeoutRunnable); synchronized (stateLock) { capturerObserver.onCapturerStarted(false /* success */); openAttemptsRemaining--; if (openAttemptsRemaining <= 0) { Logging.w(TAG, "Opening camera failed, passing: " + error); sessionOpening = false; stateLock.notifyAll(); if (switchState != SwitchState.IDLE) { if (switchEventsHandler != null) { switchEventsHandler.onCameraSwitchError(error); switchEventsHandler = null; } switchState = SwitchState.IDLE; } if (failureType == CameraSession.FailureType.DISCONNECTED) { eventsHandler.onCameraDisconnected(); } else { eventsHandler.onCameraError(error); } } else { Logging.w(TAG, "Opening camera failed, retry: " + error); createSessionInternal(OPEN_CAMERA_DELAY_MS); } } } }; @Nullable private final CameraSession.Events cameraSessionEventsHandler = new CameraSession.Events() { @Override public void onCameraOpening() { checkIsOnCameraThread(); synchronized (stateLock) { if (currentSession != null) { Logging.w(TAG, "onCameraOpening while session was open."); return; } eventsHandler.onCameraOpening(cameraName); } } @Override public void onCameraError(CameraSession session, String error) { checkIsOnCameraThread(); synchronized (stateLock) { if (session != currentSession) { Logging.w(TAG, "onCameraError from another session: " + error); return; } eventsHandler.onCameraError(error); stopCapture(); } } @Override public void onCameraDisconnected(CameraSession session) { checkIsOnCameraThread(); synchronized (stateLock) { if (session != currentSession) { Logging.w(TAG, "onCameraDisconnected from another session."); return; } eventsHandler.onCameraDisconnected(); stopCapture(); } } @Override public void onCameraClosed(CameraSession session) { checkIsOnCameraThread(); synchronized (stateLock) { if (session != currentSession && currentSession != null) { Logging.d(TAG, "onCameraClosed from another session."); return; } eventsHandler.onCameraClosed(); } } @Override public void onFrameCaptured(CameraSession session, VideoFrame frame) { checkIsOnCameraThread(); synchronized (stateLock) { if (session != currentSession) { Logging.w(TAG, "onFrameCaptured from another session."); return; } if (!firstFrameObserved) { eventsHandler.onFirstFrameAvailable(); firstFrameObserved = true; } cameraStatistics.addFrame(); capturerObserver.onFrameCaptured(frame); } } }; private final Runnable openCameraTimeoutRunnable = new Runnable() { @Override public void run() { eventsHandler.onCameraError("Camera failed to start within timeout."); } }; // Initialized on initialize // ------------------------- @Nullable private Handler cameraThreadHandler; private Context applicationContext; private CapturerObserver capturerObserver; @Nullable private SurfaceTextureHelper surfaceHelper; private final Object stateLock = new Object(); private boolean sessionOpening; /* guarded by stateLock */ @Nullable private CameraSession currentSession; /* guarded by stateLock */ private String cameraName; /* guarded by stateLock */ private int width; /* guarded by stateLock */ private int height; /* guarded by stateLock */ private int framerate; /* guarded by stateLock */ private int openAttemptsRemaining; /* guarded by stateLock */ private SwitchState switchState = SwitchState.IDLE; /* guarded by stateLock */ @Nullable private CameraSwitchHandler switchEventsHandler; /* guarded by stateLock */ // Valid from onDone call until stopCapture, otherwise null. @Nullable private CameraStatistics cameraStatistics; /* guarded by stateLock */ private boolean firstFrameObserved; /* guarded by stateLock */ public CameraCapturer(String cameraName, @Nullable CameraEventsHandler eventsHandler, CameraEnumerator cameraEnumerator) { if (eventsHandler == null) { eventsHandler = new CameraEventsHandler() { @Override public void onCameraError(String errorDescription) {} @Override public void onCameraDisconnected() {} @Override public void onCameraFreezed(String errorDescription) {} @Override public void onCameraOpening(String cameraName) {} @Override public void onFirstFrameAvailable() {} @Override public void onCameraClosed() {} }; } this.eventsHandler = eventsHandler; this.cameraEnumerator = cameraEnumerator; this.cameraName = cameraName; uiThreadHandler = new Handler(Looper.getMainLooper()); final String[] deviceNames = cameraEnumerator.getDeviceNames(); if (deviceNames.length == 0) { throw new RuntimeException("No cameras attached."); } if (!Arrays.asList(deviceNames).contains(this.cameraName)) { throw new IllegalArgumentException( "Camera name " + this.cameraName + " does not match any known camera device."); } } @Override public void initialize(@Nullable SurfaceTextureHelper surfaceTextureHelper, Context applicationContext, CapturerObserver capturerObserver) { this.applicationContext = applicationContext; this.capturerObserver = capturerObserver; this.surfaceHelper = surfaceTextureHelper; this.cameraThreadHandler = surfaceTextureHelper == null ? null : surfaceTextureHelper.getHandler(); } @Override public void startCapture(int width, int height, int framerate) { Logging.d(TAG, "startCapture: " + width + "x" + height + "@" + framerate); if (applicationContext == null) { throw new RuntimeException("CameraCapturer must be initialized before calling startCapture."); } synchronized (stateLock) { if (sessionOpening || currentSession != null) { Logging.w(TAG, "Session already open"); return; } this.width = width; this.height = height; this.framerate = framerate; sessionOpening = true; openAttemptsRemaining = MAX_OPEN_CAMERA_ATTEMPTS; createSessionInternal(0); } } private void createSessionInternal(int delayMs) { uiThreadHandler.postDelayed(openCameraTimeoutRunnable, delayMs + OPEN_CAMERA_TIMEOUT); cameraThreadHandler.postDelayed(new Runnable() { @Override public void run() { createCameraSession(createSessionCallback, cameraSessionEventsHandler, applicationContext, surfaceHelper, cameraName, width, height, framerate); } }, delayMs); } @Override public void stopCapture() { Logging.d(TAG, "Stop capture"); synchronized (stateLock) { while (sessionOpening) { Logging.d(TAG, "Stop capture: Waiting for session to open"); try { stateLock.wait(); } catch (InterruptedException e) { Logging.w(TAG, "Stop capture interrupted while waiting for the session to open."); Thread.currentThread().interrupt(); return; } } if (currentSession != null) { Logging.d(TAG, "Stop capture: Nulling session"); cameraStatistics.release(); cameraStatistics = null; final CameraSession oldSession = currentSession; cameraThreadHandler.post(new Runnable() { @Override public void run() { oldSession.stop(); } }); currentSession = null; capturerObserver.onCapturerStopped(); } else { Logging.d(TAG, "Stop capture: No session open"); } } Logging.d(TAG, "Stop capture done"); } @Override public void changeCaptureFormat(int width, int height, int framerate) { Logging.d(TAG, "changeCaptureFormat: " + width + "x" + height + "@" + framerate); synchronized (stateLock) { stopCapture(); startCapture(width, height, framerate); } } @Override public void dispose() { Logging.d(TAG, "dispose"); stopCapture(); } @Override public void switchCamera(final CameraSwitchHandler switchEventsHandler) { Logging.d(TAG, "switchCamera"); cameraThreadHandler.post(new Runnable() { @Override public void run() { switchCameraInternal(switchEventsHandler); } }); } @Override public boolean isScreencast() { return false; } public void printStackTrace() { Thread cameraThread = null; if (cameraThreadHandler != null) { cameraThread = cameraThreadHandler.getLooper().getThread(); } if (cameraThread != null) { StackTraceElement[] cameraStackTrace = cameraThread.getStackTrace(); if (cameraStackTrace.length > 0) { Logging.d(TAG, "CameraCapturer stack trace:"); for (StackTraceElement traceElem : cameraStackTrace) { Logging.d(TAG, traceElem.toString()); } } } } private void reportCameraSwitchError( String error, @Nullable CameraSwitchHandler switchEventsHandler) { Logging.e(TAG, error); if (switchEventsHandler != null) { switchEventsHandler.onCameraSwitchError(error); } } private void switchCameraInternal(@Nullable final CameraSwitchHandler switchEventsHandler) { Logging.d(TAG, "switchCamera internal"); final String[] deviceNames = cameraEnumerator.getDeviceNames(); if (deviceNames.length < 2) { if (switchEventsHandler != null) { switchEventsHandler.onCameraSwitchError("No camera to switch to."); } return; } synchronized (stateLock) { if (switchState != SwitchState.IDLE) { reportCameraSwitchError("Camera switch already in progress.", switchEventsHandler); return; } if (!sessionOpening && currentSession == null) { reportCameraSwitchError("switchCamera: camera is not running.", switchEventsHandler); return; } this.switchEventsHandler = switchEventsHandler; if (sessionOpening) { switchState = SwitchState.PENDING; return; } else { switchState = SwitchState.IN_PROGRESS; } Logging.d(TAG, "switchCamera: Stopping session"); cameraStatistics.release(); cameraStatistics = null; final CameraSession oldSession = currentSession; cameraThreadHandler.post(new Runnable() { @Override public void run() { oldSession.stop(); } }); currentSession = null; int cameraNameIndex = Arrays.asList(deviceNames).indexOf(cameraName); cameraName = deviceNames[(cameraNameIndex + 1) % deviceNames.length]; sessionOpening = true; openAttemptsRemaining = 1; createSessionInternal(0); } Logging.d(TAG, "switchCamera done"); } private void checkIsOnCameraThread() { if (Thread.currentThread() != cameraThreadHandler.getLooper().getThread()) { Logging.e(TAG, "Check is on camera thread failed."); throw new RuntimeException("Not on camera thread."); } } protected String getCameraName() { synchronized (stateLock) { return cameraName; } } abstract protected void createCameraSession( CameraSession.CreateSessionCallback createSessionCallback, CameraSession.Events events, Context applicationContext, SurfaceTextureHelper surfaceTextureHelper, String cameraName, int width, int height, int framerate); }
哎呀,這是一個抽象類,把最關鍵的createCameraSession交給子類去實現了。
因此又要回到CameraCapturer的子類中看看具體實現方案。
6.如上圖的第4點中的Camera1Capturer#createCameraSession
原來源頭是CameraSession。
由它來採集視頻流的。
7.首先看看CameraSession接口。
主要有兩個回調類。
一個建立Session的回調,正在建立onDone和建立失敗onFailure。
一個是事件回調,攝像頭打開,失敗,斷開鏈接,關閉,採集。
中止採集stop()。
8.再來看看CameraSession的實現類,如Camera1Session。
class Camera1Session implements CameraSession { private static final String TAG = "Camera1Session"; private static final int NUMBER_OF_CAPTURE_BUFFERS = 3; private static final Histogram camera1StartTimeMsHistogram = Histogram.createCounts("WebRTC.Android.Camera1.StartTimeMs", 1, 10000, 50); private static final Histogram camera1StopTimeMsHistogram = Histogram.createCounts("WebRTC.Android.Camera1.StopTimeMs", 1, 10000, 50); private static final Histogram camera1ResolutionHistogram = Histogram.createEnumeration( "WebRTC.Android.Camera1.Resolution", CameraEnumerationAndroid.COMMON_RESOLUTIONS.size()); private static enum SessionState { RUNNING, STOPPED } private final Handler cameraThreadHandler; private final Events events; private final boolean captureToTexture; private final Context applicationContext; private final SurfaceTextureHelper surfaceTextureHelper; private final int cameraId; private final android.hardware.Camera camera; private final android.hardware.Camera.CameraInfo info; private final CaptureFormat captureFormat; // Used only for stats. Only used on the camera thread. private final long constructionTimeNs; // Construction time of this class. private SessionState state; private boolean firstFrameReported = false; // TODO(titovartem) make correct fix during webrtc:9175 @SuppressWarnings("ByteBufferBackingArray") public static void create(final CreateSessionCallback callback, final Events events, final boolean captureToTexture, final Context applicationContext, final SurfaceTextureHelper surfaceTextureHelper, final int cameraId, final int width, final int height, final int framerate) { final long constructionTimeNs = System.nanoTime(); Logging.d(TAG, "Open camera " + cameraId); events.onCameraOpening(); final android.hardware.Camera camera; try { camera = android.hardware.Camera.open(cameraId); } catch (RuntimeException e) { callback.onFailure(FailureType.ERROR, e.getMessage()); return; } if (camera == null) { callback.onFailure(FailureType.ERROR, "android.hardware.Camera.open returned null for camera id = " + cameraId); return; } try { camera.setPreviewTexture(surfaceTextureHelper.getSurfaceTexture()); } catch (IOException | RuntimeException e) { camera.release(); callback.onFailure(FailureType.ERROR, e.getMessage()); return; } final android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo(); android.hardware.Camera.getCameraInfo(cameraId, info); final CaptureFormat captureFormat; try { final android.hardware.Camera.Parameters parameters = camera.getParameters(); captureFormat = findClosestCaptureFormat(parameters, width, height, framerate); final Size pictureSize = findClosestPictureSize(parameters, width, height); updateCameraParameters(camera, parameters, captureFormat, pictureSize, captureToTexture); } catch (RuntimeException e) { camera.release(); callback.onFailure(FailureType.ERROR, e.getMessage()); return; } if (!captureToTexture) { final int frameSize = captureFormat.frameSize(); for (int i = 0; i < NUMBER_OF_CAPTURE_BUFFERS; ++i) { final ByteBuffer buffer = ByteBuffer.allocateDirect(frameSize); camera.addCallbackBuffer(buffer.array()); } } // Calculate orientation manually and send it as CVO insted. camera.setDisplayOrientation(0 /* degrees */); callback.onDone(new Camera1Session(events, captureToTexture, applicationContext, surfaceTextureHelper, cameraId, camera, info, captureFormat, constructionTimeNs)); } private static void updateCameraParameters(android.hardware.Camera camera, android.hardware.Camera.Parameters parameters, CaptureFormat captureFormat, Size pictureSize, boolean captureToTexture) { final List<String> focusModes = parameters.getSupportedFocusModes(); parameters.setPreviewFpsRange(captureFormat.framerate.min, captureFormat.framerate.max); parameters.setPreviewSize(captureFormat.width, captureFormat.height); parameters.setPictureSize(pictureSize.width, pictureSize.height); if (!captureToTexture) { parameters.setPreviewFormat(captureFormat.imageFormat); } if (parameters.isVideoStabilizationSupported()) { parameters.setVideoStabilization(true); } if (focusModes.contains(android.hardware.Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) { parameters.setFocusMode(android.hardware.Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO); } camera.setParameters(parameters); } private static CaptureFormat findClosestCaptureFormat( android.hardware.Camera.Parameters parameters, int width, int height, int framerate) { // Find closest supported format for |width| x |height| @ |framerate|. final List<CaptureFormat.FramerateRange> supportedFramerates = Camera1Enumerator.convertFramerates(parameters.getSupportedPreviewFpsRange()); Logging.d(TAG, "Available fps ranges: " + supportedFramerates); final CaptureFormat.FramerateRange fpsRange = CameraEnumerationAndroid.getClosestSupportedFramerateRange(supportedFramerates, framerate); final Size previewSize = CameraEnumerationAndroid.getClosestSupportedSize( Camera1Enumerator.convertSizes(parameters.getSupportedPreviewSizes()), width, height); CameraEnumerationAndroid.reportCameraResolution(camera1ResolutionHistogram, previewSize); return new CaptureFormat(previewSize.width, previewSize.height, fpsRange); } private static Size findClosestPictureSize( android.hardware.Camera.Parameters parameters, int width, int height) { return CameraEnumerationAndroid.getClosestSupportedSize( Camera1Enumerator.convertSizes(parameters.getSupportedPictureSizes()), width, height); } private Camera1Session(Events events, boolean captureToTexture, Context applicationContext, SurfaceTextureHelper surfaceTextureHelper, int cameraId, android.hardware.Camera camera, android.hardware.Camera.CameraInfo info, CaptureFormat captureFormat, long constructionTimeNs) { Logging.d(TAG, "Create new camera1 session on camera " + cameraId); this.cameraThreadHandler = new Handler(); this.events = events; this.captureToTexture = captureToTexture; this.applicationContext = applicationContext; this.surfaceTextureHelper = surfaceTextureHelper; this.cameraId = cameraId; this.camera = camera; this.info = info; this.captureFormat = captureFormat; this.constructionTimeNs = constructionTimeNs; startCapturing(); } @Override public void stop() { Logging.d(TAG, "Stop camera1 session on camera " + cameraId); checkIsOnCameraThread(); if (state != SessionState.STOPPED) { final long stopStartTime = System.nanoTime(); stopInternal(); final int stopTimeMs = (int) TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - stopStartTime); camera1StopTimeMsHistogram.addSample(stopTimeMs); } } private void startCapturing() { Logging.d(TAG, "Start capturing"); checkIsOnCameraThread(); state = SessionState.RUNNING; camera.setErrorCallback(new android.hardware.Camera.ErrorCallback() { @Override public void onError(int error, android.hardware.Camera camera) { String errorMessage; if (error == android.hardware.Camera.CAMERA_ERROR_SERVER_DIED) { errorMessage = "Camera server died!"; } else { errorMessage = "Camera error: " + error; } Logging.e(TAG, errorMessage); stopInternal(); if (error == android.hardware.Camera.CAMERA_ERROR_EVICTED) { events.onCameraDisconnected(Camera1Session.this); } else { events.onCameraError(Camera1Session.this, errorMessage); } } }); if (captureToTexture) { listenForTextureFrames(); } else { listenForBytebufferFrames(); } try { camera.startPreview(); } catch (RuntimeException e) { stopInternal(); events.onCameraError(this, e.getMessage()); } } private void stopInternal() { Logging.d(TAG, "Stop internal"); checkIsOnCameraThread(); if (state == SessionState.STOPPED) { Logging.d(TAG, "Camera is already stopped"); return; } state = SessionState.STOPPED; surfaceTextureHelper.stopListening(); // Note: stopPreview or other driver code might deadlock. Deadlock in // android.hardware.Camera._stopPreview(Native Method) has been observed on // Nexus 5 (hammerhead), OS version LMY48I. camera.stopPreview(); camera.release(); events.onCameraClosed(this); Logging.d(TAG, "Stop done"); } private void listenForTextureFrames() { surfaceTextureHelper.startListening(new SurfaceTextureHelper.OnTextureFrameAvailableListener() { @Override public void onTextureFrameAvailable( int oesTextureId, float[] transformMatrix, long timestampNs) { checkIsOnCameraThread(); if (state != SessionState.RUNNING) { Logging.d(TAG, "Texture frame captured but camera is no longer running."); surfaceTextureHelper.returnTextureFrame(); return; } if (!firstFrameReported) { final int startTimeMs = (int) TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - constructionTimeNs); camera1StartTimeMsHistogram.addSample(startTimeMs); firstFrameReported = true; } int rotation = getFrameOrientation(); if (info.facing == android.hardware.Camera.CameraInfo.CAMERA_FACING_FRONT) { // Undo the mirror that the OS "helps" us with. // http://developer.android.com/reference/android/hardware/Camera.html#setDisplayOrientation(int) transformMatrix = RendererCommon.multiplyMatrices( transformMatrix, RendererCommon.horizontalFlipMatrix()); } final VideoFrame.Buffer buffer = surfaceTextureHelper.createTextureBuffer(captureFormat.width, captureFormat.height, RendererCommon.convertMatrixToAndroidGraphicsMatrix(transformMatrix)); final VideoFrame frame = new VideoFrame(buffer, rotation, timestampNs); events.onFrameCaptured(Camera1Session.this, frame); frame.release(); } }); } private void listenForBytebufferFrames() { camera.setPreviewCallbackWithBuffer(new android.hardware.Camera.PreviewCallback() { @Override public void onPreviewFrame(final byte[] data, android.hardware.Camera callbackCamera) { checkIsOnCameraThread(); if (callbackCamera != camera) { Logging.e(TAG, "Callback from a different camera. This should never happen."); return; } if (state != SessionState.RUNNING) { Logging.d(TAG, "Bytebuffer frame captured but camera is no longer running."); return; } final long captureTimeNs = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime()); if (!firstFrameReported) { final int startTimeMs = (int) TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - constructionTimeNs); camera1StartTimeMsHistogram.addSample(startTimeMs); firstFrameReported = true; } VideoFrame.Buffer frameBuffer = new NV21Buffer( data, captureFormat.width, captureFormat.height, () -> cameraThreadHandler.post(() -> { if (state == SessionState.RUNNING) { camera.addCallbackBuffer(data); } })); final VideoFrame frame = new VideoFrame(frameBuffer, getFrameOrientation(), captureTimeNs); events.onFrameCaptured(Camera1Session.this, frame); frame.release(); } }); } private int getDeviceOrientation() { int orientation = 0; WindowManager wm = (WindowManager) applicationContext.getSystemService(Context.WINDOW_SERVICE); switch (wm.getDefaultDisplay().getRotation()) { case Surface.ROTATION_90: orientation = 90; break; case Surface.ROTATION_180: orientation = 180; break; case Surface.ROTATION_270: orientation = 270; break; case Surface.ROTATION_0: default: orientation = 0; break; } return orientation; } private int getFrameOrientation() { int rotation = getDeviceOrientation(); if (info.facing == android.hardware.Camera.CameraInfo.CAMERA_FACING_BACK) { rotation = 360 - rotation; } return (info.orientation + rotation) % 360; } private void checkIsOnCameraThread() { if (Thread.currentThread() != cameraThreadHandler.getLooper().getThread()) { throw new IllegalStateException("Wrong thread"); } } }
再Camera1Session#create中會建立一個Camera1Session對象。
直接在構造函數中開始採集。
Camera1Session#startCapturing具體怎麼作的呢?
再進入Camera1Session#listenForBytebufferFrames看看是什麼狀況。
原來這裏纔是真正進行採集的地方。
Camera#setPreviewCallbackWithBuffer,將相機預覽的數據填充到一個緩衝區。
而後將預覽的二進制數據添加到一個VideoFrame.Buffer中。這裏採用的是NV21Buffer來處理該過程的。
最後經過event一路回調到Camera1Capture中去了,隨便把frame也帶過去了。