這是Android音視頻的第一篇文章,終於回到了個人老本行上。以前好像歷來沒有作過Camera相關的開發,這塊對我來講仍是有點陌生,因此從頭開始學起。java
在Android API21中Google發佈了Camera2來取代本來的Camera,二者的變更也是比較大的。android
Camera2中Google採用了pipeline(管道)的概念,將Camera Device相機設備和Android Device安卓設備鏈接起來, Android Device經過管道發送CaptureRequest請求給Camera Device,Camera Device經過管道返回CameraMetadata數據給Android Device,這一切創建在一個叫做CameraCaptureSession的會話中。git
在Camera2 架構在覈心參與類角色有:CameraManager、CameraDevice、CameraCharacteristics、CameraRequest與CameraRequest.Builder、CameraCaptureSession以及CaptureResult。github
位於android.hardware.camera2.CameraManager下,也是Android 21(5.0)添加的,和其餘系統服務同樣經過 Context.getSystemService(Context.CAMERA_SERVICE)
來完成初始化,主要用於管理系統攝像頭。算法
manager.getCameraIdList()
獲取Android設備的攝像頭列表manager.getCameraCharacteristics(cameraId)
獲取指定攝像頭的相關特性manager.openCamera(String cameraId, CameraDevice.StateCallback callback, Handler handler)
打開指定Id的攝像頭,StateCallback 是打開狀態的一個監聽回調,Handler 表示使用哪一個線程處理回調,若是爲null則表示當前線程。CameraDevice是Camera2抽象出來的一個對象,直接與系統硬件攝像頭相聯繫。bash
經過CameraDevice.StateCallback監聽攝像頭的狀態網絡
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback(){
@Override
public void onOpened(@NonNull CameraDevice camera) {
//攝像頭打開,能夠建立會話,開始預覽
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
}
};
複製代碼
管理CameraCaptureSession會話,至關於Android Device和Camera Device之間的管道,後面的數據交流都在這個會話中完成。session
管理CaptureRequest,主要包括經過createCaptureRequest(int templateType)建立捕獲請求,在須要預覽、拍照、再次預覽的時候都須要經過建立請求來完成。架構
正如前面所說,系統向攝像頭髮送 Capture 請求,而攝像頭會返回 CameraMetadata,這一切都是在由對應的CameraDevice建立的CameraCaptureSession 會話完成,當程序須要預覽、拍照、再次預覽時,都須要先經過會話。CameraCaptureSession一旦被建立,直到對應的CameraDevice關閉纔會死掉。雖然CameraCaptureSession會話用於從攝像頭中捕獲圖像,可是隻有同一個會話才能再次從同一攝像頭中捕獲圖像。ide
描述Cameradevice屬性的對象,可使用CameraManager經過getCameraCharacteristics(String cameraId)進行查詢。
CameraRequest表明了一次捕獲請求
CameraRequest.Builder用於描述捕獲圖片的各類參數設置,包含捕獲硬件(傳感器,鏡頭,閃存),對焦模式、曝光模式,處理流水線,控制算法和輸出緩衝區的配置,而後傳遞到對應的會話中進行設置。CameraRequest.Builder負責生成CameraRequest對象。
CaptureRequest描述是從圖像傳感器捕獲單個圖像的結果的子集的對象。
谷歌有寫兩個示例程序,介紹如何使用Camera2
示例程序中有預覽、拍照、錄像等功能,很是好的入門學習代碼。
在這裏仍是過一遍整個流程,加深理解。
private void openCamera(int width, int height) {
// 判斷權限
if (ContextCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
requestCameraPermission();
return;
}
// 設置參數,獲取攝像頭ID、設置預覽寬高等
setUpCameraOutputs(width, height);
// 配置TextureView的紋理轉換,解決Camera顯示變形問題
configureTransform(width, height);
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
//Semaphore 信號量,保證只能有一條線程使用Camera設備
if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
throw new RuntimeException("Time out waiting to lock camera opening.");
}
//真正去打開攝像頭,在mStateCallback中獲取成功或失敗的回調
manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
}
}
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice cameraDevice) {
// This method is called when the camera is opened. We start camera preview here.
mCameraOpenCloseLock.release();
mCameraDevice = cameraDevice;
//建立會話,開始預覽
createCameraPreviewSession();
}
@Override
public void onDisconnected(@NonNull CameraDevice cameraDevice) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
}
@Override
public void onError(@NonNull CameraDevice cameraDevice, int error) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
Activity activity = getActivity();
if (null != activity) {
activity.finish();
}
}
};
複製代碼
private void createCameraPreviewSession() {
try {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview.
Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder
= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
//這裏是指實時圖像數據的輸出目標,之後錄製視頻、直播等都須要在這裏添加對應的Target
mPreviewRequestBuilder.addTarget(surface);
// Here, we create a CameraCaptureSession for camera preview.
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (null == mCameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
try {
// 設置自動對焦
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Flash is automatically enabled when necessary.
setAutoFlash(mPreviewRequestBuilder);
// Finally, we start displaying the camera preview.
mPreviewRequest = mPreviewRequestBuilder.build();
//不斷捕獲圖像,顯示預覽圖像
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed( @NonNull CameraCaptureSession cameraCaptureSession) {
showToast("Failed");
}
}, null
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
複製代碼
/** * Initiate a still image capture. */
private void takePicture() {
lockFocus();
}
/** * Lock the focus as the first step for a still image capture. */
private void lockFocus() {
try {
// 告訴攝像機開始對焦
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_START);
// Tell #mCaptureCallback to wait for the lock.
mState = STATE_WAITING_LOCK;
// 發送CaptureRequest要求攝像機捕捉圖像
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
複製代碼
private CameraCaptureSession.CaptureCallback mCaptureCallback
= new CameraCaptureSession.CaptureCallback() {
private void process(CaptureResult result) {
//處理對焦、閃光燈等
.......
case STATE_WAITING_LOCK: {
.......
//對焦完成,保存一張圖片
captureStillPicture();
.......
break;
}
}
}
複製代碼
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
// This is the CaptureRequest.Builder that we use to take a picture.
// 將mImageReader做爲目標,得到的圖片數據會交給mImageReader處理
// 初始化時 mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);
// 因此數據最終會在mOnImageAvailableListener中處理
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
// Use the same AE and AF modes as the preview.
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
setAutoFlash(captureBuilder);
// Orientation
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));
// 拍照事件的最終回調,圖片處理完後調用onCaptureCompleted
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG, mFile.toString());
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
複製代碼
// 數據處理、保存
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
//mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));
Image mImage = reader.acquireNextImage();
ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
// 文件操做
// ......
// 這裏獲取到的是視頻的原始數據,要對數據作進一步的算法處理、或者採用第三方的編碼庫進行壓縮、以及網絡傳輸等,均可以從這裏拿到數據。
mImage.close();
}
};
複製代碼
主要看一下MediaRecorder錄製視頻相關代碼
private void startRecordingVideo() {
if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
return;
}
try {
// 關閉以前的會話,新的會話會添加錄像的Target
closePreviewSession();
// 配置MediaRecorder,音頻、視頻來源,編碼格式等
setUpMediaRecorder();
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// 建立一個適合視頻錄製的請求
mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
List<Surface> surfaces = new ArrayList<>();
// Set up Surface for the camera preview
Surface previewSurface = new Surface(texture);
surfaces.add(previewSurface);
mPreviewBuilder.addTarget(previewSurface);
// Set up Surface for the MediaRecorder 重要的一步,視頻信息會交給mMediaRecorder處理
Surface recorderSurface = mMediaRecorder.getSurface();
surfaces.add(recorderSurface);
mPreviewBuilder.addTarget(recorderSurface);
// Start a capture session
// Once the session starts, we can update the UI and start recording
mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
mPreviewSession = cameraCaptureSession;
updatePreview();
getActivity().runOnUiThread(new Runnable() {
@Override
public void run() {
// UI
mButtonVideo.setText(R.string.stop);
mIsRecordingVideo = true;
// 開始錄製
mMediaRecorder.start();
}
});
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
Activity activity = getActivity();
if (null != activity) {
Toast.makeText(activity, "Failed", Toast.LENGTH_SHORT).show();
}
}
}, mBackgroundHandler);
} catch (CameraAccessException | IOException e) {
e.printStackTrace();
}
}
複製代碼
// 配置MediaRecorder
private void setUpMediaRecorder() throws IOException {
final Activity activity = getActivity();
if (null == activity) {
return;
}
// 設置要用於錄製的音頻源。
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
// 設置要用於錄製的視頻源。
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
// 設置錄製期間生成的輸出文件的格式。
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
// 生成MP4文件路徑
if (mNextVideoAbsolutePath == null || mNextVideoAbsolutePath.isEmpty()) {
mNextVideoAbsolutePath = getVideoFilePath(getActivity());
}
mMediaRecorder.setOutputFile(mNextVideoAbsolutePath);
// 設置用於錄製的視頻編碼比特率。
mMediaRecorder.setVideoEncodingBitRate(10000000);
// 設置要捕獲的視頻的幀速率。
mMediaRecorder.setVideoFrameRate(30);
mMediaRecorder.setVideoSize(mVideoSize.getWidth(), mVideoSize.getHeight());
// 設置要用於錄製的視頻編碼器。
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
// 設置要用於錄製的音頻編碼器。
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
switch (mSensorOrientation) {
case SENSOR_ORIENTATION_DEFAULT_DEGREES:
mMediaRecorder.setOrientationHint(DEFAULT_ORIENTATIONS.get(rotation));
break;
case SENSOR_ORIENTATION_INVERSE_DEGREES:
mMediaRecorder.setOrientationHint(INVERSE_ORIENTATIONS.get(rotation));
break;
}
// 在調用start前必須的一步
mMediaRecorder.prepare();
}
複製代碼
/**
* 常規使用MediaRecorder去錄製視頻的例子以下:
* MediaRecorder recorder = new MediaRecorder();
* recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
* recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
* recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
* recorder.setOutputFile(PATH_NAME);
* recorder.prepare();
* recorder.start(); // Recording is now started
* ...
* recorder.stop();
* recorder.reset(); // You can reuse the object by going back to setAudioSource() step
* recorder.release(); // Now the object cannot be reused
**/
複製代碼
Camera2預覽、拍照和錄像的大概流程就是這樣,相對來講挺複雜的,但也很是重要,後面會繼續深刻分析原理和源碼。