camera2 (api2)打開預覽過程(二)

使用camera的流程: openCamera() -> applySettings() -> setPreviewTexture() -> startPreview() ->autoFocus() -> takePicture()。java

打開camera設備的大體過程:android

1,  實例化CameraModule對象,即mCurrentModule表示當前的module,默認是photoModule。api

2,  顯示第一次運行的對話框FirstRunDialog,在dialog正常結束後,執行resume。緩存

3,  根據mCurrentModule的類型,實際執行的PhotoModule.java中的resume,間接調用CameraProvider接口的實現類CameraController中的方法requestCamera,若是當期是api2,就會經過AndroidCamera2AgentImpl.java的實例,調用openCamera()$AndroidCamera2AgentImpl.java,實際調用的是父類CameraAgent.java中的方法openCamera。接下來會異步的方式執行打開camera的過程,具體就是CameraActions.OPEN_CAMERA消息的處理,這個消息的處理過程當中調用Cameramanager.java的openCamera。session

4,  經過CameraManager.java的openCamera,開啓打開camera的過程。同時實例化CameraDevice.StateCallback類型的回調mCameraDeviceStateCallback,以便在camera打開後執行其onOpened方法,這個變量是在AndroidCamera2AgentImpl.java中定義的,在這個onOpened回調中,又會CameraOpenCallbackopenCallback的回調onCameraOpened,而這個onCameraOpened的實如今CameraController.java中,從CameraController經過onCameraOpened把camera打開成功的消息傳遞到CameraActivity,進一步傳遞到 PhotoModule.java中的onCameraAvailable,開啓預覽。數據結構

建立CameraDeviceImpl.java實例。app

5,  CameraManager.java中的openCameraDeviceUserAsync,打開一個到camera設備的connection,先是獲取CameraService句柄,而後經過cameraService的connectDevice實現到camera hal層鏈接。異步

在openCameraDeviceUserAsync函數的最後,調用了deviceImpl.setRemoteDevice(cameraUser);同時攜帶了打開的camera客戶端做爲參數,指定到這裏說明camera成功打開了,因此會執行onOpened# CameraDevice.StateCallback,以及StateCallbackKK的onOpened。ide

6,  CameraService中的connectDevice,會經過makeClient建立CameraDeviceClient實例,這個實例對應了cameraservice.java中的BasicClient類型。函數

CameraDeviceClient的繼承關係:

classCameraDeviceClient :

    publicCamera2ClientBase<CameraDeviceClientBase>,

        publiccamera2::FrameProcessorBase::FilteredListener

繼承了Camera2ClientBase,CameraDeviceClientBase,實現了監聽:FrameProcessorBase::FilteredListener。

Camera2ClientBase是一個模板類,其中的TClientBase是指CameraDeviceClientBase。

CameraDeviceClientBase又繼承了CameraService::BasicClient,camera2::BnCameraDeviceUser。其中繼承BnCameraDeviceUser使其具備了跨進程通訊的能力。

因此實例化CameraDeviceClient時,這一系列類的構造函數都會被調用。

7,  在Camera2ClientBase的構造函數中建立了Camera3Device實例sp<CameraDeviceBase>  mDevice;

CameraDeviceClient完成實例化後,執行其initialize方法,一方面調用Camera2ClientBase的initializeImpl,執行權限檢查操做實際調用的是CameraService::BasicClient::startCameraOps()方法。

另外一方面調用Camera3Device的initialize,打開hal設備,執行Hal層的初始化,這個過程當中會建立Camera3BufferManager,啓動RequestThread,拍照的request,預覽的request都會在這個線程的threadLoop中獲得處理。

在CameraDeviceClient的initializeImpl中,建立一個FrameProcessorBase實例,這是一個輸出幀元數據處理線程,當設備由新的frames可用時,就會調用onResultAvailable方法。

8,在camera成功打開後,接着應用設置項,設置顯示紋理,而後纔是開啓預覽界面,其中在設置顯示PreviewTexture時,建立了CameraCaptureSession,這是後期發送預覽、拍照請求的基礎。在CameraCaptureSession成功建立後,會回調CameraCaptureSession.StateCallback(相關實例在AndroidCamera2AgentImpl.java)的onConfigured,同時傳回建立的CameraCaptureSession session對象供預覽、拍照使用,而後把camerastate改爲AndroidCamera2StateHolder.CAMERA_PREVIEW_READY,表示能夠預覽了。

(以上是Android O版本的調用流程)

接上一篇繼續分析,下面就開始獲取CameraService的句柄,調用CameraService中的connectDevice函數。

frameworks/base/core/java/android/hardware/camera2/CameraManager.java

private CameraDeviceopenCameraDeviceUserAsync(String cameraId,
           CameraDevice.StateCallback callback, Handler handler, final int uid)
           throws CameraAccessException@ CameraManager.java {
//首先是獲取CameraService的句柄
         ICameraServicecameraService = CameraManagerGlobal.get().getCameraService();
//向cameraService發送鏈接請求
         cameraUser= cameraService.connectDevice(callbacks, id,
                   mContext.getOpPackageName(),uid);
}
先看獲取CameraService的過程:

frameworks/base/core/java/android/hardware/camera2/CameraManager.java

CameraManagerGlobal.get().getCameraService();
public ICameraService getCameraService() {
         connectCameraServiceLocked();
}
private void connectCameraServiceLocked() {
//這裏經過serviceManager來查詢cameraservice的句柄,對應的servicename是
// private static final String CAMERA_SERVICE_BINDER_NAME ="media.camera";跟前面提到的
//cameraservice的啓動過程當中註冊cameraservice到servicemanager時,設置的服務名是一
//樣的,因此這裏獲得的就是CameraService的句柄。
         IBinder cameraServiceBinder =
                   ServiceManager.getService(CAMERA_SERVICE_BINDER_NAME);
//接下來是把查詢到的cameraservice句柄這個Ibinder轉成ICameraService,在註冊
//cameraservice時,是把ICameraService轉成Ibinder保存的,這裏反向轉化,由此能夠推斷//CameraService.cpp必定繼承自IBinder,
         ICameraService cameraService = ICameraService.Stub.asInterface(cameraServiceBinder);
//這裏註冊一個監聽ICameraServiceListener,當一個新的camera可用時,有相應的回調
cameraService.addListener(this);
}
接着看下CameraService.cpp是否是繼承自IBinder,

CameraService.h
class CameraService : public::android::hardware::BnCameraService,
其他省略,CameraService繼承自BnCameraService,
BnCameraService.h
//對應的命名空間:android::hardware::
namespace android {
namespace hardware {
class BnCameraService : public::android::BnInterface<ICameraService>
}
}
這裏的ICameraService是有ICameraService.aidl進過aidl工具自動生成的,ICameraService.aidl文件經轉化後生成了ICameraService.java,ICameraService.h,ICameraService.cpp文件,早期版本aidl文件轉化後只有.java文件生成。若是*.aidl文件被添加到的Android.mk,它的build Target是庫,好比:include $(BUILD_SHARED_LIBRARY),那麼就會自動生成.h,.cpp文件。

繼續看BnInterface是否是根IBinder有關係:

IInterface.h
 
template<typename INTERFACE>
 
class BnInterface : public INTERFACE, public BBinder

能夠看到BnInterface是一個模板類,這裏的INTERFACE就是ICameraService,而且其繼承自BBinder,

Frameworks/native/include/binder/Binder.h
 
class BBinder : public IBinder{}

從這裏能夠看出BBinder繼承自IBinder。 

從以上繼承關係,能夠知道connectDevice的調用流程:

CameraManager.java

    private CameraDevice openCameraDeviceUserAsync(String cameraId,
            CameraDevice.StateCallback callback, Handler handler, final int uid)
            throws CameraAccessException {
......
            try {
                if (supportsCamera2ApiLocked(cameraId)) {
                    // Use cameraservice's cameradeviceclient implementation for HAL3.2+ devices
                    ICameraService cameraService = CameraManagerGlobal.get().getCameraService();
                    if (cameraService == null) {
                        throw new ServiceSpecificException(
                            ICameraService.ERROR_DISCONNECTED,
                            "Camera service is currently unavailable");
                    }
                    cameraUser = cameraService.connectDevice(callbacks, cameraId,
                            mContext.getOpPackageName(), uid);
                } else {
                    
                }
            } catch (ServiceSpecificException e) {
               
            } catch (RemoteException e) {
               
            }
        }
 
        return device;
    }

out/target/common/obj/java_libraries/framework_intermediates/.../ICameraService.java

這是由ICameraService.aidl自動生成的.java文件

public interface ICameraService extends android.os.IInterface
{
/** Local-side IPC implementation stub class. */
public static abstract class Stub extends android.os.Binder implements android.hardware.ICameraService
{
private static class Proxy implements android.hardware.ICameraService
{
/**
     * Open a camera device through the new camera API
     * Only supported for device HAL versions >= 3.2
     */
@Override public android.hardware.camera2.ICameraDeviceUser connectDevice(android.hardware.camera2.ICameraDeviceCallbacks callbacks, java.lang.String cameraId, java.lang.String opPackageName, int clientUid) throws android.os.RemoteException
{
android.os.Parcel _data = android.os.Parcel.obtain();
android.os.Parcel _reply = android.os.Parcel.obtain();
android.hardware.camera2.ICameraDeviceUser _result;
try {
_data.writeInterfaceToken(DESCRIPTOR);
_data.writeStrongBinder((((callbacks!=null))?(callbacks.asBinder()):(null)));
_data.writeString(cameraId);
_data.writeString(opPackageName);
_data.writeInt(clientUid);
mRemote.transact(Stub.TRANSACTION_connectDevice, _data, _reply, 0);
_reply.readException();
_result = android.hardware.camera2.ICameraDeviceUser.Stub.asInterface(_reply.readStrongBinder());
}
return _result;
}
}
}
}
由mRemote.transact()開啓跨進程的通訊,經由IBinder,BpBinder,IPCThreadState把請求發到Binder驅動,由Binder驅動把請求發到cameraService服務端,針對同一個請求,client和server端的業務碼是一致的。

CameraService.cpp的onTransact()方法會被調用:

status_t CameraService::onTransact(uint32_tcode, const Parcel& data, Parcel* reply, uint32_t flags) {
 
         return BnCameraService::onTransact(code, data, reply, flags);à
 
}
ICameraService.cpp

::android::status_tBnCameraService::onTransact(uint32_t _aidl_code, const ::android::Parcel&_aidl_data, ::android::Parcel* _aidl_reply, uint32_t _aidl_flags) {
 
         caseCall::CONNECTDEVICE:
 
         ::android::sp<::android::hardware::camera2::ICameraDeviceUser>_aidl_return;
 
//cameraManager.java中發起connectDevice時是帶四個參數,這裏加了一個ICameraDeviceUser類型的參數,並把這個出參做爲reply的一部分返回給client端,這裏的connectDevice纔是真正調用到CameraService.cpp中的connectDevice方法。
 
         ::android::binder::Status_aidl_status(connectDevice(in_callbacks, in_cameraId, in_opPackageName,in_clientUid, &_aidl_return));
 
//把_aidl_return寫入到返回的數據結構中
 
         _aidl_ret_status=_aidl_reply->writeStrongBinder(::android::hardware::camera2::ICameraDeviceUser::asBinder(_aidl_return));
 
}

下面先看下怎麼返回_aidl_return到client端的,cameraservice先把結果寫到_aidl_reply這個parcel中,而後由Binder驅動在發到client端,其中的細節是client端發起請求後會進入睡眠,等server端有了處理結果,把這個結果寫到了binder驅動後,client會被Binder驅動喚醒執行讀取操做。這裏接收結果的客戶端是:

ICameraService.java中的Proxy:

public interface ICameraService:: publicstatic abstract class Stub:: private static class Proxy{
 
         @Overridepublic android.hardware.camera2.ICameraDeviceUserconnectDevice(android.hardware.camera2.ICameraDeviceCallbacks callbacks, intcameraId, java.lang.String opPackageName, int clientUid) throwsandroid.os.RemoteException{
 
//這句代碼是發送請求的開始
 
                   mRemote.transact(Stub.TRANSACTION_connectDevice,_data, _reply, 0);
 
//這句就是服務端處理後返回的結果,經過_reply.readStrongBinder()從parcel中讀取結果,而後返回值給cameraManager。
 
                   _result= android.hardware.camera2.ICameraDeviceUser.Stub.asInterface(_reply.readStrongBinder());
 
                   return_result;
 
}
 
}
接着看CameraService.cpp中connectDevice都作了什麼操做:

CameraService.cpp

Status CameraService::connectDevice(
 
         constsp<hardware::camera2::ICameraDeviceCallbacks>& cameraCb,
 
         intcameraId, onst String16& clientPackageName, int clientUid,
 
         /*out*/sp<hardware::camera2::ICameraDeviceUser>*device){
 
//這裏的device是出參,類型是ICameraDeviceUser,也是有ICameraDeviceUser.aidl自動生成的,這個對象跟CameraDeviceClient的實例client對應,CameraDeviceClient繼承了BnCameraDeviceUser進而繼承了ICameraDeviceUser,
 
         sp<CameraDeviceClient>client = nullptr;      
 
// connectHelper的定義在CameraService.h中
 
         ret=connectHelper<hardware::camera2::ICameraDeviceCallbacks,CameraDeviceClient>
 
(cameraCb, id, CAMERA_HAL_API_VERSION_UNSPECIFIED,clientPackageName,
 
clientUid, USE_CALLING_PID, API_2, /*legacyMode*/ false,/*shimUpdateOnly*/ false,
 
/*out*/client);
 
         *device= client;
 
}

CameraService.h

在O版本上,connectHelper的函數實現又被放在了frameworks/av/services/camera/libcameraservice/cameraservice.cppz中

//這是一個模板方法,CALLBACK是hardware::camera2::ICameraDeviceCallbacks,

//CLIENT是CameraDeviceClient。這個方法主要做用是生成CameraClient實例,並調用其inittialize方法。

template<class CALLBACK, classCLIENT>
 
binder::StatusCameraService::connectHelper(const sp<CALLBACK>& cameraCb, constString8& cameraId, int halVersion, const String16& clientPackageName,int clientUid, int clientPid, apiLevel effectiveApiLevel, bool legacyMode, boolshimUpdateOnly, /*out*/sp<CLIENT>& device) {
 
         ret= makeClient(this, cameraCb, clientPackageName, id, facing, clientPid,
 
                clientUid, getpid(),legacyMode, halVersion, deviceVersion, effectiveApiLevel,
 
                /*out*/&tmp)
 
         client= static_cast<CLIENT*>(tmp.get());
 
         err= client->initialize(mModule)
 
}

CameraService.cpp

Status CameraService::makeClient(constsp<CameraService>& cameraService,
 
       const sp<IInterface>& cameraCb, const String16& packageName,int cameraId,
 
       int facing, int clientPid, uid_t clientUid, int servicePid, boollegacyMode,
 
       int halVersion, int deviceVersion, apiLevel effectiveApiLevel,
 
       /*out*/sp<BasicClient>* client){
 
//根據apiversion的不一樣,建立不一樣的CameraClient實例,這裏建立CameraDeviceClient實例。
 
         *client= new CameraDeviceClient(cameraService, tmp, packageName, cameraId,
 
                        facing, clientPid,clientUid, servicePid);
 
}

看下CameraDeviceClient的繼承關係,

CameraDeviceClient.h

class CameraDeviceClient :
 
         public Camera2ClientBase<CameraDeviceClientBase>,
 
         public camera2::FrameProcessorBase::FilteredListener{}
 
struct CameraDeviceClientBase :
 
         public CameraService::BasicClient,
 
         public hardware::camera2::BnCameraDeviceUser{}

能夠看到CameraDeviceClient繼承了CameraService::BasicClient,而且實現了ICameraDeviceUser的這個Binder的api,同時還實現了幀處理線程的監聽。

接着看CameraDeviceClient的構造函數:

CameraDeviceClient.cpp

CameraDeviceClient::CameraDeviceClient(constsp<CameraService>& cameraService,
 
         constsp<hardware::camera2::ICameraDeviceCallbacks>& remoteCallback,
 
         constString16& clientPackageName, int cameraId, int cameraFacing, int clientPid,
 
         uid_tclientUid, int servicePid) :
 
         Camera2ClientBase(cameraService,remoteCallback, clientPackageName,
 
         cameraId,cameraFacing, clientPid, clientUid, servicePid),

主要是在參數初始化列表中調用了父類Camera2ClientBase的構造函數。

Camera2ClientBase.cpp

Camera2ClientBase是一個模板類,這裏的TClientBase是CameraDeviceClientBase,能夠從CameraDeviceClient的繼承關係看出。除了調用父類TClientBase(CameraDeviceClientBase)的構造函數外,還建立Camera3Device實例。

template <typename TClientBase>
 
Camera2ClientBase<TClientBase>::Camera2ClientBase(
 
         constsp<CameraService>& cameraService,  constsp<TCamCallbacks>& remoteCallback,
 
         constString16& clientPackageName, int cameraId, int cameraFacing, int clientPid,
 
         uid_tclientUid, int servicePid):
 
TClientBase(cameraService, remoteCallback,clientPackageName,
 
       cameraId, cameraFacing, clientPid, clientUid, servicePid),{
 
         mDevice= new Camera3Device(cameraId);
 
}

接着把繼承的構造函數看完,CameraDeviceClientBase又調用了父類CameraService::BasicClient,的構造函數,BasicClient的構造函數實現代碼在CameraService中,主要作的事情是應用權限相關的,這塊權限的處理不是很瞭解。

這一系列構造函數的執行,最重要的仍是Camera2ClientBase中的Camera3Device實例的建立及緊接着的initialize方法的調用。

下面看initialize方法的調用流程:

CameraDeviceClient.cpp

status_tCameraDeviceClient::initialize(CameraModule *module){
 
         res= Camera2ClientBase::initialize(module);
 
//這裏註冊了一個監聽,mFrameProcessor是一個Thread,是一個輸出幀元數據處理線程,
 
//處理預覽回調相關的事情,這個線程會等待camera設備新的幀,而後調用監聽接口的方法onResultAvailable,
 
//這個方法:CameraDeviceClient::onResultAvailable,又會執行回調:
 
// remoteCb->onResultReceived(result.mMetadata,result.mResultExtras);這個remoteCb是
 
//hardware::camera2::ICameraDeviceCallbacks類型的,這個callback實例是在
 
//Cameramanager.java中執行打開camera設備時建立的,而後由CameraService的connectDevice方法一路傳遞到CameraDeviceClient這裏,因此這個回調實際的實現代碼是:
 
// CameraDeviceImpl.java中的內部類CameraDeviceCallbacks的方法:onResultReceived。
 
         mFrameProcessor->registerListener(FRAME_PROCESSOR_LISTENER_MIN_ID,
 
                   FRAME_PROCESSOR_LISTENER_MAX_ID,/*listener*/this, /*sendPartials*/true);
 
}
其中的CameraDeviceImpl.java中的內部類CameraDeviceCallbacks,在它的被回調方法onResultReceived中,經過mCaptureCallbackMap取出執行開啓預覽、拍照時傳入的callback(CameraDeviceImpl.CaptureCallback),這個CameraDeviceImpl.CaptureCallback實際是對應用程序端傳過來的CameraCaptureSession.CaptureCallback的封裝,具體封裝是經過createCaptureCallbackProxy方法實現的,因此當有一幀遠數據可用時,最終層層回調會執行CameraCaptureSession.CaptureCallback的onCaptureProgressed,onCaptureCompleted方法,將元數據傳給應用端。

Camera2ClientBase.cpp

status_t Camera2ClientBase<TClientBase>::initialize(CameraModule*module){
 
//這裏的mDevice是Camera3Device類的實例。
 
         res= mDevice->initialize(module);
 
}
Camera3Device.cpp

status_tCamera3Device::initialize(CameraModule *module){
 
//調用CameraModule的open方法打開HAL設備,從這裏開始就進入到了HAL層,HAL設備對應的結構體類型是camera3_device_t,module就是CameraModule的實例,這個實例的建立是在CameraService第一次被引用時在其void CameraService::onFirstRef()函數中,mModule = new CameraModule(rawModule);這部分跟CameraService的啓動有關係。
 
         res= module->open(deviceName.string(),
 
                   reinterpret_cast<hw_device_t**>(&device));
 
//初始化HAL層設備,
 
         res= device->ops->initialize(device, this);
 
//建立Buffer管理器。
 
         mBufferManager= new Camera3BufferManager();
 
 
 
         res= find_camera_metadata_ro_entry(info.static_camera_characteristics,
 
                   ANDROID_CONTROL_AE_LOCK_AVAILABLE,&aeLockAvailableEntry);
 
//開啓一個請求隊列線程,run方法後它的threadLoop方法就會執行。
 
         mRequestThread= new RequestThread(this, mStatusTracker, device, aeLockAvailable);
 
         res= mRequestThread->run(String8::format("C3Dev-%d-ReqQueue",mId).string());
 
//建立準備流的線程,可是並無立刻運行這個線程,而是等到調用Camera3Device的prepare方法時,根據須要開啓線程,何時調用了Camera3Device的prepare方法呢?這個我沒打log跟,一種可能的狀況是當建立一個session時,預分配緩存時調用。
 
         mPreparerThread= new PreparerThread();
 
}
在O版本,打開hal設備是經過CameraProviderManager來完成的。

-------------------------------------------------------------------------------------------------------------------

到這裏Camera設備的打開就完成了。緊接着的就是開啓預覽。

再回到應用層,CaptureModule.java

上面camera設備打開的過程是從openCameraAndStartPreview中open方法開始的,當camera成功打開後,會回調onCameraOpened,在這個回調中經過camera.startPreview啓動預覽。

private void openCameraAndStartPreview() {
 
         mOneCameraOpener.open(cameraId,captureSetting, mCameraHandler, mainThread,
 
                   imageRotationCalculator,mBurstController, mSoundPlayer,
 
                   newOpenCallback() {
 
                            @Override
 
                            publicvoid onCameraOpened(@Nonnull final OneCamera camera) {
 
                            mCamera= camera;
 
                            updatePreviewBufferDimension();
 
                            updatePreviewBufferSize();
 
                            camera.startPreview(newSurface(getPreviewSurfaceTexture()),
 
                                     newCaptureReadyCallback() {
 
                                               @Override
 
                                               publicvoid onReadyForCapture() {
 
//開啓預覽,要先建立拍照session,若是session成功建立,會回調到這裏,說明預覽已經準備好了,能夠準備拍照了,
 
                                                        mMainThread.execute(newRunnable() {
 
                                                             public void run() {
                                                             onPreviewStarted();
                                                             onReadyStateChanged(true);
                                                        }
                                        }
                                    }
                                }); 
                            }
 
                   },,);
}

OneCameraImpl.java

public void startPreview(SurfacepreviewSurface, CaptureReadyCallback listener) {
 
         setupAsync(mPreviewSurface,listener);
 
}

開啓異步拍照session

private void setupAsync(final SurfacepreviewSurface, final CaptureReadyCallback listener) {
 
       mCameraHandler.post(new Runnable() {
 
           @Override
           public void run() {
               setup(previewSurface,listener);
           }
       });
}

private void setup(Surface previewSurface,final CaptureReadyCallback listener) {
 
mDevice.createCaptureSession(outputSurfaces,new CameraCaptureSession.StateCallback() {
 
         public void onConfigured(CameraCaptureSessionsession) {
 
                   mCaptureSession = session;
 
         boolean success =repeatingPreview(null);
 
         if (success) {
 
                   listener.onReadyForCapture();
 
         }
 
}
 
}

Session的建立是調用到CameraDeviceImpl.java中的createCaptureSession,進而調用configureStreamsChecked配置流,所謂session建立是否成功,就是是否成功配置了輸入輸出流,若是成功了設備會block進入idle,而且回調StateCallbackKK. onIdle();配置可能會失敗,好比格式大小不支持,這時回調StateCallbackKK. onUnconfigured()。無論這個配置成功與否,都會new一個CameraCaptureSessionImpl實例,若是配置是成功的,就會回調上面CameraCaptureSession.StateCallback()中的onConfigured,同時把CameraCaptureSessionImpl實例做爲onConfigured的參數傳到OneCameraImpl.java中,就是mCaptureSession= session,也就是隻有配置成功了,纔會接着發出預覽的request即repeatingPreview。

OneCameraImpl.java

private boolean repeatingPreview(Objecttag) {
 
         CaptureRequest.Builderbuilder = mDevice.
 
                   createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
 
         mCaptureSession.setRepeatingRequest(builder.build(),mCaptureCallback,
 
                   mCameraHandler);
 
}

若是request成功build,就能夠準備拍照了。

CameraCaptureSessionImpl.java

public synchronized int setRepeatingRequest(CaptureRequest request, CaptureCallback callback,
         Handlerhandler) throws CameraAccessException {
 
//這裏會把提交的request的requestID入隊,由於session的建立要配置camera設備的內部管道,要分配內存緩衝區,很耗時,因此capture request提交後會先入隊,等session ready就開始執行capture
 
         return addPendingSequence(mDeviceImpl.setRepeatingRequest(request,
 
                   createCaptureCallbackProxy(handler,callback), mDeviceHandler));
 
}

其中的參數createCaptureCallbackProxy(handler, callback)是指定了回調從CameraDeviceImpl.CaptureCallback到CameraCaptureSession.CaptureCallback的,比較重要的一個方法是它的onCaptureCompleted。其中callback是OneCameraImpl.java中的mCaptureCallback。

繼續看Request的建立提交。

CameraDeviceImpl.java

public int setRepeatingRequest(CaptureRequest request, CaptureCallback callback,
         Handlerhandler) throws CameraAccessException {
 
         return submitCaptureRequest(requestList, callback, handler, /*streaming*/true);
 
}

private int submitCaptureRequest(List<CaptureRequest> requestList, CaptureCallbackcallback,
         Handlerhandler, boolean repeating) throws CameraAccessException {
 
         requestInfo= mRemoteDevice.submitRequestList(requestArray, repeating);
 
         if(callback != null) {
 
         mCaptureCallbackMap.put(requestInfo.getRequestId(),
 
                   new CaptureCallbackHolder(
 
                   callback, requestList,handler, repeating, mNextSessionId - 1));
         }
}

經過ICameraDeviceUserWrapper的實例mRemoteDevice提交request,

ICameraDeviceUserWrapper.java

public SubmitInfosubmitRequest(CaptureRequest request, boolean streaming){
 
         return mRemoteDevice.submitRequest(request, streaming);
 
}

這裏的mRemoteDevice類型是ICameraDeviceUser,這個實例是經過cameraService的connectDevice方法返回的。

前面咱們說過ICameraDeviceUser對應了CameraDeviceClient,CameraDeviceClient對應了CameraService的內部類Client。

ICameraDeviceUser.java,ICameraDeviceUser.cpp都是aidl文件自動生成的。

這樣request就藉助aidl的跨進程從ICameraDeviceUser.java到了CameraDeviceClient.cpp這邊,進而跟cameraservice創建了聯繫。

繼續看submitCaptureRequest對callback的處理,把callback作了一個包裝放在了mCaptureCallbackMap中跟requestID作了關聯,那麼何時回調了這個callback呢?

前面在說CameraDeviceClient.cpp的初始化時提到,mFrameProcessor是一個輸出幀元數據處理線程,處理預覽回調相關的事情,這個線程會等待caemra設備新的幀,而後而後調用監聽接口的方法onResultAvailable,這個方法:CameraDeviceClient::onResultAvailable,又會執行回調:remoteCb->onResultReceived(result.mMetadata,result.mResultExtras);這個remoteCb是hardware::camera2::ICameraDeviceCallbacks類型的,這個callback實例是在Cameramanager.java中執行打開camera設備時建立的,而後由CameraService的connectDevice方法一路傳遞到CameraDeviceClient這裏,因此這個回調實際的實現代碼是:

CameraDeviceImpl.java中的內部類CameraDeviceCallbacks的方法:onResultReceived。

咱們看CameraDeviceImpl.java的onResultReceived方法:

CameraDeviceImpl.java

public voidonResultReceived(CameraMetadataNative result,
         CaptureResultExtrasresultExtras) throws RemoteException {
 
//根據requestId,取得holder
 
         intrequestId = resultExtras.getRequestId();
 
         finalCaptureCallbackHolder holder =
 
                   CameraDeviceImpl.this.mCaptureCallbackMap.get(requestId);
 
         finalCaptureRequest request = holder.getRequest(resultExtras.getSubsequenceId());
 
//經過holder獲得callback執行回調,同時傳入數據resultAsCapture。
 
         holder.getCallback().onCaptureProgressed(CameraDeviceImpl.this,
 
request, resultAsCapture);
 
         holder.getCallback().onCaptureCompleted(CameraDeviceImpl.this,
 
request, resultAsCapture);
 
}
這樣就把底層的數據傳到Framework,進一步傳到了應用層。

另外獲取元數據,也能夠經過ImageReader。

在實例化Camera實例時,獲取ImageReader對象,同時設置它的監聽,當有一張新的圖片可用時,回調其onImageAvailable接口,在這個onImageAvailable接口中,讀取、存儲元數據。


OneCameraImpl.java
private final ImageReader mCaptureImageReader;
    ImageReader.OnImageAvailableListener mCaptureImageListener =
            new ImageReader.OnImageAvailableListener() {
                @Override
                public void onImageAvailable(ImageReader reader) {
                    // Add the image data to the latest in-flight capture.
                    // If all the data for that capture is complete, store the
                    // image data.
                    InFlightCapture capture = null;
                    synchronized (mCaptureQueue) {
                        if (mCaptureQueue.getFirst().setImage(reader.acquireLatestImage())
                                .isCaptureComplete()) {
                            capture = mCaptureQueue.removeFirst();
                        }
                    }
                    if (capture != null) {
                        onCaptureCompleted(capture);
                    }
                }
            };

獲取ImageReader實例,設置監聽。

OneCameraImpl(CameraDevice device, CameraCharacteristics characteristics, Size pictureSize) {
        mCaptureImageReader = ImageReader.newInstance(pictureSize.getWidth(),
                pictureSize.getHeight(),
                sCaptureImageFormat, 2);
        mCaptureImageReader.setOnImageAvailableListener(mCaptureImageListener, 
mCameraHandler);
}
 


拍照完成時,會回調onCaptureCompleted。

oneCameraImpl.java

private void onCaptureCompleted(InFlightCapture capture) {
 
        // Experimental support for writing RAW. We do not have a usable JPEG
        // here, so we don't use the usual capture session mechanism and instead
        // just store the RAW file in its own directory.
        // TODO: If we make this a real feature we should probably put the DNGs
        // into the Camera directly.
//能夠存儲元數據
        if (sCaptureImageFormat == ImageFormat.RAW_SENSOR) {
            if (!RAW_DIRECTORY.exists()) {
                if (!RAW_DIRECTORY.mkdirs()) {
                    throw new RuntimeException("Could not create RAW directory.");
                }
            }
            File dngFile = new File(RAW_DIRECTORY, capture.session.getTitle() + ".dng");
            writeDngBytesAndClose(capture.image, capture.totalCaptureResult,
                    mCharacteristics, dngFile);
        } else {
//也能夠存儲jpg。
            // Since this is not an HDR+ session, we will just save the
            // result.
            byte[] imageBytes = acquireJpegBytesAndClose(capture.image);
            saveJpegPicture(imageBytes, capture.parameters, capture.session,
                    capture.totalCaptureResult);
        }
        broadcastReadyState(true);
        capture.parameters.callback.onPictureTaken(capture.session);
}

調用writeDngBytesAndClose存儲元數據,

 private static void writeDngBytesAndClose(Image image, TotalCaptureResult captureResult,             CameraCharacteristics characteristics, File dngFile) {         try (DngCreator dngCreator = new DngCreator(characteristics, captureResult);                 FileOutputStream outputStream = new FileOutputStream(dngFile)) {             // TODO: Add DngCreator#setThumbnail and add the DNG to the normal             // filmstrip.             dngCreator.writeImage(outputStream, image);             outputStream.close();             image.close();         } catch (IOException e) {             Log.e(TAG, "Could not store DNG file", e);             return;         }         Log.i(TAG, "Successfully stored DNG file: " + dngFile.getAbsolutePath());     }

相關文章
相關標籤/搜索