Android CameraHal 類圖分析中得知,CameraAdapter與其餘類交互的通道是FrameNotifier,所以,咱們的學習入口就從FrameNotifier開始。html
FrameNotifier繼承了接口類MessageNotifier,所以,CameraAdapter須要實現這兩個類的全部接口。調用的基礎是enableMsgType,使用此接口來使能及註冊相關回調函數。cookie
1.enableMsgTypeapp
從上圖看,MessageNotifier中的接口函數enableMsgType最終被BaseCameraAdapter繼承實現,而MessageNotifier接口類被EventProvider引用。而EventProvider被兩個類引用——AppCallbackNotifier及CameraHal。EventProvider與FrameProvider基本相似。ide
1.函數
在Hal調用setParameters時,當下列條件知足時會調用setEventProvider()函數:學習
ExCameraParameters::KEY_TEMP_BRACKETING!=NULL && strcmp(valstr, ExCameraParameters::BRACKET_ENABLE) == 0,ui
就是說當使能BRACKETING功能時纔會用到。this
BRACKETING:使用不一樣曝光參數拍攝同一場景。用於圖片合成。例如,廣袤的草原,遠處有高山,若是採用同一組曝光參數,勢必會致使拍攝效果不佳,但若是近處的草原及遠處的高山採用不一樣的曝光參數,以後再進行合成。就能夠兼顧。編碼
setEventProvider設置的參數爲ALL_EVENTS 及 mCameraAdapter;spa
ALL_EVENTS定義:= enum CameraHalEventType { NO_EVENTS = 0x0, EVENT_FOCUS_LOCKED = 0x1, EVENT_FOCUS_ERROR = 0x2, EVENT_ZOOM_INDEX_REACHED = 0x4, EVENT_SHUTTER = 0x8, EVENT_FACE = 0x10, ///@remarks Future enum related to display, like frame displayed event, could be added here ALL_EVENTS = 0xFFFF ///Maximum of 16 event types supported };
2~4.
在setEventProvider()中, 構造一個EventProvider,並使用ALL_EVENTS註冊。
void CameraHal::setEventProvider(int32_t eventMask, MessageNotifier * eventNotifier) { if ( NULL != mEventProvider ) { mEventProvider->disableEventNotification(CameraHalEvent::ALL_EVENTS); delete mEventProvider; mEventProvider = NULL; } mEventProvider = new EventProvider(eventNotifier, this, eventCallbackRelay); if ( NULL == mEventProvider ) { CAMHAL_LOGEA("Error in creating EventProvider"); } else { mEventProvider->enableEventNotification(eventMask); } }
mEventProvider = new EventProvider(eventNotifier, this, eventCallbackRelay);
第一個參數是mCameraAdapter的類型轉換,第三個參數爲Hal中的eventCallbackRelay. 沒有特別的內容,只是初始化賦值操做。
mEventProvider->enableEventNotification(eventMask);
int EventProvider::enableEventNotification(int32_t frameTypes) { status_t ret = NO_ERROR; ///Enable the frame notification to CameraAdapter (which implements FrameNotifier interface) mEventNotifier->enableMsgType(frameTypes<<MessageNotifier::EVENT_BIT_FIELD_POSITION , NULL , mEventCallback , mCookie ); return ret; }
mEventNotifier是mCameraAdapter的類型轉換。所以,enableMsgType函數最終會調用到BaseCameraAdapter中的實現。
5. BaseCameraAdapter中的enableMsgType
void BaseCameraAdapter::enableMsgType(int32_t msgs, frame_callback callback, event_callback eventCb, void* cookie) { Mutex::Autolock lock(mSubscriberLock); LOG_FUNCTION_NAME; if ( CameraFrame::PREVIEW_FRAME_SYNC == msgs ) { mFrameSubscribers.add((int) cookie, callback); } else if ( CameraFrame::FRAME_DATA_SYNC == msgs ) { mFrameDataSubscribers.add((int) cookie, callback); } else if ( CameraFrame::IMAGE_FRAME == msgs) { mImageSubscribers.add((int) cookie, callback); } else if ( CameraFrame::RAW_FRAME == msgs) { mRawSubscribers.add((int) cookie, callback); } else if ( CameraFrame::VIDEO_FRAME_SYNC == msgs) { mVideoSubscribers.add((int) cookie, callback); } else if ( CameraHalEvent::ALL_EVENTS == msgs) { mFocusSubscribers.add((int) cookie, eventCb); mShutterSubscribers.add((int) cookie, eventCb); mZoomSubscribers.add((int) cookie, eventCb); mFaceSubscribers.add((int) cookie, eventCb); } else { CAMHAL_LOGEA("Message type subscription no supported yet!"); } LOG_FUNCTION_NAME_EXIT; }
其中ALL_EVENTS的相關類型CameraHalEvent並無像CameraFrame類枚舉作區分。
咱們再回頭看回調函數的實現
void CameraHal::eventCallback(CameraHalEvent* event) { if ( NULL != event ) { switch( event->mEventType ) { case CameraHalEvent::EVENT_FOCUS_LOCKED: case CameraHalEvent::EVENT_FOCUS_ERROR: { if ( mBracketingEnabled ) { startImageBracketing(); } break; } default: { break; } }; } }
回調函數中有針對EVENT_FOCUS_LOCKED及EVENT_FOCUS_ERROR兩個事件的處理函數——>startImageBracketing();由於咱們不太會用到Bracketing功能,因此,此處不深究。
從HAL中發起的Event事件處理就結束了,回顧一下,從setParameters()函數中,若Bracketing功能打開,則會新建EventProvider,並使用ALL_EVENTS向BaseCameraAdapter註冊回調函數。當有事件發生,CameraAdapter會調用回調函數,若是事件類型是EVENT_FOCUS_LOCKED及EVENT_FOCUS_ERROR,則會調用處理函數startImageBracketing();
跟CameraHal發起的EVENT,從流程上講沒有區別,包括CameraAdapter都是一個。惟一的區別是回調函數不同。AppCallbackNotifier的處理函數eventCallbackRelay——》eventCallback:
void AppCallbackNotifier::eventCallback(CameraHalEvent* chEvt) { ///Post the event to the event queue of AppCallbackNotifier MSGUTILS::Message msg; CameraHalEvent *event; LOG_FUNCTION_NAME; if ( NULL != chEvt ) { event = new CameraHalEvent(*chEvt); if ( NULL != event ) { msg.command = AppCallbackNotifier::NOTIFIER_CMD_PROCESS_EVENT; msg.arg1 = event; { Mutex::Autolock lock(mLock); mEventQ.put(&msg); } } else { CAMHAL_LOGEA("Not enough resources to allocate CameraHalEvent"); } } LOG_FUNCTION_NAME_EXIT; }
使用消息隊列mEventQ發送到消息處理線程AppCallbackNotifier::notificationThread()
bool AppCallbackNotifier::notificationThread() { bool shouldLive = true; status_t ret; LOG_FUNCTION_NAME; //CAMHAL_LOGDA("Notification Thread waiting for message"); ret = MSGUTILS::MessageQueue::waitForMsg(&mNotificationThread->msgQ(), &mEventQ, &mFrameQ, AppCallbackNotifier::NOTIFIER_TIMEOUT); //CAMHAL_LOGDA("Notification Thread received message"); if (mNotificationThread->msgQ().hasMsg()) { ///Received a message from CameraHal, process it CAMHAL_LOGDA("Notification Thread received message from Camera HAL"); shouldLive = processMessage(); if(!shouldLive) { CAMHAL_LOGDA("Notification Thread exiting."); } } if(mEventQ.hasMsg()) { ///Received an event from one of the event providers CAMHAL_LOGDA("Notification Thread received an event from event provider (CameraAdapter)"); notifyEvent(); } if(mFrameQ.hasMsg()) { ///Received a frame from one of the frame providers //CAMHAL_LOGDA("Notification Thread received a frame from frame provider (CameraAdapter)"); notifyFrame(); } LOG_FUNCTION_NAME_EXIT; return shouldLive; }
mEventQ的處理函數爲notifyEvent()
void AppCallbackNotifier::notifyEvent() { ///Receive and send the event notifications to app MSGUTILS::Message msg; LOG_FUNCTION_NAME; { Mutex::Autolock lock(mLock); mEventQ.get(&msg); } bool ret = true; CameraHalEvent *evt = NULL; CameraHalEvent::FocusEventData *focusEvtData; CameraHalEvent::ZoomEventData *zoomEvtData; CameraHalEvent::FaceEventData faceEvtData; if(mNotifierState != AppCallbackNotifier::NOTIFIER_STARTED) { return; } switch(msg.command) { case AppCallbackNotifier::NOTIFIER_CMD_PROCESS_EVENT: evt = ( CameraHalEvent * ) msg.arg1; if ( NULL == evt ) { CAMHAL_LOGEA("Invalid CameraHalEvent"); return; } switch(evt->mEventType) { case CameraHalEvent::EVENT_SHUTTER: if ( ( NULL != mCameraHal ) && ( NULL != mNotifyCb ) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_SHUTTER) ) ) { mNotifyCb(CAMERA_MSG_SHUTTER, 0, 0, mCallbackCookie); } mRawAvailable = false; break; case CameraHalEvent::EVENT_FOCUS_LOCKED: case CameraHalEvent::EVENT_FOCUS_ERROR: focusEvtData = &evt->mEventData->focusEvent; if ( ( focusEvtData->focusLocked ) && ( NULL != mCameraHal ) && ( NULL != mNotifyCb ) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_FOCUS) ) ) { mNotifyCb(CAMERA_MSG_FOCUS, true, 0, mCallbackCookie); mCameraHal->disableMsgType(CAMERA_MSG_FOCUS); } else if ( focusEvtData->focusError && ( NULL != mCameraHal ) && ( NULL != mNotifyCb ) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_FOCUS) ) ) { mNotifyCb(CAMERA_MSG_FOCUS, false, 0, mCallbackCookie); mCameraHal->disableMsgType(CAMERA_MSG_FOCUS); } break; case CameraHalEvent::EVENT_ZOOM_INDEX_REACHED: zoomEvtData = &evt->mEventData->zoomEvent; if ( ( NULL != mCameraHal ) && ( NULL != mNotifyCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_ZOOM) ) ) { mNotifyCb(CAMERA_MSG_ZOOM, zoomEvtData->currentZoomIndex, zoomEvtData->targetZoomIndexReached, mCallbackCookie); } break; case CameraHalEvent::EVENT_FACE: faceEvtData = evt->mEventData->faceEvent; if ( ( NULL != mCameraHal ) && ( NULL != mNotifyCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_METADATA) ) ) { // WA for an issue inside CameraService camera_memory_t *tmpBuffer = mRequestMemory(-1, 1, 1, NULL); mDataCb(CAMERA_MSG_PREVIEW_METADATA, tmpBuffer, 0, faceEvtData->getFaceResult(), mCallbackCookie); faceEvtData.clear(); if ( NULL != tmpBuffer ) { tmpBuffer->release(tmpBuffer); } } break; case CameraHalEvent::ALL_EVENTS: break; default: break; } break; } if ( NULL != evt ) { delete evt; } }
case CameraHalEvent::EVENT_SHUTTER:
case CameraHalEvent::EVENT_FOCUS_LOCKED:
case CameraHalEvent::EVENT_FOCUS_ERROR:
case CameraHalEvent::EVENT_ZOOM_INDEX_REACHED:
都是使用mNotifyCb實現通知,
case CameraHalEvent::EVENT_FACE:
使用mDataCb通知,須要將數據傳遞給上層。
上面是全部的notifyevent事件,看看Frame event都在那裏定義?一共有下面這些類型
enum FrameType { PREVIEW_FRAME_SYNC = 0x1, ///SYNC implies that the frame needs to be explicitly returned after consuming in order to be filled by camera again PREVIEW_FRAME = 0x2 , ///Preview frame includes viewfinder and snapshot frames IMAGE_FRAME_SYNC = 0x4, ///Image Frame is the image capture output frame IMAGE_FRAME = 0x8, VIDEO_FRAME_SYNC = 0x10, ///Timestamp will be updated for these frames VIDEO_FRAME = 0x20, FRAME_DATA_SYNC = 0x40, ///Any extra data assosicated with the frame. Always synced with the frame FRAME_DATA= 0x80, RAW_FRAME = 0x100, SNAPSHOT_FRAME = 0x200, ALL_FRAMES = 0xFFFF ///Maximum of 16 frame types supported };
FRAME_DATA_SYNC IMAGE_FRAME RAW_FRAME PREVIEW_FRAME_SYNC VIDEO_FRAME_SYNC
第一處: FRAME_DATA_SYNC = 0x40, ///Any extra data assosicated with the frame. Always synced with the frame
開啓Measurements時,將完整數據傳遞給mPreviewBufs[],不進行2Dto1D轉換,
void AppCallbackNotifier::setMeasurements(bool enable) { Mutex::Autolock lock(mLock); LOG_FUNCTION_NAME; mMeasurementEnabled = enable; if ( enable ) { mFrameProvider->enableFrameNotification(CameraFrame::FRAME_DATA_SYNC); } LOG_FUNCTION_NAME_EXIT; }
else if ( ( CameraFrame::FRAME_DATA_SYNC == frame->mFrameType ) && ( NULL != mCameraHal ) && ( NULL != mDataCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) { copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);
void AppCallbackNotifier::copyAndSendPreviewFrame(CameraFrame* frame, int32_t msgType) { dest = (void*) mPreviewBufs[mPreviewBufCount]; CAMHAL_LOGVB("%d:copy2Dto1D(%p, %p, %d, %d, %d, %d, %d, %d,%s)", __LINE__, NULL, //buf, frame->mBuffer, frame->mWidth, frame->mHeight, frame->mPixelFmt, frame->mAlignment, 2, frame->mLength, mPreviewPixelFormat); if ( NULL != dest ) { // data sync frames don't need conversion if (CameraFrame::FRAME_DATA_SYNC == frame->mFrameType) { if ( (mPreviewMemory->size / MAX_BUFFERS) >= frame->mLength ) { memcpy(dest, (void*) src, frame->mLength); } else { memset(dest, 0, (mPreviewMemory->size / MAX_BUFFERS)); } } else { if ((NULL == (void*)frame->mYuv[0]) || (NULL == (void*)frame->mYuv[1])){ CAMHAL_LOGEA("Error! One of the YUV Pointer is NULL"); goto exit; } else{ copy2Dto1D(dest, frame->mYuv, frame->mWidth, frame->mHeight, frame->mPixelFmt, frame->mAlignment, frame->mOffset, 2, frame->mLength, mPreviewPixelFormat); } } }
第二處:只註冊IMAGE_FRAME RAW_FRAME兩個類型
void AppCallbackNotifier::setFrameProvider(FrameNotifier *frameNotifier) { LOG_FUNCTION_NAME; ///@remarks There is no NULL check here. We will check ///for NULL when we get the start command from CameraAdapter mFrameProvider = new FrameProvider(frameNotifier, this, frameCallbackRelay); if ( NULL == mFrameProvider ) { CAMHAL_LOGEA("Error in creating FrameProvider"); } else { //Register only for captured images and RAW for now //TODO: Register for and handle all types of frames mFrameProvider->enableFrameNotification(CameraFrame::IMAGE_FRAME); mFrameProvider->enableFrameNotification(CameraFrame::RAW_FRAME); } LOG_FUNCTION_NAME_EXIT; }
IMAGE_FRAME,表示須要JPEG編碼數據,有兩種方式,若是知足下列條件
else if ( (CameraFrame::IMAGE_FRAME == frame->mFrameType) && (NULL != mCameraHal) && (NULL != mDataCb) && ((CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG & frame->mQuirks) || (CameraFrame::ENCODE_RAW_RGB24_TO_JPEG & frame->mQuirks)|| (CameraFrame::ENCODE_RAW_YUV420SP_TO_JPEG & frame->mQuirks)))
則須要新建encoder線程,將YUV或RGB數據傳遞給encoder線程,並將AppCallbackNotifierEncoderCallback回調函數做爲參數傳遞進入,當編碼完成時,調用回調函數,回調函數調用AppCallbackNotifier::EncoderDoneCb(),函數內部會調用mDataCb(CAMERA_MSG_COMPRESSED_IMAGE, picture, 0, NULL, mCallbackCookie);將數據回傳。
若不知足上面的條件,只知足
else if ( ( CameraFrame::IMAGE_FRAME == frame->mFrameType ) && ( NULL != mCameraHal ) && ( NULL != mDataCb) )
則表示此frame不須要編碼,直接調用 copyAndSendPictureFrame(frame, CAMERA_MSG_COMPRESSED_IMAGE);將數據回調
void AppCallbackNotifier::copyAndSendPictureFrame(CameraFrame* frame, int32_t msgType) { camera_memory_t* picture = NULL; void *dest = NULL, *src = NULL; // scope for lock { picture = mRequestMemory(-1, frame->mLength, 1, NULL); if (NULL != picture) { dest = picture->data; if (NULL != dest) { src = (void *) ((unsigned int) frame->mBuffer + frame->mOffset); memcpy(dest, src, frame->mLength); } } } exit: mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType) frame->mFrameType); if(picture) { if((mNotifierState == AppCallbackNotifier::NOTIFIER_STARTED) && mCameraHal->msgTypeEnabled(msgType)) { mDataCb(msgType, picture, 0, NULL, mCallbackCookie); } picture->release(picture); } }
注意,這兩種處理都會調用mFrameProvider->returnFrame函數,用於回收buffer。
對於RAW_FRAME類型,notifyFrame處理較簡單:只是回傳數據,而後經過returnFrame回收buffer
if ( (CameraFrame::RAW_FRAME == frame->mFrameType )&& ( NULL != mCameraHal ) && ( NULL != mDataCb) && ( NULL != mNotifyCb ) ) { if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE) ) { #ifdef COPY_IMAGE_BUFFER copyAndSendPictureFrame(frame, CAMERA_MSG_RAW_IMAGE); #else //TODO: Find a way to map a Tiler buffer to a MemoryHeapBase #endif } else { if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE_NOTIFY) ) { mNotifyCb(CAMERA_MSG_RAW_IMAGE_NOTIFY, 0, 0, mCallbackCookie); } mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType) frame->mFrameType); } mRawAvailable = true; }
第三處:在CameraFrame::PREVIEW_FRAME_SYNC用於Preview的回調處理
status_t AppCallbackNotifier::startPreviewCallbacks(CameraParameters ¶ms, void *buffers, uint32_t *offsets, int fd, size_t length, size_t count) {
if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME ) ) {
mFrameProvider->enableFrameNotification(CameraFrame::PREVIEW_FRAME_SYNC);
}
}
notifyFrame中的處理,若是MeasureMent沒有打開,則直接回收buffer,若是打開了,則須要回傳數據。
else if ( ( CameraFrame::PREVIEW_FRAME_SYNC== frame->mFrameType ) && ( NULL != mCameraHal ) && ( NULL != mDataCb) && ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)) ) { //When enabled, measurement data is sent instead of video data if ( !mMeasurementEnabled ) { copyAndSendPreviewFrame(frame, CAMERA_MSG_PREVIEW_FRAME);// } else { mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType) frame->mFrameType); } }
第四處:startRecording中 CameraFrame::VIDEO_FRAME_SYNC,用於錄像,須要拷貝數據。
status_t AppCallbackNotifier::startRecording() { status_t ret = NO_ERROR; LOG_FUNCTION_NAME; Mutex::Autolock lock(mRecordingLock); if ( NULL == mFrameProvider ) { CAMHAL_LOGEA("Trying to start video recording without FrameProvider"); ret = -1; } if(mRecording) { return NO_INIT; } if ( NO_ERROR == ret ) { mFrameProvider->enableFrameNotification(CameraFrame::VIDEO_FRAME_SYNC); } mRecording = true; LOG_FUNCTION_NAME_EXIT; return ret; }
第一處: FRAME_DATA_SYNC = 0x40, ///Any extra data assosicated with the frame. Always synced with the frame
開啓Measurements時,將完整數據傳遞給mPreviewBufs[],不進行2Dto1D轉換,
第二初:setFrameProvider時,設置對於RAW_FRAME及IMAGE_FRAME,用於圖片的上傳及回收
第三處:在CameraFrame::PREVIEW_FRAME_SYNC用於Preview的回調及回收
第四處:startRecording中 CameraFrame::VIDEO_FRAME_SYNC,用於錄像,須要拷貝數據
else if ( (CameraFrame::IMAGE_FRAME == frame->mFrameType) && (NULL != mCameraHal) && (NULL != mDataCb) && ((CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG & frame->mQuirks) || (CameraFrame::ENCODE_RAW_RGB24_TO_JPEG & frame->mQuirks)|| (CameraFrame::ENCODE_RAW_YUV420SP_TO_JPEG & frame->mQuirks)))