簡介
經過整理前幾篇筆記,我已經對整個 Camera 的框架有比較清晰的認識,而且也已經漸漸摸清楚了整個控制流的邏輯。
此次,我打算經過追蹤 Camera.startPreview() 方法,以加深對控制流的理解。同時嘗試結合上一篇關於 module 的認識,將這個流程走到儘量底部的位置。
Camera.startPreview() flow
1. Frameworks
1.1 Camera.java
位置:frameworks/base/core/java/android/hardware/Camera.java
startPreview():
給上層 application 提供一個接口。
進入 Runtime 層。
/**
* Starts capturing and drawing preview frames to the screen.
* Preview will not actually start until a surface is supplied
* with {@link #setPreviewDisplay(SurfaceHolder)} or
* {@link #setPreviewTexture(SurfaceTexture)}.
*
* <p>If {@link #setPreviewCallback(Camera.PreviewCallback)},
* {@link #setOneShotPreviewCallback(Camera.PreviewCallback)}, or
* {@link #setPreviewCallbackWithBuffer(Camera.PreviewCallback)} were
* called, {@link Camera.PreviewCallback#onPreviewFrame(byte[], Camera)}
* will be called when preview data becomes available.
*/
public native final void startPreview();
2. Android Runtime
2.1 android_hardware_Camera.cpp
位置:frameworks/base/core/jni/android_hardware_Camera.cpp
android_hardware_Camera_startPreview():
調用 get_native_camera() 函數獲取一個 Camera 實例。
調用 Camera::startPreview()。
static void android_hardware_Camera_startPreview(JNIEnv *env, jobject thiz)
{
ALOGV("startPreview");
sp<Camera> camera = get_native_camera(env, thiz, NULL);
if (camera == 0) return;java
if (camera->startPreview() != NO_ERROR) {
jniThrowRuntimeException(env, "startPreview failed");
return;
}
}
get_native_camera():
從 DVM 中獲取關於 Camera 的上下文。
從上下文信息中獲取 Camera 實例。
sp<Camera> get_native_camera(JNIEnv *env, jobject thiz, JNICameraContext** pContext)
{
sp<Camera> camera;
Mutex::Autolock _l(sLock);
JNICameraContext* context = reinterpret_cast<JNICameraContext*>(env->GetLongField(thiz, fields.context));
if (context != NULL) {
camera = context->getCamera();
}
ALOGV("get_native_camera: context=%p, camera=%p", context, camera.get());
if (camera == 0) {
jniThrowRuntimeException(env,
"Camera is being used after Camera.release() was called");
}android
if (pContext != NULL) *pContext = context;
return camera;
}
3. Libraries
3.1 Camera.cpp
位置:frameworks/av/camera/Camera.cpp
startPreview():
mCamera 便是在 connect 過程當中返回的 CameraClient,它具體實現了 startPreview() 接口。
調用 CameraClient::startPreview()。
// start preview mode
status_t Camera::startPreview()
{
ALOGV("startPreview");
sp <::android::hardware::ICamera> c = mCamera;
if (c == 0) return NO_INIT;
return c->startPreview();
}
3.2 CameraClient.cpp
位置:frameworks/av/services/camera/libcameraservice/api1/CameraClient.cpp
startPreview():
經過 startCameraMode 函數進入具體的實現邏輯。
// start preview mode
status_t CameraClient::startPreview() {
LOG1("startPreview (pid %d)", getCallingPid());
return startCameraMode(CAMERA_PREVIEW_MODE);
}
startCameraMode():
根據傳入的參數 CAMERA_PREVIEW_MODE 肯定進入的分支。
調用 startPreviewMode() 。
// start preview or recording
status_t CameraClient::startCameraMode(camera_mode mode) {
LOG1("startCameraMode(%d)", mode);
Mutex::Autolock lock(mLock);
status_t result = checkPidAndHardware();
if (result != NO_ERROR) return result;api
switch(mode) {
case CAMERA_PREVIEW_MODE:
if (mSurface == 0 && mPreviewWindow == 0) {
LOG1("mSurface is not set yet.");
// still able to start preview in this case.
}
return startPreviewMode();
case CAMERA_RECORDING_MODE:
if (mSurface == 0 && mPreviewWindow == 0) {
ALOGE("mSurface or mPreviewWindow must be set before startRecordingMode.");
return INVALID_OPERATION;
}
return startRecordingMode();
default:
return UNKNOWN_ERROR;
}
}
startPreviewMode():
若是預覽已經存在,則直接返回成功信息。
若是未存在,則繼續往下走。
mHardware 是 CameraHardwareInterface 的實例,在 connect 過程的最後被初始化。
經過 mHardware 調用 setPreviewWindow() 和 startPreview() 接口。
進入 HAL 層。
status_t CameraClient::startPreviewMode() {
LOG1("startPreviewMode");
status_t result = NO_ERROR;app
// if preview has been enabled, nothing needs to be done
if (mHardware->previewEnabled()) {
return NO_ERROR;
}框架
if (mPreviewWindow != 0) {
mHardware->setPreviewScalingMode(
NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);
mHardware->setPreviewTransform(mOrientation);
}
mHardware->setPreviewWindow(mPreviewWindow);
result = mHardware->startPreview();
if (result == NO_ERROR) {
mCameraService->updateProxyDeviceState(
ICameraServiceProxy::CAMERA_STATE_ACTIVE,
String8::format("%d", mCameraId));
}
return result;
}
4. HAL
4.1 CameraHardwareInterface.h
位置:frameworks/av/services/camera/libcameraservice/device1/CameraHardwareInterface.h
previewEnable():
經過 mDevice->ops 繼續向下調用(不是咱們主要追蹤的)。
mDevice 便是經過 hw_get_module() 相關流程進行初始化的設備實例,它的類型是 camera_device_t 。
若是 preview 存在,則返回 true 。
/**
* Returns true if preview is enabled.
*/
int previewEnabled()
{
ALOGV("%s(%s)", __FUNCTION__, mName.string());
if (mDevice->ops->preview_enabled)
return mDevice->ops->preview_enabled(mDevice);
return false;
}
setPreviewWindow():
若是 set_preview_window 函數指針爲空,則返回失敗信息。
若否,經過 mDevice->ops 繼續向下調用(不是咱們主要追蹤的)。
/** Set the ANativeWindow to which preview frames are sent */
status_t setPreviewWindow(const sp<ANativeWindow>& buf)
{
ALOGV("%s(%s) buf %p", __FUNCTION__, mName.string(), buf.get());
if (mDevice->ops->set_preview_window) {
mPreviewWindow = buf;
if (buf != nullptr) {
if (mPreviewScalingMode != NOT_SET) {
setPreviewScalingMode(mPreviewScalingMode);
}
if (mPreviewTransform != NOT_SET) {
setPreviewTransform(mPreviewTransform);
}
}
mHalPreviewWindow.user = this;
ALOGV("%s &mHalPreviewWindow %p mHalPreviewWindow.user %p", __FUNCTION__,
&mHalPreviewWindow, mHalPreviewWindow.user);
return mDevice->ops->set_preview_window(mDevice,
buf.get() ? &mHalPreviewWindow.nw : 0);
}
return INVALID_OPERATION;
}
startPreview():
若 start_preview() 函數指針爲空,則返回失敗信息。
若否,則經過 mDevice 進行下一步操做。
關於 mDevice,咱們結合 Camera.open() 流程與 hw_get_module() 相關邏輯,能夠知道它的邏輯是這樣的:
在 CameraService 啓動時,會調用 onFirstRef() 對 module 進行初始化,獲取 module 實例。
在 open 過程當中,CameraClient 鏈接 CameraServer 成功時,會實例化 CameraHardwareInterface,並傳入 module 實例對其初始化。
在初始化過程當中,經過 module 實例對應的 open 方法,咱們得到一個 device 實例,即 mDevice,這對應了具體的攝像頭設備。
經過 mDevice,咱們就能夠將對應的指令傳達到硬件設備。
經過對 camera_device_t 類型進行追蹤,能夠找到函數指針的一個具體指向。
/**
* Start preview mode.
*/
status_t startPreview()
{
ALOGV("%s(%s)", __FUNCTION__, mName.string());
if (mDevice->ops->start_preview)
return mDevice->ops->start_preview(mDevice);
return INVALID_OPERATION;
}
4.2 camera.h
位置:hardware/libhardware/include/hardware/camera.h
struct camera_device:
這裏就聲明瞭咱們想要追蹤的 camera_device_t 。
ops 對應的類型是 camera_device_ops_t ,這個結構中聲明瞭函數指針。
typedef struct camera_device {
/**
* camera_device.common.version must be in the range
* HARDWARE_DEVICE_API_VERSION(0,0)-(1,FF). CAMERA_DEVICE_API_VERSION_1_0 is
* recommended.
*/
hw_device_t common;
camera_device_ops_t *ops;
void *priv;
} camera_device_t;
struct camera_device_ops:
因爲註釋過長,我把除了 start_preview 之外的註釋都去掉了。
能夠看到,全部關於 Camera 設備的操做,對應的函數指針都在這裏聲明瞭。
可是這裏無法看出,函數指針具體指向哪裏。
在 Linux 下用 find . -name "*.cpp" | xargs grep "start_preview =" 能夠找到一些對應的文件,這些文件所處的位置與具體的設備商有關。
在這些文件中,就肯定了函數指針的指向。
typedef struct camera_device_ops {
int (*set_preview_window)(struct camera_device *,
struct preview_stream_ops *window);ide
void (*set_callbacks)(struct camera_device *,
camera_notify_callback notify_cb,
camera_data_callback data_cb,
camera_data_timestamp_callback data_cb_timestamp,
camera_request_memory get_memory,
void *user);函數
void (*enable_msg_type)(struct camera_device *, int32_t msg_type);oop
void (*disable_msg_type)(struct camera_device *, int32_t msg_type);學習
int (*msg_type_enabled)(struct camera_device *, int32_t msg_type);ui
/**
* Start preview mode.
*/
int (*start_preview)(struct camera_device *);
void (*stop_preview)(struct camera_device *);
int (*preview_enabled)(struct camera_device *);
int (*store_meta_data_in_buffers)(struct camera_device *, int enable);
int (*start_recording)(struct camera_device *);
void (*stop_recording)(struct camera_device *);
int (*recording_enabled)(struct camera_device *);
void (*release_recording_frame)(struct camera_device *,
const void *opaque);
int (*auto_focus)(struct camera_device *);
int (*cancel_auto_focus)(struct camera_device *);
int (*take_picture)(struct camera_device *);
int (*cancel_picture)(struct camera_device *);
int (*set_parameters)(struct camera_device *, const char *parms);
char *(*get_parameters)(struct camera_device *);
void (*put_parameters)(struct camera_device *, char *);
int (*send_command)(struct camera_device *,
int32_t cmd, int32_t arg1, int32_t arg2);
void (*release)(struct camera_device *);
int (*dump)(struct camera_device *, int fd);
} camera_device_ops_t;
4.3 hardware/ti/omap4-aah/camera
經過 find 指令,我找到了一些與函數指針指向有關的文件。
從這些文件的路徑上看,它們與不一樣的設備供應商有關。
我決定往 ti/omap4-aah 子文件夾去一探究竟。
4.3.1 CameraHal_Module.cpp
位置:hardware/ti/omap4-aah/camera/CameraHal_Module.cpp
camera_device_open():
在 open 流程中,就指定了 ops 中指針的對應關係。
memset(camera_device, 0, sizeof(*camera_device));
memset(camera_ops, 0, sizeof(*camera_ops));
camera_device->base.common.tag = HARDWARE_DEVICE_TAG;
camera_device->base.common.version = 0;
camera_device->base.common.module = (hw_module_t *)(module);
camera_device->base.common.close = camera_device_close;
camera_device->base.ops = camera_ops;
camera_ops->set_preview_window = camera_set_preview_window;
camera_ops->set_callbacks = camera_set_callbacks;
camera_ops->enable_msg_type = camera_enable_msg_type;
camera_ops->disable_msg_type = camera_disable_msg_type;
camera_ops->msg_type_enabled = camera_msg_type_enabled;
camera_ops->start_preview = camera_start_preview;
camera_ops->stop_preview = camera_stop_preview;
camera_ops->preview_enabled = camera_preview_enabled;
camera_ops->store_meta_data_in_buffers = camera_store_meta_data_in_buffers;
camera_ops->start_recording = camera_start_recording;
camera_ops->stop_recording = camera_stop_recording;
camera_ops->recording_enabled = camera_recording_enabled;
camera_ops->release_recording_frame = camera_release_recording_frame;
camera_ops->auto_focus = camera_auto_focus;
camera_ops->cancel_auto_focus = camera_cancel_auto_focus;
camera_ops->take_picture = camera_take_picture;
camera_ops->cancel_picture = camera_cancel_picture;
camera_ops->set_parameters = camera_set_parameters;
camera_ops->get_parameters = camera_get_parameters;
camera_ops->put_parameters = camera_put_parameters;
camera_ops->send_command = camera_send_command;
camera_ops->release = camera_release;
camera_ops->dump = camera_dump;
*device = &camera_device->base.common;
// -------- TI specific stuff --------
camera_device->cameraid = cameraid;
camera_start_preview():
注意 gCameraHals 是 CameraHal * 。
經過調用 CameraHal::startPreview() 完成業務邏輯。
int camera_start_preview(struct camera_device * device)
{
CAMHAL_LOG_MODULE_FUNCTION_NAME;
int rv = -EINVAL;
ti_camera_device_t* ti_dev = NULL;
if(!device)
return rv;
ti_dev = (ti_camera_device_t*) device;
rv = gCameraHals[ti_dev->cameraid]->startPreview();
return rv;
}
4.3.2 CameraHal.cpp
位置:hardware/ti/omap4-aah/camera/CameraHal.cpp
這個文件對應的功能是,將 Camera Hardware Interface 映射到 V4L2。
注意兩個聲明:
extern "C" CameraAdapter* OMXCameraAdapter_Factory(size_t);
extern "C" CameraAdapter* V4LCameraAdapter_Factory(size_t);
分別對應 OMX 與 V4L 的適配器工廠,這裏多是將 Adapter 模式與 Factory 模式結合使用。
startPreview():
源代碼中帶有大量註釋,在這裏我將其去掉,只關注調用邏輯。
首先調用了 cameraPreviewInitialization() 函數進行初始化。
經過 CameraAdapter 發送 CAMERA_START_PREVIEW 指令,若成功執行,則完成流程。
status_t CameraHal::startPreview() {
LOG_FUNCTION_NAME;
status_t ret = cameraPreviewInitialization();
if (!mPreviewInitializationDone) return ret;
mPreviewInitializationDone = false;
if(mDisplayAdapter.get() != NULL) {
CAMHAL_LOGDA("Enabling display");
int width, height;
mParameters.getPreviewSize(&width, &height);
#if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS
ret = mDisplayAdapter->enableDisplay(width, height, &mStartPreview);
#else
ret = mDisplayAdapter->enableDisplay(width, height, NULL);
#endif
if ( ret != NO_ERROR ) {
CAMHAL_LOGEA("Couldn't enable display");
CAMHAL_ASSERT_X(false,
"At this stage mCameraAdapter->mStateSwitchLock is still locked, "
"deadlock is guaranteed");
goto error;
}
}
CAMHAL_LOGDA("Starting CameraAdapter preview mode");
ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_START_PREVIEW);
if(ret!=NO_ERROR) {
CAMHAL_LOGEA("Couldn't start preview w/ CameraAdapter");
goto error;
}
CAMHAL_LOGDA("Started preview");
mPreviewEnabled = true;
mPreviewStartInProgress = false;
return ret;
error:
CAMHAL_LOGEA("Performing cleanup after error");
//Do all the cleanup
freePreviewBufs();
mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
if(mDisplayAdapter.get() != NULL) {
mDisplayAdapter->disableDisplay(false);
}
mAppCallbackNotifier->stop();
mPreviewStartInProgress = false;
mPreviewEnabled = false;
LOG_FUNCTION_NAME_EXIT;
return ret;
}
cameraPreviewInitialization():
這段代碼比較長,但從註釋看,它主要作三件事:
經過 Adapter 設置相關參數;
申請 Buffers 空間;
對 Buffers 進行相應設置以進行預覽。
瞭解了大致思路,再看看具體代碼:
mCameraAdapter->setParameters():設置參數。
allocPreviewBufs() :申請 Buffers。
desc:注意到這個變量對應着 CameraAdapter::BuffersDescriptor,在申請 Buffers 空間成功後,便對其進行相應的成員設置。
mAppCallbackNotifier->start():開啓回調通知。
mAppCallbackNotifier->startPreviewCallbacks():將 Buffers 對應到相應的回調函數中,以供上層 APP 獲取預覽所需的數據。
NOTE:
代碼中不斷使用 mCameraAdapter->sendCommand() 來發送指令,並獲取一些數據。
指令發送到對應的 Adapter (如 V4L Adapter),就會調用相應的函數進行處理。
////////////
/**
@brief Set preview mode related initialization
-> Camera Adapter set params
-> Allocate buffers
-> Set use buffers for preview
@param none
@return NO_ERROR
@todo Update function header with the different errors that are possible
*/
status_t CameraHal::cameraPreviewInitialization()
{
status_t ret = NO_ERROR;
CameraAdapter::BuffersDescriptor desc;
CameraFrame frame;
unsigned int required_buffer_count;
unsigned int max_queueble_buffers;
#if PPM_INSTRUMENTATION || PPM_INSTRUMENTATION_ABS
gettimeofday(&mStartPreview, NULL);
#endif
LOG_FUNCTION_NAME;
if (mPreviewInitializationDone) {
return NO_ERROR;
}
if ( mPreviewEnabled ){
CAMHAL_LOGDA("Preview already running");
LOG_FUNCTION_NAME_EXIT;
return ALREADY_EXISTS;
}
if ( NULL != mCameraAdapter ) {
ret = mCameraAdapter->setParameters(mParameters);
}
if ((mPreviewStartInProgress == false) && (mDisplayPaused == false)){
ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_RESOLUTION_PREVIEW,( int ) &frame);
if ( NO_ERROR != ret ){
CAMHAL_LOGEB("Error: CAMERA_QUERY_RESOLUTION_PREVIEW %d", ret);
return ret;
}
///Update the current preview width and height
mPreviewWidth = frame.mWidth;
mPreviewHeight = frame.mHeight;
}
///If we don't have the preview callback enabled and display adapter,
if(!mSetPreviewWindowCalled || (mDisplayAdapter.get() == NULL)){
CAMHAL_LOGD("Preview not started. Preview in progress flag set");
mPreviewStartInProgress = true;
ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_SWITCH_TO_EXECUTING);
if ( NO_ERROR != ret ){
CAMHAL_LOGEB("Error: CAMERA_SWITCH_TO_EXECUTING %d", ret);
return ret;
}
return NO_ERROR;
}
if( (mDisplayAdapter.get() != NULL) && ( !mPreviewEnabled ) && ( mDisplayPaused ) )
{
CAMHAL_LOGDA("Preview is in paused state");
mDisplayPaused = false;
mPreviewEnabled = true;
if ( NO_ERROR == ret )
{
ret = mDisplayAdapter->pauseDisplay(mDisplayPaused);
if ( NO_ERROR != ret )
{
CAMHAL_LOGEB("Display adapter resume failed %x", ret);
}
}
//restart preview callbacks
if(mMsgEnabled & CAMERA_MSG_PREVIEW_FRAME)
{
mAppCallbackNotifier->enableMsgType (CAMERA_MSG_PREVIEW_FRAME);
}
signalEndImageCapture();
return ret;
}
required_buffer_count = atoi(mCameraProperties->get(CameraProperties::REQUIRED_PREVIEW_BUFS));
///Allocate the preview buffers
ret = allocPreviewBufs(mPreviewWidth, mPreviewHeight, mParameters.getPreviewFormat(), required_buffer_count, max_queueble_buffers);
if ( NO_ERROR != ret )
{
CAMHAL_LOGEA("Couldn't allocate buffers for Preview");
goto error;
}
if ( mMeasurementEnabled )
{
ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_QUERY_BUFFER_SIZE_PREVIEW_DATA,
( int ) &frame,
required_buffer_count);
if ( NO_ERROR != ret )
{
return ret;
}
///Allocate the preview data buffers
ret = allocPreviewDataBufs(frame.mLength, required_buffer_count);
if ( NO_ERROR != ret ) {
CAMHAL_LOGEA("Couldn't allocate preview data buffers");
goto error;
}
if ( NO_ERROR == ret )
{
desc.mBuffers = mPreviewDataBuffers;
desc.mOffsets = mPreviewDataOffsets;
desc.mFd = mPreviewDataFd;
desc.mLength = mPreviewDataLength;
desc.mCount = ( size_t ) required_buffer_count;
desc.mMaxQueueable = (size_t) required_buffer_count;
mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW_DATA,
( int ) &desc);
}
}
///Pass the buffers to Camera Adapter
desc.mBuffers = mPreviewBuffers;
desc.mOffsets = mPreviewOffsets;
desc.mFd = mPreviewFd;
desc.mLength = mPreviewLength;
desc.mCount = ( size_t ) required_buffer_count;
desc.mMaxQueueable = (size_t) max_queueble_buffers;
ret = mCameraAdapter->sendCommand(CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW,
( int ) &desc);
if ( NO_ERROR != ret )
{
CAMHAL_LOGEB("Failed to register preview buffers: 0x%x", ret);
freePreviewBufs();
return ret;
}
///Start the callback notifier
ret = mAppCallbackNotifier->start();
if( ALREADY_EXISTS == ret )
{
//Already running, do nothing
CAMHAL_LOGDA("AppCallbackNotifier already running");
ret = NO_ERROR;
}
else if ( NO_ERROR == ret ) {
CAMHAL_LOGDA("Started AppCallbackNotifier..");
mAppCallbackNotifier->setMeasurements(mMeasurementEnabled);
}
else
{
CAMHAL_LOGDA("Couldn't start AppCallbackNotifier");
goto error;
}
if (ret == NO_ERROR) mPreviewInitializationDone = true;
mAppCallbackNotifier->startPreviewCallbacks(mParameters, mPreviewBuffers, mPreviewOffsets, mPreviewFd, mPreviewLength, required_buffer_count);
return ret;
error:
CAMHAL_LOGEA("Performing cleanup after error");
//Do all the cleanup
freePreviewBufs();
mCameraAdapter->sendCommand(CameraAdapter::CAMERA_STOP_PREVIEW);
if(mDisplayAdapter.get() != NULL)
{
mDisplayAdapter->disableDisplay(false);
}
mAppCallbackNotifier->stop();
mPreviewStartInProgress = false;
mPreviewEnabled = false;
LOG_FUNCTION_NAME_EXIT;
return ret;
}
4.3.3 BaseCameraAdapter.cpp
位置:hardware/ti/omap4-aah/camera/BaseCameraAdapter.cpp
注意到文件中定義的常量:
分別對應不一樣的命令。
const LUT cameraCommandsUserToHAL[] = {
{ "CAMERA_START_PREVIEW", CameraAdapter::CAMERA_START_PREVIEW },
{ "CAMERA_STOP_PREVIEW", CameraAdapter::CAMERA_STOP_PREVIEW },
{ "CAMERA_START_VIDEO", CameraAdapter::CAMERA_START_VIDEO },
{ "CAMERA_STOP_VIDEO", CameraAdapter::CAMERA_STOP_VIDEO },
{ "CAMERA_START_IMAGE_CAPTURE", CameraAdapter::CAMERA_START_IMAGE_CAPTURE },
{ "CAMERA_STOP_IMAGE_CAPTURE", CameraAdapter::CAMERA_STOP_IMAGE_CAPTURE },
{ "CAMERA_PERFORM_AUTOFOCUS", CameraAdapter::CAMERA_PERFORM_AUTOFOCUS },
{ "CAMERA_CANCEL_AUTOFOCUS", CameraAdapter::CAMERA_CANCEL_AUTOFOCUS },
{ "CAMERA_PREVIEW_FLUSH_BUFFERS", CameraAdapter::CAMERA_PREVIEW_FLUSH_BUFFERS },
{ "CAMERA_START_SMOOTH_ZOOM", CameraAdapter::CAMERA_START_SMOOTH_ZOOM },
{ "CAMERA_STOP_SMOOTH_ZOOM", CameraAdapter::CAMERA_STOP_SMOOTH_ZOOM },
{ "CAMERA_USE_BUFFERS_PREVIEW", CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW },
{ "CAMERA_SET_TIMEOUT", CameraAdapter::CAMERA_SET_TIMEOUT },
{ "CAMERA_CANCEL_TIMEOUT", CameraAdapter::CAMERA_CANCEL_TIMEOUT },
{ "CAMERA_START_BRACKET_CAPTURE", CameraAdapter::CAMERA_START_BRACKET_CAPTURE },
{ "CAMERA_STOP_BRACKET_CAPTURE", CameraAdapter::CAMERA_STOP_BRACKET_CAPTURE },
{ "CAMERA_QUERY_RESOLUTION_PREVIEW", CameraAdapter::CAMERA_QUERY_RESOLUTION_PREVIEW },
{ "CAMERA_QUERY_BUFFER_SIZE_IMAGE_CAPTURE", CameraAdapter::CAMERA_QUERY_BUFFER_SIZE_IMAGE_CAPTURE },
{ "CAMERA_QUERY_BUFFER_SIZE_PREVIEW_DATA", CameraAdapter::CAMERA_QUERY_BUFFER_SIZE_PREVIEW_DATA },
{ "CAMERA_USE_BUFFERS_IMAGE_CAPTURE", CameraAdapter::CAMERA_USE_BUFFERS_IMAGE_CAPTURE },
{ "CAMERA_USE_BUFFERS_PREVIEW_DATA", CameraAdapter::CAMERA_USE_BUFFERS_PREVIEW_DATA },
{ "CAMERA_TIMEOUT_EXPIRED", CameraAdapter::CAMERA_TIMEOUT_EXPIRED },
{ "CAMERA_START_FD", CameraAdapter::CAMERA_START_FD },
{ "CAMERA_STOP_FD", CameraAdapter::CAMERA_STOP_FD },
{ "CAMERA_SWITCH_TO_EXECUTING", CameraAdapter::CAMERA_SWITCH_TO_EXECUTING },
{ "CAMERA_USE_BUFFERS_VIDEO_CAPTURE", CameraAdapter::CAMERA_USE_BUFFERS_VIDEO_CAPTURE },
#ifdef OMAP_ENHANCEMENT_CPCAM
{ "CAMERA_USE_BUFFERS_REPROCESS", CameraAdapter::CAMERA_USE_BUFFERS_REPROCESS },
{ "CAMERA_START_REPROCESS", CameraAdapter::CAMERA_START_REPROCESS },
#endif
};
BaseCameraAdapter::sendCommand():
利用 switch 將不一樣的命令對應到各自的邏輯中。
此處的 BaseCameraAdapter::startPreview() 實際上作什麼操做,具體的實現是在其子類中,接下來選取一個子類 V4LCameraAdapter 繼續深刻。
case CameraAdapter::CAMERA_START_PREVIEW:
{
CAMHAL_LOGDA("Start Preview");
if ( ret == NO_ERROR )
{
ret = setState(operation);
}
if ( ret == NO_ERROR )
{
ret = startPreview();
}
if ( ret == NO_ERROR )
{
ret = commitState();
}
else
{
ret |= rollbackState();
}
break;
}
4.3.4 V4LCameraAdapter.h
位置:hardware/ti/omap4-aah/camera/inc/V4LCameraAdapter/V4LCameraAdapter.h
類 V4LCameraAdapter 繼承了 BaseCameraAdapter。
注意它有一個內部私有類:
threadLoop() 應該是與線程的循環執行有關。
這個線程不斷執行 Adapter 中的 previewThread() 函數。
private:
class PreviewThread : public android::Thread {
V4LCameraAdapter* mAdapter;
public:
PreviewThread(V4LCameraAdapter* hw) :
Thread(false), mAdapter(hw) { }
virtual void onFirstRef() {
run("CameraPreviewThread", android::PRIORITY_URGENT_DISPLAY);
}
virtual bool threadLoop() {
mAdapter->previewThread();
// loop until we need to quit
return true;
}
};
//Used for calculation of the average frame rate during preview
status_t recalculateFPS();
char * GetFrame(int &index);
int previewThread();
4.3.5 V4LCameraAdapter.cpp
位置:hardware/ti/omap4-aah/camera/V4LCameraAdapter/V4LCameraAdapter.cpp
startPreview():
經過 v4lIoctl() 函數從硬件獲取須要的數據,並存入 Buffers。
啓動一個 PreviewThread,用於接收從 V4L 攝像頭設備傳回的數據。
最後設置一些 flag 代表預覽功能已開啓,startPreview 的控制流程就結束了。
status_t V4LCameraAdapter::startPreview()
{
status_t ret = NO_ERROR;
LOG_FUNCTION_NAME;
android::AutoMutex lock(mPreviewBufsLock);
if(mPreviewing) {
ret = BAD_VALUE;
goto EXIT;
}
for (int i = 0; i < mPreviewBufferCountQueueable; i++) {
mVideoInfo->buf.index = i;
mVideoInfo->buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
mVideoInfo->buf.memory = V4L2_MEMORY_MMAP;
ret = v4lIoctl(mCameraHandle, VIDIOC_QBUF, &mVideoInfo->buf);
if (ret < 0) {
CAMHAL_LOGEA("VIDIOC_QBUF Failed");
goto EXIT;
}
nQueued++;
}
ret = v4lStartStreaming();
// Create and start preview thread for receiving buffers from V4L Camera
if(!mCapturing) {
mPreviewThread = new PreviewThread(this);
CAMHAL_LOGDA("Created preview thread");
}
//Update the flag to indicate we are previewing
mPreviewing = true;
mCapturing = false;
EXIT:
LOG_FUNCTION_NAME_EXIT;
return ret;
}
previewThread():
在 PreviewThread 線程中不斷被調用。
獲取設備傳回的數據,並進行一些格式轉換操做:
convertYUV422ToNV12Tiler()
給幀數據進行一些必要的參數設置,如幀大小、時間戳等。
將幀數據發送給用戶:
sendFrameToSubscribers(&frame)
int V4LCameraAdapter::previewThread()
{
status_t ret = NO_ERROR;
int width, height;
CameraFrame frame;
void *y_uv[2];
int index = 0;
int stride = 4096;
char *fp = NULL;
mParams.getPreviewSize(&width, &height);
if (mPreviewing) {
fp = this->GetFrame(index);
if(!fp) {
ret = BAD_VALUE;
goto EXIT;
}
CameraBuffer *buffer = mPreviewBufs.keyAt(index);
CameraFrame *lframe = (CameraFrame *)mFrameQueue.valueFor(buffer);
if (!lframe) {
ret = BAD_VALUE;
goto EXIT;
}
debugShowFPS();
if ( mFrameSubscribers.size() == 0 ) {
ret = BAD_VALUE;
goto EXIT;
}
y_uv[0] = (void*) lframe->mYuv[0];
//y_uv[1] = (void*) lframe->mYuv[1];
//y_uv[1] = (void*) (lframe->mYuv[0] + height*stride);
convertYUV422ToNV12Tiler ( (unsigned char*)fp, (unsigned char*)y_uv[0], width, height);
CAMHAL_LOGVB("##...index= %d.;camera buffer= 0x%x; y= 0x%x; UV= 0x%x.",index, buffer, y_uv[0], y_uv[1] );
#ifdef SAVE_RAW_FRAMES
unsigned char* nv12_buff = (unsigned char*) malloc(width*height*3/2);
//Convert yuv422i to yuv420sp(NV12) & dump the frame to a file
convertYUV422ToNV12 ( (unsigned char*)fp, nv12_buff, width, height);
saveFile( nv12_buff, ((width*height)*3/2) );
free (nv12_buff);
#endif
frame.mFrameType = CameraFrame::PREVIEW_FRAME_SYNC;
frame.mBuffer = buffer;
frame.mLength = width*height*3/2;
frame.mAlignment = stride;
frame.mOffset = 0;
frame.mTimestamp = systemTime(SYSTEM_TIME_MONOTONIC);
frame.mFrameMask = (unsigned int)CameraFrame::PREVIEW_FRAME_SYNC;
if (mRecording)
{
frame.mFrameMask |= (unsigned int)CameraFrame::VIDEO_FRAME_SYNC;
mFramesWithEncoder++;
}
ret = setInitFrameRefCount(frame.mBuffer, frame.mFrameMask);
if (ret != NO_ERROR) {
CAMHAL_LOGDB("Error in setInitFrameRefCount %d", ret);
} else {
ret = sendFrameToSubscribers(&frame);
}
}
EXIT:
return ret;
}
4.3.6 * AppCallbackNotifier.cpp
位置:hardware/ti/omap4-aah/camera/AppCallbackNotifier.cpp
在 CameraHal.cpp 中,預覽功能初始化的部分,調用到了 AppCallbackNotifier 類的函數。這個類應該是與各個回調函數有關,即與數據流有着密切的聯繫。
在這裏,我打算簡單地瞭解一些對應的邏輯。
startPreviewCallbacks():
對 previewBuffers 進行一些必要的設置。
同步預覽幀:
mFrameProvider->enableFrameNotification(CameraFrame::PREVIEW_FRAME_SYNC)
到這裏之後,就不知道該往哪去深刻了,可是回調函數在哪調用了?
status_t AppCallbackNotifier::startPreviewCallbacks(android::CameraParameters ¶ms, CameraBuffer *buffers, uint32_t *offsets, int fd, size_t length, size_t count)
{
unsigned int *bufArr;
int size = 0;
LOG_FUNCTION_NAME;
android::AutoMutex lock(mLock);
if ( NULL == mFrameProvider )
{
CAMHAL_LOGEA("Trying to start video recording without FrameProvider");
return -EINVAL;
}
if ( mPreviewing )
{
CAMHAL_LOGDA("+Already previewing");
return NO_INIT;
}
int w,h;
///Get preview size
params.getPreviewSize(&w, &h);
// save preview pixel format, size and stride
mPreviewWidth = w;
mPreviewHeight = h;
mPreviewStride = 4096;
mPreviewPixelFormat = CameraHal::getPixelFormatConstant(params.getPreviewFormat());
size = CameraHal::calculateBufferSize(mPreviewPixelFormat, w, h);
mPreviewMemory = mRequestMemory(-1, size, AppCallbackNotifier::MAX_BUFFERS, NULL);
if (!mPreviewMemory) {
return NO_MEMORY;
}
for (int i=0; i < AppCallbackNotifier::MAX_BUFFERS; i++) {
mPreviewBuffers[i].type = CAMERA_BUFFER_MEMORY;
mPreviewBuffers[i].opaque = (unsigned char*) mPreviewMemory->data + (i*size);
mPreviewBuffers[i].mapped = mPreviewBuffers[i].opaque;
}
if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME ) ) {
mFrameProvider->enableFrameNotification(CameraFrame::PREVIEW_FRAME_SYNC);
}
if ( mCameraHal->msgTypeEnabled(CAMERA_MSG_POSTVIEW_FRAME) ) {
mFrameProvider->enableFrameNotification(CameraFrame::SNAPSHOT_FRAME);
}
mPreviewBufCount = 0;
mPreviewing = true;
LOG_FUNCTION_NAME_EXIT;
return NO_ERROR;
}
setCallbacks():
我想知道回調函數在哪調用,因而找到了這個設置回調的函數。
數據回調名爲 mDataCb,因而搜索 mDataCb。
void AppCallbackNotifier::setCallbacks(CameraHal* cameraHal,
camera_notify_callback notify_cb,
camera_data_callback data_cb,
camera_data_timestamp_callback data_cb_timestamp,
camera_request_memory get_memory,
void *user)
{
android::AutoMutex lock(mLock);
LOG_FUNCTION_NAME;
mCameraHal = cameraHal;
mNotifyCb = notify_cb;
mDataCb = data_cb;
mDataCbTimestamp = data_cb_timestamp;
mRequestMemory = get_memory;
mCallbackCookie = user;
LOG_FUNCTION_NAME_EXIT;
}
notifyEvent():
在這個函數中發現了對回調函數 mDataCb 的調用。
下面選出的這個分支是與 PREVIEW_METADATA 有關,即預覽元數據。
元數據在 evt->mEventData 中,看命名應該是與 Event 有關。
Event 相關的機制目前我還不太清楚,也還不會去深究,只須要知道它拿到了數據就行。
申請一個 camera_memory_t 對應的 Buffers 空間後,就調用回調函數將元數據往上層進行傳輸了。
case CameraHalEvent::EVENT_METADATA:
metaEvtData = evt->mEventData->metadataEvent;
if ( ( NULL != mCameraHal ) &&
( NULL != mNotifyCb) &&
( mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_METADATA) ) )
{
// WA for an issue inside CameraService
camera_memory_t *tmpBuffer = mRequestMemory(-1, 1, 1, NULL);
mDataCb(CAMERA_MSG_PREVIEW_METADATA,
tmpBuffer,
0,
metaEvtData->getMetadataResult(),
mCallbackCookie);
metaEvtData.clear();
if ( NULL != tmpBuffer ) {
tmpBuffer->release(tmpBuffer);
}
}
break;
流程簡圖
圖中標明瞭控制流程的主要調用順序,對於數據流暫不考究。
實際上,在 HAL 層與 Device 之間應該還有一層 Linux Kernel(drivers),但這目前已經超出我所須要瞭解的範圍,因此就先忽略掉了。
注意 HAL 層中,CameraHardwareInterface 是通用的入口,而真正實現與驅動層的對接是與平臺相關的,不一樣平臺有不一樣的實現方案。
小結 本篇筆記中,咱們選定 Camera.startPreview() 方法做爲切入點,結合 hw_get_module() 相關內容,對其整個流程進行追蹤分析,從而對以前已有初步瞭解的 Camera 控制流邏輯有一個更全面的理解。 在 HAL 層中,涉及到了不一樣平臺的實現,並且這些具體實現仍是有比較大的區別的。 目前還不夠清楚的地方,就是 Event 、Binder 等 Android 內部機制的實現。 在下一篇,我開始整理 Camera 數據流的邏輯。因爲目前所探究的數據流相對來講比較簡單,主要就是幾個 callback 函數的調用邏輯,因此我打算結合初始化與控制流的部分,將整個 Camera 流程整理、結合起來,做爲這系列學習筆記的終結。