您的位置:首页 > 移动开发 > Android开发

android6.0源码分析之Camera初始化

2016-05-05 15:24 134 查看

1、CameraService的注册

在ServiceManager的初始化过程中,会启动MediaService进程,而在MediaService进程中会初始化一系列的服务,这其中就包括CameraService.

int main(int argc __unused, char** argv)
{
...
sp<ProcessState> proc(ProcessState::self());
sp<IServiceManager> sm = defaultServiceManager();
AudioFlinger::instantiate();
MediaPlayerService::instantiate();
ResourceManagerService::instantiate();
CameraService::instantiate();
AudioPolicyService::instantiate();
SoundTriggerHwService::instantiate();
RadioService::instantiate();
registerExtensions();
ProcessState::self()->startThreadPool();
IPCThreadState::self()->joinThreadPool();
}


在MediaServer的main方法中,会调用CameraService的instantiate方法来进行CameraService的注册。继续分析instantiate方法:

static void instantiate() {
publish();
}


instantiate调用了publish()方法,继续看publish()方法:

public:
static status_t publish(bool allowIsolated = false) {
sp<IServiceManager> sm(defaultServiceManager());
return sm->addService(String16(
SERVICE::getServiceName()),
new SERVICE(), allowIsolated);
}


由此可知,此处将CameraService服务add到了ServiceManager进程中,即CameraService完成了注册。

2、应用层初始化流程

android6.0源码分析之Camera框架简介中已经介绍了Camera子系统的代码结构,应用层的初始化以Camera.java的onCreate方法最为入口:

@Override
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
getPreferredCameraId();
mFocusManager = new FocusManager(mPreferences,
defaultFocusModes);
/*To reduce startup time, we start the camera open and
preview threads.
We make sure the preview is started at the end of
onCreate.
*/
mCameraOpenThread.start();

...

// Make sure camera device is opened.
try {
mCameraOpenThread.join();
mCameraOpenThread = null;
if (mOpenCameraFail) {
Util.showErrorAndFinish(this,
R.string.cannot_connect_camera);
return;
} else if (mCameraDisabled) {
Util.showErrorAndFinish(this,
R.string.camera_disabled);
return;
}
} catch (InterruptedException ex) {
// ignore
}
mCameraPreviewThread.start()

...

// Make sure preview is started.
try {
mCameraPreviewThread.join();
} catch (InterruptedException ex) {
// ignore
}
...
}


在onCreate方法中会启动一个mCameraOpenThread线程,它负责Camera的open,由注释可知,启动一个线程是为了减少启动时间,在open成功之后,才会启动mCameraPreviewThread线程进行预览。在mCameraOpenThread线程中会调用Util的openCamera()方法并返回一个Camera对象mCameraDevice,之后对Camera的操作都可以通过此mCameraDevice进行。

Thread mCameraOpenThread = new Thread(new Runnable() {
public void run() {
try {
mCameraDevice = Util.openCamera(Camera.this,
mCameraId);
} catch (CameraHardwareException e) {
mOpenCameraFail = true;
} catch (CameraDisabledException e) {
mCameraDisabled = true;
}
}
});


接着看Util.openCamera()方法:

public static android.hardware.Camera openCamera(...){
...

try {
return CameraHolder.instance().open(cameraId);
} catch (CameraHardwareException e) {
...
}
}


调用CameraHolder的open方法,根据cameraId打开Camera并返回一个android.hardware.Camera对象,即赋值给mCameraDevice。接下来分析如何打开此Camera并返回Camera对象。

public synchronized android.hardware.Camera open(int cameraId){
...
mCameraDevice = android.hardware.Camera.open(cameraId);
...
return mCameraDevice ;
}


至此,应用层的打开流程就分析结束了,其流程图如下:



3、framework层的Camera初始化流程

应用层调用frameworks层的Camera类的open方法来继续进行Camera的初始化:

public static Camera open(int cameraId) {
return new Camera(cameraId);
}


调用此静态方法以及cameraId来创建一个新的Camera对象,继续看Camera的构造函数:

Camera(int cameraId) {
int err = cameraInitNormal(cameraId);
...
}


在构造函数中只调用了cameraInitNormal方法,它最后会通过JNI调用Native方法native_setup来进行Camera的初始化,现在分析native_setup方法,在android_hardware_Camera.cpp找到android_hardware_Camera_native_setup本地方法:

static jint android_hardware_Camera_native_setup(...)
{
...
sp<Camera> camera;
if (halVersion == CAMERA_HAL_API_VERSION_NORMAL_CONNECT) {
// Default path: hal version is don't care, do normal
//camera connect.
camera = Camera::connect(cameraId, clientName,
Camera::USE_CALLING_UID);
} else {
jint status = Camera::connectLegacy(cameraId, halVersion,
clientName,Camera::USE_CALLING_UID, camera);
if (status != NO_ERROR) {
return status;
}
}
...
jclass clazz = env->GetObjectClass(thiz);
...
// We use a weak reference so the Camera object can be
//garbage collected.
// The reference is only used as a proxy for callbacks.
sp<JNICameraContext> context = new JNICameraContext(env,
weak_this, clazz, camera);
context->incStrong(
(void*)android_hardware_Camera_native_setup);
camera->setListener(context);
// save context in opaque field
env->SetLongField(thiz, fields.context,
(jlong)context.get());
return NO_ERROR;
}


代码中,会调用本地Camera client的connect方法,在此处有两个方法:一个是connect,它不关心HAL的版本。另一个是connectLegacy方法,它关心HAL的版本。这里只分析第一种,connect代码如下:

sp<Camera> Camera::connect(...)
{
return CameraBaseT::connect(cameraId, clientPackageName,
clientUid);
}


它根据cameraId,clientPackageName,clientUid等参数,直接调用CameraBaseT的connect方法:

template <typename TCam, typename TCamTraits>
sp<TCam> CameraBase<TCam, TCamTraits>::connect(...)
{
sp<TCam> c = new TCam(cameraId);
sp<TCamCallbacks> cl = c;
const sp<ICameraService>& cs = getCameraService();
if (cs != 0) {
TCamConnectService fnConnectService =
TCamTraits::fnConnectService;
status = (cs.get()->*fnConnectService)(cl, cameraId,
clientPackageName, clientUid,/*out*/ c->mCamera);
}
if (status == OK && c->mCamera != 0) {
IInterface::asBinder(c->mCamera)->linkToDeath(c);
c->mStatus = NO_ERROR;
} else {
c.clear();
}
return c;
}


它首先通过getCameraService方法来获取作为server的CameraService服务,然后调用它的connect方法来对client的初始化操作进行真正的响应,而具体的进程间通信机制(IPC Binder)此处不作介绍,所以借着看CameraService的connect方法,它直接调用connectHelper方法,connectHelper的定义和实现都在CameraService.h文件中:

template<class CALLBACK, class CLIENT>
status_t CameraService::connectHelper(...) {
...
sp<CLIENT> client = nullptr;
{
// Enforce client permissions and do basic sanity checks
if((ret = validateConnectLocked(cameraId,
/*inout*/clientUid)) != NO_ERROR) {
return ret;
}
sp<BasicClient> clientTmp = nullptr;
...
// give flashlight a chance to close devices if necessary
mFlashlight->prepareDeviceOpen(cameraId);
// TODO: Update getDeviceVersion + HAL interface to use
//strings for Camera IDs
int id = cameraIdToInt(cameraId);
...
sp<BasicClient> tmp = nullptr;
if((ret = makeClient(this, cameraCb, clientPackageName,
cameraId, facing, clientPid,clientUid, getpid(),
legacyMode, halVersion, deviceVersion,
effectiveApiLevel,/*out*/&tmp)) != NO_ERROR) {
return ret;
}
client = static_cast<CLIENT*>(tmp.get());
if ((ret = client->initialize(mModule)) != OK) {
return ret;
}

sp<IBinder> remoteCallback = client->getRemote();
if (remoteCallback != nullptr) {
remoteCallback->linkToDeath(this);
}
...
if (shimUpdateOnly) {
// If only updating legacy shim parameters,
//immediately disconnect client
mServiceLock.unlock();
client->disconnect();
mServiceLock.lock();
} else {
// Otherwise, add client to active clients list
finishConnectLocked(client, partial);
}
} // lock is destroyed, allow further connect calls

// Important: release the mutex here so the client can call back into the service from its
// destructor (can be at the end of the call)
device = client;
return NO_ERROR;
}


有代码可知,有几个重要的函数调用,首先makeClient方法会根据CameraDevice的api进行CameraClient对象的创建,此处分析API2,所以它会创建CameraDeviceClient对象,而在此对象中会创建Camera3Device对象,然后会调用此client的initialize方法,CameraDeviceClient类的initialize方法创建并运行一个FrameProcessorBase线程,即运行一个帧处理线程,而FrameProcessorBase线程的主要作用是处理帧并且回调其每一帧的处理结果给application层,采用ICameraDeviceCallbacks将每一视频帧数据回传,它的实现在frameworks/av/camera/camera2/ICameraDeviceCallbacks.cpp中。最后它会返回一个client对象。至此,frameworks层分析结束,其时序图如下:



4、总结

Camera框架采用的是C/S的架构,所以需要借助IPC Binder机制,所以在Camera的API2.0下面出现了很多AIDL的方式来代替API1.0,最后都会通过CameraService来处理CameraDeviceClient的数据。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: