您的位置:首页 > 运维架构 > 网站架构

Android Camera 系统架构源码分析(3)---->Camera的显示流程

2015-10-26 16:23 232 查看
Preview的显示流程

这次我们要从最开始startPreview的时候开始,在starPreview之间就setPreviewWindow()。

//CameraClient.cpp
status_tCameraClient::startPreviewMode(){
mHardware->previewEnabled();
mHardware->setPreviewWindow(mPreviewWindow);
result=mHardware->startPreview();
}


最后会调用到Cam1DeviceBase的setPreviewWindow(),其中的调用过程和startPreview()的一样,不再赘述

Cam1DeviceBase::setPreviewWindow(preview_stream_ops*window)
{
//(1)初始化显示
status_tstatus=initDisplayClient(window);
//(2)开始显示
status=enableDisplayClient();
}


initDisplayClient();在之前说的Cam1DeviceBase::startPreview()的(2)也调用了initDisplayClient();

Cam1DeviceBase::initDisplayClient(preview_stream_ops*window)
{
SizepreviewSize;
//[1]ChecktoseewhetherthepassedwindowisNULLornot.
//...
//[2]Getpreviewsize.获得预览参数
queryPreviewSize(previewSize.width,previewSize.height);
//[3]InitializeDisplayClient.
//[3.1]createaDisplayClient.仅是初始化变量
mpDisplayClient=IDisplayClient::createInstance();
//[3.2]initializethenewly-createdDisplayClient.
mpDisplayClient->init();
//[3.3]setpreview_stream_ops&relatedwindowinfo.
mpDisplayClient->setWindow(window,previewSize.width,previewSize.height,queryDisplayBufCount());
//[3.4]setImageBufferProviderClientifitexist.
mpCamAdapter!=0&&!mpDisplayClient->setImgBufProviderClient(mpCamAdapter);
}


下面是initDisplayClient();的细节分析

DisplayClient::init()
{
/**
构造DisplayThread并让它run起来,注意构造时传入的handle参数为dispalyClient
构造ImgBufQueue,它的id为eID_DISPLAY
在previewClientinit()时,也创建了一个ImgBufQueue,而ID是eID_PRV_CB
**/
ret=createDisplayThread()&&createImgBufQueue();
}
DisplayClient::setWindow(preview_stream_ops*constwindow,int32_tconstwndWidth,
int32_tconstwndHeight,int32_tconsti4MaxImgBufCount)
{returnset_preview_stream_ops(window,wndWidth,wndHeight,i4MaxImgBufCount);
}
DisplayClient::set_preview_stream_ops(preview_stream_ops*constwindow,int32_tconstwndWidth,
int32_tconstwndHeight,int32_tconsti4MaxImgBufCount)
{
int32_tmin_undequeued_buf_count=0;
//
//(2)Check
//....
//(3)Savainfo.
mpStreamImgInfo.clear();
mpStreamImgInfo=newImgInfo(wndWidth,wndHeight,CAMERA_DISPLAY_FORMAT,CAMERA_DISPLAY_FORMAT_HAL,"Camera@Display");
mpStreamOps=window;
mi4MaxImgBufCount=i4MaxImgBufCount;
//(4.1)Setgrallocusagebitsforwindow.
err=mpStreamOps->set_usage(mpStreamOps,CAMERA_GRALLOC_USAGE);
//(4.2)Getminimumundequeuebuffercount
err=mpStreamOps->get_min_undequeued_buffer_count(mpStreamOps,&min_undequeued_buf_count);
//(4.3)Setthenumberofbuffersneededfordisplay.
err=mpStreamOps->set_buffer_count(mpStreamOps,mi4MaxImgBufCount+min_undequeued_buf_count);
//(4.4)Setwindowgeometry
err=mpStreamOps->set_buffers_geometry(mpStreamOps,mpStreamImgInfo->mu4ImgWidth,
mpStreamImgInfo->mu4ImgHeight,mpStreamImgInfo->mi4ImgFormat);
}
DisplayClient::setImgBufProviderClient(sp<IImgBufProviderClient>const&rpClient)
{
rpClient->onImgBufProviderCreated(mpImgBufQueue);
mpImgBufPvdrClient=rpClient;
}
//BaseCamAdapter.cpp
BaseCamAdapter::onImgBufProviderCreated(sp<IImgBufProvider>const&rpProvider)
{
int32_tconsti4ProviderId=rpProvider->getProviderId();
mpImgBufProvidersMgr->setProvider(i4ProviderId,rpProvider);
}


DispalyClient的初始化就这么完成了,从但是我们并不知道上面已经初始化的参数有什么作用,又是如何开始刷显示的呢?我们返回到enableDisplayClient()继续跟踪

Cam1DeviceBase::enableDisplayClient()
{
SizepreviewSize;
//[1]Getpreviewsize.获取预览参数
queryPreviewSize(previewSize.width,previewSize.height);
//[2]Enable
mpDisplayClient->enableDisplay(previewSize.width,previewSize.height,queryDisplayBufCount(),mpCamAdapter);
}


Cam1DeviceBase::enableDisplayClient()
{
SizepreviewSize;
//[1]Getpreviewsize.获取预览参数
queryPreviewSize(previewSize.width,previewSize.height);
//[2]Enable
mpDisplayClient->enableDisplay(previewSize.width,previewSize.height,queryDisplayBufCount(),mpCamAdapter);
}
DisplayClient::enableDisplay(int32_tconsti4Width,int32_tconsti4Height,
int32_tconsti4BufCount,sp<IImgBufProviderClient>const&rpClient)
{
//Enable.
enableDisplay();
}
DisplayClient::enableDisplay()
{
//Postacommandtowakeupthethread.向DisplayThread发送消息
mpDisplayThread->postCommand(Command(Command::eID_WAKEUP));
}
DisplayThread::threadLoop()
{
Commandcmd;
if(getCommand(cmd))
{
switch(cmd.eId)
{
caseCommand::eID_EXIT:
//....
caseCommand::eID_WAKEUP:
default:
//调用的是handler的onThreadloop
//在前面已知DisplayThread的handler被设置成了DisplayClient
mpThreadHandler->onThreadLoop(cmd);
break;
}
}
}


DisplayClient的onThreadLoop()在DisplayClient.BufOps.cpp里,仔细观察下面的函数是不会有似曾相识的感觉,原来下面函数的结构和Preview的onClientThreadLoop的函数结构一模一样,边里面使用的函数名都一样,虽然不是同一个函数

DisplayClient::onThreadLoop(Commandconst&rCmd)
{
//(0)lockProcessor.
sp<IImgBufQueue>pImgBufQueue;
//(1)PrepareallTODObuffers.准备buf,把buf放入队列里
if(!prepareAllTodoBuffers(pImgBufQueue))
{
returntrue;
}
//(2)Start通知开始处理
pImgBufQueue->startProcessor();
//(3)Dountildisabled.
while(1)
{
//(.1)阻塞等待通知,并开始处理buf
waitAndHandleReturnBuffers(pImgBufQueue);
//(.2)breakifdisabled.
if(!isDisplayEnabled())
{
MY_LOGI("Displaydisabled");
break;
}
//(.3)re-prepareallTODObuffers,ifpossible,
//sincesomeDONE/CANCELbuffersreturn.准备buf,把buf放入队列里
prepareAllTodoBuffers(pImgBufQueue);
}
//
//(4)Stop
pImgBufQueue->pauseProcessor();
pImgBufQueue->flushProcessor();
pImgBufQueue->stopProcessor();
//
//(5)Cancelallun-returnedbuffers.
cancelAllUnreturnBuffers();
//
{
Mutex::Autolock_l(mStateMutex);
mState=eState_Suspend;
mStateCond.broadcast();
}
}


根据之前的经验很直觉地找到了waitAndHandleReturnBuffers()函数,正是里面处理了数据,但是又怎么样把数据显示出来的呢?继续看

DisplayClient::waitAndHandleReturnBuffers(sp<IImgBufQueue>const&rpBufQueue)
{
Vector<ImgBufQueNode>vQueNode;
//(1)dequebuffersfromprocessor.阻塞等待通知读取Buf
rpBufQueue->dequeProcessor(vQueNode);
//(2)handlebuffersdequedfromprocessor.
ret=handleReturnBuffers(vQueNode);
}
DisplayClient::handleReturnBuffers(Vector<ImgBufQueNode>const&rvQueNode)
{
/*
*Notes:
*For30fps,wejustenque(display)thelatestframe,
*andcanceltheothers.
*Forframerate>30fps,weshouldjudgethetimestamphereorsource.
*/
//
//(3)RemovefromListandenquePrvOps/cancelPrvOps,onebyone.
int32_tconstqueSize=rvQueNode.size();
for(int32_ti=0;i<queSize;i++)
{
sp<IImgBuf>const&rpQueImgBuf=rvQueNode[i].getImgBuf();//ImgBufinQueue.
sp<StreamImgBuf>constpStreamImgBuf=*mStreamBufList.begin();//ImgBufinList.
//(.1)CheckvalidpointerstoimagebuffersinQueue&List
if(rpQueImgBuf==0||pStreamImgBuf==0)
{
MY_LOGW("BadImgBuf:(Que[%d],List.begin)=(%p,%p)",i,rpQueImgBuf.get(),pStreamImgBuf.get());
continue;
}
//(.2)ChecktheequalityofimagebuffersbetweenQueue&List.
if(rpQueImgBuf->getVirAddr()!=pStreamImgBuf->getVirAddr())
{
MY_LOGW("BadaddressinImgBuf:(Que[%d],List.begin)=(%p,%p)",i,rpQueImgBuf->getVirAddr(),pStreamImgBuf->getVirAddr());
continue;
}
//(.3)Everycheckisok.Nowremovethenodefromthelist.
mStreamBufList.erase(mStreamBufList.begin());
//
//(.4)enquePrvOps/cancelPrvOps
if(i==idxToDisp){
//
if(mpExtImgProc!=NULL)
{
if(mpExtImgProc->getImgMask()&ExtImgProc::BufType_Display)
{
IExtImgProc::ImgInfoimg;
//
img.bufType=ExtImgProc::BufType_Display;
img.format=pStreamImgBuf->getImgFormat();
img.width=pStreamImgBuf->getImgWidth();
img.height=pStreamImgBuf->getImgHeight();
img.stride[0]=pStreamImgBuf->getImgWidthStride(0);
img.stride[1]=pStreamImgBuf->getImgWidthStride(1);
img.stride[2]=pStreamImgBuf->getImgWidthStride(2);
img.virtAddr=(MUINT32)(pStreamImgBuf->getVirAddr());
img.bufSize=pStreamImgBuf->getBufSize();
//预留的处理接口,用户可自行填充
mpExtImgProc->doImgProc(img);
}
}
//处理将要显示的Buf
enquePrvOps(pStreamImgBuf);
}
else{
//处理被忽略的buf
cancelPrvOps(pStreamImgBuf);
}
}
}
DisplayClient::enquePrvOps(sp<StreamImgBuf>const&rpImgBuf)
{
//[1]unlockbufferbeforesendingtodisplay
GraphicBufferMapper::get().unlock(rpImgBuf->getBufHndl());
//[2]Dumpimageifwanted.
dumpImgBuf_If(rpImgBuf);
//[3]settimestamp.
err=mpStreamOps->set_timestamp(mpStreamOps,rpImgBuf->getTimestamp());
//[4]setgrallocbuffertype&dirty设置buf的参数
::gralloc_extra_setBufParameter(rpImgBuf->getBufHndl(),
GRALLOC_EXTRA_MASK_TYPE|GRALLOC_EXTRA_MASK_DIRTY,GRALLOC_EXTRA_BIT_TYPE_CAMERA|GRALLOC_EXTRA_BIT_DIRTY);
//[5]unlocksandpostthebuffertodisplay.
//此处我们的mpStreamOps在之前的set_preview_stream_ops()被初始化为window,即setWindow()传下来的window参数。
//所以我们得往上找,这个window是什么东西。
err=mpStreamOps->enqueue_buffer(mpStreamOps,rpImgBuf->getBufHndlPtr());
}


再往回找,可以看到这个mpStramOps变量,即window变量是一个Surface。

//setthebufferconsumerthatthepreviewwilluse
status_tCameraClient::setPreviewTarget(
constsp<IGraphicBufferProducer>&bufferProducer){

sp<IBinder>binder;
sp<ANativeWindow>window;
if(bufferProducer!=0){
binder=bufferProducer->asBinder();
//UsingcontrolledByAppflagtoensurethatthebufferqueueremainsin
//asyncmodefortheoldcameraAPI,wheremanyapplicationsdepend
//onthatbehavior.
window=newSurface(bufferProducer,/*controlledByApp*/true);
}
returnsetPreviewWindow(binder,window);
}


写过APP的都知道,这个Surface是Android上层用于显示的一个辅助类。而Camera的预览也相当于调用了Surface的操作函数。那问题又来了,调用这些Surface是如何去显示的呢?......我就到此为止了,完成了我做知识储备的目的,日后要是工作有需要深入再深入了,有兴趣的朋友可以自行..........深入。我们还是再继续看,这些数据是如何来的呢?

现在回头看看现在涉及到的几个大类做一下总结。

Cam1DeviceBase(包括其子类和父类)包含了所有的Camera的操作函数,如Open,显示,拍照,录像,自动对焦的操作,以及这些Camera操作和Camera硬件涉及到的参数,可以说,Cam1Device对Frameworks及App层来说就是一个Camera了。

在Cam1DeviceBase里包含了ParamsManager,顾名思义就是Camera的参数管理,我们之前也没去理会他,就让它飘会吧。Cam1DeviceBase还包含了几个Client,用于负责所有的功能操作。DisplayClient用来负责显示,CamClient则是一个大统一,把Camera的操作都归集于此。CamClient又把自己的功能分成几个
Client去负责,其中PreviewClient负责显示,RecordClient负责录制,在拍照的功能里FDClient,OTClient,PreviewClient都参了一脚,具体有什么用,有空再研究。但还有一个CameraAdapter,这也非常重要的
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: