Android MediaRecorder系统结构
前面有分析過(guò)Camera的實(shí)現(xiàn),現(xiàn)在來(lái)看看MediaRecorder的實(shí)現(xiàn),這里我不會(huì)太去關(guān)注它的分層結(jié)構(gòu),我更關(guān)注它的邏輯!
APP層 /path/to/aosp/frameworks/base/media/java/android/media/MediaRecorder.java
JNI層 /path/to/aosp/frameworks/base/media/jni/android_media_MediaRecorder.cpp
調(diào)用NATIVE層的MediaRecorder(這里是BnMediaRecorderClient)
header /path/to/aosp/frameworks/av/include/media/mediarecorder.h
implementation /path/to/aosp/frameworks/av/media/libmedia/mediarecorder.cpp
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 | MediaRecorder::MediaRecorder() : mSurfaceMediaSource(NULL) { ????ALOGV("constructor"); ????const sp<IMediaPlayerService>& service(getMediaPlayerService()); ????if (service != NULL) { ????????mMediaRecorder = service->createMediaRecorder(getpid()); ????} ????if (mMediaRecorder != NULL) { ????????mCurrentState = MEDIA_RECORDER_IDLE; ????} ????doCleanUp(); } |
getMediaPlayerService()這個(gè)方法位于/path/to/aosp/frameworks/av/include/media/IMediaDeathNotifier.h
獲取到MediaPlayerService(這個(gè)是BpMediaPlayerService)之后
調(diào)用IMediaPlayerService當(dāng)中的
| 1 2 3 4 5 6 7 8 9 | sp<IMediaRecorder> MediaPlayerService::createMediaRecorder(pid_t pid) { ????sp<MediaRecorderClient> recorder = new MediaRecorderClient(this, pid); ????wp<MediaRecorderClient> w = recorder; ????Mutex::Autolock lock(mLock); ????mMediaRecorderClients.add(w); ????ALOGV("Create new media recorder client from pid %d", pid); ????return recorder; } |
創(chuàng)建MediaRecorderClient(這里是BnMediaRecorder)
但是通過(guò)binder拿到的是BpMediaRecorder
因?yàn)橛腥缦碌膇nterface_cast過(guò)程
| 1 2 3 4 5 6 7 8 | virtual sp<IMediaRecorder> createMediaRecorder(pid_t pid) { ????Parcel data, reply; ????data.writeInterfaceToken(IMediaPlayerService::getInterfaceDescriptor()); ????data.writeInt32(pid); ????remote()->transact(CREATE_MEDIA_RECORDER, data, &reply); ????return interface_cast<IMediaRecorder>(reply.readStrongBinder()); } |
而MediaRecorderClient當(dāng)中又會(huì)創(chuàng)建StagefrightRecorder(MediaRecorderBase),它位于
/path/to/aosp/frameworks/av/media/libmediaplayerservice/StagefrightRecorder.cpp
目前我們可以認(rèn)為在APP/JNI/NATIVE這邊是在一個(gè)進(jìn)程當(dāng)中,在MediaPlayerService當(dāng)中的MediaRecorderClient/StagefrightRecorder是在另外一個(gè)進(jìn)程當(dāng)中,他們之間通過(guò)binder通信,而且Bp和Bn我們也都有拿到,后面我們將不再仔細(xì)區(qū)分Bp和Bn。
客戶端這邊
BnMediaRecorderClient
BpMediaRecorder
BpMediaPlayerService
服務(wù)端這邊
BpMediaRecorderClient(如果需要通知客戶端的話,它可以獲得這個(gè)Bp)
BnMediaRecorder
BnMediaPlayerService
這有張圖(點(diǎn)過(guò)去看原始大圖)
我們以開(kāi)始錄影為例子,比如start()
在這里就兵分兩路,一個(gè)CameraSource,一個(gè)MPEG4Writer(sp mWriter)
這兩個(gè)class都位于/path/to/aosp/frameworks/av/media/libstagefright/當(dāng)中
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | status_t StagefrightRecorder::startMPEG4Recording() { ????int32_t totalBitRate; ????status_t err = setupMPEG4Recording( ????????????mOutputFd, mVideoWidth, mVideoHeight, ????????????mVideoBitRate, &totalBitRate, &mWriter); ????if (err != OK) { ????????return err; ????} ????int64_t startTimeUs = systemTime() / 1000; ????sp<MetaData> meta = new MetaData; ????setupMPEG4MetaData(startTimeUs, totalBitRate, &meta); ????err = mWriter->start(meta.get()); ????if (err != OK) { ????????return err; ????} ????return OK; } |
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 | status_t StagefrightRecorder::setupMPEG4Recording( ????????int outputFd, ????????int32_t videoWidth, int32_t videoHeight, ????????int32_t videoBitRate, ????????int32_t *totalBitRate, ????????sp<MediaWriter> *mediaWriter) { ????mediaWriter->clear(); ????*totalBitRate = 0; ????status_t err = OK; ????sp<MediaWriter> writer = new MPEG4Writer(outputFd); ????if (mVideoSource < VIDEO_SOURCE_LIST_END) { ????????sp<MediaSource> mediaSource; ????????err = setupMediaSource(&mediaSource); // very important ????????if (err != OK) { ????????????return err; ????????} ????????sp<MediaSource> encoder; ????????err = setupVideoEncoder(mediaSource, videoBitRate, &encoder); // very important ????????if (err != OK) { ????????????return err; ????????} ????????writer->addSource(encoder); ????????*totalBitRate += videoBitRate; ????} ????// Audio source is added at the end if it exists. ????// This help make sure that the "recoding" sound is suppressed for ????// camcorder applications in the recorded files. ????if (!mCaptureTimeLapse && (mAudioSource != AUDIO_SOURCE_CNT)) { ????????err = setupAudioEncoder(writer); // very important ????????if (err != OK) return err; ????????*totalBitRate += mAudioBitRate; ????} ????... ????writer->setListener(mListener); ????*mediaWriter = writer; ????return OK; } |
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | // Set up the appropriate MediaSource depending on the chosen option status_t StagefrightRecorder::setupMediaSource( ??????????????????????sp<MediaSource> *mediaSource) { ????if (mVideoSource == VIDEO_SOURCE_DEFAULT ????????????|| mVideoSource == VIDEO_SOURCE_CAMERA) { ????????sp<CameraSource> cameraSource; ????????status_t err = setupCameraSource(&cameraSource); ????????if (err != OK) { ????????????return err; ????????} ????????*mediaSource = cameraSource; ????} else if (mVideoSource == VIDEO_SOURCE_GRALLOC_BUFFER) { ????????// If using GRAlloc buffers, setup surfacemediasource. ????????// Later a handle to that will be passed ????????// to the client side when queried ????????status_t err = setupSurfaceMediaSource(); ????????if (err != OK) { ????????????return err; ????????} ????????*mediaSource = mSurfaceMediaSource; ????} else { ????????return INVALID_OPERATION; ????} ????return OK; } |
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 | status_t StagefrightRecorder::setupCameraSource( ????????sp<CameraSource> *cameraSource) { ????status_t err = OK; ????if ((err = checkVideoEncoderCapabilities()) != OK) { ????????return err; ????} ????Size videoSize; ????videoSize.width = mVideoWidth; ????videoSize.height = mVideoHeight; ????if (mCaptureTimeLapse) { ????????if (mTimeBetweenTimeLapseFrameCaptureUs < 0) { ????????????ALOGE("Invalid mTimeBetweenTimeLapseFrameCaptureUs value: %lld", ????????????????mTimeBetweenTimeLapseFrameCaptureUs); ????????????return BAD_VALUE; ????????} ????????mCameraSourceTimeLapse = CameraSourceTimeLapse::CreateFromCamera( ????????????????mCamera, mCameraProxy, mCameraId, ????????????????videoSize, mFrameRate, mPreviewSurface, ????????????????mTimeBetweenTimeLapseFrameCaptureUs); ????????*cameraSource = mCameraSourceTimeLapse; ????} else { ????????*cameraSource = CameraSource::CreateFromCamera( ????????????????mCamera, mCameraProxy, mCameraId, videoSize, mFrameRate, ????????????????mPreviewSurface, true /*storeMetaDataInVideoBuffers*/); ????} ????mCamera.clear(); ????mCameraProxy.clear(); ????if (*cameraSource == NULL) { ????????return UNKNOWN_ERROR; ????} ????if ((*cameraSource)->initCheck() != OK) { ????????(*cameraSource).clear(); ????????*cameraSource = NULL; ????????return NO_INIT; ????} ????// When frame rate is not set, the actual frame rate will be set to ????// the current frame rate being used. ????if (mFrameRate == -1) { ????????int32_t frameRate = 0; ????????CHECK ((*cameraSource)->getFormat()->findInt32( ????????????????????kKeyFrameRate, &frameRate)); ????????ALOGI("Frame rate is not explicitly set. Use the current frame " ?????????????"rate (%d fps)", frameRate); ????????mFrameRate = frameRate; ????} ????CHECK(mFrameRate != -1); ????mIsMetaDataStoredInVideoBuffers = ????????(*cameraSource)->isMetaDataStoredInVideoBuffers(); ????return OK; } |
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 | status_t StagefrightRecorder::setupVideoEncoder( ????????sp<MediaSource> cameraSource, ????????int32_t videoBitRate, ????????sp<MediaSource> *source) { ????source->clear(); ????sp<MetaData> enc_meta = new MetaData; ????enc_meta->setInt32(kKeyBitRate, videoBitRate); ????enc_meta->setInt32(kKeyFrameRate, mFrameRate); ????switch (mVideoEncoder) { ????????case VIDEO_ENCODER_H263: ????????????enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_H263); ????????????break; ????????case VIDEO_ENCODER_MPEG_4_SP: ????????????enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_MPEG4); ????????????break; ????????case VIDEO_ENCODER_H264: ????????????enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_AVC); ????????????break; ????????default: ????????????CHECK(!"Should not be here, unsupported video encoding."); ????????????break; ????} ????sp<MetaData> meta = cameraSource->getFormat(); ????int32_t width, height, stride, sliceHeight, colorFormat; ????CHECK(meta->findInt32(kKeyWidth, &width)); ????CHECK(meta->findInt32(kKeyHeight, &height)); ????CHECK(meta->findInt32(kKeyStride, &stride)); ????CHECK(meta->findInt32(kKeySliceHeight, &sliceHeight)); ????CHECK(meta->findInt32(kKeyColorFormat, &colorFormat)); ????enc_meta->setInt32(kKeyWidth, width); ????enc_meta->setInt32(kKeyHeight, height); ????enc_meta->setInt32(kKeyIFramesInterval, mIFramesIntervalSec); ????enc_meta->setInt32(kKeyStride, stride); ????enc_meta->setInt32(kKeySliceHeight, sliceHeight); ????enc_meta->setInt32(kKeyColorFormat, colorFormat); ????if (mVideoTimeScale > 0) { ????????enc_meta->setInt32(kKeyTimeScale, mVideoTimeScale); ????} ????if (mVideoEncoderProfile != -1) { ????????enc_meta->setInt32(kKeyVideoProfile, mVideoEncoderProfile); ????} ????if (mVideoEncoderLevel != -1) { ????????enc_meta->setInt32(kKeyVideoLevel, mVideoEncoderLevel); ????} ????OMXClient client; ????CHECK_EQ(client.connect(), (status_t)OK); ????uint32_t encoder_flags = 0; ????if (mIsMetaDataStoredInVideoBuffers) { ????????encoder_flags |= OMXCodec::kStoreMetaDataInVideoBuffers; ????} ????// Do not wait for all the input buffers to become available. ????// This give timelapse video recording faster response in ????// receiving output from video encoder component. ????if (mCaptureTimeLapse) { ????????encoder_flags |= OMXCodec::kOnlySubmitOneInputBufferAtOneTime; ????} ????sp<MediaSource> encoder = OMXCodec::Create( ????????????client.interface(), enc_meta, ????????????true /* createEncoder */, cameraSource, ????????????NULL, encoder_flags); ????if (encoder == NULL) { ????????ALOGW("Failed to create the encoder"); ????????// When the encoder fails to be created, we need ????????// release the camera source due to the camera's lock ????????// and unlock mechanism. ????????cameraSource->stop(); ????????return UNKNOWN_ERROR; ????} ????*source = encoder; ????return OK; } |
這里和OMXCodec關(guān)聯(lián)起來(lái)
有一個(gè)叫media_codecs.xml的配置文件來(lái)表明設(shè)備支持哪些codec
我們錄制MPEG 4的時(shí)候還會(huì)有聲音,所以后面還有個(gè)setupAudioEncoder,具體的方法就不展開(kāi)了,總之就是把聲音也作為一個(gè)Track加入到MPEG4Writer當(dāng)中去。
這里插個(gè)題外話,Google說(shuō)把setupAudioEncoder放到后面是為了避免開(kāi)始錄影的那一個(gè)提示聲音也被錄制進(jìn)去,但是實(shí)際發(fā)現(xiàn)它這樣做還是會(huì)有bug,在一些設(shè)備上還是會(huì)把那聲錄制進(jìn)去,這個(gè)遇到的都是靠APP自己來(lái)播放聲音來(lái)繞過(guò)這個(gè)問(wèn)題的。
另外MPEG4Writer當(dāng)中有個(gè)
start(MetaData*)
啟動(dòng)兩個(gè)方法
a) startWriterThread
啟動(dòng)一個(gè)thread去寫(xiě)
?| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | void MPEG4Writer::threadFunc() { ????ALOGV("threadFunc"); ????prctl(PR_SET_NAME, (unsigned long)"MPEG4Writer", 0, 0, 0); ????Mutex::Autolock autoLock(mLock); ????while (!mDone) { ????????Chunk chunk; ????????bool chunkFound = false; ????????while (!mDone && !(chunkFound = findChunkToWrite(&chunk))) { ????????????mChunkReadyCondition.wait(mLock); ????????} ????????// Actual write without holding the lock in order to ????????// reduce the blocking time for media track threads. ????????if (chunkFound) { ????????????mLock.unlock(); ????????????writeChunkToFile(&chunk); ????????????mLock.lock(); ????????} ????} ????writeAllChunks(); } |
b) startTracks
?| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | status_t MPEG4Writer::startTracks(MetaData *params) { ????for (List<Track *>::iterator it = mTracks.begin(); ?????????it != mTracks.end(); ++it) { ????????status_t err = (*it)->start(params); ????????if (err != OK) { ????????????for (List<Track *>::iterator it2 = mTracks.begin(); ?????????????????it2 != it; ++it2) { ????????????????(*it2)->stop(); ????????????} ????????????return err; ????????} ????} ????return OK; } |
然后調(diào)用每個(gè)Track的start方法
?| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | status_t MPEG4Writer::Track::start(MetaData *params) { ????... ????initTrackingProgressStatus(params); ????... ????status_t err = mSource->start(meta.get()); // 這里會(huì)去執(zhí)行CameraSource(start),這兩個(gè)是相互關(guān)聯(lián)的 ????... ????pthread_create(&mThread, &attr, ThreadWrapper, this); ????return OK; } void *MPEG4Writer::Track::ThreadWrapper(void *me) { ????Track *track = static_cast<Track *>(me); ????status_t err = track->threadEntry(); ????return (void *) err; } |
通過(guò)status_t MPEG4Writer::Track::threadEntry()
是新啟動(dòng)另外一個(gè)thread,它里面會(huì)通過(guò)一個(gè)循環(huán)來(lái)不斷讀取CameraSource(read)里面的數(shù)據(jù),CameraSource里面的數(shù)據(jù)當(dāng)然是從driver返回過(guò)來(lái)的(可以參見(jiàn)CameraSourceListener,CameraSource用一個(gè)叫做mFrameReceived的List專門存放從driver過(guò)來(lái)的數(shù)據(jù),如果收到數(shù)據(jù)會(huì)調(diào)用mFrameAvailableCondition.signal,若還沒(méi)有開(kāi)始錄影,這個(gè)時(shí)候收到的數(shù)據(jù)是被丟棄的,當(dāng)然MediaWriter先啟動(dòng)的是CameraSource的start方法,再啟動(dòng)寫(xiě)Track),然后寫(xiě)到文件當(dāng)中。
注意:準(zhǔn)確來(lái)說(shuō)這里MPEG4Writer讀取的是OMXCodec里的數(shù)據(jù),因?yàn)閿?shù)據(jù)先到CameraSource,codec負(fù)責(zé)編碼之后,MPEG4Writer才負(fù)責(zé)寫(xiě)到文件當(dāng)中!關(guān)于數(shù)據(jù)在CameraSource/OMXCodec/MPEG4Writer之間是怎么傳遞的,可以參見(jiàn)http://guoh.org/lifelog/2013/06/interaction-between-stagefright-and-codec/當(dāng)中講Buffer的傳輸過(guò)程。
回頭再來(lái)看,Stagefright做了什么事情?我更覺(jué)得它只是一個(gè)粘合劑(glue)的用處,它工作在MediaPlayerService這一層,把MediaSource,MediaWriter,Codec以及上層的MediaRecorder綁定在一起,這應(yīng)該就是它最大的作用,Google用它來(lái)替換Opencore也是符合其一貫的工程派作風(fēng)(相比復(fù)雜的學(xué)術(shù)派而言,雖然Google很多東西也很復(fù)雜,但是它一般都是以盡量簡(jiǎn)單的方式來(lái)解決問(wèn)題)。
讓大家覺(jué)得有點(diǎn)不習(xí)慣的是,它把MediaRecorder放在MediaPlayerService當(dāng)中,這兩個(gè)看起來(lái)是對(duì)立的事情,或者某一天它們會(huì)改名字,或者是兩者分開(kāi),不知道~~
當(dāng)然這只是個(gè)簡(jiǎn)單的大體介紹,Codec相關(guān)的后面爭(zhēng)取專門來(lái)分析一下!
有些細(xì)節(jié)的東西在這里沒(méi)有列出,需要的話會(huì)把一些注意點(diǎn)列出來(lái):
1. 時(shí)光流逝錄影
CameraSource對(duì)應(yīng)的就是CameraSourceTimeLapse
具體做法就是在
dataCallbackTimestamp
當(dāng)中有skipCurrentFrame
當(dāng)然它是用些變量來(lái)記錄和計(jì)算
mTimeBetweenTimeLapseVideoFramesUs(1E6/videoFrameRate) // 兩個(gè)frame之間的間隔時(shí)間
記錄上一個(gè)frame的(mLastTimeLapseFrameRealTimestampUs) // 上一個(gè)frame發(fā)生的時(shí)間
然后通過(guò)frame rate計(jì)算出兩個(gè)frame之間的相距離時(shí)間,中間的都透過(guò)releaseOneRecordingFrame來(lái)drop掉
也就是說(shuō)driver返回的東西都不變,只是在SW這層我們自己來(lái)處理掉
關(guān)于Time-lapse相關(guān)的可以參閱
https://en.wikipedia.org/wiki/Time-lapse_photography
2. 錄影當(dāng)中需要用到Camera的話是通過(guò)ICameraRecordingProxy,即Camera當(dāng)中的RecordingProxy(這是一個(gè)BnCameraRecordingProxy)
當(dāng)透過(guò)binder,將ICameraRecordingProxy傳到服務(wù)端進(jìn)程之后,它就變成了Bp,如下:
| 1 2 3 4 5 6 7 8 9 | case SET_CAMERA: { ????ALOGV("SET_CAMERA"); ????CHECK_INTERFACE(IMediaRecorder, data, reply); ????sp<ICamera> camera = interface_cast<ICamera>(data.readStrongBinder()); ????sp<ICameraRecordingProxy> proxy = ????????interface_cast<ICameraRecordingProxy>(data.readStrongBinder()); ????reply->writeInt32(setCamera(camera, proxy)); ????return NO_ERROR; } break; |
在CameraSource當(dāng)中會(huì)這樣去使用
?| 1 2 3 4 5 6 7 8 | // We get the proxy from Camera, not ICamera. We need to get the proxy // to the remote Camera owned by the application. Here mCamera is a // local Camera object created by us. We cannot use the proxy from // mCamera here. mCamera = Camera::create(camera); if (mCamera == 0) return -EBUSY; mCameraRecordingProxy = proxy; mCameraFlags |= FLAGS_HOT_CAMERA; |
疑問(wèn)點(diǎn):
CameraSource當(dāng)中這個(gè)
List > mFramesBeingEncoded;
有什么用?
每編碼完一個(gè)frame,CameraSource就會(huì)將其保存起來(lái),Buffer被release的時(shí)候,會(huì)反過(guò)來(lái)release掉這些frame(s),這種做法是為了效率么?為什么不編碼完一個(gè)frame就將其release掉?
另外不得不再感嘆下Google經(jīng)常的delete this;行為,精妙,但是看起來(lái)反常!
原文地址: http://guoh.org/lifelog/2013/06/android-mediarecorder-architecture/
總結(jié)
以上是生活随笔為你收集整理的Android MediaRecorder系统结构的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: Android异步编程
- 下一篇: Android Binder 分析——原