Audroid中audio子系統分析(1) --- 服務調用架構

audiosystem中的static status_t setStreamMute(int stream, bool mute);函數的調用爲例。android

AudioSystem.cpp中相應代碼以下:ios

status_t AudioSystem::setStreamMute(int stream, bool mute)ide

{函數

    if (uint32_t(stream) >= NUM_STREAM_TYPES) return BAD_VALUE;ui

    const sp<IAudioFlinger>& af = AudioSystem::get_audio_flinger();spa

    if (af == 0) return PERMISSION_DENIED;線程

    af->setStreamMute(stream, mute);指針

    return NO_ERROR;code

}接口

 

get_audio_flinger返回的是gAudioFlinger指針,它對應"media.audio_flinger"這個服務。

binder = sm->getService(String16("media.audio_flinger"));

gAudioFlinger = interface_cast<IAudioFlinger>(binder);

af->setStreamMute(stream, mute);

就調用了IAudioFlinger.cpp中的

virtual status_t setStreamMute(int stream, bool muted)

    {

        Parcel data, reply;

        data.writeInterfaceToken(IAudioFlinger::getInterfaceDescriptor());

        data.writeInt32(stream);

        data.writeInt32(muted);

        remote()->transact(SET_STREAM_MUTE, data, &reply);

        return reply.readInt32();

    }

 

它利用Binder IPC機制,將SET_STREAM_MUTE這個任務交給了一個service

status_t BnAudioFlinger::onTransact(

    uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)

{

 case SET_STREAM_MUTE: {

            CHECK_INTERFACE(IAudioFlinger, data, reply);

            int stream = data.readInt32();

            reply->writeInt32( setStreamMute(stream, data.readInt32()) );

            return NO_ERROR;

        } break;…

}

其中關鍵是調用了setStreamMute(stream, data.readInt32())這個函數。

 

那麼,BnAudioFlinger做爲一個service,其實際的setStreamMute(stream, data.readInt32())操做在哪裏呢?

 

class AudioFlinger : public BnAudioFlinger, public IBinder::DeathRecipient

可見,其實是AudioFlinger做爲一個serviceIAudioFlinger提供服務。

 

 

status_t AudioFlinger::setStreamMute(int stream, bool muted)

{

    // check calling permissions

    if (!settingsAllowed()) {

        return PERMISSION_DENIED;

    }

 

    if (stream < 0 || uint32_t(stream) >= AudioSystem::NUM_STREAM_TYPES ||

        uint32_t(stream) == AudioSystem::ENFORCED_AUDIBLE) {

        return BAD_VALUE;

    }

 

    mStreamTypes[stream].mute = muted;

    for (uint32_t i = 0; i < mPlaybackThreads.size(); i++)

       mPlaybackThreads.valueAt(i)->setStreamMute(stream, muted);

 

    return NO_ERROR;

}

 

DefaultKeyedVector< int, sp<PlaybackThread> >  mPlaybackThreads;AudioFlinger的一個數據成員。結構就是個map

for (uint32_t i = 0; i < mPlaybackThreads.size(); i++)

       mPlaybackThreads.valueAt(i)->setStreamMute(stream, muted);

可見,AudioFlinger能夠爲多個playback線程提供服務。AudioSystem中的setStreamMute函數是一個靜態成員函數,針對的是system-wide的操做,因此講全部playback現成的相應stream類型mute是合理的。

 

class PlaybackThread : public ThreadBase

class ThreadBase : public Thread

class Thread : virtual public RefBase

 

mPlaybackThreads.valueAt(i)->setStreamMute(stream, muted);

實際就調用了相應playback threadsetStreamMute函數,以下

status_t AudioFlinger::PlaybackThread::setStreamMute(int stream, bool muted)

{

#ifdef LVMX

    int audioOutputType = LifeVibes::getMixerType(mId, mType);

    if (LifeVibes::audioOutputTypeIsLifeVibes(audioOutputType)) {

        LifeVibes::setStreamMute(audioOutputType, stream, muted);

    }

#endif

    mStreamTypes[stream].mute = muted;

    return NO_ERROR;

}

 

AudioSystem爲上層提供system-wideaudio操做。

AudioTrack爲上層提供播放接口。

AudioRecord爲上層提供錄音接口。

他們最後使用的服務依然仍是AudioFlinger.cpp

 

class TrackHandle : public android::BnAudioTrack

{

private:

        sp<PlaybackThread::Track> mTrack;

};

 

AudioTrackstart爲例。

class BpAudioTrack : public BpInterface<IAudioTrack>

 

virtual status_t start()

    {

        Parcel data, reply;

        data.writeInterfaceToken(IAudioTrack::getInterfaceDescriptor());

        status_t status = remote()->transact(START, data, &reply);

        if (status == NO_ERROR) {

            status = reply.readInt32();

        } else {

            LOGW("start() error: %s", strerror(-status));

        }

        return status;

    }

 

 

status_t BnAudioTrack::onTransact(

    uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)

{

        case START: {

            CHECK_INTERFACE(IAudioTrack, data, reply);

            reply->writeInt32(start());

            return NO_ERROR;

        } break;

….

}

 

在函數sp<IAudioTrack> AudioFlinger::createTrack中,有:

sp<TrackHandle> trackHandle;

trackHandle = new TrackHandle(track);

return trackHandle;

 

又有:class TrackHandle : public android::BnAudioTrack

可見實際爲IAudioTrack提供服務的是TrackHandle。調用的是TrackHandle中的start

 

 

status_t AudioFlinger::TrackHandle::start() {

    return mTrack->start();

}

 

class TrackHandle : public android::BnAudioTrack {

private:

        sp<PlaybackThread::Track> mTrack;

    };

 

 

可見咱們最後調用的是PlaybackThread::Track這個類中的start方法。

以下:

status_t AudioFlinger::PlaybackThread::Track::start()

 

(PS:以上分析不必定正確,有待改進)。

相關文章
相關標籤/搜索