Android 音視頻深刻 五 完美的錄視頻(附源碼下載)

本篇項目地址,名字是錄視頻,求stargit

https://github.com/979451341/Audio-and-video-learning-materialsgithub

這一次的代碼錄視頻在各個播放器均可以用,有時長顯示,對比上一次的代碼說說爲什麼二者效果不一樣,可是我先補充一些以前漏掉的MediaCodec的官方說明還有MediaCodec.BufferInfoc#

1.MediaCodec的補充緩存

buffer_flag_codec_config:提示標誌等含有編碼初始化/編×××的具體數據,而不是媒體數據緩衝區。app

buffer_flag_end_of_stream:這個信號流的結束less

buffer_flag_sync_frame提:包含數據的同步幀緩衝區。ide

info_output_buffers_changed:輸出緩衝區發生了變化,客戶必須向輸出緩衝區新設置的返回getoutputbuffers()這一點上。函數

info_output_format_changed:輸出格式發生了變化,隨後的數據將按照新格式。this

info_try_again_later:代表呼叫超時,超時時調用dequeueoutputbuffer編碼

dequeueInputBuffer(long timeoutUs):返回輸入緩衝區的索引以填充有效數據或-若是當前沒有這樣的緩衝區,則返回1。

dequeueOutputBuffer(MediaCodec.BufferInfo info, long timeoutUs):將輸出緩衝器,擋住了timeoutUs微妙

flush():刷新輸入和輸出端口的組件,全部指標之前返回調用dequeueinputbuffer(長)和dequeueoutputbuffer(mediacodec.bufferinfo,長)無效。

mediacodecinfo getcodecinfo():獲取編×××信息。

getinputbuffers():這start()返回後調用。

getoutputbuffers():在start()返回時dequeueoutputbuffer信號輸出緩衝的變化經過返回info_output_buffers_changed

mediaformat getoutputformat():這叫dequeueoutputbuffer信號後返回info_output_format_changed格式變化

queueInputBuffer(int index, int offset, int size, long presentationTimeUs, int flags):在指定索引上填充一個輸入緩衝區以後,將其提交給組件。

MediaCodec.BufferInfo每一個緩衝區元數據包括一個偏移量和大小,指定相關聯編×××緩衝區中有效數據的範圍。 我就理解爲將緩存區數據寫入本地的時候須要作出一些調整的 參數

2.代碼對比
不廢話直接來看Video編碼這部分,首先對比MediaFormat

mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, this.mWidth, this.mHeight);
    mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

    final MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
    format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);  // API >= 18
    format.setInteger(MediaFormat.KEY_BIT_RATE, calcBitRate());
    format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
    format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);

二者什麼寬高視頻格式都同樣,甚至幀率、採集點大小都同樣,只有一個不同MediaFormat.KEY_COLOR_FORMAT,這個官方說明是:
由用戶設置編碼器,在×××的輸出格式中可讀。

也就是說它可以設置編碼器,能設置編碼方式,也就是說這個兩個工程最大的不一樣是編碼,咱們繼續對比

private void encodeFrame(byte[] input) {
Log.w(TAG, "VideoEncoderThread.encodeFrame()");

// 將原始的N21數據轉爲I420
    NV21toI420SemiPlanar(input, mFrameData, this.mWidth, this.mHeight);

    ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
    ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();

    int inputBufferIndex = mMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
    if (inputBufferIndex >= 0) {
        ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
        inputBuffer.clear();
        inputBuffer.put(mFrameData);
        mMediaCodec.queueInputBuffer(inputBufferIndex, 0, mFrameData.length, System.nanoTime() / 1000, 0);
    } else {
        Log.e(TAG, "input buffer not available");
    }

—————-省略
}

protected void encode(final ByteBuffer buffer, final int length, final long presentationTimeUs) {
    if (!mIsCapturing) return;
    final ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
    while (mIsCapturing) {
        final int inputBufferIndex = mMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
        if (inputBufferIndex >= 0) {
            final ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            if (buffer != null) {
                inputBuffer.put(buffer);
            }

// if (DEBUG) Log.v(TAG, "encode:queueInputBuffer");
if (length <= 0) {
// send EOS
mIsEOS = true;
if (DEBUG) Log.i(TAG, "send BUFFER_FLAG_END_OF_STREAM");
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, 0,
presentationTimeUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
break;
} else {
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, length,
presentationTimeUs, 0);
}
break;
} else if (inputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
// wait for MediaCodec encoder is ready to encode
// nothing to do here because MediaCodec#dequeueInputBuffer(TIMEOUT_USEC)
// will wait for maximum TIMEOUT_USEC(10msec) on each call
}
}
}

同樣,說實話沒啥不一樣,只有一個不一樣那就是第一個編碼函數開頭用了這個,而後在編碼的時候使用了i420bytes,這個i420bytes經過xnv21bytes變換而來

private static void NV21toI420SemiPlanar(byte[] nv21bytes, byte[] i420bytes, int width, int height) {
    System.arraycopy(nv21bytes, 0, i420bytes, 0, width * height);
    for (int i = width * height; i < nv21bytes.length; i += 2) {
        i420bytes[i] = nv21bytes[i + 1];
        i420bytes[i + 1] = nv21bytes[i];
    }
}

咱們再看每一次編碼一個幀放入混合器分別是咋搞的,下面這個代碼意思是監聽編碼一幀先後編碼器的狀態的變化並將編碼後的數據放入MP4文件裏,而後釋放內存

drain();
            // request stop recording
            signalEndOfInputStream();
            // process output data again for EOS signale
            drain();
            // release all related objects
            release();

咱們在看看drain()裏面說啥,
開頭就mMediaCodec.getOutputBuffers輸出數據,而後獲得編碼器的狀態,若是超時了就退出當前循環,若是輸出緩衝區發生了變化,那就在執行一次mMediaCodec.getOutputBuffers,若是輸出格式變化了從新給編碼器配置MediaFormat,而後編碼器再次加入混合器,狀態的值小於0就是不可預料的狀態了,既然是不可預料那就沒辦法了,剩下來的就是正常的狀態,配合着BufferInfo將數據寫入混合器

protected void drain() {
    if (mMediaCodec == null) return;
    ByteBuffer[] encoderOutputBuffers = mMediaCodec.getOutputBuffers();
    int encoderStatus, count = 0;
    final MediaMuxerWrapper muxer = mWeakMuxer.get();
    if (muxer == null) {

// throw new NullPointerException("muxer is unexpectedly null");
Log.w(TAG, "muxer is unexpectedly null");
return;
}
LOOP: while (mIsCapturing) {
// get encoded data with maximum timeout duration of TIMEOUT_USEC(=10[msec])
encoderStatus = mMediaCodec.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// wait 5 counts(=TIMEOUT_USEC x 5 = 50msec) until data/EOS come
if (!mIsEOS) {
if (++count > 5)
break LOOP; // out of while
}
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
if (DEBUG) Log.v(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
// this shoud not come when encoding
encoderOutputBuffers = mMediaCodec.getOutputBuffers();
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
if (DEBUG) Log.v(TAG, "INFO_OUTPUT_FORMAT_CHANGED");
// this status indicate the output format of codec is changed
// this should come only once before actual encoded data
// but this status never come on Android4.3 or less
// and in that case, you should treat when MediaCodec.BUFFER_FLAG_CODEC_CONFIG come.
if (mMuxerStarted) { // second time request is error
throw new RuntimeException("format changed twice");
}
// get output format from codec and pass them to muxer
// getOutputFormat should be called after INFO_OUTPUT_FORMAT_CHANGED otherwise crash.
final MediaFormat format = mMediaCodec.getOutputFormat(); // API >= 16
mTrackIndex = muxer.addTrack(format);
mMuxerStarted = true;
if (!muxer.start()) {
// we should wait until muxer is ready
synchronized (muxer) {
while (!muxer.isStarted())
try {
muxer.wait(100);
} catch (final InterruptedException e) {
break LOOP;
}
}
}
} else if (encoderStatus < 0) {
// unexpected status
if (DEBUG) Log.w(TAG, "drain:unexpected result from encoder#dequeueOutputBuffer: " + encoderStatus);
} else {
final ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
if (encodedData == null) {
// this never should come...may be a MediaCodec internal error
throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
}
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// You shoud set output format to muxer here when you target Android4.3 or less
// but MediaCodec#getOutputFormat can not call here(because INFO_OUTPUT_FORMAT_CHANGED don't come yet)
// therefor we should expand and prepare output format from buffer data.
// This sample is for API>=18(>=Android 4.3), just ignore this flag here
if (DEBUG) Log.d(TAG, "drain:BUFFER_FLAG_CODEC_CONFIG");
mBufferInfo.size = 0;
}

if (mBufferInfo.size != 0) {
                // encoded data is ready, clear waiting counter
                count = 0;
                if (!mMuxerStarted) {
                    // muxer is not ready...this will prrograming failure.
                    throw new RuntimeException("drain:muxer hasn't started");
                }
                // write encoded data to muxer(need to adjust presentationTimeUs.
                mBufferInfo.presentationTimeUs = getPTSUs();
                muxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
                prevOutputPTSUs = mBufferInfo.presentationTimeUs;
            }
            // return buffer to encoder
            mMediaCodec.releaseOutputBuffer(encoderStatus, false);
            if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                // when EOS come.
                mIsCapturing = false;
                break;      // out of while
            }
        }
    }
}

打完收工,代碼不少,多多抽象理解,重在理解過程,細節。。。。。,自我總結

相關文章
相關標籤/搜索