在Win8上面,Image source切換的時候有bug。當咱們短期定時切換的時候,Image不能正常地顯示對應的圖片。Image控件又不支持GIF播放,因此GIF圖片的播放就是一個很是頭痛的問題。windows
正巧,最近在研究MFSource,我就突發奇想,能不能用MFSource去播放GIF圖片。app
結果成功了,項目文件在這裏 https://gifwin8player.codeplex.com/ 你們能夠和我一塊兒來維護這個項目。dom
接下來,跟你們分享一下我在開發中遇到的技術問題。但願能給你們一些幫助。異步
首先MFSource,能夠幫助咱們拓展MediaElement所支持的媒體類型,和添加一些新的編碼解碼。微軟有一個很是好的例程 http://code.msdn.microsoft.com/windowsapps/Media-extensions-sample-8e1b8275 這個例程裏面演示了不少技術,MFSource能夠幫助咱們添加播放器支持的協議,好比說咱們須要播放器播放一個加密的url地址,咱們也能夠用他添加新的解碼器。MFT能夠幫助咱們增長一些視頻特效。async
個人例程就是基於修改MFSource,這個項目裏關於MFSource只有三個類。GIFByteStreamHandler, GIFSrc,GIFStream。GIFByteStreamHandler是主要處理媒體打開時候的操做,這個媒體既能夠是流,也能夠是URL。同時這個類也能夠獲得應用傳來的參數(IPropertyStore *pProps)。GIFSrc,這個類是起主要做用的,須要填入視頻的一些信息,好比尺寸,還要給出每一幀的數據。GIFStream這個類是維護應答的,播放器會調(RequestSample(IUnknown* pToken);)這個函數來請求幀,而後咱們再用(m_pEventQueue->QueueEventParamUnk(MEMediaSample, GUID_NULL, S_OK, pSample);)來應答。ide
Media Foundation是基於消息應答的機制工做的。與Win32的MF不一樣,Windows Store 裏面的MF沒有消息隊列的API,不過例程中已經實現了一個消息隊列。函數
我修改的思路也很簡單,就是在須要的時候,給出幀的數據和時間。this
首先,修改的是CGIFByteStreamHandler::BeginCreateObject函數。若是咱們應用中調用MediaElement.Source = new Uri(**)的時候,咱們能夠從LPCWSTR pwszURL 參數中得到這個Uri。若是咱們是用MediaElement.SetSource()函數的話,傳入的就是流。MediaElement.Source 不能指定任意的文件,好比不能指定圖片庫的某一個圖片,可是像ms-appx這樣的路徑就能夠獲取。編碼
CGIFSource是單例運行的,CGIFSource::BeginOpen函數用來打開流,WIC須要的流是IStream,IMFByteStream能夠直接轉換成IStream,可是那個函數不能在Windows Store 裏面用。咱們只能經過IRandomAccessStream作中轉。代碼以下加密
if (SUCCEEDED(hr)) { Microsoft::WRL::ComPtr<ABI::Windows::Storage::Streams::IRandomAccessStream> iRandomstream; hr = MFCreateStreamOnMFByteStreamEx( pStream, IID_PPV_ARGS(&iRandomstream) ); hr = CreateStreamOverRandomAccessStream( reinterpret_cast<IUnknown*>(iRandomstream.Get()), IID_PPV_ARGS(&istream) ); }
這裏面還須要額外的設置,在GIFSource.idl裏面添加 import "windows.storage.idl"; 引用頭文件#include <Shcore.h>和對應的Shcore.lib。我試過添加#include <Windows.winmd>,這個會致使編譯報錯,在idl裏面用import也要注意大小寫要跟(C:\Program Files (x86)\Windows Kits\8.0\Include\winrt)裏面的一致。
MF中的異步操做是經過callback模式來作的,
// Create an async result object. We'll use it later to invoke the callback. if (SUCCEEDED(hr)) { hr = MFCreateAsyncResult(NULL, pCB, pState, &m_pBeginOpenResult); } if (m_pBeginOpenResult) { hr = m_pBeginOpenResult->SetStatus(hr); if (SUCCEEDED(hr)) { hr = MFInvokeCallback(m_pBeginOpenResult); } }
這段代碼是建立一個異步請求的結果,而後再激發這個結果,最後的結果在CGIFByteStreamHandler::Invoke 中執行。
打開文件以後,咱們要建立視頻的描述,這段代碼是在CGIFSource::CreateStream 中被DeliverPlayload()調用。
HRESULT CGIFSource::CreateStream(long stream_id) { OutputDebugString(L"call CreateStream function\n"); HRESULT hr = S_OK; IMFMediaType *pType = NULL; IMFStreamDescriptor *pSD = NULL; CGIFStream *pStream = NULL; IMFMediaTypeHandler *pHandler = NULL; hr = MFCreateMediaType(&pType); if (SUCCEEDED(hr)) { hr = pType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video); } if (SUCCEEDED(hr)) { hr = pType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB24); } // Format details. if (SUCCEEDED(hr)) { // Frame size hr = MFSetAttributeSize( pType, MF_MT_FRAME_SIZE, m_pImage->GetWidth(), m_pImage->GetHeight() ); } if (SUCCEEDED(hr)) { // Frame rate hr = MFSetAttributeRatio( pType, MF_MT_FRAME_RATE, 24000, 1001 ); // 23.976 fps } if (SUCCEEDED(hr)) { // Sequence header. hr=pType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT,1); } if (SUCCEEDED(hr)) { // Create the stream descriptor from the media type. hr = MFCreateStreamDescriptor(stream_id, 1, &pType, &pSD); } // Set the default media type on the stream handler. if (SUCCEEDED(hr)) { hr = pSD->GetMediaTypeHandler(&pHandler); } if (SUCCEEDED(hr)) { hr = pHandler->SetCurrentMediaType(pType); } // Create the new stream. if (SUCCEEDED(hr)) { pStream = new (std::nothrow) CGIFStream(this, pSD, hr); if (pStream == NULL) { hr = E_OUTOFMEMORY; } } if (SUCCEEDED(hr)) { m_streams.AddStream(stream_id, pStream); pStream->AddRef(); } SafeRelease(&pType); SafeRelease(&pSD); SafeRelease(&pStream); return hr; }
steam建立好以後,要建立MF裏面的描述 經過這個函數InitPresentationDescriptor()
這個函數執行完以後,初始化的工做就完成了。
用戶點播放的時候,系統會調CGIFSource::CreatePresentationDescriptor這個函數來獲取視頻信息,而後調用CGIFSource::Start。接着內部的消息隊列會調CGIFSource::DoStart,向MF中發MESourceStarted消息,接到這個消息以後CGIFStream::RequestSample會被觸發,若是CGIFStream裏面維護的隊列爲空的話,就會問CGIFSource要數據,最後定爲到DeliverPayload()函數,
HRESULT CGIFSource::DeliverPayload() { // When this method is called, the read buffer contains a complete // payload, and the payload belongs to a stream whose type we support. wchar_t buf[256]; swprintf_s(buf,L"call do DeliverPayload function with frame %d",m_CurrentIndex); OutputDebugString((LPCWSTR)buf); HRESULT hr = S_OK; //GIFPacketHeader packetHdr; CGIFStream *pStream = NULL; // not AddRef'd IMFMediaBuffer *pBuffer = NULL; IMFSample *pSample = NULL; BYTE *pData = NULL; // Pointer to the IMFMediaBuffer data. IWICBitmapFrameDecode *pWicFrame = NULL; // If we are still opening the file, then we might need to create this stream. if (SUCCEEDED(hr)) { if (m_state == STATE_OPENING) { hr = CreateStream(0); } } // Create a media buffer for the payload. if (SUCCEEDED(hr)) { hr = MFCreateMemoryBuffer(m_pImage->GetBuffSize(), &pBuffer); } if (SUCCEEDED(hr)) { hr = pBuffer->Lock(&pData, NULL, NULL); } if (SUCCEEDED(hr)) { hr=m_pImage->GetFrameBuffByIndex(m_CurrentIndex,&pData); } if (SUCCEEDED(hr)) { hr = pBuffer->Unlock(); } if (SUCCEEDED(hr)) { hr = pBuffer->SetCurrentLength(m_pImage->GetBuffSize()); } // Create a sample to hold the buffer. if (SUCCEEDED(hr)) { hr = MFCreateSample(&pSample); } if (SUCCEEDED(hr)) { hr = pSample->AddBuffer(pBuffer); } // Time stamp the sample. if (SUCCEEDED(hr)) { //LONGLONG hnsStart = m_CurrentIndex * 1000000; // 1s hr = pSample->SetSampleTime(m_SampleTimespan); m_SampleTimespan+=m_pImage->GetFrameDelay()*10000; } // Deliver the payload to the stream. if (SUCCEEDED(hr)) { /*hr = m_pEventQueue->QueueEventParamUnk( MEMediaSample, GUID_NULL, S_OK, pSample);*/ hr = m_streams[0]->DeliverPayload(pSample); } // If the open operation is still pending, check if we're done. if (SUCCEEDED(hr)) { if (m_state == STATE_OPENING) { hr = InitPresentationDescriptor(); goto done; } } m_CurrentIndex++; if(m_CurrentIndex>=m_pImage->GetFrameNumbers()) { hr = m_pEventQueue->QueueEventParamVar( MEEndOfStream, GUID_NULL, S_OK, NULL); if (FAILED(hr)) { goto done; } hr =QueueAsyncOperation(SourceOp::OP_END_OF_STREAM); if (FAILED(hr)) { goto done; } }else { if (SUCCEEDED(hr)) { if (StreamsNeedData()) { hr = RequestSample(); } } } done: SafeRelease(&pBuffer); SafeRelease(&pSample); return hr; }
這個函數把圖片裏面的數據轉換成sample,再加上時間傳回CGIFStream,最後發送hr = m_pEventQueue->QueueEventParamUnk(MEMediaSample, GUID_NULL, S_OK, pSample);給MF,這樣幀的數據就輸出了。
寫好了以後,還要把這個source引入到項目中,有一點要注意的是,要在Package.appxmanifast,裏面添加額外的信息。右鍵單擊這個文件->查看代碼,在Package下面添加節點
<Extensions> <Extension Category="windows.activatableClass.inProcessServer"> <InProcessServer> <Path>GIFSource.dll</Path> <ActivatableClass ActivatableClassId="GIFSource.GIFByteStreamHandler" ThreadingModel="both" /> </InProcessServer> </Extension> </Extensions>
而後在MediaExtensionManager中添加ByteStreamHandler
_extensionManager.RegisterByteStreamHandler("GIFSource.GIFByteStreamHandler", ".gif", "image/gif");