yuv 圖像裏的stride和plane的解釋

stride能夠翻譯爲:跨距app

stride指在內存中每行像素所佔的空間。以下圖所示,爲了實現內存對齊(或者其它的什麼緣由),每行像素在內存中所佔的空間並非圖像的寬度。ide



plane通常是以luma plane、chroma plane的形式出現,其實就是luma層和chroma層,就像RGB,要用三個plane來存。ui

最近在作HI5321的一個項目,其中遇到一個關鍵性的技術問題,咱們的圖像處理程序需 要的是rgb888格式的圖像文件,而我從hi3521獲取的視頻流是yuv420sp格式的圖片幀,問題來了,如今我須要將yuv420sp格式的一幀 圖像轉換成rgb888格式的圖片,其實個人目的是要rgb888圖像數據。yuv420sp的yuv份量在內存儲存的方式是y份量單獨存儲,uv份量交 叉存儲。好了,能夠作了,可是當我打印yuv420sp幀信息的時候發現這個720*576的一幀圖片的stride(也就是跨距)竟然是768,不解, 知道如今即使我已經成功將yuv420sp的一幀圖片轉成了bmp位圖,我依然不明白這個多出來的768-720=48字節是什麼。在沒有考慮跨距的狀況 下,我直接從yuv份量的地址出讀取個份量然後獲取rgb數據保存成bmp位圖,但bmp徹底錯亂,哪裏出了問題。確定是跨距,跨距:必定會大於等於幀寬 度而且是4的倍數,720和768之間是4的倍數的數多了,爲何是768?好吧!既然是在不足4的倍數的狀況下須要在行末補0,那我權當這48字節就在 每行的末尾。那麼在讀取yuv份量的時候一定要偏移地址。試一試,bmp果然保存成功,就像抓拍的圖片同樣,固然其中技術細節你們都知道,yuv換算成 rgb的公式我知道的很多於3個。this

      寫這段記錄的目的就是想說這個stride的問題,因此在進行yuv420p,yuv420sp,等視頻幀轉換時必定要注意跨距stride這個參數。spa

 

 

When a video image is stored in memory, the memory buffer might contain extra padding bytes after each row of pixels. The padding bytes affect how the image is stored in memory, but do not affect how the image is displayed.翻譯

The stride is the number of bytes from one row of pixels in memory to the next row of pixels in memory. Stride is also called pitch. If padding bytes are present, the stride is wider than the width of the image, as shown in the following illustration.code

Diagram showing an image plus padding.

Two buffers that contain video frames with equal dimensions can have two different strides. If you process a video image, you must take the stride into account.orm

In addition, there are two ways that an image can be arranged in memory. In a top-down image, the top row of pixels in the image appears first in memory. In a bottom-up image, the last row of pixels appears first in memory. The following illustration shows the difference between a top-down image and a bottom-up image.視頻

Diagram showing top-down and bottom-up images.

A bottom-up image has a negative stride, because stride is defined as the number of bytes need to move down a row of pixels, relative to the displayed image. YUV images should always be top-down, and any image that is contained in a Direct3D surface must be top-down. RGB images in system memory are usually bottom-up.blog

Video transforms in particular need to handle buffers with mismatched strides, because the input buffer might not match the output buffer. For example, suppose that you want to convert a source image and write the result to a destination image. Assume that both images have the same width and height, but might not have the same pixel format or the same image stride.

The following example code shows a generalized approach for writing this kind of function. This is not a complete working example, because it abstracts many of the specific details.

 

下面是一個轉換的例子,能夠經過它很好的理解

最近拿到了一塊液晶顯示屏,採用NTSC隔行掃描制式輸出圖像,其採用的顏色格式爲YUV4:2:2的UYVY格式,但是某視頻解碼器輸出的顏色格式是YUV4:2:0的I420格式。那麼,就必須在二者之間進行一次轉換,其中I420是以平面(planner)格式存放的,而UYVY則是以緊縮(packet)格式存放的。這個轉換過程並不複雜,原理如圖 1所示。

圖2中的每個顏色份量都採用一個字節表示,U0Y0V0Y1這樣一個存放序列表示的其實是兩個像素點,總共須要4個字節表示。所以,每個像素點平均佔據的空間是2字節。YUV這種顏色格式的理論依據是HVS(Human Visual System,人類視覺系統)對亮度敏感,而對色度的敏感程度次之。所以經過對每一行像素點的色差份量亞採樣來減小所需的存儲空間。YUV4:2:2緊縮格式的顏色佔據的存儲空間是YUV4:4:4格式佔據的存儲空間的2/3。好比,若是採用YUV4:4:4格式,則每一個像素點都須要用三個份量表示,也即須要用3字節表示一個像素點。

代碼實現

void rv_csp_i420_uyvy(
    uint8_t *y_plane,   // Y plane of I420
    uint8_t *u_plane,   // U plane of I420
    uint8_t *v_plane,   // V plane of I420
    int y_stride,       // Y stride of I420, in pixel
    int uv_stride,      // U and V stride of I420, in pixel
    uint8_t *image,     // output UYVY image
    int width,          // image width
    int height)         // image height
{
    int row;
    int col;
    uint8_t *pImg = image;

    for (row = 0; row < height; row = row+1) 
{
for (col = 0; col < width; col = col+2)
{ pImg[
0] = u_plane[row/2 * uv_stride + col/2]; pImg[1] = y_plane[row * y_stride + col]; pImg[2] = v_plane[row/2 * uv_stride + col/2]; pImg[3] = y_plane[row * y_stride + col + 1]; pImg += 4; } } }

代碼好像有點問題,保存的時候沒有考慮YUV4:2:2的stride,不過上面的代碼已經把原理說的很清除了。

相關文章
相關標籤/搜索