人生如逆旅,我亦是行人
- Android OpenGLES開發:EGL環境搭建
- Android OpenGLES2.0開發(一):艱難的開始
- Android OpenGLES2.0開發(二):環境搭建
- Android OpenGLES2.0開發(三):繪制一個三角形
- Android OpenGLES2.0開發(四):矩陣變換和相機投影
- Android OpenGLES2.0開發(五):繪制正方形和圓形
- Android OpenGLES2.0開發(六):著色器語言GLSL
- Android OpenGLES2.0開發(七):紋理貼圖之顯示圖片
- Android OpenGLES2.0開發(八):Camera預覽
- Android OpenGLES2.0開發(九):圖片濾鏡
- Android OpenGLES2.0開發(十):FBO離屏渲染
- Android OpenGLES2.0開發(十一):渲染YUV
引言
還記的我們在Android OpenGLES2.0開發(八):Camera預覽章節顯示Camera預覽中提到的一種方式嗎,渲染NV21數據,這種方式略復雜我們沒有詳細講解。但是在音視頻開發中,由于YUV數據量比RGB小很多常常用于傳輸,而我們拿到YUV數據會要求顯示。本章我們就來詳細介紹如何使用OpenGL高效渲染YUV數據。
YUV格式
YUV格式是一種顏色編碼方法,常用于視頻處理和壓縮中,特別是在電視廣播、視頻會議、視頻播放等領域。它將顏色信息分成了三個部分:亮度(Y)和色度(U和V)。
- Y (亮度): 表示圖像的亮度信息,決定了圖像的明暗。
- U (藍色差): 表示藍色與亮度的差異,記錄藍色通道的信息。
- V (紅色差): 表示紅色與亮度的差異,記錄紅色通道的信息。
由于Y分量與U、V分量是分開存儲的,YUV格式可以有效地進行顏色壓縮,特別適合視頻傳輸和存儲。
常見的幾中YUV格式:
YUV420P
Y、U、V分布圖如下,U和V分開存儲,又稱為I420格式,UV交換順序后為YV12格式
I420:
YV12:
YUV420SP
該格式,UV交錯分布,根據UV的先后順序分為NV12和NV21(Android默認)
NV12:
NV21:
YUV和RGB轉換
我們知道OpenGL紋理最終渲染的都是RGBA數據,因此我們需要將YUV轉換為RGB。通用的轉換公式如下:
R = Y + 1.402 * (V - 128)G = Y - 0.344136 * (U - 128) - 0.714136 * (V - 128)B = Y + 1.772 * (U - 128)
上面YUV轉RGB的公式我使用了BT.601標準,實際上有多種標準,每種標準系數不同
- ITU-R BT.601(SDTV 標準,適用于 Android NV21)
- ITU-R BT.709(HDTV 標準,適用于 1080p 及以上視頻)
- ITU-R BT.2020(UHDTV 標準,適用于 4K、8K 視頻)
NV21轉換示例
應該有很多人和我有一樣的疑問,Y的數據量是UV的4倍,一個Y是如何和UV映射的呢,下面我們來舉例說明。
假設我們有一個 4×4 的 Y 紋理(每個像素一個 Y 值),而 UV 紋理是 2×2,示意如下:
Y 紋理(4×4)
Y00 Y01 Y02 Y03
Y10 Y11 Y12 Y13
Y20 Y21 Y22 Y23
Y30 Y31 Y32 Y33
UV 紋理(2×2)
UV0 UV1
UV2 UV3
每個 UV 采樣點對應 4 個 Y 像素:
(UV0) → {Y00, Y01, Y10, Y11}
(UV1) → {Y02, Y03, Y12, Y13}
(UV2) → {Y20, Y21, Y30, Y31}
(UV3) → {Y22, Y23, Y32, Y33}
但在 OpenGL 中,每個 Y 片段著色器 都需要一個 UV 值,而 UV 紋理比 Y 紋理小 4 倍(2x2),所以 OpenGL 會對 UV 進行插值。
OpenGL轉換YUV
有了上面的理論基礎,我們知道只需要將YUV數據傳到OpenGL中,shader程序一個像素一個像素的轉換即可。YUV數據如何傳入到OpenGL中,答案是通過sampler2D紋理傳遞。我們又知道OpenGL中sampler2D紋理中有四個值RGBA,而YUV中Y只是單通道,I420中UV是單通道,NV12和NV21中UV交錯存儲可以理解為雙通道。
那么現在的問題就是找到創建單通道和雙通道的紋理了, OpenGL為我們提供了GL_LUMINANCE 和 GL_LUMINANCE_ALPHA 格式的紋理,其中 GL_LUMINANCE 紋理用來加載 NV21 Y Plane 的數據,GL_LUMINANCE_ALPHA 紋理用來加載 UV Plane 的數據。
GL_LUMINANCE:
單通道紋理,紋理對象中RGBA值都相同
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0,GLES20.GL_LUMINANCE, width, height, 0,GLES20.GL_LUMINANCE,GLES20.GL_UNSIGNED_BYTE, imageData);
GL_LUMINANCE_ALPHA:
雙通道紋理,UV存儲到R和A值中
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0,GLES20.GL_LUMINANCE_ALPHA, width, height, 0,GLES20.GL_LUMINANCE_ALPHA,GLES20.GL_UNSIGNED_BYTE, imageData
);
頂點著色器
頂點著色器代碼沒有變化,和之前Image中一樣
// 頂點著色器代碼
private final String vertexShaderCode ="uniform mat4 uMVPMatrix;\n" +// 頂點坐標"attribute vec4 vPosition;\n" +// 紋理坐標"attribute vec2 vTexCoordinate;\n" +"varying vec2 aTexCoordinate;\n" +"void main() {\n" +" gl_Position = uMVPMatrix * vPosition;\n" +" aTexCoordinate = vTexCoordinate;\n" +"}\n";
片段著色器代碼
// 片段著色器代碼
private final String fragmentShaderCode ="precision mediump float;\n" +"uniform sampler2D samplerY;\n" +"uniform sampler2D samplerU;\n" +"uniform sampler2D samplerV;\n" +"uniform sampler2D samplerUV;\n" +"uniform int yuvType;\n" +"varying vec2 aTexCoordinate;\n" +"void main() {\n" +" vec3 yuv;\n" +" if (yuvType == 0) {" +" yuv.x = texture2D(samplerY, aTexCoordinate).r;\n" +" yuv.y = texture2D(samplerU, aTexCoordinate).r - 0.5;\n" +" yuv.z = texture2D(samplerV, aTexCoordinate).r - 0.5;\n" +" } else if (yuvType == 1) {" +" yuv.x = texture2D(samplerY, aTexCoordinate).r;\n" +" yuv.y = texture2D(samplerUV, aTexCoordinate).r - 0.5;\n" +" yuv.z = texture2D(samplerUV, aTexCoordinate).a - 0.5;\n" +" } else {" +" yuv.x = texture2D(samplerY, aTexCoordinate).r;\n" +" yuv.y = texture2D(samplerUV, aTexCoordinate).a - 0.5;\n" +" yuv.z = texture2D(samplerUV, aTexCoordinate).r - 0.5;\n" +" }" +" vec3 rgb = mat3(1.0, 1.0, 1.0,\n" +" 0.0, -0.344, 1.772,\n" +" 1.402, -0.714, 0.0) * yuv;\n" +" gl_FragColor = vec4(rgb, 1);\n" +"}\n";
片段著色器中代碼乍一看很復雜,待我來詳細解釋
sampler2D
我們上面聲明了4個變量samplerY、samplerU、samplerV、samplerUV
- yuvType=0:YUV格式為I420,U和V是分開存儲的,我們需要把U和V分別映射到不同的紋理中,用到samplerY、samplerU、samplerV
- yuvType=1:YUV格式為NV12,UV和交錯存儲的,UV映射到一個紋理上,用到samplerY、samplerUV
- yuvType=2:YUV格式為NV21,UV和交錯存儲的,UV映射到一個紋理上,用到samplerY、samplerUV
texture2D
texture2D方法我們在前面的章節應該很熟悉了,就是獲取對應紋理坐標下的RGBA的值
- yuvType=0:YUV格式為I420,YUV都是分開存儲,所以只需獲取r就可以得到對應的YUV的值
- yuvType=1:YUV格式為NV12,Y和上面一樣,UV交錯存儲,通過獲取r和a可得到對應的UV
- yuvType=1:YUV格式為NV21,Y和上面一樣,UV交錯存儲,通過獲取a和r可得到對應的UV
計算RGB
vec3 rgb = mat3(1.0, 1.0, 1.0,
0.0, -0.344, 1.772,
1.402, -0.714, 0.0) * yuv;gl_FragColor = vec4(rgb, 1);
上面我們使用了矩陣乘法,其實和上面的提到的公式一樣,如果你不熟悉這種方式,你可以分開計算:
float r = yuv.x + 1.402 * yuv.z;
float g = yuv.x - 0.344 * yuv.y - 0.714 * yuv.z;
float b = yuv.x + 1.772 * yuv.y;gl_FragColor = vec4(r, g, b, 1.0);
這兩種方式的效果是一樣的,為了計算效率我們只取float的后三位小數
YUVFilter
接下來我們看下YUVFilter的完整代碼,這個類也是從之前Image拷貝而來,并做了修改如下:
public class YUVFilter {/*** 繪制的流程* 1.頂點著色程序 - 用于渲染形狀的頂點的 OpenGL ES 圖形代碼* 2.片段著色器 - 用于渲染具有特定顏色或形狀的形狀的 OpenGL ES 代碼紋理。* 3.程序 - 包含您想要用于繪制的著色器的 OpenGL ES 對象 一個或多個形狀* <p>* 您至少需要一個頂點著色器來繪制形狀,以及一個 fragment 著色器來為該形狀著色。* 這些著色器必須經過編譯,然后添加到 OpenGL ES 程序中,該程序隨后用于繪制形狀。*/// 頂點著色器代碼private final String vertexShaderCode ="uniform mat4 uMVPMatrix;\n" +// 頂點坐標"attribute vec4 vPosition;\n" +// 紋理坐標"attribute vec2 vTexCoordinate;\n" +"varying vec2 aTexCoordinate;\n" +"void main() {\n" +" gl_Position = uMVPMatrix * vPosition;\n" +" aTexCoordinate = vTexCoordinate;\n" +"}\n";// 片段著色器代碼private final String fragmentShaderCode ="precision mediump float;\n" +"uniform sampler2D samplerY;\n" +"uniform sampler2D samplerU;\n" +"uniform sampler2D samplerV;\n" +"uniform sampler2D samplerUV;\n" +"uniform int yuvType;\n" +"varying vec2 aTexCoordinate;\n" +"void main() {\n" +" vec3 yuv;\n" +" if (yuvType == 0) {" +" yuv.x = texture2D(samplerY, aTexCoordinate).r;\n" +" yuv.y = texture2D(samplerU, aTexCoordinate).r - 0.5;\n" +" yuv.z = texture2D(samplerV, aTexCoordinate).r - 0.5;\n" +" } else if (yuvType == 1) {" +" yuv.x = texture2D(samplerY, aTexCoordinate).r;\n" +" yuv.y = texture2D(samplerUV, aTexCoordinate).r - 0.5;\n" +" yuv.z = texture2D(samplerUV, aTexCoordinate).a - 0.5;\n" +" } else {" +" yuv.x = texture2D(samplerY, aTexCoordinate).r;\n" +" yuv.y = texture2D(samplerUV, aTexCoordinate).a - 0.5;\n" +" yuv.z = texture2D(samplerUV, aTexCoordinate).r - 0.5;\n" +" }" +" vec3 rgb = mat3(1.0, 1.0, 1.0,\n" +" 0.0, -0.344, 1.772,\n" +" 1.402, -0.714, 0.0) * yuv;\n" +" gl_FragColor = vec4(rgb, 1);\n" +"}\n";private int mProgram;// 頂點坐標緩沖區private FloatBuffer vertexBuffer;// 紋理坐標緩沖區private FloatBuffer textureBuffer;// 此數組中每個頂點的坐標數static final int COORDS_PER_VERTEX = 2;/*** 頂點坐標數組* 頂點坐標系中原點(0,0)在畫布中心* 向左為x軸正方向* 向上為y軸正方向* 畫布四個角坐標如下:* (-1, 1),(1, 1)* (-1,-1),(1,-1)*/private float vertexCoords[] = {-1.0f, 1.0f, // 左上-1.0f, -1.0f, // 左下1.0f, 1.0f, // 右上1.0f, -1.0f, // 右下};/*** 紋理坐標數組* 這里我們需要注意紋理坐標系,原點(0,0s)在畫布左下角* 向左為x軸正方向* 向上為y軸正方向* 畫布四個角坐標如下:* (0,1),(1,1)* (0,0),(1,0)*/private float textureCoords[] = {0.0f, 1.0f, // 左上0.0f, 0.0f, // 左下1.0f, 1.0f, // 右上1.0f, 0.0f, // 右下};private int positionHandle;// 紋理坐標句柄private int texCoordinateHandle;// Use to access and set the view transformationprivate int vPMatrixHandle;private IntBuffer mPlanarTextureHandles = IntBuffer.wrap(new int[3]);private int[] mSampleHandle = new int[3];private int mYUVTypeHandle;private final int vertexCount = vertexCoords.length / COORDS_PER_VERTEX;private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertexprivate int mTextureWidth;private int mTextureHeight;public YUVFilter() {// 初始化形狀坐標的頂點字節緩沖區vertexBuffer = ByteBuffer.allocateDirect(vertexCoords.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer().put(vertexCoords);vertexBuffer.position(0);// 初始化紋理坐標頂點字節緩沖區textureBuffer = ByteBuffer.allocateDirect(textureCoords.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer().put(textureCoords);textureBuffer.position(0);}public void setTextureSize(int width, int height) {mTextureWidth = width;mTextureHeight = height;}public void surfaceCreated() {// 加載頂點著色器程序int vertexShader = GLESUtils.loadShader(GLES20.GL_VERTEX_SHADER,vertexShaderCode);// 加載片段著色器程序int fragmentShader = GLESUtils.loadShader(GLES20.GL_FRAGMENT_SHADER,fragmentShaderCode);// 創建空的OpenGL ES程序mProgram = GLES20.glCreateProgram();// 將頂點著色器添加到程序中GLES20.glAttachShader(mProgram, vertexShader);// 將片段著色器添加到程序中GLES20.glAttachShader(mProgram, fragmentShader);// 創建OpenGL ES程序可執行文件GLES20.glLinkProgram(mProgram);// 獲取頂點著色器vPosition成員的句柄positionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");// 獲取頂點著色器中紋理坐標的句柄texCoordinateHandle = GLES20.glGetAttribLocation(mProgram, "vTexCoordinate");// 獲取繪制矩陣句柄vPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");// 獲取yuvType句柄mYUVTypeHandle = GLES20.glGetUniformLocation(mProgram, "yuvType");// 生成YUV紋理句柄GLES20.glGenTextures(3, mPlanarTextureHandles);}public void surfaceChanged(int width, int height) {GLES20.glViewport(0, 0, width, height);}public void onDraw(float[] matrix, YUVFormat yuvFormat) {// 將程序添加到OpenGL ES環境GLES20.glUseProgram(mProgram);// 重新繪制背景色為黑色GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);// 為正方形頂點啟用控制句柄GLES20.glEnableVertexAttribArray(positionHandle);// 寫入坐標數據GLES20.glVertexAttribPointer(positionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);// 啟用紋理坐標控制句柄GLES20.glEnableVertexAttribArray(texCoordinateHandle);// 寫入坐標數據GLES20.glVertexAttribPointer(texCoordinateHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, textureBuffer);// 將投影和視圖變換傳遞給著色器GLES20.glUniformMatrix4fv(vPMatrixHandle, 1, false, matrix, 0);int yuvType = 0;// 設置yuvTypeif (yuvFormat == YUVFormat.I420) {yuvType = 0;} else if (yuvFormat == YUVFormat.NV12) {yuvType = 1;} else if (yuvFormat == YUVFormat.NV21) {yuvType = 2;}GLES20.glUniform1i(mYUVTypeHandle, yuvType);// yuvType: 0是I420,1是NV12int planarCount = 0;if (yuvFormat == YUVFormat.I420) {planarCount = 3;mSampleHandle[0] = GLES20.glGetUniformLocation(mProgram, "samplerY");mSampleHandle[1] = GLES20.glGetUniformLocation(mProgram, "samplerU");mSampleHandle[2] = GLES20.glGetUniformLocation(mProgram, "samplerV");} else {//NV12、NV21有兩個平面planarCount = 2;mSampleHandle[0] = GLES20.glGetUniformLocation(mProgram, "samplerY");mSampleHandle[1] = GLES20.glGetUniformLocation(mProgram, "samplerUV");}for (int i = 0; i < planarCount; i++) {GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mPlanarTextureHandles.get(i));GLES20.glUniform1i(mSampleHandle[i], i);}// 繪制GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);// 禁用頂點陣列GLES20.glDisableVertexAttribArray(positionHandle);GLES20.glDisableVertexAttribArray(texCoordinateHandle);}public void release() {GLES20.glDeleteProgram(mProgram);mProgram = -1;}/*** 將圖片數據綁定到紋理目標,適用于UV分量分開存儲的(I420)** @param yPlane YUV數據的Y分量* @param uPlane YUV數據的U分量* @param vPlane YUV數據的V分量* @param width YUV圖片寬度* @param height YUV圖片高度*/public void feedTextureWithImageData(ByteBuffer yPlane, ByteBuffer uPlane, ByteBuffer vPlane, int width, int height) {//根據YUV編碼的特點,獲得不同平面的基址textureYUV(yPlane, width, height, 0);textureYUV(uPlane, width / 2, height / 2, 1);textureYUV(vPlane, width / 2, height / 2, 2);}/*** 將圖片數據綁定到紋理目標,適用于UV分量交叉存儲的(NV12、NV21)** @param yPlane YUV數據的Y分量* @param uvPlane YUV數據的UV分量* @param width YUV圖片寬度* @param height YUV圖片高度*/public void feedTextureWithImageData(ByteBuffer yPlane, ByteBuffer uvPlane, int width, int height) {//根據YUV編碼的特點,獲得不同平面的基址textureYUV(yPlane, width, height, 0);textureNV12(uvPlane, width / 2, height / 2, 1);}/*** 將圖片數據綁定到紋理目標,適用于UV分量分開存儲的(I420)** @param imageData YUV數據的Y/U/V分量* @param width YUV圖片寬度* @param height YUV圖片高度*/private void textureYUV(ByteBuffer imageData, int width, int height, int index) {// 將紋理對象綁定到紋理目標GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mPlanarTextureHandles.get(index));// 設置放大和縮小時,紋理的過濾選項為:線性過濾GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);// 設置紋理X,Y軸的紋理環繞選項為:邊緣像素延伸GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);// 加載圖像數據到紋理,GL_LUMINANCE指明了圖像數據的像素格式為只有亮度,雖然第三個和第七個參數都使用了GL_LUMINANCE,// 但意義是不一樣的,前者指明了紋理對象的顏色分量成分,后者指明了圖像數據的像素格式// 獲得紋理對象后,其每個像素的r,g,b,a值都為相同,為加載圖像的像素亮度,在這里就是YUV某一平面的分量值GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0,GLES20.GL_LUMINANCE, width, height, 0,GLES20.GL_LUMINANCE,GLES20.GL_UNSIGNED_BYTE, imageData);}/*** 將圖片數據綁定到紋理目標,適用于UV分量交叉存儲的(NV12、NV21)** @param imageData YUV數據的UV分量* @param width YUV圖片寬度* @param height YUV圖片高度*/private void textureNV12(ByteBuffer imageData, int width, int height, int index) {GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mPlanarTextureHandles.get(index));GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0,GLES20.GL_LUMINANCE_ALPHA, width, height, 0,GLES20.GL_LUMINANCE_ALPHA,GLES20.GL_UNSIGNED_BYTE, imageData);}
}
具體改動點主要就是增加了根據YUV類型生成對應的紋理,并將紋理闖入OpenGL中
DisplayYUVGLSurfaceView
新建一個GLSurfaceView,在其中使用YUVFilter,完整代碼如下:
public class DisplayYUVGLSurfaceView extends GLSurfaceView {private static final String TAG = DisplayYUVGLSurfaceView.class.getSimpleName();private Context mContext;private MyRenderer mMyRenderer;public DisplayYUVGLSurfaceView(Context context) {super(context);init(context);}public DisplayYUVGLSurfaceView(Context context, AttributeSet attrs) {super(context, attrs);init(context);}private void init(Context context) {mContext = context;mMyRenderer = new MyRenderer();setEGLContextClientVersion(2);setRenderer(mMyRenderer);setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);}public void feedYUVData(byte[] yuvData, int width, int height, YUVFormat yuvFormat, int rotate) {if (yuvData == null) {return;}mMyRenderer.feedData(yuvData, width, height, yuvFormat, rotate);requestRender();}public void setCameraId(int id) {mMyRenderer.setCameraId(id);}static class MyRenderer implements Renderer {private YUVFilter mYUVFilter;private YUVFormat mYUVFormat;private int mWidth;private int mHeight;// vPMatrix is an abbreviation for "Model View Projection Matrix"private float[] mMVPMatrix = new float[16];// y分量數據private ByteBuffer y = ByteBuffer.allocate(0);// u分量數據private ByteBuffer u = ByteBuffer.allocate(0);// v分量數據private ByteBuffer v = ByteBuffer.allocate(0);// uv分量數據private ByteBuffer uv = ByteBuffer.allocate(0);// 標識GLSurfaceView是否準備好private boolean hasVisibility = false;private boolean isMirror = false;private int mRotate;private int mCameraId;public MyRenderer() {mYUVFilter = new YUVFilter();}public void setCameraId(int cameraId) {mCameraId = cameraId;}@Overridepublic void onSurfaceCreated(GL10 gl, EGLConfig config) {mYUVFilter.surfaceCreated();}@Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) {mYUVFilter.surfaceChanged(width, height);hasVisibility = true;}@Overridepublic void onDrawFrame(GL10 gl) {synchronized (this) {if (y.capacity() > 0) {y.position(0);if (mYUVFormat == YUVFormat.I420) {u.position(0);v.position(0);mYUVFilter.feedTextureWithImageData(y, u, v, mWidth, mHeight);} else {uv.position(0);mYUVFilter.feedTextureWithImageData(y, uv, mWidth, mHeight);}MatrixUtils.getMatrix(mMVPMatrix, MatrixUtils.TYPE_FITXY, mWidth, mHeight, mWidth, mHeight);MatrixUtils.flip(mMVPMatrix, false, true);if (mCameraId == 1) {MatrixUtils.flip(mMVPMatrix, true, false);}MatrixUtils.rotate(mMVPMatrix, mRotate);try {long start = System.currentTimeMillis();mYUVFilter.onDraw(mMVPMatrix, mYUVFormat);Log.i(TAG, "drawTexture " + mWidth + "x" + mHeight + " 耗時:" + (System.currentTimeMillis() - start) + "ms");} catch (Exception e) {Log.w(TAG, e.getMessage());}}}}/*** 設置渲染的YUV數據的寬高** @param width 寬度* @param height 高度*/public void setYuvDataSize(int width, int height) {if (width > 0 && height > 0) {// 初始化容器if (width != mWidth || height != mHeight) {this.mWidth = width;this.mHeight = height;int yarraySize = width * height;int uvarraySize = yarraySize / 4;synchronized (this) {y = ByteBuffer.allocate(yarraySize);u = ByteBuffer.allocate(uvarraySize);v = ByteBuffer.allocate(uvarraySize);uv = ByteBuffer.allocate(uvarraySize * 2);}}}}public void feedData(byte[] yuvData, int width, int height, YUVFormat yuvFormat, int rotate) {setYuvDataSize(width, height);synchronized (this) {mWidth = width;mHeight = height;mYUVFormat = yuvFormat;mRotate = rotate;if (hasVisibility) {if (yuvFormat == YUVFormat.I420) {y.clear();u.clear();v.clear();y.put(yuvData, 0, width * height);u.put(yuvData, width * height, width * height / 4);v.put(yuvData, width * height * 5 / 4, width * height / 4);} else {y.clear();uv.clear();y.put(yuvData, 0, width * height);uv.put(yuvData, width * height, width * height / 2);}}}}}
}
?OpenGLES渲染YUV數據?主要涉及到YUV數據的處理和渲染過程。YUV是一種顏色編碼方法,其中“Y”表示明亮度(Luminance或Luma),而“U”和“V”表示色度(Chrominance或Chroma),用于描述影像色彩及飽和度。YUV格式主要用于電視系統以及模擬視頻領域,它允許降低色度的帶寬,同時保持圖片質量,提供傳輸效率。在OpenGLES中渲染YUV數據,通常涉及以下幾個步驟:
?
顯示Camera的YUV數據
我們使用了Camera系列中Camera2Manager類,通過他獲取YUV數據
public class DisplayYUVActivity extends AppCompatActivity implements CameraCallback {private static final String TAG = DisplayYUVActivity.class.getSimpleName();private DisplayYUVGLSurfaceView mDisplayYUVGLSurfaceView;private ICameraManager mCameraManager;private int mCameraId = 1;@Overrideprotected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);setContentView(R.layout.activity_display_yuvactivity);mDisplayYUVGLSurfaceView = findViewById(R.id.displayYUVGLView);mCameraManager = new Camera2Manager(this);mCameraManager.setCameraId(mCameraId);mCameraManager.setCameraCallback(this);mCameraManager.addPreviewBufferCallback(mPreviewBufferCallback);mDisplayYUVGLSurfaceView.setCameraId(mCameraId);}@Overrideprotected void onResume() {super.onResume();mCameraManager.openCamera();}@Overrideprotected void onPause() {super.onPause();mCameraManager.releaseCamera();}@Overridepublic void onOpen() {mCameraManager.startPreview((SurfaceTexture) null);}@Overridepublic void onOpenError(int error, String msg) {}@Overridepublic void onPreview(int previewWidth, int previewHeight) {}@Overridepublic void onPreviewError(int error, String msg) {}@Overridepublic void onClose() {}private PreviewBufferCallback mPreviewBufferCallback = new PreviewBufferCallback() {@Overridepublic void onPreviewBufferFrame(byte[] data, int width, int height, YUVFormat format) {mDisplayYUVGLSurfaceView.feedYUVData(data, width, height, format, mCameraManager.getOrientation());}};
}
最后
本章我們學習了如何將YUV原數據通過OpenGL顯示,該方式通過OpenGL將YUV數據轉換為RGB然后顯示到屏幕,性能比用CPU轉換好很多。
OpenGL ES系列:https://github.com/xiaozhi003/AndroidOpenGLDemo.git,如果對你有幫助可以star下,萬分感謝^_^