效果圖以下:css
爲控件設置ViewOutlineProviderhtml
public RoundTextureView(Context context, AttributeSet attrs) {
super(context, attrs);
setOutlineProvider(new ViewOutlineProvider() {
@Override
public void getOutline(View view, Outline outline) {
Rect rect = new Rect(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());
outline.setRoundRect(rect, radius);
}
});
setClipToOutline(true);
}
複製代碼
在須要時修改圓角值並更新java
public void setRadius(int radius) {
this.radius = radius;
}
public void turnRound() {
invalidateOutline();
}
複製代碼
便可根據設置的圓角值更新控件顯示的圓角大小。當控件爲正方形,且圓角值爲邊長的一半,顯示的就是圓形。android
首先介紹一種簡單可是侷限性較大的實現方式:將相機預覽尺寸和預覽控件的大小都調整爲1:1。git
通常Android設備都支持多種預覽尺寸,以Samsung Tab S3爲例github
2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1920x1080
2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1280x720
2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1440x1080
2019-08-02 13:16:08.669 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1088x1088
2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 1056x864
2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 960x720
2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 720x480
2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 640x480
2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 352x288
2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 320x240
2019-08-02 13:16:08.670 16407-16407/com.wsy.glcamerademo I/CameraHelper: supportedPreviewSize: 176x144
複製代碼
其中1:1的預覽尺寸爲:1088x1088。算法
2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 4128x3096
2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 4128x2322
2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3264x2448
2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3264x1836
2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 3024x3024
2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2976x2976
2019-08-02 13:19:24.980 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2880x2160
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2592x1944
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1920
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1440
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2560x1080
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2160x2160
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2048x1536
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 2048x1152
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1936x1936
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1920x1080
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1440x1080
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1280x960
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 1280x720
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 960x720
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 720x480
2019-08-02 13:19:24.981 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 640x480
2019-08-02 13:19:24.982 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 320x240
2019-08-02 13:19:24.982 16768-16768/com.wsy.glcamerademo I/Camera2Helper: getBestSupportedSize: 176x144
複製代碼
其中1:1的預覽尺寸爲:3024x302四、2976x297六、2160x2160、1936x1936。canvas
只要咱們選擇1:1的預覽尺寸,再將預覽控件設置爲正方形,便可實現正方形預覽;
再經過設置預覽控件的圓角爲邊長的一半,便可實現圓形預覽。ruby
選擇1:1預覽尺寸的缺陷分析app
處理不支持1:1預覽尺寸的狀況
示例代碼
//將預覽控件和預覽尺寸比例保持一致,避免拉伸
{
FrameLayout.LayoutParams textureViewLayoutParams = (FrameLayout.LayoutParams) textureView.getLayoutParams();
int newHeight = 0;
int newWidth = textureViewLayoutParams.width;
//橫屏
if (displayOrientation % 180 == 0) {
newHeight = textureViewLayoutParams.width * previewSize.height / previewSize.width;
}
//豎屏
else {
newHeight = textureViewLayoutParams.width * previewSize.width / previewSize.height;
}
////當不是正方形預覽的狀況下,添加一層ViewGroup限制View的顯示區域
if (newHeight != textureViewLayoutParams.height) {
insertFrameLayout = new RoundFrameLayout(CoverByParentCameraActivity.this);
int sideLength = Math.min(newWidth, newHeight);
FrameLayout.LayoutParams layoutParams = new FrameLayout.LayoutParams(sideLength, sideLength);
insertFrameLayout.setLayoutParams(layoutParams);
FrameLayout parentView = (FrameLayout) textureView.getParent();
parentView.removeView(textureView);
parentView.addView(insertFrameLayout);
insertFrameLayout.addView(textureView);
FrameLayout.LayoutParams newTextureViewLayoutParams = new FrameLayout.LayoutParams(newWidth, newHeight);
//橫屏
if (displayOrientation % 180 == 0) {
newTextureViewLayoutParams.leftMargin = ((newHeight - newWidth) / 2);
}
//豎屏
else {
newTextureViewLayoutParams.topMargin = -(newHeight - newWidth) / 2;
}
textureView.setLayoutParams(newTextureViewLayoutParams);
}
}
複製代碼
使用上面的方法操做已經可完成正方形和圓形預覽,可是僅適用於原生相機,當咱們的數據源並不是是原生相機的狀況時如何進行圓形預覽?接下來介紹使用GLSurfaceView顯示NV21的方案,徹底是本身實現預覽數據的繪製。
其中的重點是渲染器(Renderer)的編寫,Renderer的介紹以下:
/** * A generic renderer interface. * <p> * The renderer is responsible for making OpenGL calls to render a frame. * <p> * GLSurfaceView clients typically create their own classes that implement * this interface, and then call {@link GLSurfaceView#setRenderer} to * register the renderer with the GLSurfaceView. * <p> * * <div class="special reference"> * <h3>Developer Guides</h3> * <p>For more information about how to use OpenGL, read the * <a href="{@docRoot}guide/topics/graphics/opengl.html">OpenGL</a> developer guide.</p> * </div> * * <h3>Threading</h3> * The renderer will be called on a separate thread, so that rendering * performance is decoupled from the UI thread. Clients typically need to * communicate with the renderer from the UI thread, because that's where * input events are received. Clients can communicate using any of the * standard Java techniques for cross-thread communication, or they can * use the {@link GLSurfaceView#queueEvent(Runnable)} convenience method. * <p> * <h3>EGL Context Lost</h3> * There are situations where the EGL rendering context will be lost. This * typically happens when device wakes up after going to sleep. When * the EGL context is lost, all OpenGL resources (such as textures) that are * associated with that context will be automatically deleted. In order to * keep rendering correctly, a renderer must recreate any lost resources * that it still needs. The {@link #onSurfaceCreated(GL10, EGLConfig)} method * is a convenient place to do this. * * * @see #setRenderer(Renderer) */
public interface Renderer {
/** * Called when the surface is created or recreated. * <p> * Called when the rendering thread * starts and whenever the EGL context is lost. The EGL context will typically * be lost when the Android device awakes after going to sleep. * <p> * Since this method is called at the beginning of rendering, as well as * every time the EGL context is lost, this method is a convenient place to put * code to create resources that need to be created when the rendering * starts, and that need to be recreated when the EGL context is lost. * Textures are an example of a resource that you might want to create * here. * <p> * Note that when the EGL context is lost, all OpenGL resources associated * with that context will be automatically deleted. You do not need to call * the corresponding "glDelete" methods such as glDeleteTextures to * manually delete these lost resources. * <p> * @param gl the GL interface. Use <code>instanceof</code> to * test if the interface supports GL11 or higher interfaces. * @param config the EGLConfig of the created surface. Can be used * to create matching pbuffers. */
void onSurfaceCreated(GL10 gl, EGLConfig config);
/** * Called when the surface changed size. * <p> * Called after the surface is created and whenever * the OpenGL ES surface size changes. * <p> * Typically you will set your viewport here. If your camera * is fixed then you could also set your projection matrix here: * <pre class="prettyprint"> * void onSurfaceChanged(GL10 gl, int width, int height) { * gl.glViewport(0, 0, width, height); * // for a fixed camera, set the projection too * float ratio = (float) width / height; * gl.glMatrixMode(GL10.GL_PROJECTION); * gl.glLoadIdentity(); * gl.glFrustumf(-ratio, ratio, -1, 1, 1, 10); * } * </pre> * @param gl the GL interface. Use <code>instanceof</code> to * test if the interface supports GL11 or higher interfaces. * @param width * @param height */
void onSurfaceChanged(GL10 gl, int width, int height);
/** * Called to draw the current frame. * <p> * This method is responsible for drawing the current frame. * <p> * The implementation of this method typically looks like this: * <pre class="prettyprint"> * void onDrawFrame(GL10 gl) { * gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT); * //... other gl calls to render the scene ... * } * </pre> * @param gl the GL interface. Use <code>instanceof</code> to * test if the interface supports GL11 or higher interfaces. */
void onDrawFrame(GL10 gl);
}
複製代碼
void onSurfaceCreated(GL10 gl, EGLConfig config)
void onSurfaceChanged(GL10 gl, int width, int height)
void onDrawFrame(GL10 gl)
renderMode
爲RENDERMODE_CONTINUOUSLY
時,該函數將不斷地執行;renderMode
爲RENDERMODE_WHEN_DIRTY
時,將只在建立完成和調用requestRender
後才執行。通常咱們選擇RENDERMODE_WHEN_DIRTY
渲染模式,避免過分繪製。通常狀況下,咱們會本身實現一個Renderer,而後爲GLSurfaceView設置Renderer,能夠說,Renderer的編寫是整個流程的核心步驟。如下是在void onSurfaceCreated(GL10 gl, EGLConfig config)
進行的初始化操做和在void onDrawFrame(GL10 gl)
進行的繪製操做的流程圖:
![]()
Android View座標系
|
![]()
OpenGL世界座標系
|
---|
如圖所示,和Android的View座標系不一樣,OpenGL的座標系是笛卡爾座標系。
Android View的座標系以左上角爲原點,向右x遞增,向下y遞增;
而OpenGL座標系以中心爲原點,向右x遞增,向上y遞增。
/** * 頂點着色器 */
private static String VERTEX_SHADER =
" attribute vec4 attr_position;\n" +
" attribute vec2 attr_tc;\n" +
" varying vec2 tc;\n" +
" void main() {\n" +
" gl_Position = attr_position;\n" +
" tc = attr_tc;\n" +
" }";
/** * 片斷着色器 */
private static String FRAG_SHADER =
" varying vec2 tc;\n" +
" uniform sampler2D ySampler;\n" +
" uniform sampler2D uSampler;\n" +
" uniform sampler2D vSampler;\n" +
" const mat3 convertMat = mat3( 1.0, 1.0, 1.0, -0.001, -0.3441, 1.772, 1.402, -0.7141, -0.58060);\n" +
" void main()\n" +
" {\n" +
" vec3 yuv;\n" +
" yuv.x = texture2D(ySampler, tc).r;\n" +
" yuv.y = texture2D(uSampler, tc).r - 0.5;\n" +
" yuv.z = texture2D(vSampler, tc).r - 0.5;\n" +
" gl_FragColor = vec4(convertMat * yuv, 1.0);\n" +
" }";
複製代碼
內建變量解釋
gl_Position
VERTEX_SHADER
代碼裏的gl_Position
表明繪製的空間座標。因爲咱們是二維繪製,因此直接傳入OpenGL二維座標系的左下(-1,-1)、右下(1,-1)、左上(-1,1)、右上(1,1),也就是{-1,-1,1,-1,-1,1,1,1}gl_FragColor
FRAG_SHADER
代碼裏的gl_FragColor
表明單個片元的顏色其餘變量解釋
ySampler
、uSampler
、vSampler
convertMat
R = Y + 1.402 (V - 128)
G = Y - 0.34414 (U - 128) - 0.71414 (V - 128)
B = Y + 1.772 (U - 128)
複製代碼
咱們可獲得一個YUV轉RGB的矩陣 1.0, 1.0, 1.0,
0, -0.344, 1.77,
1.403, -0.714, 0
複製代碼
部分類型、函數的解釋
vec三、vec4
vec4 texture2D(sampler2D sampler, vec2 coord)
texture2D(ySampler, tc).r
獲取到的是Y數據,texture2D(uSampler, tc).r
獲取到的是U數據,texture2D(vSampler, tc).r
獲取到的是V數據。在Java代碼中進行初始化
根據圖像寬高建立Y、U、V對應的ByteBuffer
紋理數據;
根據是否鏡像顯示、旋轉角度選擇對應的轉換矩陣;
public void init(boolean isMirror, int rotateDegree, int frameWidth, int frameHeight) {
if (this.frameWidth == frameWidth
&& this.frameHeight == frameHeight
&& this.rotateDegree == rotateDegree
&& this.isMirror == isMirror) {
return;
}
dataInput = false;
this.frameWidth = frameWidth;
this.frameHeight = frameHeight;
this.rotateDegree = rotateDegree;
this.isMirror = isMirror;
yArray = new byte[this.frameWidth * this.frameHeight];
uArray = new byte[this.frameWidth * this.frameHeight / 4];
vArray = new byte[this.frameWidth * this.frameHeight / 4];
int yFrameSize = this.frameHeight * this.frameWidth;
int uvFrameSize = yFrameSize >> 2;
yBuf = ByteBuffer.allocateDirect(yFrameSize);
yBuf.order(ByteOrder.nativeOrder()).position(0);
uBuf = ByteBuffer.allocateDirect(uvFrameSize);
uBuf.order(ByteOrder.nativeOrder()).position(0);
vBuf = ByteBuffer.allocateDirect(uvFrameSize);
vBuf.order(ByteOrder.nativeOrder()).position(0);
// 頂點座標
squareVertices = ByteBuffer
.allocateDirect(GLUtil.SQUARE_VERTICES.length * FLOAT_SIZE_BYTES)
.order(ByteOrder.nativeOrder())
.asFloatBuffer();
squareVertices.put(GLUtil.SQUARE_VERTICES).position(0);
//紋理座標
if (isMirror) {
switch (rotateDegree) {
case 0:
coordVertice = GLUtil.MIRROR_COORD_VERTICES;
break;
case 90:
coordVertice = GLUtil.ROTATE_90_MIRROR_COORD_VERTICES;
break;
case 180:
coordVertice = GLUtil.ROTATE_180_MIRROR_COORD_VERTICES;
break;
case 270:
coordVertice = GLUtil.ROTATE_270_MIRROR_COORD_VERTICES;
break;
default:
break;
}
} else {
switch (rotateDegree) {
case 0:
coordVertice = GLUtil.COORD_VERTICES;
break;
case 90:
coordVertice = GLUtil.ROTATE_90_COORD_VERTICES;
break;
case 180:
coordVertice = GLUtil.ROTATE_180_COORD_VERTICES;
break;
case 270:
coordVertice = GLUtil.ROTATE_270_COORD_VERTICES;
break;
default:
break;
}
}
coordVertices = ByteBuffer.allocateDirect(coordVertice.length * FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asFloatBuffer();
coordVertices.put(coordVertice).position(0);
}
複製代碼
在Surface建立完成時進行Renderer初始化
private void initRenderer() {
rendererReady = false;
createGLProgram();
//啓用紋理
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
//建立紋理
createTexture(frameWidth, frameHeight, GLES20.GL_LUMINANCE, yTexture);
createTexture(frameWidth / 2, frameHeight / 2, GLES20.GL_LUMINANCE, uTexture);
createTexture(frameWidth / 2, frameHeight / 2, GLES20.GL_LUMINANCE, vTexture);
rendererReady = true;
}
複製代碼
其中createGLProgram
用於建立OpenGL Program並關聯着色器代碼中的變量
private void createGLProgram() {
int programHandleMain = GLUtil.createShaderProgram();
if (programHandleMain != -1) {
// 使用着色器程序
GLES20.glUseProgram(programHandleMain);
// 獲取頂點着色器變量
int glPosition = GLES20.glGetAttribLocation(programHandleMain, "attr_position");
int textureCoord = GLES20.glGetAttribLocation(programHandleMain, "attr_tc");
// 獲取片斷着色器變量
int ySampler = GLES20.glGetUniformLocation(programHandleMain, "ySampler");
int uSampler = GLES20.glGetUniformLocation(programHandleMain, "uSampler");
int vSampler = GLES20.glGetUniformLocation(programHandleMain, "vSampler");
//給變量賦值
/** * GLES20.GL_TEXTURE0 和 ySampler 綁定 * GLES20.GL_TEXTURE1 和 uSampler 綁定 * GLES20.GL_TEXTURE2 和 vSampler 綁定 * * 也就是說 glUniform1i的第二個參數表明圖層序號 */
GLES20.glUniform1i(ySampler, 0);
GLES20.glUniform1i(uSampler, 1);
GLES20.glUniform1i(vSampler, 2);
GLES20.glEnableVertexAttribArray(glPosition);
GLES20.glEnableVertexAttribArray(textureCoord);
/** * 設置Vertex Shader數據 */
squareVertices.position(0);
GLES20.glVertexAttribPointer(glPosition, GLUtil.COUNT_PER_SQUARE_VERTICE, GLES20.GL_FLOAT, false, 8, squareVertices);
coordVertices.position(0);
GLES20.glVertexAttribPointer(textureCoord, GLUtil.COUNT_PER_COORD_VERTICES, GLES20.GL_FLOAT, false, 8, coordVertices);
}
}
複製代碼
其中createTexture
用於根據寬高和格式建立紋理
private void createTexture(int width, int height, int format, int[] textureId) {
//建立紋理
GLES20.glGenTextures(1, textureId, 0);
//綁定紋理
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);
/** * {@link GLES20#GL_TEXTURE_WRAP_S}表明左右方向的紋理環繞模式 * {@link GLES20#GL_TEXTURE_WRAP_T}表明上下方向的紋理環繞模式 * * {@link GLES20#GL_REPEAT}:重複 * {@link GLES20#GL_MIRRORED_REPEAT}:鏡像重複 * {@link GLES20#GL_CLAMP_TO_EDGE}:忽略邊框截取 * * 例如咱們使用{@link GLES20#GL_REPEAT}: * * squareVertices coordVertices * -1.0f, -1.0f, 1.0f, 1.0f, * 1.0f, -1.0f, 1.0f, 0.0f, -> 和textureView預覽相同 * -1.0f, 1.0f, 0.0f, 1.0f, * 1.0f, 1.0f 0.0f, 0.0f * * squareVertices coordVertices * -1.0f, -1.0f, 2.0f, 2.0f, * 1.0f, -1.0f, 2.0f, 0.0f, -> 和textureView預覽相比,分割成了4 塊相同的預覽(左下,右下,左上,右上) * -1.0f, 1.0f, 0.0f, 2.0f, * 1.0f, 1.0f 0.0f, 0.0f */
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
/** * {@link GLES20#GL_TEXTURE_MIN_FILTER}表明所顯示的紋理比加載進來的紋理小時的狀況 * {@link GLES20#GL_TEXTURE_MAG_FILTER}表明所顯示的紋理比加載進來的紋理大時的狀況 * * {@link GLES20#GL_NEAREST}:使用紋理中座標最接近的一個像素的顏色做爲須要繪製的像素顏色 * {@link GLES20#GL_LINEAR}:使用紋理中座標最接近的若干個顏色,經過加權平均算法獲得須要繪製的像素顏色 */
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, format, width, height, 0, format, GLES20.GL_UNSIGNED_BYTE, null);
}
複製代碼
在數據源獲取到時裁剪並傳入幀數據
@Override
public void onPreview(final byte[] nv21, Camera camera) {
//裁剪指定的圖像區域
ImageUtil.cropNV21(nv21, this.squareNV21, previewSize.width, previewSize.height, cropRect);
//刷新GLSurfaceView
roundCameraGLSurfaceView.refreshFrameNV21(this.squareNV21);
}
複製代碼
NV21數據裁剪代碼
/** * 裁剪NV21數據 * * @param originNV21 原始的NV21數據 * @param cropNV21 裁剪結果NV21數據,須要預先分配內存 * @param width 原始數據的寬度 * @param height 原始數據的高度 * @param left 原始數據被裁剪的區域的左邊界 * @param top 原始數據被裁剪的區域的上邊界 * @param right 原始數據被裁剪的區域的右邊界 * @param bottom 原始數據被裁剪的區域的下邊界 */
public static void cropNV21(byte[] originNV21, byte[] cropNV21, int width, int height, int left, int top, int right, int bottom) {
int halfWidth = width / 2;
int cropImageWidth = right - left;
int cropImageHeight = bottom - top;
//原數據Y左上
int originalYLineStart = top * width;
int targetYIndex = 0;
//原數據UV左上
int originalUVLineStart = width * height + top * halfWidth;
//目標數據的UV起始值
int targetUVIndex = cropImageWidth * cropImageHeight;
for (int i = top; i < bottom; i++) {
System.arraycopy(originNV21, originalYLineStart + left, cropNV21, targetYIndex, cropImageWidth);
originalYLineStart += width;
targetYIndex += cropImageWidth;
if ((i & 1) == 0) {
System.arraycopy(originNV21, originalUVLineStart + left, cropNV21, targetUVIndex, cropImageWidth);
originalUVLineStart += width;
targetUVIndex += cropImageWidth;
}
}
}
複製代碼
傳給GLSurafceView並刷新幀數據
/** * 傳入NV21刷新幀 * * @param data NV21數據 */
public void refreshFrameNV21(byte[] data) {
if (rendererReady) {
yBuf.clear();
uBuf.clear();
vBuf.clear();
putNV21(data, frameWidth, frameHeight);
dataInput = true;
requestRender();
}
}
複製代碼
其中putNV21
用於將NV21中的Y、U、V數據分別取出
/** * 將NV21數據的Y、U、V份量取出 * * @param src nv21幀數據 * @param width 寬度 * @param height 高度 */
private void putNV21(byte[] src, int width, int height) {
int ySize = width * height;
int frameSize = ySize * 3 / 2;
//取份量y值
System.arraycopy(src, 0, yArray, 0, ySize);
int k = 0;
//取份量uv值
int index = ySize;
while (index < frameSize) {
vArray[k] = src[index++];
uArray[k++] = src[index++];
}
yBuf.put(yArray).position(0);
uBuf.put(uArray).position(0);
vBuf.put(vArray).position(0);
}
複製代碼
在執行requestRender
後,onDrawFrame
函數將被回調,在其中進行三個紋理的數據綁定並繪製
@Override
public void onDrawFrame(GL10 gl) {
// 分別對每一個紋理作激活、綁定、設置數據操做
if (dataInput) {
//y
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTexture[0]);
GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,
0,
0,
0,
frameWidth,
frameHeight,
GLES20.GL_LUMINANCE,
GLES20.GL_UNSIGNED_BYTE,
yBuf);
//u
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTexture[0]);
GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,
0,
0,
0,
frameWidth >> 1,
frameHeight >> 1,
GLES20.GL_LUMINANCE,
GLES20.GL_UNSIGNED_BYTE,
uBuf);
//v
GLES20.glActiveTexture(GLES20.GL_TEXTURE2);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTexture[0]);
GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D,
0,
0,
0,
frameWidth >> 1,
frameHeight >> 1,
GLES20.GL_LUMINANCE,
GLES20.GL_UNSIGNED_BYTE,
vBuf);
//在數據綁定完成後進行繪製
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
}
複製代碼
便可完成繪製。
有時候需求並不只僅是圓形預覽這麼簡單,咱們可能還要爲相機預覽加一層邊框
同樣的思路,咱們動態地修改邊框值,並進行重繪。
邊框自定義View中的相關代碼以下:
@Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
if (paint == null) {
paint = new Paint();
paint.setStyle(Paint.Style.STROKE);
paint.setAntiAlias(true);
SweepGradient sweepGradient = new SweepGradient(((float) getWidth() / 2), ((float) getHeight() / 2),
new int[]{Color.GREEN, Color.CYAN, Color.BLUE, Color.CYAN, Color.GREEN}, null);
paint.setShader(sweepGradient);
}
drawBorder(canvas, 6);
}
private void drawBorder(Canvas canvas, int rectThickness) {
if (canvas == null) {
return;
}
paint.setStrokeWidth(rectThickness);
Path drawPath = new Path();
drawPath.addRoundRect(new RectF(0, 0, getWidth(), getHeight()), radius, radius, Path.Direction.CW);
canvas.drawPath(drawPath, paint);
}
public void turnRound() {
invalidate();
}
public void setRadius(int radius) {
this.radius = radius;
}
複製代碼