Android開發 API人臉檢測實例教程(內含源碼)

Android中文API最新中文版
http://www.eoeandroid.com/thread-58597-1-1.html
=============帖子正文=======================

經過兩個主要的API,Android提供了一個直接在位圖上進行臉部檢測的方法,這兩個API分別是    android.media.FaceDetector和android.media.FaceDetector.Face,已經包含在Android官方API中。本教程來自Developer網站,向你們介紹了這些API,同時提供教程中實例代碼下載
 
102_120427155144_1_lit.jpg 
所謂人臉檢測就是指從一副圖片或者一幀視頻中標定出全部人臉的位置和尺寸。人臉檢測是人臉識別系統中的一個重要環節,也能夠獨立應用於視頻監控。在數字媒體日益普及的今天,利用人臉檢測技術還能夠幫助咱們從海量圖片數據中快速篩選出包含人臉的圖片。 在目前的數碼相機中,人臉檢測能夠用來完成自動對焦,即「臉部對焦」。「臉部對焦」是在自動曝光和自動對焦發明後,二十年來最重要的一次攝影技術革新。家用數碼相機,佔絕大多數的照片是以人爲拍攝主體的,這就要求相機的自動曝光和對焦以人物爲基準。

構建一我的臉檢測的Android Activity

你能夠構建一個通用的Android Activity,咱們擴展了基類ImageView,成爲MyImageView,而咱們須要進行檢測的包含人臉的位圖文件必須是565格式,API才能正常工做。被檢測出來的人臉須要一個置信測度(confidence measure),這個措施定義在android.media.FaceDetector.Face.CONFIDENCE_THRESHOLD。
最重要的方法實如今setFace(),它將FaceDetector對象實例化,同時調用findFaces,結果存放在faces裏,人臉的中點轉移到MyImageView。代碼以下:

?
代碼片斷,雙擊複製
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
public class TutorialOnFaceDetect1 extends Activity {
private MyImageView mIV;
private Bitmap mFaceBitmap;
private int mFaceWidth = 200 ;
private int mFaceHeight = 200 ;
private static final int MAX_FACES = 1 ;
private static String TAG = "TutorialOnFaceDetect" ;
 
@Override
public void onCreate(Bundle savedInstanceState) {
super .onCreate(savedInstanceState);
 
mIV = new MyImageView( this );
setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
 
// load the photo
Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);
mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true );
b.recycle();
 
mFaceWidth = mFaceBitmap.getWidth();
mFaceHeight = mFaceBitmap.getHeight();
mIV.setImageBitmap(mFaceBitmap);
 
// perform face detection and set the feature points setFace();
 
mIV.invalidate();
}
 
public void setFace() {
FaceDetector fd;
FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];
PointF midpoint = new PointF();
int [] fpx = null ;
int [] fpy = null ;
int count = 0 ;
 
try {
fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);
count = fd.findFaces(mFaceBitmap, faces);
} catch (Exception e) {
Log.e(TAG, "setFace(): " + e.toString());
return ;
}
 
// check if we detect any faces
if (count > 0 ) {
fpx = new int [count];
fpy = new int [count];
 
for ( int i = 0 ; i < count; i++) {
try {
faces<i>.getMidPoint(midpoint);
 
fpx = ( int )midpoint.x;
fpy = ( int )midpoint.y;
} catch (Exception e) {
Log.e(TAG, "setFace(): face " + i + ": " + e.toString());
}
}
}
 
mIV.setDisplayPoints(fpx, fpy, count, 0 );
}
} </i>


接下來的代碼中,咱們在MyImageView中添加setDisplayPoints() ,用來在被檢測出的人臉上標記渲染。圖1展現了一個標記在被檢測處的人臉上處於中心位置。
?
代碼片斷,雙擊複製
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
// set up detected face features for display
public void setDisplayPoints( int [] xx, int [] yy, int total, int style) {
mDisplayStyle = style;
mPX = null ;
mPY = null ;
 
if (xx != null && yy != null && total > 0 ) {
mPX = new int [total];
mPY = new int [total];
 
for ( int i = 0 ; i < total; i++) {
mPX = xx;
mPY = yy;
}
}
}


多人臉檢測

經過FaceDetector能夠設定檢測到人臉數目的上限。好比設置最多隻檢測10張臉:
  1. private static final int MAX_FACES = 10; 
複製代碼
圖2展現檢測到多張人臉的狀況。
102_120427154908_1.jpg 
定位眼睛中心位置

Android人臉檢測返回其餘有用的信息,例同時會返回如eyesDistance,pose,以及confidence。咱們能夠經過eyesDistance來定位眼睛的中心位置。

下面的代碼中,咱們將setFace()放在doLengthyCalc()中。同時圖3展現了定位眼睛中心位置的效果。
?
代碼片斷,雙擊複製
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
public class TutorialOnFaceDetect extends Activity {
private MyImageView mIV;
private Bitmap mFaceBitmap;
private int mFaceWidth = 200 ;
private int mFaceHeight = 200 ;
private static final int MAX_FACES = 10 ;
private static String TAG = "TutorialOnFaceDetect" ;
private static boolean DEBUG = false ;
 
protected static final int GUIUPDATE_SETFACE = 999 ;
protected Handler mHandler = new Handler(){
// @Override
public void handleMessage(Message msg) {
mIV.invalidate();
 
super .handleMessage(msg);
}
};
 
@Override
public void onCreate(Bundle savedInstanceState) {
super .onCreate(savedInstanceState);
 
mIV = new MyImageView( this );
setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
 
// load the photo
Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);
mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true );
b.recycle();
 
mFaceWidth = mFaceBitmap.getWidth();
mFaceHeight = mFaceBitmap.getHeight();
mIV.setImageBitmap(mFaceBitmap);
mIV.invalidate();
 
// perform face detection in setFace() in a background thread
doLengthyCalc();
}
 
public void setFace() {
FaceDetector fd;
FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];
PointF eyescenter = new PointF();
float eyesdist = 0 .0f;
int [] fpx = null ;
int [] fpy = null ;
int count = 0 ;
 
try {
fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);
count = fd.findFaces(mFaceBitmap, faces);
} catch (Exception e) {
Log.e(TAG, "setFace(): " + e.toString());
return ;
}
 
// check if we detect any faces
if (count > 0 ) {
fpx = new int [count * 2 ];
fpy = new int [count * 2 ];
 
for ( int i = 0 ; i < count; i++) {
try {
faces<i>.getMidPoint(eyescenter);
eyesdist = faces<i>.eyesDistance();
 
// set up left eye location
fpx[ 2 * i] = ( int )(eyescenter.x - eyesdist / 2 );
fpy[ 2 * i] = ( int )eyescenter.y;
 
// set up right eye location
fpx[ 2 * i + 1 ] = ( int )(eyescenter.x + eyesdist / 2 );
fpy[ 2 * i + 1 ] = ( int )eyescenter.y;
 
if (DEBUG) {
Log.e(TAG, "setFace(): face " + i + ": confidence = " + faces<i>.confidence()
+ ", eyes distance = " + faces<i>.eyesDistance()
+ ", pose = (" + faces<i>.pose(FaceDetector.Face.EULER_X) + ","
+ faces<i>.pose(FaceDetector.Face.EULER_Y) + ","
+ faces<i>.pose(FaceDetector.Face.EULER_Z) + ")"
+ ", eyes midpoint = (" + eyescenter.x + "," + eyescenter.y + ")" );
}
} catch (Exception e) {
Log.e(TAG, "setFace(): face " + i + ": " + e.toString());
}
}
}
 
mIV.setDisplayPoints(fpx, fpy, count * 2 , 1 );
}
 
private void doLengthyCalc() {
Thread t = new Thread() {
Message m = new Message();
 
public void run() {
try {
setFace();
m.what = TutorialOnFaceDetect.GUIUPDATE_SETFACE;
TutorialOnFaceDetect. this .mHandler.sendMessage(m);
} catch (Exception e) {
Log.e(TAG, "doLengthyCalc(): " + e.toString());
}
}
};
 
t.start();
}
}
</i></i></i></i></i></i></i>
相關文章
相關標籤/搜索