This guide walks you through creating a real-time face recognition Android app using Java, CameraX, and ML Kit (Google’s on-device machine learning SDK). No server needed—everything runs locally!
1. Prerequisites
- Android Studio (latest version)
- Android device/emulator (API 21+)
- Basic Java/Android knowledge
2. Setup Dependencies
Add to app/build.gradle:
gradle
dependencies {
// CameraX
implementation "androidx.camera:camera-core:1.3.0"
implementation "androidx.camera:camera-camera2:1.3.0"
implementation "androidx.camera:camera-lifecycle:1.3.0"
implementation "androidx.camera:camera-view:1.3.0"
// ML Kit Face Detection
implementation 'com.google.mlkit:face-detection:16.1.5'
// For drawing face contours
implementation 'com.github.duanhong169:drawabletoolbox:1.0.7'
}
Sync Gradle.
3. AndroidManifest.xml Permissions
xml
Run
<uses-permission android:name="android.permission.CAMERA" /> <uses-feature android:name="android.hardware.camera" /> <uses-feature android:name="android.hardware.camera.autofocus" />
4. MainActivity.java
Step 1: Initialize CameraX + Face Detector
java
import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import androidx.camera.core.*;
import androidx.camera.lifecycle.ProcessCameraProvider;
import androidx.core.content.ContextCompat;
import com.google.common.util.concurrent.ListenableFuture;
import com.google.mlkit.vision.common.InputImage;
import com.google.mlkit.vision.face.Face;
import com.google.mlkit.vision.face.FaceDetection;
import com.google.mlkit.vision.face.FaceDetector;
import com.google.mlkit.vision.face.FaceDetectorOptions;
import java.util.concurrent.ExecutionException;
public class MainActivity extends AppCompatActivity {
private ListenableFuture<ProcessCameraProvider> cameraProviderFuture;
private PreviewView previewView;
private FaceDetector detector;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
previewView = findViewById(R.id.previewView);
// High-accuracy face detection settings
FaceDetectorOptions options = new FaceDetectorOptions.Builder()
.setPerformanceMode(FaceDetectorOptions.PERFORMANCE_MODE_ACCURATE)
.setContourMode(FaceDetectorOptions.CONTOUR_MODE_ALL)
.build();
detector = FaceDetection.getClient(options);
startCamera();
}
private void startCamera() {
cameraProviderFuture = ProcessCameraProvider.getInstance(this);
cameraProviderFuture.addListener(() -> {
try {
ProcessCameraProvider provider = cameraProviderFuture.get();
bindPreview(provider);
} catch (ExecutionException | InterruptedException e) {
e.printStackTrace();
}
}, ContextCompat.getMainExecutor(this));
}
Step 2: Bind Camera Preview + Face Analysis
java
private void bindPreview(@NonNull ProcessCameraProvider provider) {
Preview preview = new Preview.Builder().build();
CameraSelector selector = new CameraSelector.Builder()
.requireLensFacing(CameraSelector.LENS_FACING_FRONT)
.build();
ImageAnalysis analysis = new ImageAnalysis.Builder()
.setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
.build();
analysis.setAnalyzer(ContextCompat.getMainExecutor(this), imageProxy -> {
@SuppressLint("UnsafeOptInUsageError")
Image mediaImage = imageProxy.getImage();
if (mediaImage != null) {
InputImage image = InputImage.fromMediaImage(mediaImage, imageProxy.getImageInfo().getRotationDegrees());
detector.process(image)
.addOnSuccessListener(faces -> {
for (Face face : faces) {
// Draw face bounding box (see Step 3)
Log.d("FaceRecognition", "Face detected! ID: " + face.getTrackingId());
}
})
.addOnFailureListener(e -> Log.e("FaceRecognition", "Detection failed", e))
.addOnCompleteListener(task -> imageProxy.close());
}
});
provider.unbindAll();
provider.bindToLifecycle(this, selector, preview, analysis);
preview.setSurfaceProvider(previewView.getSurfaceProvider());
}
5. Face Contour Drawing (Custom View)
Create FaceContourView.java:
java
import android.content.Context;
import android.graphics.*;
import android.util.AttributeSet;
import android.view.View;
import com.google.mlkit.vision.face.Face;
import java.util.List;
public class FaceContourView extends View {
private Paint paint;
private List<Face> faces;
public FaceContourView(Context context, AttributeSet attrs) {
super(context, attrs);
paint = new Paint();
paint.setColor(Color.GREEN);
paint.setStyle(Paint.Style.STROKE);
paint.setStrokeWidth(5f);
}
public void setFaces(List<Face> faces) {
this.faces = faces;
invalidate(); // Redraw
}
@Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
if (faces != null) {
for (Face face : faces) {
RectF bounds = new RectF(face.getBoundingBox());
canvas.drawRect(bounds, paint);
}
}
}
}
Add to activity_main.xml:
xml
Run
<com.yourpackage.FaceContourView
android:id="@+id/faceOverlay"
android:layout_width="match_parent"
android:layout_height="match_parent" />
6. Update MainActivity for Drawing
java
Copy
Download
// Add to MainActivity fields private FaceContourView faceOverlay; // In onCreate() faceOverlay = findViewById(R.id.faceOverlay); // Inside onSuccessListener in bindPreview() faceOverlay.setFaces(faces);
7. Run the App
- Connect an Android device (or use emulator with camera support).
- Click Run in Android Studio.
- Grant camera permissions when prompted.
8. Enhancements
- Face Recognition (Not Just Detection):
- Use
TensorFlow Litewith a pre-trained model (e.g.,FaceNet). - Compare face embeddings against a database.
- Use
- Save Recognized Faces:
- Store facial features in
Room Database.
- Store facial features in
- Add User Feedback:
- Show names when a known face is detected.
9. Key Takeaways
✅ Uses ML Kit for real-time face detection (no internet needed).
✅ CameraX for modern camera handling.
✅ Custom view to draw face bounding boxes.

