Senior 5 min · March 28, 2026

Biometric Auth — Why BIOMETRIC_WEAK Accepts a Printed Photo

BIOMETRIC_WEAK passes canAuthenticate() without liveness detection.

N
Naren · Founder
Plain-English first. Then code. Then the interview question.
About
 ● Production Incident 🔎 Debug Guide
Quick Answer
  • Biometric authentication = matching a physical trait against an enrolled template. On-device via FIDO2/WebAuthn stores templates in hardware (Secure Enclave/TEE). Server-side matching = you own the breach risk.
  • FAR (False Acceptance Rate) vs FRR (False Rejection Rate) are inversely coupled. Lower FAR raises FRR. At 4M users, 0.1% FRR = 4,000 daily failed logins = $40K/day call centre cost.
  • Android BIOMETRIC_STRONG (fingerprint, Face ID depth map) vs BIOMETRIC_WEAK (camera face unlock). WEAK = spoofable with a photo. Never use for finance.
  • Production trap: KeyPermanentlyInvalidatedException when user adds/removes a fingerprint. Unhandled → app crash or infinite loop. Handle it by revoking the server-side public key and re-enrolling.
  • FIDO2 phishing resistance: credential bound to origin (bank.example.com). Phishing site (b4nk.example.com) gets no credential. AitM attacks fail at protocol level.
  • Biggest mistake: setting userVerification: 'preferred' instead of 'required' in WebAuthn. On devices without biometrics, browser skips verification, UV flag = 0, server accepts — attacker needs only device PIN, not fingerprint.
Plain-English First

Think of your password as a key you carry around — anyone who steals the key can open the lock. Biometric authentication flips this: instead of checking the key, the lock checks your hand shape, your face geometry, or your voice pattern. The catch? You can't reissue your fingerprint after a breach the way you reissue a stolen key. The 'lock' also never gets a perfect read — it's always making a probabilistic judgment call, not an exact match. That one detail — probability, not certainty — is the root cause of almost every biometric security incident you'll read about.

A major Southeast Asian bank's mobile app failed a PCI-DSS audit in 2022 because their biometric layer was storing raw fingerprint images in a local SQLite database on the device — unencrypted, backed up to iCloud by default. Every successful login was also silently backing up the user's biometric template to a consumer cloud account the security team had zero control over. That's not a hypothetical. That's what happens when teams treat biometrics as a UX feature instead of a cryptographic identity primitive.

The problem biometric authentication solves is real and unsolved by passwords alone: humans are catastrophically bad at secret management. They reuse passwords, write them on sticky notes, and surrender them to the first convincing phishing email they get. Hardware tokens help, but they get lost. Biometrics are different — they bind authentication to something you physically are, which dramatically raises the cost of remote credential theft. That matters right now because credential stuffing attacks have become industrialised. Buying 50 million username/password pairs costs less than a decent dinner.

By the end of this article you'll be able to design a biometric authentication flow that doesn't leak templates, explain the FAR/FRR trade-off to a product manager without putting them to sleep, identify the four most common ways biometric systems get defeated in production, and make an informed architecture call between on-device matching and server-side matching. You'll also know exactly when to tell your CTO that biometrics alone aren't enough.

The Four Biometric Types and Their Real-World Attack Surfaces

Every biometric modality makes a different bet on the uniqueness and permanence of a physical trait. Understanding that bet is the only way to reason about the attack surface you're accepting.

Fingerprint recognition is the most widely deployed modality because the sensors are cheap, fast, and well-understood. The matching algorithm extracts minutiae points — ridge endings and bifurcations — and compares them against an enrolled template. The core vulnerability isn't the algorithm; it's liveness detection. A 2019 study from Cisco Talos showed that high-resolution fingerprint photos lifted from a wine glass could defeat capacitive sensors on most mid-range Android devices using a $500 mould-and-cast workflow. If your threat model includes targeted physical attacks on high-value accounts, fingerprint alone is a bad bet.

Facial recognition splits into two very different things people often conflate: 2D face matching (a photo comparison, basically) and 3D structured light or time-of-flight depth mapping like Apple Face ID. The 2D variant is trivially defeated by a photograph. Don't ship it for anything that matters. The 3D variant is genuinely hard to spoof — Face ID's published false acceptance rate is 1 in 1,000,000 — but it requires expensive dedicated hardware and fails non-trivially in bright outdoor light and at extreme angles.

Voice recognition and iris scanning round out the common deployment options. Iris is extremely accurate (false acceptance rates around 1 in 1.2 million in controlled conditions) but requires dedicated near-infrared hardware and degrades badly with contact lenses and certain eye conditions. Voice recognition is the weakest of the four in 2024 — modern voice synthesis models can clone a voice from 30 seconds of audio. If you're designing a phone-based authentication flow, voice biometrics should be treated as a convenience factor only, never as a primary security control.

BiometricModalityComparison.systemdesignSYSTEMDESIGN
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
// io.thecodeforge — System Design tutorial
// Biometric Modality SelectionArchitecture Decision Record
// Scenario: Multi-channel fintech app selecting auth modality per channel

// ─────────────────────────────────────────────────────────────────────
// SYSTEM CONTEXT
// ─────────────────────────────────────────────────────────────────────
// Channels:    Mobile (iOS + Android), Web portal, IVR phone banking
// Users:       ~4M retail banking customers
// Threat model: Remote credential stuffing, stolen device, SIM swap
//               (NOT nation-state targeted physical attacks)

// ─────────────────────────────────────────────────────────────────────
// MODALITY DECISION TREE — per channel
// ─────────────────────────────────────────────────────────────────────

CHANNEL: Mobile (iOS 14+, Android 9+)
├── Primary:   On-device fingerprint OR Face (delegated to OS biometric API)
│              WHY: We never see the raw biometric — OS handles enrollment
│              and matching inside the Secure Enclave / TEE. We only get
│              a signed cryptographic assertion: "user authenticated".
│              This eliminates template storage liability entirely.
│
├── Hardware:  Apple Secure Enclave (A7+), Android StrongBox / TEEREQUIREMENT: Confirm hardware-backed key storage at
│              enrollment time. Reject software-only fallbacks.
│              Android API: KeyInfo.isInsideSecureHardware() == true
│              iOS API:     SecAccessControlCreateWithFlags + .biometryAny
│
├── Fallback:  Device PIN (NOT SMS OTP — defeats SIM swap protection)
│
└── REJECT:    Raw image/template capture from custom SDK sensors.
               Legal exposure: BIPA (Illinois), GDPR Art.9 (special category).
               Operational exposure: You now own a biometric data breach.

CHANNEL: Web Portal (desktop browser)
├── Primary:   FIDO2 / WebAuthn with platform authenticator
│              WHY: Browser biometric APIs (navigator.credentials) delegate
│              to the OS — same Secure Enclave path as mobile.
│              Template never leaves the device. Ever.
│
├── Fallback:  FIDO2 hardware key (YubiKey) — for power users / ops staff
│
└── REJECT:    Browser-based face capture via WebRTC + cloud matching.
               Latency is 800ms–2s round-trip. Liveness detection requires
               server-side ML infra you have to maintain and retrain.
               Not worth it when WebAuthn gives you better security for free.

CHANNEL: IVR Phone Banking
├── Primary:   NONEdo not use voice biometrics as a security control.
│              WHY: Voice synthesis (ElevenLabs, RVC, VALL-E) clones voice
│              from publicly available audio. LinkedIn videos. Earnings calls.
│              Customer service recordings. The attack cost is ~$0.
│
├── Alternative: Step-up to mobile push notification with biometric
│               confirmation on the registered device. Forces the attacker
│               to control both the phone call AND the enrolled device.
│
└── If voice biometrics are mandated by business: treat as a single factor
    in a 2FA flow — never as the sole gate. Document the risk acceptance.

// ─────────────────────────────────────────────────────────────────────
// FAR / FRR OPERATING POINTS — what the product team needs to understand
// ─────────────────────────────────────────────────────────────────────

FAR  = False Acceptance Rate  — impostors incorrectly granted access
FRR  = False Rejection Rate   — legitimate users incorrectly denied

// These are inversely coupled. Lowering FAR raises FRR and vice versa.
// The operating point is a BUSINESS DECISION, not a technical one.

Modality          Typical FAR         Typical FRR     Notes
──────────────────────────────────────────────────────────────────────
Fingerprint       0.001% – 0.1%       0.1% – 1%       Degrades: wet/dry fingers
Face 3D (Face ID) 0.0001%             ~12%           Degrades: masks, sunglasses
Face 2D           1% – 5%             0.5%            NEVER use for finance
Iris              0.00008%            0.3%            Hardware cost is prohibitive
Voice             1% – 10%            1% – 5%         Synthetic audio attack: ~100% FAR

// At 4M users, even a 0.1% FRR means 4,000 locked-out customers per day.
// Your call centre cost per failed auth interaction: ~$812.
// That's $32,000–$48,000/day in hidden costs from a threshold decision.
// Run this number in front of your product manager before finalising thresholds.
Output
Architecture Decision Record evaluated.
Mobile channel: Fingerprint/Face via OS API → hardware-backed key assertion
Web channel: FIDO2/WebAuthn platform authenticator
IVR channel: Voice biometrics REJECTED — step-up to mobile push 2FA
FAR/FRR operating point requires business sign-off.
At 4M users + 0.1% FRR → 4,000 failed auths/day → ~$40,000/day call centre exposure.
Recommend starting at vendor default threshold and A/B testing tighter values.
Template storage liability: ZERO (all matching delegated to device OS).
GDPR Art.9 / BIPA exposure: ZERO (no biometric data leaves the device).
Android's isInsideSecureHardware() Lie
On some Android OEM builds (seen this on a major Chinese manufacturer's flagship in 2022), KeyInfo.isInsideSecureHardware() returns true even when StrongBox is unavailable and the key is stored in a software-emulated TEE. Always cross-check with KeyInfo.getSecurityLevel() == KeyProperties.SECURITY_LEVEL_STRONG_BOX — not just the boolean. If StrongBox isn't available, decide upfront whether TEE-only is acceptable for your threat model, and make that a documented risk decision, not an accidental one.
Production Insight
On a major Chinese OEM's flagship device, isInsideSecureHardware() returned true, but getSecurityLevel() returned SECURITY_LEVEL_SOFTWARE. The device had no TEE at all.
The team had been accepting "hardware" keys from this device for 18 months.
Rule: Always check both booleans and security level. Reject keys with SECURITY_LEVEL_SOFTWARE for any sensitive auth flow.
Key Takeaway
Fingerprint is cheap but spoofable. 3D Face ID is strong but requires hardware. 2D face is a photo comparison — never use for finance.
Voice biometrics are broken (AI voice cloning). Iris is accurate but expensive.
Rule: On-device matching via FIDO2/WebAuthn eliminates template storage liability entirely.

How Biometric Matching Actually Works: Templates, Thresholds, and the Liveness Problem

Most engineers treat the biometric sensor as a black box that returns true or false. That mental model will burn you in production. The actual pipeline has five stages, each with its own failure mode.

Capture → Feature Extraction → Template Generation → Matching → Decision. The sensor captures a raw signal — pixels, capacitance grid, acoustic waveform. The feature extractor converts that signal into a compact mathematical representation called a template — for fingerprints, that's typically a set of (x, y, angle) tuples for minutiae points, stored as a vector of 400–1,000 bytes. The matcher computes a similarity score between the live template and the enrolled template. The decision module applies a threshold to that score.

The threshold is where everything gets political. Security teams want a low FAR — they want to minimise impostors getting through. Product teams want a low FRR — they want to minimise frustrated legitimate users calling support. These goals are mathematically opposed. You can't improve both simultaneously with the same algorithm. The Equal Error Rate (EER) is the operating point where FAR equals FRR, and it's used as a single-number benchmark, but you almost never want to operate at EER in production. You tune based on the cost of each error type in your specific context.

Liveness detection is the layer that gets skipped in proof-of-concepts and costs you in production. Without it, a static artefact — a photo, a mould, a replay attack — gets the same similarity score as a live person. PAD (Presentation Attack Detection) is a separate subsystem, and its quality varies enormously across vendors. When you're evaluating a biometric SDK, the ISO/IEC 30107-3 PAD compliance level is the number you ask for, not the marketing FAR figure.

BiometricMatchingPipeline.systemdesignSYSTEMDESIGN
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
// io.thecodeforge — System Design tutorial
// Biometric Matching PipelineSequence + Data Flow
// Scenario: Mobile banking app, on-device matching via Android BiometricPrompt

// ─────────────────────────────────────────────────────────────────────
// SEQUENCE: Enrollment (one-time, at account setup)
// ─────────────────────────────────────────────────────────────────────

UserApp: "Enable biometric login"

AppOS BiometricManager:
    CHECK canAuthenticate(BIOMETRIC_STRONG)
    // BIOMETRIC_STRONG requires hardware-backed credential
    // BIOMETRIC_WEAK allows face unlock via camera (lower assurance)
    // NEVER use DEVICE_CREDENTIAL alone for financial flows

    IF result == BIOMETRIC_ERROR_NO_HARDWARE:
        // Device has no biometric hardware. Offer PIN fallback.
        // Log telemetry — useful for device support decisions.
        ABORT enrollment, present PIN setup

    IF result == BIOMETRIC_ERROR_NONE_ENROLLED:
        // Hardware exists but user hasn't enrolled a fingerprint/face.
        // Direct to system Settings — you CANNOT enroll on their behalf.
        LAUNCH Intent(Settings.ACTION_BIOMETRIC_ENROLL)

AppAndroid Keystore:
    // Generate an asymmetric key pair BOUND to biometric authentication.
    // This is the critical step. The private key is:
    //   - Stored in hardware-backed Keystore (TEE or StrongBox)
    //   - ONLY usable after successful biometric auth in the same session
    //   - Never exportable — cannot be read out of the hardware, ever

    KeyPairGenerator.getInstance("EC", "AndroidKeyStore")
    KeyGenParameterSpec.Builder("biometric_auth_key", PURPOSE_SIGN)
        .setAlgorithmParameterSpec(ECGenParameterSpec("secp256r1"))
        .setUserAuthenticationRequired(true)          // BOUND to biometric
        .setUserAuthenticationParameters(
            timeout = 0,                              // 0 = require fresh auth
            type = AUTH_BIOMETRIC_STRONG              // NOT weak face unlock
        )
        .setInvalidatedByBiometricEnrollment(true)   // KEY IS DELETED if new
                                                     // fingerprint is added.
                                                     // Prevents adding attacker
                                                     // fingerprint = instant access.

AppBackend API:
    POST /biometric/enroll
    PAYLOAD: { userId, publicKey (PEM), deviceId, keyAttestation }
    // keyAttestation is a certificate chain from the hardware proving the key
    // genuinely lives in hardware. Verify this server-side — DO NOT skip it.
    // An unattested key could be a software key on a rooted device.

Backend:
    VERIFY keyAttestation certificate chain against Google's root CA
    // Google publishes root certificates for hardware attestation.
    // If attestation fails: reject enrollment, flag account for review.
    STORE { userId, publicKey, deviceId, enrolledAt }
    // You are storing a PUBLIC KEY. Not a fingerprint. Not a template.
    // A breach of this table leaks nothing biometric.

// ─────────────────────────────────────────────────────────────────────
// SEQUENCE: Authentication (every login)
// ─────────────────────────────────────────────────────────────────────

UserApp: "Log in with fingerprint"

AppBackend:
    GET /biometric/challenge
    RESPONSE: { challenge: "<32-byte cryptographically random nonce>" }
    // The challenge MUST be server-generated and single-use.
    // Client-generated challenges = replay attack surface.
    // Store challenge server-side with a 60-second TTL.

AppOS BiometricPrompt:
    // Show system fingerprint dialog.
    // Pass a CryptoObject wrapping a Signature initialised with private key.
    BiometricPrompt.authenticate(
        CryptoObject(signature),
        cancellationSignal,
        executor,
        authCallback
    )

OSSecure Hardware:
    BIOMETRIC MATCHING HAPPENS HERE — entirely inside TEE/Secure Enclave
    The app NEVER sees the fingerprint image or template.
    The hardware returns: SUCCESS or FAILURE
    On SUCCESS: unlocks the private key for use within this session

OSApp (on success):
    // The Signature object is now usable — private key is unlocked.
    signature.update(challengeBytes)  // Sign the server's nonce
    signedChallenge = signature.sign() // Produces EC signature

AppBackend:
    POST /biometric/verify
    PAYLOAD: { userId, deviceId, signedChallenge, challengeId }

Backend:
    RETRIEVE challenge by challengeId — verify it's < 60 seconds old
    MARK challenge as consumed — prevents replay within TTL window
    RETRIEVE publicKey for userId + deviceId
    VERIFY ECDSA signature over challenge bytes using stored publicKey
    // Standard ECDSA verify. No biometric data ever reaches the server.
    // If signature is valid: the user's finger was on the enrolled device.
    // That's the cryptographic guarantee. Nothing more, nothing less.

    IF valid:
        ISSUE short-lived JWT (15min) + refresh token (rotated on each use)
        LOG auth event: { userId, deviceId, timestamp, ipAddress, geoHash }

// ─────────────────────────────────────────────────────────────────────
// WHAT HAPPENS WHEN SOMEONE ADDS A NEW FINGERPRINT TO THE DEVICE
// ─────────────────────────────────────────────────────────────────────

// setInvalidatedByBiometricEnrollment(true) means:
// → Android deletes biometric_auth_key from Keystore automatically
// → Next login attempt: KeyPermanentlyInvalidatedException thrown
// → App detects this, prompts: "Your biometric login was reset.
//   Please log in with your password to re-enrol."
// → Requires password re-authentication before new biometric enrolment
//
// WHY THIS MATTERS: Without this, an attacker who has brief physical
// access to an unlocked phone can add THEIR fingerprint to the device
// and immediately gain access to the banking app with their own finger.
// This flag closes that attack vector.
Output
ENROLLMENT FLOW:
canAuthenticate(BIOMETRIC_STRONG) → SUCCESS
KeyPairGenerator → EC key generated in StrongBox hardware
keyAttestation verified against Google hardware attestation root CA
Backend stores: { userId, publicKeyPEM, deviceId, enrolledAt }
Biometric data stored: NONE (all in device hardware)
AUTHENTICATION FLOW:
Challenge issued: 32-byte nonce, TTL=60s, single-use
BiometricPrompt shown → fingerprint matched inside TEE
Private key unlocked → ECDSA signature over challenge computed
Backend: challenge consumed, signature verified ✓
JWT issued: exp=15min | refresh token rotated
SECURITY PROPERTIES:
Template breach risk: NONE (never leaves device hardware)
Replay attack surface: NONE (single-use challenge, 60s TTL)
New-fingerprint attack: BLOCKED (key invalidated on enrol change)
Rooted device / soft key: BLOCKED (hardware attestation required)
Never Skip Hardware Attestation Verification
I've reviewed three fintech codebases where the backend stored the public key from enrollment without verifying the keyAttestation certificate chain. On a rooted Android device, you can generate an EC key in software, fake the enrollment POST, and the backend happily stores it. Now your 'biometric' login is just an EC signature from a key sitting in a file on a rooted phone — there's no biometric involved at all. Always verify attestation against Google's hardware attestation root (available at android.googleapis.com/attestation/status). Reject soft-backed keys for any flow that matters.
Production Insight
A rooted Pixel 3 bypassed attestation by using Magisk module that spoofed the attestation certificate.
The backend's verification checked only that a certificate was present, not that it was valid against Google's root.
Fix: implement full chain verification, including revocation checking via Google's online endpoint.
Rule: Hardware attestation is a chain of trust. Verify every link, not just the leaf certificate.
Key Takeaway
Biometric pipeline: Capture → Feature Extraction → Template → Matching → Threshold.
Threshold balances FAR vs FRR — business decision, not technical.
Liveness detection (PAD) is mandatory. Without it, a photo defeats face unlock.

Where Biometric Systems Break Down in Production

The failure modes that actually show up in incident post-mortems aren't the ones in the threat model documents. They're operational, edge-case, and demographic.

Template aging is the slow-burn failure nobody plans for. Fingerprints change. Aging, manual labour, chemotherapy, eczema, and significant weight changes all degrade match quality over time. I've seen a healthcare company's nurse workforce hit a 12% FRR spike — meaning roughly 1 in 8 nurses couldn't authenticate at the medication dispensing terminal — because their templates were enrolled 18 months prior and nobody had built a re-enrollment workflow. The fix isn't a better algorithm. It's building periodic re-enrollment prompts into your UX from day one.

Biometric fallback paths are where most security properties go to die. Users who can't authenticate biometrically fall back to PIN, password, or SMS OTP. If that fallback is weaker than the primary path, attackers just target the fallback. I've seen apps with Face ID that fell back to a 4-digit PIN — making the entire biometric layer security theatre. Your fallback must be designed as a primary security control, not an afterthought.

Privacy regulations create operational complexity that engineers routinely underestimate. BIPA in Illinois requires written consent and a published retention policy before collecting biometric identifiers. GDPR Article 9 classifies biometric data as special category data requiring explicit legal basis. If you're building server-side biometric matching — even as a backend service your mobile app calls — you need legal sign-off in every jurisdiction where your users live before you ship. The on-device matching architecture I showed in the previous section isn't just a security choice; it's also how you avoid being BIPA's next headline.

BiometricFailureModeMitigation.systemdesignSYSTEMDESIGN
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
// io.thecodeforge — System Design tutorial
// Biometric Failure Mode CatalogueProduction Incident Patterns
// Scenario: High-availability fintech app, 4M users, 99.9% auth SLA

// ─────────────────────────────────────────────────────────────────────
// FAILURE MODE 1: KeyPermanentlyInvalidatedException
// ─────────────────────────────────────────────────────────────────────

SYMPTOM:
    User taps "Login with fingerprint"
    App throws: KeyPermanentlyInvalidatedException
    User sees: generic error screen or app crash (if unhandled)

CAUSE:
    A) User added/removed a fingerprint on their device
    B) User changed/removed screen lock (depending on key config)
    C) Device was enrolled with setInvalidatedByBiometricEnrollment(true)
       — which is correct behaviour, not a bug

HANDLING:
    catch (KeyPermanentlyInvalidatedException e) {
        // Do NOT show a generic error. The user isn't broken.
        // Their security configuration changed. That's expected.

        // 1. Delete the invalidated key from local Keystore
        keyStore.deleteEntry("biometric_auth_key")

        // 2. Revoke the associated public key on the server
        //    POST /biometric/revoke { userId, deviceId }
        //    This prevents the old public key being used in a confused deputy attack
        apiClient.revokeBiometricKey(userId, deviceId)

        // 3. Present clear UX: "Your biometric login needs to be reset"
        //    Require password authentication before re-enrollment
        //    This is intentional friction — it's the security checkpoint
        navigator.navigate(Route.BiometricReenrollment)
    }

// ─────────────────────────────────────────────────────────────────────
// FAILURE MODE 2: Biometric lockout after failed attempts
// ─────────────────────────────────────────────────────────────────────

SYMPTOM:
    After 5 failed fingerprint attempts, Android locks biometric auth.
    App receives: BiometricPrompt.ERROR_LOCKOUT (error code 7)
    After extended failures: ERROR_LOCKOUT_PERMANENT (error code 9)
    User is stuck — biometric is disabled until device PIN is entered.

CAUSE:
    OS-enforced anti-brute-force protection. This is correct.
    But apps that don't handle it gracefully strand the user.

HANDLING:
    onAuthenticationError(errorCode, errString) {
        when (errorCode) {
            BIOMETRIC_ERROR_LOCKOUT -> {
                // Temporary lockout (30 seconds typically)
                // Guide user to verify with device PIN/password
                // This unlocks biometric after successful PIN entry
                showMessage("Too many attempts. Verify with your PIN to continue.")
                // Offer explicit "Use PIN instead" button — don't just show error
            }
            BIOMETRIC_ERROR_LOCKOUT_PERMANENT -> {
                // Requires device PIN. Cannot be unlocked programmatically.
                // Offer password-based app login as parallel path
                showMessage("Biometric locked. Log in with your password.")
                // Log this event — spike in LOCKOUT_PERMANENT is an
                // indicator of a targeted brute-force campaign on physical devices
            }
            BIOMETRIC_ERROR_NO_BIOMETRICS -> {
                // User removed all enrolled biometrics since last session
                // Treat same as KeyPermanentlyInvalidatedException path
            }
        }
    }

// ─────────────────────────────────────────────────────────────────────
// FAILURE MODE 3: Template aging / demographic degradation
// ─────────────────────────────────────────────────────────────────────

SYMPTOM:
    FRR rises gradually over months. Support tickets:
    "My fingerprint doesn't work anymore" — user hasn't changed anything.
    Affects specific cohorts disproportionately: manual workers,
    elderly users, users with skin conditions.

CAUSE:
    Enrolled template no longer matches aged/changed biometric closely enough
    to exceed the matching threshold. This is a physics problem, not a bug.

MITIGATION:
    // Monitor per-cohort FRR in your auth analytics pipeline
    // Alert threshold: FRR rising above 2% in any user segment

    // Build proactive re-enrollment:
    IF (daysSinceEnrollment > 365) AND (recentAuthSuccessRate < 0.95):
        PROMPT user on next successful login:
        "Update your biometric for better reliability" (not security framing)
        // Users accept re-enrollment when framed as convenience, not security

    // On-device matching note: you can't query match scores directly from
    // Android BiometricPrompt — it's binary success/failure.
    // Infer quality from FRR trends in your server-side auth event log.

// ─────────────────────────────────────────────────────────────────────
// FAILURE MODE 4: Weak fallback path undermining biometric security
// ─────────────────────────────────────────────────────────────────────

SYMPTOM (SECURITY):
    Biometric auth is bypassed entirely by attackers who target the fallback.
    App has Face ID → fallback is SMS OTPSIM swap = full account takeover.
    The biometric layer added zero net security.

ANTI-PATTERN:
    BiometricSMS OTP fallback      (SIM swap defeats it)
    Biometric4-digit PIN fallback  (brute-forceable offline if device stolen)
    BiometricSecurity question     (social engineering / data breach defeats it)

CORRECT PATTERN:
    BiometricDevice PIN/Password fallback (OS-enforced, hardware-encrypted)
    WHY: If an attacker has the device and the PIN, they already have physical
         possession. The threat model at that point is device loss, not remote
         compromise. Device-level encryption handles that — not your app.

    For step-up actions (transfers > $5,000, change of registered device):
    NEVER accept biometric OR fallback alone.
    REQUIRE: biometric + separate OTP to registered email or authenticator app.
    Make fallback paths trigger step-up re-verification, not bypass it.
Output
FAILURE MODE CATALOGUE — Mitigation Status
KeyPermanentlyInvalidatedException → HANDLED: key revoked, re-enroll flow
BiometricPrompt.ERROR_LOCKOUT → HANDLED: PIN fallback presented
BiometricPrompt.ERROR_LOCKOUT_PERM → HANDLED: password login offered, event logged
Template aging / FRR drift → MITIGATED: re-enrollment prompt at 365d
Weak fallback (SMS OTP) → BLOCKED: device PIN only as fallback
Step-up bypass via fallback → BLOCKED: high-value actions require 2FA regardless
AUTH ANALYTICS ALERT RULES:
FRR > 2% in any cohort → PagerDuty: P2 (likely template aging)
ERROR_LOCKOUT_PERMANENT spike → PagerDuty: P1 (possible brute-force campaign)
Attestation failures > 0.1% → PagerDuty: P1 (rooted device activity)
Auth Event Logging as Your FRR Sensor
Since Android BiometricPrompt doesn't expose match scores, you can't directly measure FRR per user. Instead, log every auth attempt with outcome, userId, deviceId, and timestamp on your server. A user who succeeds once after three app-level cancellations (which often happen after silent biometric failures the user dismissed) is showing you a soft FRR signal. Segment this by account age, device model, and geography — you'll find your template aging hotspots before they become support ticket tsunamis.
Production Insight
A healthcare client had 12% FRR among nurses after 18 months. The issue wasn't device quality — fingerprints degrade with frequent handwashing and age.
The fix wasn't algorithm tuning. It was building a re-enrollment prompt at 12 months.
Rule: Monitor FRR per cohort. Build automatic re-enrollment nudges. Frame as 'improve reliability' not 're-secure your account'.
Key Takeaway
Template aging is inevitable. Build periodic re-enrollment into your UX.
Fallback paths must be as strong as the primary biometric. SMS OTP as fallback = SIM swap defeats Face ID.
On-device matching = no GDPR/BIPA template storage liability. Server-side matching = you own the breach risk.
● Production incidentPOST-MORTEMseverity: high

The Face ID That Accepted a Screenshot

Symptom
Android users reported unauthorised transactions. The app's authentication logs showed successful biometric logins at times when users were asleep. No device rooting detected. The attacker was using the user's own phone? No — they had physical access to the device? Actually, the attack vector was simpler: the user's social media photos were enough.
Assumption
The team assumed Android's face authentication was as secure as Apple's Face ID. They didn't know Android has two face authentication APIs: BIOMETRIC_STRONG (requires hardware-backed depth sensor, similar to Face ID) and BIOMETRIC_WEAK (camera-only, spoofable with a photo). They used the easier-to-implement weak API.
Root cause
The Android BiometricPrompt configuration used BIOMETRIC_WEAK instead of BIOMETRIC_STRONG. The canAuthenticate() check passed on devices without dedicated face detection hardware. The camera-based face unlock requires no liveness detection — a printed photo held in front of the camera triggers the same authentication event as a real face. The app also did not verify the AUTHENTICATION_RESULT_TYPE in the BiometricPrompt callback. It accepted both BIOMETRIC_SUCCESS and DEVICE_CREDENTIAL_SUCCESS (PIN/password fallback) without distinction. An attacker with device PIN could bypass face unlock entirely. Worse, the public key enrollment skipped hardware attestation verification. A rooted device could generate a software key, claim it was hardware-backed, and the server stored it. The entire biometric security property collapsed to "is there a private key somewhere on this device?"
Fix
1. Changed Android canAuthenticate() to check for BIOMETRIC_STRONG exclusively. Devices without hardware-backed biometrics (most cheap Android phones with camera-only face unlock) no longer see the biometric option. 2. Added setAllowedAuthenticators(BIOMETRIC_STRONG | DEVICE_CREDENTIAL) — allowed device PIN only as fallback, not as a biometric alternative. 3. Verified authentication result type: onAuthenticationSucceeded now checks result.getAuthenticationType() == BIOMETRIC_SUCCESS. Rejects DEVICE_CREDENTIAL_SUCCESS for biometric-only paths. 4. Enforced hardware attestation verification server-side: parse attestationObject, verify certificate chain to Google/Apple root CA, check KeyDescription extension for TRUSTED_ENVIRONMENT or STRONG_BOX. Reject any key without this. 5. For iOS, enforced .biometryAny (Face ID/Touch ID) not .biometryCurrentSet (allows fallback to passcode). 6. Added server-side logging of authentication type (biometric vs device credential) for fraud detection.
Key lesson
  • BIOMETRIC_STRONG vs BIOMETRIC_WEAK on Android is the difference between depth-sensor and camera-only. Use STRONG for finance.
  • Always verify authentication type in BiometricPrompt callback. Distinguish between fingerprint/face and PIN/password fallback.
  • Hardware attestation is non-negotiable. Verify the attestationObject certificate chain server-side. Reject software-backed keys.
  • User education: inform users that camera-only face unlock is not secure for financial apps. Better: don't offer it at all.
  • Test with a printed photo before shipping to production. If it works, your biometric auth is broken.
Production debug guideSymptom → Action mapping for common biometric failure modes in production mobile apps.5 entries
Symptom · 01
Android BiometricPrompt always returns ERROR_NO_BIOMETRICS even when user has fingerprint enrolled
Fix
Check canAuthenticate() flags. Are you using BIOMETRIC_STRONG but device has only BIOMETRIC_WEAK (camera face unlock)? Many mid-range Android phones lack hardware-backed sensors. Fall back to device PIN gracefully.
Symptom · 02
FIDO2 authentication fails with 'signCount <= storedSignCount' error on second device
Fix
This is expected for synced passkeys (iCloud Keychain, Google Password Manager). The spec sets signCount=0 for synced credentials. Implement: if receivedSignCount == 0, skip the clone check. Only trigger alarm when signCount > 0 AND <= stored count.
Symptom · 03
Users can't re-enrol after adding a new fingerprint — KeyPermanentlyInvalidatedException
Fix
This is correct behaviour — the key was deleted when the biometric set changed. Your app must catch this, revoke the server-side public key, and prompt for password re-authentication before allowing re-enrolment. Don't show a generic error.
Symptom · 04
Server logs show BIOMETRIC_WEAK authentications from iOS devices
Fix
iOS doesn't expose the strength level via BiometricPrompt. But .biometryAny includes Face ID/Touch ID; .biometryCurrentSet also includes passcode fallback. Ensure you're not inadvertently accepting passcode as biometric.
Symptom · 05
WebAuthn registration works, but authentication fails with 'Invalid state'
Fix
Check if userVerification: 'required' is set consistently in both registration and authentication options. If registration used 'preferred' but authentication uses 'required', the authenticator may not have stored user verification data. Match them exactly.
★ Biometric Debug Cheat SheetFast diagnostics for biometric authentication failures in production mobile apps.
Android biometric not working — ERROR_NO_BIOMETRICS
Immediate action
Check which authenticator type you're requiring
Commands
androidx.biometric.BiometricManager.from(context).canAuthenticate(BIOMETRIC_STRONG)
adb logcat | grep -i 'biometric'
Fix now
Change to BIOMETRIC_STRONG | DEVICE_CREDENTIAL for fallback, or handle gracefully with PIN-only fallback.
FIDO2 signCount clone detection false positive on second device+
Immediate action
Check if credential is synced via cloud (iCloud/Google)
Commands
grep -n 'signCount' WebAuthnRegistrationResponse.java
grep -n 'BackupState' AuthenticatorData.java
Fix now
Implement: if receivedSignCount == 0, skip clone detection. This is spec-compliant for synced passkeys.
Biometric prompt shows, but onAuthenticationSucceeded called instantly without fingerprint+
Immediate action
Check if you're accepting DEVICE_CREDENTIAL_SUCCESS type
Commands
BiometricPrompt.AuthenticationResult.getAuthenticationType()
Log.d(TAG, "auth type: " + result.getAuthenticationType())
Fix now
Filter out DEVICE_CREDENTIAL_SUCCESS for biometric-only flows: if (authType != BIOMETRIC_SUCCESS) { reject() }
WebAuthn userVerification: 'preferred' not working as expected+
Immediate action
Check that 'preferred' is not silently dropping UV requirement
Commands
grep -n 'userVerification' WebAuthnRequestOptions.java
Check authenticatorData.flags for UV bit (bit 2)
Fix now
Change to userVerification: 'required' for any financial auth flow. 'preferred' is not sufficient.
Hardware attestation failing on Pixel 6++
Immediate action
Check if you're parsing the attestation certificate chain correctly
Commands
openssl x509 -in attestation_cert.pem -text -noout
grep -n '1.3.6.1.4.1.11129.2.1.17' attestation_extension.txt
Fix now
Verify the KeyDescription extension OID. Android 12+ uses different attestation format. Use a well-maintained library (webauthn4j, py_webauthn), don't parse manually.
On-Device vs Server-Side Biometric Matching
AttributeOn-Device Matching (FIDO2/TEE)Server-Side Matching (Cloud Biometric API)
Template storage locationDevice hardware (TEE/Secure Enclave) — never leavesVendor cloud or your own database — you own the breach risk
GDPR/BIPA compliance complexityLow — no biometric data transmitted or stored server-sideHigh — explicit legal basis, DPA, retention policy required per jurisdiction
FAR/FRR tuning controlNone — fixed by OS vendor (Apple/Google)Full control — tune threshold per user cohort
Phishing resistanceNative (FIDO2 origin binding)None — depends entirely on transport security
Offline capabilityFull — matching is localNone — requires network round-trip
Infrastructure costZero matching infra — outsourced to device OSSignificant — GPU inference, model serving, latency SLA
Liveness detection qualityHigh (Apple Face ID: structured light depth map)Varies — depends on SDK vendor and model version
Cross-device portabilityPasskeys: yes (iCloud/Google sync). Bound keys: noYes — template enrolled once, match from any device
Cloning / template theft impactPrivate key in hardware — cryptographically non-exportableTemplate leak enables synthetic recreation of biometric
Vendor lock-inLow — FIDO2 is open standardHigh — proprietary template formats, API contracts
Suitable for regulated financeYes — preferred architectureOnly if server-side matching is legally mandated (rare)

Key takeaways

1
Biometrics don't authenticate a person
they authenticate that a specific physical trait was present on a specific enrolled device. That distinction matters enormously for threat modelling: remote credential theft becomes much harder, but physical device compromise becomes the new attack surface.
2
The threshold decision (FAR vs FRR) is a business decision disguised as a technical one
at 4M users, a 0.1% FRR increase translates to 4,000 failed logins per day and potentially $40K/day in call centre costs. Run the numbers before product tells you to 'just make it stricter'.
3
Use FIDO2/WebAuthn with on-device matching for any new system
you get phishing resistance, zero template storage liability, and GDPR/BIPA compliance essentially for free. Server-side biometric matching is only worth the operational and legal complexity if you have a specific cross-device use case that genuinely cannot be served any other way.
4
The counterintuitive truth that separates engineers who've shipped biometrics from those who've only read about them
the biometric match itself is rarely the weakest link. The fallback path, the recovery flow, and the enrollment security gate are almost always where the real vulnerabilities live.

Common mistakes to avoid

5 patterns
×

Using BIOMETRIC_WEAK instead of BIOMETRIC_STRONG for financial auth on Android

Symptom
Users authenticate successfully with camera-based face unlock (no depth sensor), which can be spoofed with a printed photo. Attacker who has a photo of the user's face can unlock the app.
Fix
Always pass BiometricManager.Authenticators.BIOMETRIC_STRONG to canAuthenticate() and KeyGenParameterSpec.setUserAuthenticationParameters(). Verify BiometricPrompt returns AUTHENTICATION_RESULT_TYPE_BIOMETRIC, not AUTHENTICATION_RESULT_TYPE_DEVICE_CREDENTIAL.
×

Storing the biometric enrollment public key without verifying hardware attestation

Symptom
Rooted devices can enroll with a software-generated key, bypassing all hardware security guarantees. The server accepts the key and treats it as hardware-backed biometric.
Fix
Parse the attestationObject, walk the certificate chain to Google's root CA (or Apple's), and check KeyDescription extension OID 1.3.6.1.4.1.11129.2.1.17 confirms KEY_MASTER_SECURITY_LEVEL == TRUSTED_ENVIRONMENT or STRONG_BOX. Reject registrations that fail this check.
×

Setting userVerification: 'preferred' instead of 'required' in WebAuthn

Symptom
On a device with no biometric enrolled, the browser silently skips user verification, sets the UV flag to 0, and the server accepts the assertion as a valid biometric auth. Attacker needs only the device password, not a fingerprint.
Fix
Always set userVerification: 'required' in PublicKeyCredentialRequestOptions. Verify the UV bit (bit 2) in authenticatorData.flags is set to 1 server-side. Throw an AuthenticationException if UV is not set, regardless of what the client claims.
×

Not handling KeyPermanentlyInvalidatedException — app crash or infinite loop

Symptom
User adds or removes a fingerprint. Next login attempt crashes with uncaught exception, or app enters infinite re-enrollment loop without password re-authentication.
Fix
Catch KeyPermanentlyInvalidatedException, revoke the public key server-side, require password re-authentication, then allow fresh enrollment. Never auto-re-enrol without password verification.
×

Using SMS OTP as biometric fallback — SIM swap attack

Symptom
App has Face ID enabled. Fallback is SMS OTP. Attacker performs SIM swap (social engineering mobile carrier), resets password, receives OTP, and gains full account access. Biometric layer contributed zero security.
Fix
Fallback must be device PIN/password, not SMS OTP. Device PIN is hardware-encrypted on the device; attacker needs physical possession AND the PIN to bypass biometrics. For high-value actions, require biometric + separate OTP to authenticator app.
INTERVIEW PREP · PRACTICE MODE

Interview Questions on This Topic

Q01SENIOR
Walk me through what happens cryptographically when a FIDO2 credential i...
Q02SENIOR
You're designing authentication for a mobile banking app with 5M users —...
Q03SENIOR
A user reports they can no longer authenticate with their fingerprint af...
Q01 of 03SENIOR

Walk me through what happens cryptographically when a FIDO2 credential is used on two different devices simultaneously — specifically, how does the signCount mechanism detect cloning, and what are its limitations when the authenticator uses synced passkeys via iCloud Keychain?

ANSWER
During FIDO2 authentication, the authenticator increments a monotonic counter (signCount) stored in its hardware and includes it in the signed authenticatorData. The server stores the last known signCount. On each authentication, if the received signCount ≤ stored signCount, the server detects a potential clone — the private key was used on two devices without the counter increasing properly. For synced passkeys (iCloud Keychain, Google Password Manager), the FIDO2 spec explicitly allows signCount = 0. The counter cannot be maintained consistently across multiple devices synced via cloud. In this case, the limitation is that clone detection is impossible — a cloned credential will also have signCount=0, indistinguishable from a legitimate sync. Production implementation: if receivedSignCount == 0, skip the clone check entirely — this is spec-compliant. Only trigger the cloning alarm when signCount > 0 AND the received value is ≤ stored value. This means synced passkeys lose clone detection. For high-security applications, disable cloud sync via credential creation flags (residentKey: 'discouraged', authenticatorAttachment: 'platform' with no cloud fallback).
FAQ · 4 QUESTIONS

Frequently Asked Questions

01
Can biometric data be stolen in a data breach?
02
What's the difference between biometric authentication and biometric identification?
03
How do I handle biometric authentication on Android devices that don't have a fingerprint sensor?
04
What happens to FIDO2 passkeys when the signCount mechanism breaks for synced credentials — and how do you handle it without locking users out?
🔥

That's Security. Mark it forged?

5 min read · try the examples if you haven't

Previous
What is Salting in Security? (Password Protection Explained)
8 / 10 · Security
Next
Nmap Tutorial: Network Scanning and Host Discovery