DEV Community

Cover image for VeraSnap Building a Cryptographic Evidence Capture App for Android with Kotlin, CameraX, and Hardware-Backed Security

VeraSnap Building a Cryptographic Evidence Capture App for Android with Kotlin, CameraX, and Hardware-Backed Security

In an era of deepfakes and AI-generated imagery, proving that a photo was actually taken at a specific moment in time has become a genuinely hard problem. Self-signed timestamps are trivially forgeable. Metadata can be stripped or modified. Even blockchain-based solutions often rely on the very self-attestation they claim to replace.

We've been building VeraSnap for Android—an implementation of the Content Provenance Protocol (CPP) that takes a fundamentally different approach. Instead of trusting the device or the app developer, we require independent third-party verification through RFC 3161 Time Stamp Authorities. Every capture gets a cryptographic proof that can be verified offline, years later, without depending on any server being online.

This post walks through the key technical decisions we made, the Android-specific challenges we solved, and code examples you can adapt for your own projects.

The Tech Stack

Before diving into specifics, here's what we're working with:

// build.gradle.kts
android {
    compileSdk = 34
    defaultConfig {
        minSdk = 26  // Android 8.0+
        targetSdk = 34
    }
}

dependencies {
    // Core
    implementation("androidx.core:core-ktx:1.12.0")
    implementation("androidx.lifecycle:lifecycle-runtime-ktx:2.7.0")

    // Compose + Material 3
    implementation(platform("androidx.compose:compose-bom:2024.01.00"))
    implementation("androidx.compose.material3:material3")
    implementation("androidx.navigation:navigation-compose:2.7.6")

    // CameraX
    implementation("androidx.camera:camera-core:1.3.1")
    implementation("androidx.camera:camera-camera2:1.3.1")
    implementation("androidx.camera:camera-lifecycle:1.3.1")
    implementation("androidx.camera:camera-view:1.3.1")

    // Room + Hilt
    implementation("androidx.room:room-ktx:2.6.1")
    implementation("com.google.dagger:hilt-android:2.50")

    // Biometric
    implementation("androidx.biometric:biometric:1.1.0")

    // Crypto (ASN.1 parsing for TSA responses)
    implementation("org.bouncycastle:bcprov-jdk18on:1.77")
    implementation("org.bouncycastle:bcpkix-jdk18on:1.77")
}
Enter fullscreen mode Exit fullscreen mode

The architecture follows MVVM with Clean Architecture principles. Hilt handles dependency injection. Kotlin Coroutines and Flow manage async operations. Room persists events locally.

Why Not Just Sign With a Device Key?

The naive approach to media provenance looks something like this:

// DON'T DO THIS - self-attestation is meaningless
fun captureWithSelfAttestation(imageBytes: ByteArray): Proof {
    val timestamp = System.currentTimeMillis()  // Trust me, bro
    val hash = sha256(imageBytes)
    val signature = sign(hash + timestamp)
    return Proof(hash, timestamp, signature)
}
Enter fullscreen mode Exit fullscreen mode

The problem? Anyone controlling the device can set any timestamp they want. The signature only proves that someone with access to the key signed something. It says nothing about when that signing actually occurred.

CPP solves this by requiring an external Time Stamp Authority (TSA) to countersign every proof. The TSA is an independent third party—not the app developer, not the device manufacturer, not the user. When they countersign your hash, they're saying "this hash existed at exactly this moment according to our certified clock."

The CPP Event Model

Every capture in VeraSnap produces a CPP Event—a JSON structure containing everything needed for independent verification:

@Serializable
data class CPPEvent(
    @SerialName("EventID")
    val eventId: String,  // UUIDv7 (timestamp-sortable)

    @SerialName("ChainID") 
    val chainId: String,

    @SerialName("EventType")
    val eventType: String,  // "INGEST", "EXPORT", "TOMBSTONE"

    @SerialName("Timestamp")
    val timestamp: String,  // ISO 8601

    @SerialName("PrevHash")
    val prevHash: String,  // Links to previous event

    @SerialName("Asset")
    val asset: Asset,

    @SerialName("CaptureContext")
    val captureContext: CaptureContext,

    @SerialName("EventHash")
    val eventHash: String,  // SHA-256 of canonical JSON

    @SerialName("Signature")
    val signature: String   // ES256 (ECDSA P-256)
)

@Serializable
data class Asset(
    @SerialName("ContentHash")
    val contentHash: String,  // sha256:hexstring

    @SerialName("MediaType")
    val mediaType: String,    // image/jpeg, video/mp4

    @SerialName("Resolution")
    val resolution: Resolution,

    @SerialName("SizeBytes")
    val sizeBytes: Long
)

@Serializable
data class CaptureContext(
    @SerialName("DeviceInfo")
    val deviceInfo: DeviceInfo,

    @SerialName("Location")
    val location: Location?,  // Privacy: hashed, opt-in only

    @SerialName("SecurityModule")
    val securityModule: SecurityModule,

    @SerialName("DepthAnalysis")
    val depthAnalysis: DepthAnalysis?,

    @SerialName("KeyAttestation")
    val keyAttestation: KeyAttestation
)
Enter fullscreen mode Exit fullscreen mode

The PrevHash field creates a hash chain—each event links to the previous one. If someone tries to delete an event from the middle of the chain, the verification breaks. This is the "Completeness Invariant" that makes CPP different from edit-history tracking.

Hardware-Backed Key Storage

Android offers multiple levels of key security. We want to use the strongest available on each device:

@Singleton
class KeystoreManager @Inject constructor(
    @ApplicationContext private val context: Context
) {
    companion object {
        const val KEY_ALIAS = "verasnap_signing_key"
    }

    private val keyStore = KeyStore.getInstance("AndroidKeyStore").apply { load(null) }

    fun getOrCreateKeyPair(): KeyPair {
        if (!keyStore.containsAlias(KEY_ALIAS)) {
            generateKeyPair()
        }

        val privateKey = keyStore.getKey(KEY_ALIAS, null) as PrivateKey
        val publicKey = keyStore.getCertificate(KEY_ALIAS).publicKey
        return KeyPair(publicKey, privateKey)
    }

    private fun generateKeyPair() {
        val parameterSpec = KeyGenParameterSpec.Builder(
            KEY_ALIAS,
            KeyProperties.PURPOSE_SIGN or KeyProperties.PURPOSE_VERIFY
        ).apply {
            setDigests(KeyProperties.DIGEST_SHA256)
            setAlgorithmParameterSpec(ECGenParameterSpec("secp256r1"))

            // Require user authentication for attested capture
            setUserAuthenticationRequired(false)  // Set true for Gold conformance

            // Use StrongBox if available (Titan M, Knox, etc.)
            if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.P) {
                try {
                    setIsStrongBoxBacked(true)
                } catch (e: StrongBoxUnavailableException) {
                    // Fall back to TEE
                }
            }

            // Enable key attestation
            setAttestationChallenge(generateChallenge())
        }.build()

        val keyPairGenerator = KeyPairGenerator.getInstance(
            KeyProperties.KEY_ALGORITHM_EC,
            "AndroidKeyStore"
        )
        keyPairGenerator.initialize(parameterSpec)
        keyPairGenerator.generateKeyPair()
    }

    fun getSecurityModule(): SecurityModule {
        val factory = KeyFactory.getInstance(
            keyStore.getKey(KEY_ALIAS, null).algorithm,
            "AndroidKeyStore"
        )
        val keyInfo = factory.getKeySpec(
            keyStore.getKey(KEY_ALIAS, null),
            KeyInfo::class.java
        )

        val securityLevel = when {
            Build.VERSION.SDK_INT >= Build.VERSION_CODES.S -> {
                when (keyInfo.securityLevel) {
                    KeyProperties.SECURITY_LEVEL_STRONGBOX -> "StrongBox"
                    KeyProperties.SECURITY_LEVEL_TRUSTED_ENVIRONMENT -> "TEE"
                    else -> "Software"
                }
            }
            keyInfo.isInsideSecureHardware -> "TEE"
            else -> "Software"
        }

        return SecurityModule(
            type = securityLevel,
            version = Build.VERSION.SECURITY_PATCH,
            manufacturer = detectSecurityChip()
        )
    }

    private fun detectSecurityChip(): String = when {
        Build.MANUFACTURER.equals("Google", ignoreCase = true) -> "Titan M"
        Build.MANUFACTURER.equals("Samsung", ignoreCase = true) -> "Knox"
        else -> Build.MANUFACTURER
    }

    private fun generateChallenge(): ByteArray {
        return ByteArray(32).also { SecureRandom().nextBytes(it) }
    }
}
Enter fullscreen mode Exit fullscreen mode

The hierarchy of security is: StrongBox (dedicated hardware) > TEE (isolated processor area) > Software. We detect what's available and use the best option. The proof metadata includes this information so verifiers know the assurance level.

Computing Deterministic Hashes

CPP requires deterministic hashing—the same event must always produce the same hash. This is trickier than it sounds because JSON serialization order isn't guaranteed. We use RFC 8785 (JSON Canonicalization Scheme):

@Singleton
class CryptoService @Inject constructor(
    @ApplicationContext private val context: Context,
    private val keystoreManager: KeystoreManager
) {
    private val json = Json {
        encodeDefaults = true
        // kotlinx.serialization preserves declaration order
        // which gives us deterministic output
    }

    fun computeEventHash(event: CPPEvent): String {
        // Create a copy without EventHash and Signature (those get computed)
        val hashableEvent = event.copy(
            eventHash = "",
            signature = ""
        )

        // Serialize to canonical JSON
        val canonicalJson = json.encodeToString(hashableEvent)

        // SHA-256
        val digest = MessageDigest.getInstance("SHA-256")
        val hashBytes = digest.digest(canonicalJson.toByteArray(Charsets.UTF_8))

        return "sha256:" + hashBytes.toHexString()
    }

    fun computeContentHash(bytes: ByteArray): String {
        val digest = MessageDigest.getInstance("SHA-256")
        return "sha256:" + digest.digest(bytes).toHexString()
    }

    suspend fun sign(data: ByteArray): String = withContext(Dispatchers.Default) {
        val privateKey = keystoreManager.getOrCreateKeyPair().private

        val signature = Signature.getInstance("SHA256withECDSA").apply {
            initSign(privateKey)
            update(data)
        }

        Base64.encodeToString(signature.sign(), Base64.NO_WRAP)
    }

    private fun ByteArray.toHexString(): String = 
        joinToString("") { "%02x".format(it) }
}
Enter fullscreen mode Exit fullscreen mode

Notice we're using kotlinx.serialization which preserves field declaration order. This is critical—different serialization orders would produce different hashes even for identical data.

CameraX Integration

CameraX abstracts away the fragmented Camera2 API mess. Here's our capture service:

@Singleton
class CameraService @Inject constructor(
    @ApplicationContext private val application: Context
) {
    private var cameraProvider: ProcessCameraProvider? = null
    private var imageCapture: ImageCapture? = null
    private var videoCapture: VideoCapture<Recorder>? = null

    private val _cameraState = MutableStateFlow<CameraState>(CameraState.Initializing)
    val cameraState: StateFlow<CameraState> = _cameraState.asStateFlow()

    sealed class CameraState {
        object Initializing : CameraState()
        object Ready : CameraState()
        data class Error(val message: String) : CameraState()
        object Recording : CameraState()
    }

    suspend fun initialize(
        lifecycleOwner: LifecycleOwner,
        previewView: PreviewView
    ) {
        cameraProvider = ProcessCameraProvider.getInstance(application).await()

        val preview = Preview.Builder()
            .setTargetAspectRatio(AspectRatio.RATIO_4_3)
            .build()
            .also { it.setSurfaceProvider(previewView.surfaceProvider) }

        imageCapture = ImageCapture.Builder()
            .setTargetAspectRatio(AspectRatio.RATIO_4_3)
            .setCaptureMode(ImageCapture.CAPTURE_MODE_MAXIMIZE_QUALITY)
            .build()

        val recorder = Recorder.Builder()
            .setQualitySelector(QualitySelector.from(Quality.HD))
            .build()
        videoCapture = VideoCapture.withOutput(recorder)

        val cameraSelector = CameraSelector.Builder()
            .requireLensFacing(CameraSelector.LENS_FACING_BACK)
            .build()

        try {
            cameraProvider?.unbindAll()
            cameraProvider?.bindToLifecycle(
                lifecycleOwner,
                cameraSelector,
                preview,
                imageCapture,
                videoCapture
            )
            _cameraState.value = CameraState.Ready
        } catch (e: Exception) {
            _cameraState.value = CameraState.Error(e.message ?: "Camera init failed")
        }
    }

    suspend fun capturePhoto(): Result<CapturedMedia> = runCatching {
        val imageCapture = imageCapture 
            ?: throw IllegalStateException("Camera not initialized")

        val photoFile = createTempFile()
        val outputOptions = ImageCapture.OutputFileOptions.Builder(photoFile).build()

        suspendCancellableCoroutine { continuation ->
            imageCapture.takePicture(
                outputOptions,
                ContextCompat.getMainExecutor(application),
                object : ImageCapture.OnImageSavedCallback {
                    override fun onImageSaved(output: ImageCapture.OutputFileResults) {
                        continuation.resume(
                            CapturedMedia(
                                file = photoFile,
                                mediaType = "image/jpeg",
                                timestamp = Instant.now()
                            )
                        )
                    }

                    override fun onError(exception: ImageCaptureException) {
                        continuation.resumeWithException(exception)
                    }
                }
            )
        }
    }

    private fun createTempFile(): File {
        val timestamp = SimpleDateFormat("yyyyMMdd_HHmmss", Locale.US).format(Date())
        val storageDir = application.getExternalFilesDir(Environment.DIRECTORY_PICTURES)
        return File.createTempFile("VERASNAP_${timestamp}_", ".jpg", storageDir)
    }
}

data class CapturedMedia(
    val file: File,
    val mediaType: String,
    val timestamp: Instant
)
Enter fullscreen mode Exit fullscreen mode

RFC 3161 Timestamping

The TSA integration is where the magic happens. We send a hash, they return a signed timestamp token:

@Singleton
class TSAService @Inject constructor(
    private val okHttpClient: OkHttpClient
) {
    companion object {
        const val DEFAULT_TSA_URL = "https://rfc3161.ai.moda"
    }

    suspend fun timestamp(
        hash: ByteArray,
        tsaUrl: String = DEFAULT_TSA_URL
    ): Result<TSAResponse> = withContext(Dispatchers.IO) {
        runCatching {
            // Build RFC 3161 TimeStampReq
            val request = buildTimestampRequest(hash)

            // Send to TSA
            val response = okHttpClient.newCall(
                Request.Builder()
                    .url(tsaUrl)
                    .post(request.toRequestBody("application/timestamp-query".toMediaType()))
                    .build()
            ).execute()

            if (!response.isSuccessful) {
                throw IOException("TSA returned ${response.code}")
            }

            // Parse RFC 3161 TimeStampResp
            parseTimestampResponse(response.body!!.bytes())
        }
    }

    private fun buildTimestampRequest(hash: ByteArray): ByteArray {
        // ASN.1 structure for TimeStampReq
        val messageImprint = MessageImprint(
            AlgorithmIdentifier(NISTObjectIdentifiers.id_sha256),
            hash
        )

        val tsReq = TimeStampReq(
            messageImprint,
            null,  // reqPolicy
            BigInteger(64, SecureRandom()),  // nonce
            true,  // certReq
            null   // extensions
        )

        return tsReq.encoded
    }

    private fun parseTimestampResponse(bytes: ByteArray): TSAResponse {
        val tsResp = TimeStampResp.getInstance(ASN1Sequence.fromByteArray(bytes))

        val status = tsResp.status.status.intValue()
        if (status != 0 && status != 1) {
            throw TSAException("TSA rejected request: status=$status")
        }

        val token = tsResp.timeStampToken
        val tstInfo = TSTInfo.getInstance(token.tSTInfo.parseSignedContent())

        return TSAResponse(
            token = Base64.encodeToString(token.encoded, Base64.NO_WRAP),
            timestamp = tstInfo.genTime.date.toInstant(),
            serialNumber = tstInfo.serialNumber.value,
            tsaName = tstInfo.tsa?.toString()
        )
    }
}

data class TSAResponse(
    val token: String,      // Base64-encoded DER
    val timestamp: Instant,
    val serialNumber: BigInteger,
    val tsaName: String?
)

class TSAException(message: String) : Exception(message)
Enter fullscreen mode Exit fullscreen mode

The key insight: we're sending a hash of the content, not the content itself. The TSA never sees your photos. They just certify "this hash existed at this time."

Merkle Tree Batching

For high-volume captures, getting a TSA timestamp for every single photo gets expensive. Merkle trees let us batch multiple captures under a single timestamp:

object MerkleTreeBuilder {

    fun build(leaves: List<String>): MerkleTree {
        require(leaves.isNotEmpty()) { "Cannot build tree from empty leaves" }

        // Sort leaves for determinism
        val sortedLeaves = leaves.sorted()

        // Compute leaf hashes: LeafHash = SHA256(EventHash)
        val leafHashes = sortedLeaves.map { leaf ->
            sha256(leaf.removePrefix("sha256:").hexToByteArray())
        }

        // Pad to power of 2 by duplicating last leaf
        val paddedLeaves = padToPowerOfTwo(leafHashes)

        // Build tree bottom-up
        val tree = mutableListOf<List<ByteArray>>()
        tree.add(paddedLeaves)

        var currentLevel = paddedLeaves
        while (currentLevel.size > 1) {
            val nextLevel = currentLevel.chunked(2).map { pair ->
                sha256(pair[0] + pair[1])  // Pairing: SHA256(Left || Right)
            }
            tree.add(nextLevel)
            currentLevel = nextLevel
        }

        return MerkleTree(
            root = "sha256:" + currentLevel[0].toHexString(),
            leaves = sortedLeaves,
            levels = tree
        )
    }

    fun generateProof(tree: MerkleTree, leafHash: String): MerkleProof {
        val leafIndex = tree.leaves.indexOf(leafHash)
        require(leafIndex >= 0) { "Leaf not found in tree" }

        val proof = mutableListOf<ProofElement>()
        var index = leafIndex

        for (level in 0 until tree.levels.size - 1) {
            val siblingIndex = if (index % 2 == 0) index + 1 else index - 1
            val sibling = tree.levels[level].getOrNull(siblingIndex) 
                ?: tree.levels[level][index]  // Handle padding

            proof.add(ProofElement(
                hash = "sha256:" + sibling.toHexString(),
                position = if (index % 2 == 0) "right" else "left"
            ))

            index /= 2
        }

        return MerkleProof(
            leafHash = leafHash,
            leafIndex = leafIndex,
            proof = proof,
            root = tree.root
        )
    }

    fun verifyProof(
        leafHash: String,
        proof: MerkleProof,
        expectedRoot: String
    ): Boolean {
        var hash = sha256(leafHash.removePrefix("sha256:").hexToByteArray())

        for (element in proof.proof) {
            val siblingHash = element.hash.removePrefix("sha256:").hexToByteArray()
            hash = if (element.position == "left") {
                sha256(siblingHash + hash)
            } else {
                sha256(hash + siblingHash)
            }
        }

        return "sha256:" + hash.toHexString() == expectedRoot
    }

    private fun sha256(data: ByteArray): ByteArray {
        return MessageDigest.getInstance("SHA-256").digest(data)
    }

    private fun padToPowerOfTwo(leaves: List<ByteArray>): List<ByteArray> {
        var size = 1
        while (size < leaves.size) size *= 2
        return leaves + List(size - leaves.size) { leaves.last() }
    }
}

data class MerkleTree(
    val root: String,
    val leaves: List<String>,
    val levels: List<List<ByteArray>>
)

data class MerkleProof(
    val leafHash: String,
    val leafIndex: Int,
    val proof: List<ProofElement>,
    val root: String
)

data class ProofElement(
    val hash: String,
    val position: String  // "left" or "right"
)
Enter fullscreen mode Exit fullscreen mode

With this, we can batch 1000 captures, timestamp the single Merkle root, and still verify any individual capture independently.

Depth Analysis for Screen Detection

One emerging attack vector: photographing another screen to create "authentic" captures of manipulated content. Depth sensors can help detect this:

class ScreenDetector @Inject constructor() {

    data class DepthAnalysisResult(
        val isLikelyScreen: Boolean,
        val confidence: Float,
        val depthVariance: Float,
        val flatSurfaceRatio: Float,
        val sensorType: String
    )

    fun analyze(
        depthData: FloatArray,
        width: Int,
        height: Int,
        sensorType: String
    ): DepthAnalysisResult {
        // Calculate depth statistics
        val validDepths = depthData.filter { it > 0 && it.isFinite() }
        if (validDepths.isEmpty()) {
            return DepthAnalysisResult(
                isLikelyScreen = false,
                confidence = 0f,
                depthVariance = Float.NaN,
                flatSurfaceRatio = 0f,
                sensorType = sensorType
            )
        }

        val mean = validDepths.average().toFloat()
        val variance = validDepths.map { (it - mean).pow(2) }.average().toFloat()
        val stdDev = sqrt(variance)

        // Screens have very low depth variance (everything at same distance)
        val flatThreshold = 0.05f  // 5cm
        val flatPixels = validDepths.count { abs(it - mean) < flatThreshold }
        val flatRatio = flatPixels.toFloat() / validDepths.size

        // Screen detection heuristics
        val isLikelyScreen = stdDev < 0.1f && flatRatio > 0.85f

        // Confidence based on how extreme the flatness is
        val confidence = when {
            flatRatio > 0.95f && stdDev < 0.05f -> 0.95f
            flatRatio > 0.90f && stdDev < 0.08f -> 0.80f
            flatRatio > 0.85f && stdDev < 0.10f -> 0.65f
            else -> 0.3f
        }

        return DepthAnalysisResult(
            isLikelyScreen = isLikelyScreen,
            confidence = if (isLikelyScreen) confidence else 1f - confidence,
            depthVariance = variance,
            flatSurfaceRatio = flatRatio,
            sensorType = sensorType
        )
    }
}
Enter fullscreen mode Exit fullscreen mode

The depth analysis result goes into the proof metadata. It's informational—we don't block captures of screens, but verifiers can see the analysis.

Biometric Human Presence Verification

For "Gold" conformance level, we require biometric authentication to prove a human was present:

@Singleton
class BiometricAuthService @Inject constructor(
    @ApplicationContext private val context: Context
) {
    private val biometricManager = BiometricManager.from(context)

    fun canAuthenticate(): BiometricCapability {
        return when (biometricManager.canAuthenticate(
            BiometricManager.Authenticators.BIOMETRIC_STRONG
        )) {
            BiometricManager.BIOMETRIC_SUCCESS -> BiometricCapability.Available
            BiometricManager.BIOMETRIC_ERROR_NO_HARDWARE -> BiometricCapability.NoHardware
            BiometricManager.BIOMETRIC_ERROR_HW_UNAVAILABLE -> BiometricCapability.Unavailable
            BiometricManager.BIOMETRIC_ERROR_NONE_ENROLLED -> BiometricCapability.NotEnrolled
            else -> BiometricCapability.Unknown
        }
    }

    suspend fun authenticate(
        activity: FragmentActivity,
        title: String,
        subtitle: String
    ): Result<BiometricResult> = suspendCancellableCoroutine { continuation ->

        val promptInfo = BiometricPrompt.PromptInfo.Builder()
            .setTitle(title)
            .setSubtitle(subtitle)
            .setAllowedAuthenticators(BiometricManager.Authenticators.BIOMETRIC_STRONG)
            .setNegativeButtonText(context.getString(android.R.string.cancel))
            .build()

        val callback = object : BiometricPrompt.AuthenticationCallback() {
            override fun onAuthenticationSucceeded(result: BiometricPrompt.AuthenticationResult) {
                val authenticationType = when (result.authenticationType) {
                    BiometricPrompt.AUTHENTICATION_RESULT_TYPE_BIOMETRIC -> "biometric"
                    BiometricPrompt.AUTHENTICATION_RESULT_TYPE_DEVICE_CREDENTIAL -> "device_credential"
                    else -> "unknown"
                }
                continuation.resume(Result.success(BiometricResult(
                    success = true,
                    authenticationType = authenticationType,
                    timestamp = Instant.now()
                )))
            }

            override fun onAuthenticationError(errorCode: Int, errString: CharSequence) {
                continuation.resume(Result.failure(
                    BiometricException(errorCode, errString.toString())
                ))
            }

            override fun onAuthenticationFailed() {
                // Don't fail yet—user can retry
            }
        }

        val prompt = BiometricPrompt(activity, ContextCompat.getMainExecutor(context), callback)
        prompt.authenticate(promptInfo)

        continuation.invokeOnCancellation { prompt.cancelAuthentication() }
    }
}

sealed class BiometricCapability {
    object Available : BiometricCapability()
    object NoHardware : BiometricCapability()
    object Unavailable : BiometricCapability()
    object NotEnrolled : BiometricCapability()
    object Unknown : BiometricCapability()
}

data class BiometricResult(
    val success: Boolean,
    val authenticationType: String,
    val timestamp: Instant
)

class BiometricException(val errorCode: Int, message: String) : Exception(message)
Enter fullscreen mode Exit fullscreen mode

We only accept BIOMETRIC_STRONG authentication—fingerprint or face recognition with hardware-backed anti-spoofing. Weak biometrics or PIN fallback don't qualify for human presence verification.

Room Database Schema

All events persist locally with Room:

@Database(
    entities = [
        EventEntity::class,
        CaseEntity::class,
        AnchorEntity::class,
        TombstoneEntity::class
    ],
    version = 1,
    exportSchema = true
)
@TypeConverters(Converters::class)
abstract class VeraSnapDatabase : RoomDatabase() {
    abstract fun eventDao(): EventDao
    abstract fun caseDao(): CaseDao
    abstract fun anchorDao(): AnchorDao

    companion object {
        const val DATABASE_NAME = "verasnap_db"
    }
}

@Entity(tableName = "events")
data class EventEntity(
    @PrimaryKey
    @ColumnInfo(name = "event_id")
    val eventId: String,

    @ColumnInfo(name = "chain_id")
    val chainId: String,

    @ColumnInfo(name = "case_id")
    val caseId: String,

    @ColumnInfo(name = "event_json")
    val eventJson: String,  // Full CPPEvent serialized

    @ColumnInfo(name = "timestamp")
    val timestamp: Long,

    @ColumnInfo(name = "asset_type")
    val assetType: String,

    @ColumnInfo(name = "asset_path")
    val assetPath: String,

    @ColumnInfo(name = "thumbnail_path")
    val thumbnailPath: String?,

    @ColumnInfo(name = "is_anchored")
    val isAnchored: Boolean = false,

    @ColumnInfo(name = "anchor_id")
    val anchorId: String?
)

@Entity(tableName = "anchors")
data class AnchorEntity(
    @PrimaryKey
    @ColumnInfo(name = "anchor_id")
    val anchorId: String,

    @ColumnInfo(name = "merkle_root")
    val merkleRoot: String,

    @ColumnInfo(name = "tsa_response")
    val tsaResponse: String,  // Base64 DER token

    @ColumnInfo(name = "timestamp")
    val timestamp: Long,

    @ColumnInfo(name = "event_count")
    val eventCount: Int
)

@Dao
interface EventDao {
    @Query("SELECT * FROM events WHERE case_id = :caseId ORDER BY timestamp DESC")
    fun getEventsForCase(caseId: String): Flow<List<EventEntity>>

    @Query("SELECT * FROM events WHERE is_anchored = 0")
    suspend fun getUnanchoredEvents(): List<EventEntity>

    @Insert(onConflict = OnConflictStrategy.REPLACE)
    suspend fun insert(event: EventEntity)

    @Query("UPDATE events SET is_anchored = 1, anchor_id = :anchorId WHERE event_id IN (:eventIds)")
    suspend fun markAnchored(eventIds: List<String>, anchorId: String)
}
Enter fullscreen mode Exit fullscreen mode

Localization

VeraSnap ships with 10 languages. Android's resource system handles most of it:

<!-- values/strings.xml (English - default) -->
<resources>
    <string name="app_name">VeraSnap</string>
    <string name="capture_photo">Take Photo</string>
    <string name="capture_video">Record Video</string>
    <string name="gallery_title">Proof Gallery</string>
    <string name="gallery_empty">No proofs yet</string>
    <string name="conformance_bronze">Bronze</string>
    <string name="conformance_silver">Silver</string>
    <string name="conformance_gold">Gold</string>
    <string name="verify_success">Verification Successful</string>
    <string name="verify_failed">Verification Failed</string>
    <string name="depth_screen_detected">Screen Detected</string>
    <string name="depth_real_object">Real Object</string>
</resources>

<!-- values-ja/strings.xml (Japanese) -->
<resources>
    <string name="app_name">VeraSnap</string>
    <string name="capture_photo">写真を撮る</string>
    <string name="capture_video">動画を録画</string>
    <string name="gallery_title">証明ギャラリー</string>
    <string name="gallery_empty">証明がありません</string>
    <string name="conformance_bronze">ブロンズ</string>
    <string name="conformance_silver">シルバー</string>
    <string name="conformance_gold">ゴールド</string>
    <string name="verify_success">検証成功</string>
    <string name="verify_failed">検証失敗</string>
    <string name="depth_screen_detected">スクリーン検出</string>
    <string name="depth_real_object">実物</string>
</resources>

<!-- values-ar/strings.xml (Arabic - RTL) -->
<resources>
    <string name="app_name">VeraSnap</string>
    <string name="capture_photo">التقاط صورة</string>
    <string name="capture_video">تسجيل فيديو</string>
    <string name="gallery_title">معرض الإثباتات</string>
    <!-- ... -->
</resources>
Enter fullscreen mode Exit fullscreen mode

For RTL languages like Arabic, Compose handles layout mirroring automatically when you use start/end instead of left/right.

Putting It All Together: The Capture Use Case

Here's how all the pieces combine in a single capture flow:

class CapturePhotoUseCase @Inject constructor(
    private val cameraService: CameraService,
    private val cryptoService: CryptoService,
    private val tsaService: TSAService,
    private val locationService: LocationService,
    private val depthService: DepthSensorService,
    private val biometricService: BiometricAuthService,
    private val eventRepository: EventRepository,
    private val keystoreManager: KeystoreManager
) {
    suspend operator fun invoke(
        caseId: String,
        attestedCapture: Boolean = false,
        activity: FragmentActivity? = null
    ): Result<CPPEvent> = runCatching {

        // 1. Biometric auth if attested capture requested
        if (attestedCapture) {
            requireNotNull(activity) { "Activity required for biometric auth" }
            biometricService.authenticate(
                activity,
                "Attested Capture",
                "Verify your presence"
            ).getOrThrow()
        }

        // 2. Capture photo
        val capturedMedia = cameraService.capturePhoto().getOrThrow()

        // 3. Compute content hash
        val contentBytes = capturedMedia.file.readBytes()
        val contentHash = cryptoService.computeContentHash(contentBytes)

        // 4. Gather context
        val location = locationService.getCurrentLocation()
        val depthAnalysis = depthService.analyze()
        val securityModule = keystoreManager.getSecurityModule()

        // 5. Build CPP Event
        val eventId = UUIDv7.generate()
        val chainId = eventRepository.getChainId(caseId)
        val prevHash = eventRepository.getLatestEventHash(chainId) ?: GENESIS_HASH

        val event = CPPEvent(
            eventId = eventId,
            chainId = chainId,
            eventType = "INGEST",
            timestamp = Instant.now().toString(),
            prevHash = prevHash,
            asset = Asset(
                contentHash = contentHash,
                mediaType = capturedMedia.mediaType,
                resolution = Resolution(/* from EXIF */),
                sizeBytes = contentBytes.size.toLong()
            ),
            captureContext = CaptureContext(
                deviceInfo = DeviceInfo.current(),
                location = location?.let { Location(
                    coordinateHash = cryptoService.computeContentHash(
                        "${it.latitude},${it.longitude}".toByteArray()
                    )
                )},
                securityModule = securityModule,
                depthAnalysis = depthAnalysis,
                keyAttestation = keystoreManager.getKeyAttestation()
            ),
            eventHash = "",  // Computed next
            signature = ""   // Computed after hash
        )

        // 6. Compute event hash and sign
        val eventHash = cryptoService.computeEventHash(event)
        val hashBytes = eventHash.removePrefix("sha256:").hexToByteArray()
        val signature = cryptoService.sign(hashBytes)

        val signedEvent = event.copy(
            eventHash = eventHash,
            signature = signature
        )

        // 7. Persist
        eventRepository.save(signedEvent, capturedMedia.file.absolutePath)

        signedEvent
    }

    companion object {
        const val GENESIS_HASH = "sha256:0000000000000000000000000000000000000000000000000000000000000000"
    }
}
Enter fullscreen mode Exit fullscreen mode

Testing

Unit tests verify the crypto primitives work deterministically:

class CryptoServiceTest {

    @Test
    fun `computeEventHash produces deterministic hash`() = runTest {
        val event = createTestEvent()
        val cryptoService = CryptoService(context, keystoreManager)

        val hash1 = cryptoService.computeEventHash(event)
        val hash2 = cryptoService.computeEventHash(event)

        assertEquals(hash1, hash2)
        assertTrue(hash1.startsWith("sha256:"))
        assertEquals(71, hash1.length)  // "sha256:" (7) + 64 hex chars
    }

    @Test
    fun `MerkleTree produces correct proofs`() {
        val leaves = listOf(
            "sha256:abc123...",
            "sha256:def456...",
            "sha256:ghi789..."
        )

        val tree = MerkleTreeBuilder.build(leaves)

        leaves.forEach { leaf ->
            val proof = MerkleTreeBuilder.generateProof(tree, leaf)
            assertTrue(MerkleTreeBuilder.verifyProof(leaf, proof, tree.root))
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Integration tests verify the full capture pipeline:

@HiltAndroidTest
class CaptureIntegrationTest {

    @get:Rule
    val hiltRule = HiltAndroidRule(this)

    @Inject
    lateinit var captureUseCase: CapturePhotoUseCase

    @Inject
    lateinit var eventRepository: EventRepository

    @Before
    fun setup() {
        hiltRule.inject()
    }

    @Test
    fun capturePhoto_createsValidCPPEvent() = runTest {
        val result = captureUseCase("test-case-id", attestedCapture = false)

        assertTrue(result.isSuccess)
        val event = result.getOrNull()!!

        // Verify structure
        assertTrue(event.eventId.isNotEmpty())
        assertTrue(event.eventHash.startsWith("sha256:"))
        assertTrue(event.signature.isNotEmpty())

        // Verify persistence
        val savedEvent = eventRepository.getEvent(event.eventId)
        assertNotNull(savedEvent)
        assertEquals(event.eventHash, savedEvent.eventHash)
    }
}
Enter fullscreen mode Exit fullscreen mode

What's Next

VeraSnap for Android is currently in development with a 16-week timeline:

  • Phase 1 (4 weeks): Core capture, Room DB, Keystore integration
  • Phase 2 (3 weeks): Jetpack Compose UI, navigation
  • Phase 3 (2 weeks): Depth sensors, screen detection
  • Phase 4 (2 weeks): TSA integration, Merkle trees
  • Phase 5 (2 weeks): Billing, localization
  • Phase 6 (3 weeks): Testing, Google Play submission

Resources

The Content Provenance Protocol is open source:

If you're building provenance systems and want to ensure cross-platform compatibility, the CPP test vectors are a good starting point.


Questions? Comments? Drop them below or find us on GitHub. We're particularly interested in hearing from developers working on media authenticity, cryptographic protocols, or Android security.


VeraSnap is developed by VeritasChain Co., Ltd. The Content Provenance Protocol is maintained by the VeritasChain Standards Organization under CC BY 4.0.

Top comments (0)