If your decentralized identity system forces users to publicly prove their credentials every time they authenticate, you've built an immutable record of everything they do and everywhere they go.
Decentralized identity (DID) has become a Web3 buzzword, finally, we’re promised, users can control their own digital identity. With self-sovereign identity (SSI), you control your credentials, no third party owns your profile, and you only reveal what’s needed. But in practice, the way most verifiable credential systems are built today leaves a serious gap: every time you use your credentials, you leave a breadcrumb trail, right on-chain, that anyone can follow.
Here’s why that happens, and how to build DIDs that actually deliver the privacy they promise.
The Privacy Paradox: Decentralized Doesn’t Mean Private
Decentralized IDs let you prove things, your degree, your club membership, your age, directly to dApps and services, without “logging in with Google.” Information is issued, signed, and verified using cryptography. The catch? Each authentication event, if stored or referenced on a public chain, exposes when, where, and sometimes even why you used a credential.
- Credential revelation patterns: If you prove your age at several places, observers may infer your routine.
- Cross-application tracking: If the same proof mechanism is used everywhere, “anonymous” usage isn’t so anonymous anymore.
- Credential linkage: Your different proofs become linked, building a profile that’s supposed to be “decentralized,” but actually follows you everywhere.
The Problem With Verifiable Credentials (As They’re Often Built)
- Verifiable credentials are great for eliminating central gatekeepers, but their usage is often recorded or referenced on public ledgers for anti-fraud, audit, or discoverability purposes.
- Proof reuse means patterns emerge even if the content is private.
- DID registries, meant to help confirm authenticity, end up acting as open books of your interactions, unless privacy is deeply designed into the process.
Regulators are taking notice. GDPR, for example, demands data minimization and the “right to be forgotten.” Logging every credential use or verification event on-chain is… well, not what the privacy advocates wanted.
How Confidential Computing Fills the Gap
Enter confidential computing, using Trusted Execution Environments (TEEs), cryptographic tricks, and local proofs to ensure you can prove who you are (or what you have) without leaving a trail for everyone else.
With a privacy-first DID solution:
- Verification happens inside a TEE, so no one learns more than necessary, not even the verifier.
- Selective disclosure lets you reveal the minimum (e.g., “Over 18” instead of your exact birthdate).
- No global ledger for every proof; instead, private attestation systems confirm validity.
- Credentials are issued, managed, and checked using encrypted smart contracts, never exposing the who, when, or where, just proving “this person is eligible.”
Building Real-World SSI with Oasis
- Plurality Network: Using Oasis’s ROFL (Runtime Offchain Logic) framework, Plurality enables private reputation and identity that works across apps, without forming one giant spiderweb of user activity.
- Sapphire Confidential Smart Contracts: Store and verify credentials with strong access controls, using enclaves to ensure even the contract operators can’t see your history.
- Privacy-Preserving Verifiable Credentials: Combine standard W3C VC protocols with features like one-time proofs, hidden credential revocation, and non-linkable responses.
- Oasis Tutorials & Docs: Tutorials to help devs actually build this stuff (https://oasis.net/sapphire, https://docs.oasis.io/).
Steps for Developers
- Start with the problem: Map out every place your DID system leaks usage data, even metadata.
- Research TEEs and encrypted contracts: Learn how to run identity proofing logic inside confidential environments.
- Design for minimal disclosure: Challenge if your app really needs to know “who,” or if “what claim” is enough.
- Test with privacy actors: Engage with auditors, activists, and your users, ask them what information flow would break their trust.
- Join the Oasis privacy community: https://forum.oasis.io/
Decentralized identity is about self-sovereignty and privacy. Without deep privacy, “decentralized” ID just becomes another shadowy tracker, only this time with a blockchain address. Use confidential computing and privacy-preserving protocols to make digital identity secure, not just decentralized.
Self-sovereign identity isn’t about being visible everywhere, but being in control anywhere.

Top comments (3)
Good points here. A lot of DID systems today end up leaking metadata even if the credentials themselves are private. Without confidential verification, every proof basically becomes a tracking event. Using TEEs or encrypted smart contracts to check credentials privately makes the whole model actually match the “self-sovereign” idea. Projects like Plurality show this is doable in practice, not just theory.
Really strong piece thanks for highlighting the often-overlooked privacy blind-spot in SSI: just being decentralised doesn’t guarantee anonymity. The walk-through of how classic DID/VC systems can still leak who, when, where via metadata is especially spot-on.
The take-aways around using confidential computing (TEEs, private enclaves) to perform verifications, issue credentials and check claims all without exposing interaction logs or proof-metadata are very practical.
For developers building in this space: designing for minimal disclosure and shifting verification off-chain (or into secure compute) is no longer “optional” it’s necessary if you take user privacy seriously.
DID has been one of the most important web3 use cases, but it has been highly underutilized so far, imo. First, DID can bring new life to the oft-side-lined NFT space as we can make the most of the soul-bound token concept. Then, it can make the most of user sovereignty and privacy by securing confidential solutions that Oasis Sapphire provides. Finally, by embracing ROFL, it can make the computations off-chain while storing sensitive data privately on-chain, thereby decreasing the load of information and processing. The applicability of the DID solution, with or without AI integration, in real-world utility for healthcare data, financial data, insurance, copyright, intellectual property, etc makes this the next holy grail we should focus on in the crypto space.