On February 21, 2025, Bybit lost approximately $1.5 billion in ETH and staked ETH tokens — the single largest cryptocurrency theft in history, nearly triple the previous record. The attack wasn't a smart contract bug. It wasn't a flash loan exploit. It was something far more insidious: a supply chain compromise of Safe{Wallet}'s frontend that turned legitimate multisig signers into unwitting accomplices.
Let's break down exactly how this happened, why existing security measures failed, and what every protocol team needs to learn from it.
The Attack Chain: Four Stages of Devastation
Stage 1: Compromising the Developer Machine
The Lazarus Group (North Korea's state-sponsored hacking unit, confirmed by the FBI) gained access to a Safe{Wallet} developer's workstation. Through this foothold, they obtained AWS credentials that gave them write access to Safe's S3-hosted frontend assets.
This is the part that should terrify every crypto team: the initial compromise wasn't at Bybit at all. It was at their trusted infrastructure provider.
Stage 2: Targeted JavaScript Injection
The attackers modified specific JavaScript bundles on Safe{Wallet}'s CDN — files like _app-52c9031bfa03da47.js. But here's the clever part: the malicious code was surgically targeted. It only activated for Bybit's multisig wallet addresses. Every other Safe{Wallet} user saw completely normal behavior.
The injected code intercepted three critical functions:
// Pseudocode of what the malicious injection did:
// 1. Intercept transaction creation
function executeTransaction(safeTx) {
if (isTargetWallet(safeTx.safe)) {
const realTx = deepCopy(safeTx);
safeTx.to = ATTACKER_CONTRACT;
safeTx.operation = 1; // delegatecall instead of call
safeTx.data = encodeSwapMasterCopy(MALICIOUS_IMPL);
return sign(safeTx);
}
return sign(safeTx); // Normal for everyone else
}
// 2. Display the original transaction in UI
function displayTransaction(safeTx) {
return renderUI(originalTxData);
}
// 3. After signing, restore original tx data but keep malicious signature
function postSign(signedTx) {
signedTx.displayData = originalTxData;
}
The UI showed a completely normal cold-to-hot wallet transfer. The signers saw the correct destination address, correct amounts, and a legitimate Safe URL. Under the hood, they were signing something entirely different.
Stage 3: The delegatecall Weapon
This is where smart contract architecture becomes a weapon. Bybit's cold wallet used Safe's standard proxy pattern:
┌─────────────────┐ ┌──────────────────────┐
│ Proxy Contract │────▶│ Implementation (Safe) │
│ (Bybit Wallet) │ │ masterCopy @ slot 0 │
│ Holds all funds │ │ All wallet logic │
└─────────────────┘ └──────────────────────┘
The key insight: Safe's execTransaction supports both call (operation=0) and delegatecall (operation=1). When you delegatecall a contract, that contract's code executes in the caller's storage context.
The tampered transaction changed operation from 0 to 1 and pointed to a pre-deployed attacker contract:
// Attacker's contract (simplified)
contract MaliciousImpl {
function attack() external {
// Overwrite slot 0 (masterCopy) with attacker's implementation
assembly {
sstore(0, ATTACKER_NEW_IMPL)
}
}
}
// Attacker's new implementation contract
contract DrainWallet {
function sweepETH(address to) external {
payable(to).transfer(address(this).balance);
}
function sweepERC20(address token, address to) external {
IERC20(token).transfer(to, IERC20(token).balanceOf(address(this)));
}
}
One transaction. The masterCopy was overwritten. The attacker now controlled all logic for Bybit's cold wallet.
Stage 4: The Drain
With full control, the attacker transferred:
- 401,346 ETH (~$1.12B)
- 90,375 stETH (~$253M)
- 15,000 cmETH (~$42M)
- 8,000 mETH (~$22M)
Total: approximately $1.46 billion.
Why Every Security Layer Failed
1. Multisig Didn't Help
Bybit used a multi-signature wallet — multiple people had to approve the transaction. But multisig protects against key compromise, not UI compromise. When every signer sees the same manipulated UI, they all approve the same malicious payload.
The fundamental problem: Multisig assumes signers can independently verify what they're signing. When the verification layer itself is compromised, the signatures become meaningless.
2. Hardware Wallets Didn't Help
The signers likely used hardware wallets (Ledger devices). But hardware wallets display raw transaction data — hex-encoded calldata for complex contract interactions:
// What the hardware wallet shows:
To: 0x1234...abcd
Data: 0x6a761202000000000000000000000000...
Value: 0 ETH
// What a human can understand from this: nothing useful
No signer can decode a delegatecall to an arbitrary contract by reading hex on a tiny screen. They trusted the Safe UI. The UI lied.
3. Safe{Wallet} Didn't Have SRI
Subresource Integrity (SRI) verifies fetched resources against expected cryptographic hashes:
<script src="app.js"
integrity="sha384-oqVuAfXRKap7fdgcCY5uykM6+R9GqQ8K..."
crossorigin="anonymous">
</script>
Safe{Wallet} did not implement SRI. If they had, the browser would have refused to load the tampered JavaScript, stopping the attack at Stage 2.
4. No Transaction Simulation
Modern tools like Tenderly or Blowfish can simulate transactions before signing, showing exactly what state changes will occur. A mandatory simulation step would have revealed the masterCopy storage change.
Concrete Mitigations for Your Protocol
1. Implement SRI on All Frontend Assets
<script src="app.js" integrity="sha384-[hash]"></script>
This is table-stakes. If Safe had this, the attack fails.
2. Independent Transaction Verification
Never trust a single UI. Before signing any high-value transaction:
# Decode what you're actually signing
cast calldata-decode "execTransaction(address,uint256,bytes,uint8,...)" 0x6a761202...
# Check if operation=1 (delegatecall) — red flag
3. Restrict delegatecall at the Guard Level
contract NoDelegatecallGuard is Guard {
function checkTransaction(
address, uint256, bytes memory,
Enum.Operation operation,
uint256, uint256, uint256, address, address payable, bytes memory, address
) external view {
require(operation == Enum.Operation.Call, "delegatecall blocked");
}
}
4. Time-Locked High-Value Transfers
| Threshold | Delay |
|---|---|
| > $10M | 24-hour timelock |
| > $100M | 72-hour timelock + board notification |
5. Air-Gapped Transaction Construction
For cold wallet operations, construct transactions on an air-gapped machine using CLI tools:
safe-cli send-ether \
--safe 0xBybitColdWallet \
--to 0xBybitHotWallet \
--amount 1000 \
--nonce 42
The Bigger Picture: Supply Chain Is the New Attack Surface
The Bybit hack represents a paradigm shift. We've spent years hardening smart contracts — formal verification, fuzzing, audits, bug bounties. But Lazarus didn't attack the contracts. They attacked the human-computer interface.
This mirrors a broader trend:
- BONKfun (2026): Domain hijack → wallet drainer
- Step Finance (2026): Executive device compromise → key extraction
- Bybit (2025): Frontend supply chain → UI manipulation
The attack surface has moved from Solidity/Rust code to CDN infrastructure, developer workstations, frontend dependencies, and the gap between what users see and what they sign.
Smart contract audits are necessary but no longer sufficient. The industry needs to treat frontend security, operational security, and supply chain integrity with the same rigor we apply to smart contract code.
Follow @ohmygod for more DeFi security research and vulnerability analysis.
Top comments (0)