Sign a Nostr event in 60 lines of Python using coincurve — no nostr-sdk, no nbxplorer, no rust toolchain
Every Nostr tutorial I read while wiring up my own agent's publisher loop wanted me to install some variant of nostr-sdk, python-nostr, nostr-tools, or a Rust crate wrapped in maturin. A few of those packages had half-broken BIP340 paths. A couple pinned to cryptography libraries that fight with pip install on Windows. The best of them — nostr-sdk — is genuinely great, but it is 40 MB of transitive dependencies for what is, at the end of the day, one SHA-256, one Schnorr signature, and a websocket send().
So I did what a lot of devs do when the ecosystem feels heavier than the problem: I opened the NIP-01 spec and wrote it myself. The whole publisher — keypair derivation, canonical event serialization, BIP340 signing, and fan-out to six relays — is 60 lines of Python and two dependencies you already have if you do any Bitcoin or Solana work: coincurve and websocket-client.
This post is the full working code, the exact traps I hit (coincurve's API is not what the docs front page suggests), and the reason Nostr's BIP340 choice actually makes this simpler than the average "sign a JWT" example you'll see on the average SaaS blog.
The wider context — an autonomous signal engine that publishes its own research to X, Mastodon, and Nostr — lives at cipher-starter and cipher-x402. The full publisher script this article is extracted from is in github.com/cryptomotifs/cipher-x402.
Why BIP340 Schnorr and not ECDSA
The Bitcoin ecosystem shipped BIP340 in 2020 as the standard Schnorr signature scheme for secp256k1. Nostr adopted it verbatim in NIP-01: a Nostr pubkey is an x-only 32-byte secp256k1 point, and a Nostr sig is a 64-byte BIP340 Schnorr signature over the SHA-256 of a canonical serialization of the event.
The practical consequence for a Python dev is: you do not want cryptography, you do not want ecdsa, you do not want pyca. Those all implement ECDSA, not BIP340 Schnorr, and BIP340 has non-trivial differences (deterministic nonce via auxiliary randomness, x-only keys with even-y normalization, tagged-hash domain separation). You want a library that wraps libsecp256k1 directly, because libsecp256k1 has a hardened, constant-time BIP340 implementation shipped by the Bitcoin Core team.
That library, in Python, is coincurve. As of 21.0.0 (October 2025 release) it exposes PrivateKey.sign_schnorr and PublicKey.verify_schnorr and both are thin wrappers over secp256k1_schnorrsig_sign32 / secp256k1_schnorrsig_verify in libsecp256k1.
pip install coincurve==21.0.0 websocket-client==1.8.0
That is the entire dependency surface. No Rust toolchain. No maturin. On Windows it installs a prebuilt wheel; on Linux it compiles libsecp256k1 in under ten seconds.
Generate or load a keypair
Nostr private keys are 32 random bytes. Public keys are 32 bytes — the x-coordinate of the secp256k1 point, with the y-coordinate normalized to even. coincurve.PrivateKey gives you the full point; you need to call .public_key.format(compressed=True)[1:] to drop the 0x02/0x03 prefix and keep only the 32 x-bytes, then verify the prefix was 0x02. If it was 0x03 (odd y), BIP340 says the private key must be negated before signing. Luckily coincurve.PrivateKey.sign_schnorr already handles that internally — it does the even-y normalization for you. But the pubkey you publish must still be the x-only form derived from the even-y point.
# nostr_keys.py
import os
from coincurve import PrivateKey
def new_keypair() -> tuple[str, str]:
"""Return (privkey_hex, pubkey_hex_xonly) — both 64-char lowercase hex."""
sk = PrivateKey(os.urandom(32))
sk_hex = sk.secret.hex()
# x-only pubkey: strip the 0x02/0x03 byte from the compressed form
pk_xonly = sk.public_key.format(compressed=True)[1:]
return sk_hex, pk_xonly.hex()
def load_keypair(privkey_hex: str) -> tuple[PrivateKey, str]:
sk = PrivateKey(bytes.fromhex(privkey_hex.strip()))
pk_xonly = sk.public_key.format(compressed=True)[1:]
return sk, pk_xonly.hex()
The trap I hit here: I originally did sk.public_key.format(compressed=False)[1:33] thinking "uncompressed minus the 0x04 prefix, first 32 bytes". That gives you the x-coordinate regardless of y-parity. But the signer is going to normalize the internal representation to even-y and then sign. If your stored x-only pubkey came from the odd-y point, the resulting signature is still valid (because BIP340 is x-only on the verification side too) — but you now have two different pubkeys floating around, and relays will reject note.pubkey != serialized_pubkey. Always derive the published pubkey from the compressed form and strip byte 0, because that path is consistent with what libsecp256k1 itself does during signing.
Canonical serialization — the one rule everyone gets wrong
NIP-01 specifies that the event id is the SHA-256 of a UTF-8 JSON array with a very specific shape:
[0, <pubkey_hex>, <created_at>, <kind>, <tags>, <content>]
- No whitespace between elements.
- Unicode escape sequences only for characters below 0x20 and the five mandatory escapes (
\n,\",\\,\r,\t,\b,\f). - Everything above 0x20 — including non-ASCII — goes through as UTF-8 bytes, not
\uXXXX.
Python's json.dumps by default does the opposite: it escapes all non-ASCII to \uXXXX. If you serialize with ensure_ascii=True (the default) and the content contains emoji, your event id is wrong and every relay silently drops the message.
import json
def serialize(pubkey_hex: str, created_at: int, kind: int, tags: list, content: str) -> bytes:
payload = [0, pubkey_hex, created_at, kind, tags, content]
return json.dumps(
payload,
ensure_ascii=False,
separators=(",", ":"),
sort_keys=False,
).encode("utf-8")
The separators=(",", ":") kills the spaces that json.dumps inserts by default. The ensure_ascii=False is the load-bearing flag — without it, your 🚀 becomes \ud83d\ude80 in the serialized form and the relay's own SHA-256 over the received JSON no longer matches the id you sent.
Sign the event
With the canonical bytes in hand, the id is just SHA-256 of those bytes, and the signature is BIP340 Schnorr over the same bytes.
import hashlib
import time
import uuid
def build_event(sk, pubkey_hex: str, kind: int, content: str, tags=None) -> dict:
tags = tags or []
created_at = int(time.time())
serialized = serialize(pubkey_hex, created_at, kind, tags, content)
event_id = hashlib.sha256(serialized).digest()
sig = sk.sign_schnorr(event_id) # 64 bytes, deterministic + aux-rand internally
return {
"id": event_id.hex(),
"pubkey": pubkey_hex,
"created_at": created_at,
"kind": kind,
"tags": tags,
"content": content,
"sig": sig.hex(),
}
That is it. coincurve.PrivateKey.sign_schnorr(msg32) takes exactly 32 bytes (the hash, not the preimage), pulls 32 bytes of auxiliary randomness from the OS, and returns 64 bytes of BIP340 signature. The auxiliary randomness is part of the spec — BIP340 signatures are not purely deterministic, they are "synthetic nonce" — but libsecp256k1 handles that under the hood.
Publish to relays over websockets
Nostr relays speak a tiny JSON-over-WS protocol. To publish, you open a websocket, send ["EVENT", <event>], and wait for an ["OK", <event_id>, <accepted>, <message>] frame. Most relays ack within a second; some will close the connection without acking if they disagree with your event (rate limit, spam filter, proof-of-work requirement). You do not want your publisher to hang on a dead relay, so set a small timeout and fire-and-forget the rest.
import json
import websocket
RELAYS = [
"wss://relay.damus.io",
"wss://nos.lol",
"wss://nostr.mom",
"wss://nostr-pub.wellorder.net",
"wss://relay.primal.net",
"wss://offchain.pub",
"wss://relay.snort.social",
]
def publish(event: dict, timeout: float = 4.0) -> list[tuple[str, bool]]:
results = []
payload = json.dumps(["EVENT", event])
for url in RELAYS:
try:
ws = websocket.create_connection(url, timeout=timeout)
ws.send(payload)
# Optional: read one frame to confirm OK, but do not block the loop on it
try:
resp = ws.recv()
ok = '"OK"' in resp and event["id"] in resp and "true" in resp.lower()
except Exception:
ok = True # relay silently accepted
ws.close()
results.append((url, ok))
except Exception as e:
results.append((url, False))
return results
In production I run the loop in a thread pool so the six relays go in parallel — shaves ~20 seconds when one relay is slow. A sequential version is fine for hourly cron.
The full 60-line publisher
Putting it all together, this is what the autonomous agent actually ships to production:
# nostr_publish.py — full working publisher in ~60 lines
import hashlib
import json
import os
import time
import websocket
from coincurve import PrivateKey
RELAYS = [
"wss://relay.damus.io", "wss://nos.lol", "wss://nostr.mom",
"wss://nostr-pub.wellorder.net", "wss://relay.primal.net",
"wss://offchain.pub", "wss://relay.snort.social",
]
def load(privkey_hex):
sk = PrivateKey(bytes.fromhex(privkey_hex.strip()))
pk = sk.public_key.format(compressed=True)[1:].hex()
return sk, pk
def serialize(pk, ts, kind, tags, content):
return json.dumps(
[0, pk, ts, kind, tags, content],
ensure_ascii=False,
separators=(",", ":"),
).encode("utf-8")
def build(sk, pk, content, kind=1, tags=None):
tags = tags or []
ts = int(time.time())
msg = serialize(pk, ts, kind, tags, content)
eid = hashlib.sha256(msg).digest()
sig = sk.sign_schnorr(eid)
return {
"id": eid.hex(), "pubkey": pk, "created_at": ts,
"kind": kind, "tags": tags, "content": content, "sig": sig.hex(),
}
def publish(event, relays=RELAYS, timeout=4.0):
payload = json.dumps(["EVENT", event])
ok_count = 0
for url in relays:
try:
ws = websocket.create_connection(url, timeout=timeout)
ws.send(payload)
try:
resp = ws.recv()
if '"OK"' in resp and event["id"] in resp:
ok_count += 1
except Exception:
ok_count += 1
ws.close()
except Exception:
pass
return ok_count
if __name__ == "__main__":
sk_hex = open(os.environ["NOSTR_KEY_PATH"]).read().strip()
sk, pk = load(sk_hex)
evt = build(sk, pk, "hello from 60 lines of python")
n = publish(evt)
print(f"event_id={evt['id']} relays_ok={n}")
That is the whole thing. No Rust. No 40 MB dependency tree. One secp256k1 operation, one SHA-256, one websocket send.
Gotchas worth memorizing
The even-y trap. I already mentioned it — derive your x-only pubkey from public_key.format(compressed=True)[1:], not from the uncompressed first-32-bytes slice. Otherwise your pubkey field does not match what libsecp256k1 uses internally for signing.
ensure_ascii=False is mandatory. If you ever post content with emoji, accented characters, CJK, or the rupee symbol, the default json.dumps output will hash to the wrong id and every relay will reject the event with "bad signature" — even though your signature is perfectly valid. It is the event id that is wrong. Spend ten seconds adding the flag.
Relay timeouts. Some relays are chatty (Damus, Primal, Snort all reliably ack in <500 ms). Some are quiet. Some have stealth-banned specific pubkeys and will accept the socket but never ack. Do not put a 30-second read timeout in your loop — 3-5 seconds is plenty, and "no ack" is fine because other relays will carry the event.
Rate limits and PoW. Damus drops you after ~10 events per 10 seconds per IP — spread posts across relays or back off. A handful of relays (not in the list above) require NIP-13 proof-of-work; add a nonce tag mined until the id has N leading zero bits. For the seven relays listed above, no PoW is required.
Verify your own signature locally
If a relay rejects your event for "bad sig", the debugging path is to verify the signature locally against your own pubkey before you even open the websocket. That takes two lines:
from coincurve import PublicKey
PublicKey.from_xonly(bytes.fromhex(event["pubkey"])).verify_schnorr(
bytes.fromhex(event["sig"]),
bytes.fromhex(event["id"]),
)
If that returns True, the event is structurally correct and any relay rejection is a policy decision (rate limit, PoW, pubkey ban), not a cryptographic one. If it returns False, your serialization is wrong — almost certainly the ensure_ascii=False flag or a whitespace issue in separators.
Why this matters for anyone running autonomous agents
The CIPHER signal engine I am building — see cipher-x402 and the open playbook at cipher-starter — posts its own research to Nostr on a schedule. Nostr is a good fit for autonomous publishers specifically because there is no signup, no email verification, no CAPTCHA, no TOS. You generate a keypair, you publish. It is what email would have been if email had been invented by someone who hated spam as much as developers do.
If you are wiring up your own autonomous poster and you hit the same "which Nostr library is not abandoned this week" problem I hit, copy the 60 lines above. They work today on Python 3.11+ on Windows, macOS, and Linux, with nothing but coincurve and websocket-client. They will still work in five years because BIP340 is not changing and NIP-01 is a frozen spec.
Shipping an autonomous publisher on top of a properly specced protocol feels like what web development is supposed to feel like. No SDK, no platform, no 15% cut — just a hash, a signature, and a socket.
Sai (cryptomotifs) builds autonomous signal engines. The full publisher is open-source at github.com/cryptomotifs/cipher-x402. The broader solo-dev playbook is at cipher-starter and the live product at cipher-x402.vercel.app.
Top comments (0)