Listeners of Threshold Cryptography Bootcamp were (t)asked to reflect on the first four sessions we got to this date. I guess the most of approaches to the task will be a (LLM-)conspect of the lectures, and I don't want to join this chorus since there's quite a number of threshold crypto guides and I can only produce a worse one. Instead let me think out loud on comparison of the threshold approach and naive multisig. I mean the nature of the lectures advertises the first one quite heavily and as a natural (for a cryptography engineer) skeptic push-back I was constantly tried to think how the things could be done (and often are done currently) without thresholds.
By multisig I mean just counting the signatures on something which obviously can solve the very same problem: to require $t$ out of $n$ sign-offs before going. My current conclusion (probably wrong as I'm just get the feeling of the things) is that thresholds are really superior in the two aspects:
- they're much cheaper in the sense they takes both the size and verification computation of one signature; and
- they're deterministic which enables a lot of interesting directions to explore.
They have their problems too. The committee (the $n$ out of which we need $t$ sigs) is one way or another is gated and need to go through a fancy procedure to keep the determinism when it changes, and there's not much we know to do now to prevent colluding or just (silently) leaking/loosing their key by a member (i.e. $1$ out of $n$) which leads us to strong or weak but inevitable assumption of some honesty of a committee (which mostly explains its gated nature/requirement). That's was a summary & conclusions, and further I'll try to unpack this with more lengthy thought on these points.
A lot of emphasize was put on BLS signature scheme during the lecture as a corner-stone of threshold cryptography. If I get this correctly it's because the sigs are homomorphic (but not fully; for that they'd need this property over multiplication as well). Okay, we can sum a batch of sigs and the result is the valid (aggregated) sig on the thing. Sounds useful for multisig as well!
I feel that $t$ out of $n$ problem is hairy either on the initial end or on the final end; and it's the insight or way to look at the problem of this write-up. Multisig approach lean to be hairy in the finish, and threshold approach leans to be hairy on the start (but there are good news to it below - spoiler #standardization). Btw, it's a reason why multisig is "naive" here: it's easy to start with just the primitives you already have and count the sigs until there are $t$ of those. But further we go with it -- more challenges to solve we get: remembering the sigs (or the fact of verification), management of the committee membership, time to verify or parallelization and trust in this process. All these problems are solvable, and multisig itself isn't naive when done sophisticated, but anyway the solution of the problems are in the end of a thing, closer to when verification happens; and the size is something which can't be workaround without loosing verifiability.
Let's briefly get over the main moving parts to better appreciate the benefit(s) of a threshold approach.
I just came up with committee key but I feel it's a descriptive term! Also the diagram source is available in the end of the page.
[! A fun insight I got]
All the VRF and everything else is just something a committee signs in the same way. Just give them instead of the generator a specific value which represents something you need (timestamp, block height, some data - all will be hashed to the curve anyway) and sign in the same fashion --- the resulted point is that deterministic piece of data you will be building upon.
A single public key sounds like a miracle, it's a huge enabler indeed! But what if the committee needs to be changed (and here goes all the other challenges mentioned earlier for multisig)? The answer is that we need to preserve the polynomial which defines the key during all the manipulations. And it's at least as hairy as the other approach, but now it should be dealt before we get to signing anything --- that's why I present it as the other end of the process. We need to do quite complex protocols (which are still advances rapidly; see distributed key generation - it's the main rabbit hole here) before getting to sign things, but then we have nice, clean, and deterministic system in our hands. Why determinism here such a great deal? It enables of the thing with "auto-": auto-reveal, auto-decryption, auto-confirmation, and so on. Because while multisig is busy with its hairy stuff threshold approach has one stage less on the finish and doesn't need an additional interactions thanks to using that known public key defined by the polynomial (multisig never knows which members come up with signatures in the end of the day, so after sign-off another action is needed).
I nearly abused the word "hairy" today, but let's end this with a major note. What's appealing to me dcipher
(personally I think tcipher
would be a cuter name) do the #standardization effort so that we could have nice and clean abstraction on the initial end of the threshold approach as well. I guess it's a process, and not all the standards catch-up immediately especially when the domain pace is high (👋 https://zkproof.org/), but I feel that all these efforts are not in vain and helps us even when a next effort overcome the current wave. Let's "stand on the shoulders of giants!"
graph LR
subgraph Threshold_t["any $t$ out of $n$ keys are sufficient"]
A[Personal Key 1] -->|Contains| P[Polynomial of Degree $t-1$]
B[Personal Key 2] -->|Contains| P
C[Personal Key t] -->|Contains| P
end
D[Personal Key t+1] -->|Contains| P
E[Personal Key n] -->|Contains| P
P -->|Defines| CK[Committee Key]
CK -->|Signs| G[the Generator]
G -->|Produces| PK[Deterministic Public Key]
Top comments (1)