You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
A paper I wrote with Anne Broadbent is finally out on arXiv, have a look: https://arxiv.org/abs/2105.05949
Let me talk about the work slightly less formally here instead of copy-pasting the abstract. When I first started learning about cryptography, I was surprised to learn that it is not compositional: if one composes provably secure protocols, the end result needs a new security proof, or might in fact be insecure. To some extent this is unavoidable, but one approach is to define security of a protocol in such a way that composition preserves it. Cryptographers have many frameworks for this e.g. universal composability and other variants. Most of these rely on the real-world-ideal-world paradigm (aka simulation paradigm), where roughly speaking a real protocol achieves some ideal functionality if any vulnerabilities on the protocol are already present in the ideal functionality. More formally, you'd say that for any attack on the real protocol there is an attack on the ideal functionality so that the end results are the same (or "look close enough for efficient adversaries", if working with computational security). However, the existing approaches are seen as more difficult to work with than doing game-based security proofs (that don't necessarily compose), mostly due to them setting up very detailed machine models for interactive, probabilistic computation. Given that same informal simulation paradigm underlies most of the approaches to composable security, we thought that there's something more abstract going behind it. This results in the n+1th approach to composable cryptography explained in the paper.
The viewpoint taken is that cryptography is a resource theory in the sense of Coecke-Fritz-Spekkens, where the resources are various functionalities (e.g. an authenticated channel or a ideal functionality for bit commitment), and the resource transformations are protocols that securely transform such resources to others (for instance, one-time pad can be seen as transforming an authenticated channel and a key to a secure channel). This is very close to constructive cryptography in spirit, and is not that far from its relative abstract cryptography: it's just that we prefer using category theory as an "algebraic theory of systems" to rolling one's own. More detailed discussion of the motivation/related work/results can be found in the paper.
it is definitely a nice paper in CT, but if you want to reach people from the subject area, it would help if you spell out a security proof in the paper. or at least a model of something concrete. i very much agree (and in fact know) that categorical abstraction tools are very helpful in this area, to zoom in and out of the flood of structure, and sometimes crack a path even bewfore dispelling the community prejudice (so you have to figure out how to tell the story...) but many of the approaches that you discuss were discussed, accepted, and not just widely cited, but also really tried out in applications in their time, and then the cryptosystems and protocols proven secure got broken, time and again. so this is not an area where applications are just claimed. the successful tools deployed for proving security have been successful because they turned out to be useful for building attacks. so people nowadays tend to read papers in theoretical crypto starting from examples.
Yeah our plan with Anne was to first lay the theoretical groundwork and submit to a venue that's more amenable to that and then later do some more applied things for cryptographers (which ultimately blends into doing "composable security": most of the papers proving explicitly some protocol to be composably secure pay some lip service to a framework for it in the intro but the real work seems to be rather framework-independent). I'm not sure how one would write a paper that fits conference page limits and does interesting things for both sides at once. That said, as you say there's still quite a gap going from papers proving security for more or less explicit protocols to actually implementing them (and having the implementations be secure). Like QKD was shown to be composably secure quite a while ago but there's still tons of work analyzing aspects of implementations that get closer to the real world .
@Martti Karvonen Thanks for the nice paper. I believe the urge to use category theory for modular crypto proofs is very natural.
We've made a formalization of the SSP framework which is similar to constructive crypto
https://github.com/SSProve/ssprove
Is my understanding correct that the resource theory you are using only captures probabilistic computations, but not stateful computations, and that shared state is dealt with by shared resources.
BTW. you may also be interested in the formalization of CC in Isabelle.
Thanks @Bas Spitters for bringing this formalization to my attention - I knew that people do formally verified cryptography, but I didn't know that there's also formally verified work on simulation-based security/constructive cryptography.
As far as your question goes, I think one of the main benefits of going categorical is that the background theory is agnostic about the computational model (as long as it results in a (symmetric) monoidal category) - in particular, I'd expect it to handle stateful computations as well as quantum computations or even relativistic scenarios like here: https://iopscience.iop.org/article/10.1088/1367-2630/ab0e3b/meta
Of course, one then needs to build/choose some model of computation (which we don't really address) and most security proofs probably need to go into details of the model, so one can't do everything at the model-agnostic level.
@dusko I'm actually curious about this passage:
but many of the approaches that you discuss were discussed, accepted, and not just widely cited, but also really tried out in applications in their time, and then the cryptosystems and protocols proven secure got broken, time and again
I've often heard/read the complaint that composable security definitions are too strong in that many/most important functionalities are impossible to achieve composably (at least without setup), whereas this passage sounds like the opposite: composable security proofs are too weak in that they don't guarantee real-world security. What are some examples of the latter? Of course there is a gap between provable security and real-security (e.g. due to side-channel attacks) - to what extent is this gap more prominent in composable security proofs as opposed to e.g. game-based security proofs (still done by theorists on paper)?
Hi @Martti Karvonen I had briefly looked a similar approach before, but it seems that the Kleisli category of state+probability is not a SMC, as state is not a commutative monad. It's possible that I overlooked something, but it looks like this does not directly fit in your framework.
Please correct me if I'm wrong.
So there is a whole can of worms in choosing a model (mostly unaddressed in the paper), but I've understood that people in game semantics can build categorical models for interactive asynchronous probablistic/quantum computation, and those should work (although one might need to cut quotion out a bicategory to get a 1-category). Similarly, I'd hope that one could extract categories from the fairly detailed models of compuation in UC, IITM, GNUC or other variants, but I haven't checked carefully. That said, if the Kleisli category of state+probability is not an SMC then it doesn't fit as-is.
The computational models for crypto are quite well understood in systems such as easycrypt and cryptHOL. I don't see why we'd need game semantics for that.
In fact, cryptHOL explicitly passes state, so their model is the Kleisli category for the Giry monad, which is a SMC.
How would the one-time-pad (OTP) example from CC look in your framework?
That's good to hear, so the model selection issue seems to be an issue just for quantum computing
I'm not sure if the one-time pad example from CC (I really like that example!) changes substantially. I could imagine that in a parallel universe the same paper was written with the OTP-example as-is and then a different formalization of the background theory after the example (i.e. instead of system algebras, one moves to categories). The main differences in discussing that example (at the level of formality of that paper) seem to be syntactic: one might denote "plugging converters to ports" with a more categorical notation and the pictures (e.g. fig 3) would be oriented consistently so as to result in string diagrams, rather than having open ports in arbitrary directions.
I guess one difference is that in our setup, a protocol prescribes an action for all parties, so in the honest case we ask Eve to ignore the input apart form it's length and then the correctness equation would say that the end-result equals SEC, and then the case of dishonest Eve amounts to the same equation as currently in the paper with the same simulator.
The computational models for crypto are quite well understood in systems such as easycrypt and cryptHOL
Btw, could you point me to some references that would help me understand these models? I can obviously start from the beginning, but if there's some paper that would get me up to speed faster I'd be happy to hear
But having a precise syntax/semantics is precisely what makes the categorical approach worthwhile, so for instruction purposes it would be good to work out the OTP example.
Easycrypt uses an imperative pwhile language (which does not directly fit your framework because of state)
https://link.springer.com/chapter/10.1007/978-3-319-10082-1_6
Here's the cryptHOL paper: https://eprint.iacr.org/2017/753.pdf
Here's the OTP example in CC in CryptHOL: https://ieeexplore.ieee.org/document/8823694
@Bas Spitters Just to clarify the earlier comments on needing an SMC: the particular construction of n-partite resource transformations and the related attack models requires that you start from an SMC (in fact, I think braiding is enough but one should check). If you have a monoidal category where your protocols live and you're able to express directly the relevant attack models (satisfying the axioms), no symmetry is needed to get a monoidal category of secure protocols. If you just have a plain category (i.e. no monoidal structure), I don't see why one would expect to get a monoidal category of secure protocols out of it. However, if you're ok with just getting a plain category of secure protocols, that should work provided you can spell out the attack model (now dropping the axioms referring to ). That said, I always thought that the reason people cared about the simulation paradigm was parallel composition, as I've understood that getting secure sequential composition is often doable with game-based security definitions. If everything in sight is premonoidal, I'm not sure what happens - I can imagine it working after numerous checks or failing pretty fast.
I would find it easier to think about this if one concrete (known/existing) model would be worked out. E.g. the ones in cryptoHOL and easycrypt are know to capture many existing protocols.
Martti Karvonen said:
dusko I'm actually curious about this passage:
but many of the approaches that you discuss were discussed, accepted, and not just widely cited, but also really tried out in applications in their time, and then the cryptosystems and protocols proven secure got broken, time and again
I've often heard/read the complaint that composable security definitions are too strong in that many/most important functionalities are impossible to achieve composably (at least without setup), whereas this passage sounds like the opposite: composable security proofs are too weak in that they don't guarantee real-world security. What are some examples of the latter? Of course there is a gap between provable security and real-security (e.g. due to side-channel attacks) - to what extent is this gap more prominent in composable security proofs as opposed to e.g. game-based security proofs (still done by theorists on paper)?
a model can be too strong for applications and too weak for guarantees at the same time. but here it was something else. the complaint that universal composability simulatability etc had too strong requirements refers to the fact that the requirements that there are no key cycles, or in similatability that there is no XOR, hash, whatnot --- preclude most applications. the fact that the protocols proven secure in one of the composable schemas get broken does not mean that the proofs are too weak. it means that they are too rigid for refinements. CT could solve that, but not by magic. one memorable example is that MQV and ECMQV were proven secure in universal composability model, only to fall to an actual attacker who used an eager key server to obtain a certificate during the session. the requirements must specify that the lower bound of certificate authority must be above the session timeout. several of the protocols proposed for the IPSec, and actually i think accepted in the early RFCs, were also proven secure, and every single implementation got broken by someone, but the attacks could not be modeled because the model used could not be refined. then the whole schema had to be abandoned, like 4 years of work, and IKE was proposed... you will notice that all of the compositional paradigms date back to late 90s and into the 2000s, and are not in use any more. you could read about that in papers by michael backes from late 2000s. he was a very prominent researcher in the area, and he had a series of papers with examples and nogo statements.
thanks for the examples of failures in practice!
at least on the quantum crypto side, composability is alive and well (although they mostly use abstract/constructive cryptography). for instance, https://arxiv.org/abs/1904.06320 and https://arxiv.org/abs/2011.12704 come to mind as two recent(ish) works doing things composably
Hi @dusko , I have a hard time keeping up with all the zulip's so please tag me. Sorry to have dropped off this conversation.
Compositional proofs are alive and well, especially due to their need in actual computer formalizations of cryptographic protocols. See e.g. the references above.
@Martti Karvonen I still don't understand how your works connects with the simple and standard model where cryptographic systems are modeled but probabilitistic stateful computations.
E.g. Constructive crypto has been developed in this model in cryptHOL, the paper I linked to above.
Just tagging @Bas Spitters to ensure you see my reply.
You mentioned the one-time pad at some point - I gave a talk at the Structure meets Power workshop going through the example, and it turns out that the pictures correspond to having a Hopf algebra with an integral in your category. The slides are here https://www.cst.cam.ac.uk/files/smp-2021-slides-karvonen.pdf and the talk is here https://www.youtube.com/watch?v=GDOtMRayfo0 . However, given the time constraints I didn't have the time to properly explain how exactly this fits into the framework. This will be remedied in an updated version of the paper (not yet on arxiv).
For OTP you can just start from FinStoch (and do some constructions on top), whereas for general (classical) crypto you need a more expressive category. I'm pretty sure that the converters discussed in the formalization of constructive crypto would also work for my purposes without changing things all that much - after all, we're both after the same simulation paradigm. That said, I don't have a writeup of that at the moment.
Hopf algebra with an integral
That's cute :-)
I'm probably missing something here. FinStoch as a semantics for a programming language seems quite limited. Practically (Easycrypt, FCF, CryptHOL) it is quite well established what is needed for a language to specify cryptographic protocols.
How would one model e.g. pWhile in FinStoch. Is that part of the "constructions on top" ?
I'm slightly worried that by limiting the language, one may also limit the adversary.
I agree that FinStoch is very limited, and the constructions on top are mostly to deal with the n-partite nature. Roughly: you build a category where your states are n-partite maps and protocols then consist of each party doing "a comb" around their ports. This doesn't result in a good place to do semantics for a programming language - it's just enough to compute the main fact relevant for security of OTP: for a randomly chosen secret key, the ciphertext is random, so the adversary can't do anything they couldn't already do given a random text. For general purpose work I'd indeed have to move to a more expressive category, and I believe the converters as defined in the paper you linked earlier do a reasonable job.
Bas Spitters said:
Hi dusko , I have a hard time keeping up with all the zulip's so please tag me. Sorry to have dropped off this conversation.
Compositional proofs are alive and well, especially due to their need in actual computer formalizations of cryptographic protocols. See e.g. the references above.
Martti Karvonen I still don't understand how your works connects with the simple and standard model where cryptographic systems are modeled but probabilitistic stateful computations.
E.g. Constructive crypto has been developed in this model in cryptHOL, the paper I linked to above.
hi @Bas Spitters,
i am not pushing my work. i am talking about the basic fact that you can compose a secure protocol with another secure protocol and get an attack on both, in the sense that the security requirements satisfied by both will be disproved in their interaction. there is a conjecture that for every protocol, given with a security claim, you can design another protocol satisfying the same security claim but such that when you compoese them, the security claim is not satisfied. (search "chosen protocol attack".) that is fact is what distinguishes protocol design from software design. software design is based on composition and refinement, because softare dependability requirements are first order properties. security properties are not first order properties. (search "hyperproperties".) all of protocol theory started from gus simmons noticing that you can compose secure ciphers into a protocol (TMN i think) and the ciphers remain secure, but the protocol gets broken if by composing two runs. so the lack of composability was sort of the starting point of security research, not a contribution of my work. i mentioned by work only because we seemed to have been the first who tried to characterize some conditions under which you can securely compose things. just like the genomes: they compose under strict conditions, but in general not :)
Bas Spitters said:
Hi Martti Karvonen I had briefly looked a similar approach before, but it seems that the Kleisli category of state+probability is not a SMC, as state is not a commutative monad. It's possible that I overlooked something, but it looks like this does not directly fit in your framework.
Please correct me if I'm wrong.
@Bas Spitters isn't state lax monoidal for +, and similarly for Giry? Using + as the monoidal structure seems to be what they do for constructive crypto in cryptHOL in the paper you linked, so I think this should fit the framework just fine after all.
Hi @Martti Karvonen State gives a symmetric premonoidal category of stateful programs. See e.g. the work by @Sam Staton
https://www.cs.ox.ac.uk/people/samuel.staton/papers/popl13.pdf
I'm not entirely sure what you mean by +.
Indeed, cryptHOL does not use state in the base category, but uses resumption monads to pass state around. Do you claim that's related to what you are doing?
That would be interesting.
@Bas Spitters I meant taking coproducts as giving the monoidal structure on our base category rather than cartesian products. That's clearly the monoidal structure at play in https://ieeexplore.ieee.org/document/8823694 that you mentioned earlier, and seems to play well both with the distribution and state monads.
That could work. Did you check that it does form a monoidal category? And if so, how does your approach connect to the use of resumption monads?