You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
We've just made this paper: Compositional modelling of immune response and virus transmission dynamics https://arxiv.org/abs/2111.02510
It's at the extreme end of applied CT in the sense that it's built on stochastic graph rewriting, and we know and emphasise that because of general abstract nonsense the approach is sound. It's nice to have that in the background, but more than that, it means that we can formulate a pretty complicated model in simple terms and don't have to worry about writing in the assembly language of differential equations with giant sums and unmanageable numbers of dimensions. But we don't actually do any CT in the paper.
We start by asking, what is the simplest, biologically reasonable process that could produce some data about Ct values for PCR tests from health workers with the COVID? The simple model reproduces two interesting things: heterogeneity in viral load, which has been observed before in the statistics leading to what is called "over-dispersion" where a minority of people are far more infectious than most, and shifts in the expected distribution of viral load depending on whether an epidemic is rising, falling or stationary.
One interesting thing is the extent to which it's really necessary to have multi-disciplinary collaboration to do something like this. A few of us are computer scientists, and I've been working on epidemics for a while so know a little about that. But before doing this, I knew nothing at all about the immune system. So coauthors include people who do epidemic models too and people who know immunology. That's important to keep it on the rails.
There are also a few ideas at the end, mentioned obliquely, about things that I don't think we know how to do properly yet in Open Systems. The question of what to do with rates when you compose together stochastic rewriting systems like these rule-based models or Petri nets. These are basically done by hand here. And a possible answer which might be completely wrong about how maybe we could do something similar to what happens in Open Games with the utility functions. And, could we figure out a way to compose models that have both games and nets or rules? That would be very powerful.
Also, don't even ask how long it took to get TikZ and pgfplots to do that...
Cool! It'll be great to hear more about this in our little "categorical epidemiological modeling" group the week after next. I'll read the paper ahead of time. I'm curious what you actually mean by what to do with rates when you compose Petri nets with rates. I can imagine this having an elegant sort of obvious answer that I've already written about, or being a big mess, depending on how you intend to compose those Petri nets (with rates).
(I distinguish Petri nets with rates from "plain" Petri nets - the latter being popular in computer science.)
I'd like to know what this elegant sort of obvious answer that you've written about is!
Well, it's this:
A reaction network is just an equivalent way of presenting data in a Petri net with rates, and this paper actually uses Petri nets with rates, but we say "reaction networks" since chemists prefer that terminology.
The form of composition here may or may not be general enough for your purposes. It's what Evan Patterson is using in his software:
We've been talking about it a bunch in our seminar... not that this helps you much!
There are lots of videos of me explaining this compositional framework reaction networks, which may be easier than the paper. Here's one:
The elegant obvious solution is elegant and obvious because it tackles a fairly unambitious problem - so if you're thinking about something difficult, it's probably something else.
Aha, yes, I remember discussion of that in the meeting before the one that I missed because GWB changed the time when the time changes. But I thought where that has most recently landed is basically carrying along a tuple of rates from each reaction/transition that are being composed together. That's nice because it doesn't lose any information. But rate for the composite reaction needs to be a number, probably some function of those things in the tuple. So to actually perform the composition and get an instance of something that I can simulate, I need to supply extra information. Is that right?
I don't completely understand the term "composite reaction" - mainly because in my setup, when you compose Petri nets with rates you don't compose reactions.
Petri nets with rates have "transitions" (= reactions) and "places" (= species). When you compose two Petri nets with rates in my setup, you identify some places in one with some places in the other, and that's all.
Sorry, I was mixing this up with the type of composition of nets using a pullback that we were talking about a few weeks ago.
I tend to think of these things as more reaction-shaped because it's easier to think of the graph rewriting rules that way. With rules you can sort of make a graph that corresponds to what a Petri net is out of the possible sequences of rule applications, but that makes my head hurt.
It ought to be useful to think about that though, despite the headache, because it does say something about causality. In the sense that things towards the input of a Petri net "cause" things towards the output. Same for the corresponding dependency graph for rules.
Maybe I'll understand this a bit better when you give your talk, esp. if I ask about it.