You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
The category is a Markov category. In particular, it is symmetric monoidal.
But is it closed?
I think it is not, but I'm not sure. My clue is
I.e., we would have a natural bijection between the stochastic maps from to and the probability distributions on the (yet to be defined) object . The latter, I guess, should be related to some space of functions from to . But I don't see how to naturally map a stochastic map to a probability distribution on a function space (of course this is not an argument).
But I don't see how to naturally map a stochastic map to a probability distribution on a function space (of course this is not an argument).
I can only offer some help from the applied probability end of things, which I do know. By [[stochastic map]] I think you mean a conditional probability of a transition , and not, a function with an explicit noise term , like . (If it is the latter, check out Ito Calculus for how to work with these.)
Are you familiar with the Chapman-Kolmogorov equation? This is an integral over a product of intermediate states using Bayes Law. The markov chain assumption allows us to go from a probability over every possible path/history (e.g. a probability distribution in function space ) to a probability over the underlying space , by explicitly saying that your history is encoded in your position .
Indeed I see things very much like the Chapman-Kolmogorov equation on the nLab page for Markov Category. It looks like instead of , they use the sigma-algebra directly and you have , which makes a certain amount of intuitive sense -- given where you are, what chance do you have of ending up in some measurable set of next turn. Whether some version of CK satisfies naturality conditions, I do not know.
Which is to say, given a Markov-Chain-Assumption-like rule, that maps does that help you?
Hope that helps; I too need to read up before Paolo's talk!
Stoch is not closed, and the obstruction goes the way you suspected: there are more distributions on than maps . As an example, take . Then whereas . Then the former is bigger than the latter: there are distributions on which are not the product of distributions on .
Matteo is correct, and that's the minimal counterexample.
Here is further intuition on that.
There are two ways to interpret the expression "random function":
In Haskell, if P is a probability monad, one would write the first type as X -> PY (i.e. a Kleisli morphism), and the second one as P(X -> Y) (i.e. the monad applied to the internal hom).
Now there is a difference between the two:
In general, one can only hope that the subcategory of deterministic morphisms is Cartesian closed. Now, for the case of Stoch, this is also not true. But there are related categories, such as quasi-Borel spaces as well as particular topological spaces where this is true.
Notice also that, in order to talk about the type P(X -> Y), where X -> Y are deterministic morphisms, it is sufficient that the subcategory of deterministic morphisms is cartesian closed, so this is not a great limitation.
In practice, types such as P(X -> Y) appear, for example, when we try to fit functions to data (nonparametrically).
Great, thank you all for the answers!