You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
John Baez said:
Eugenio Moggi was trying to create a decorated cospan category of "open Moore machines".
I promised @Sophie Libkind I'd try to find out more about this. Here's what Moggi just told me:
Dear John, I do not have specific pointers to paper on Moore and Mealy machines,
they were considered in the context of "sequential circuits" and the
difference between the two machines
- in a Mealy machine the output at time t depends on the state and the input at time t
- in a Moore machine the output at time t depends only on the state at time t
in both machines the state at time t+1 depends on the state and the input at time t.Mealy machines subsume "combinational systems" that do not have state and
the output at time t depends only on the input at time t.I attach a set of slides by Manfred Broy on "model based system engineering"
that provide a broader context. In his slides a system has an "interface" to
its context, but what is the systems and what is the context is a matter of
choice.On slide 20 the interface is structured in input and output channels, and one
can introduce the notions of weak and strong causality between inputs and
outputs, that capture (at the level of system behavior) the difference between
Mealy and Moore machines.Composition of several components is introduced from slide 35, we could
reduce it to
"sequential composition" of two components C1 and C2, where SOME outputs of C1
are connected to some compatible inputs of C2 (this subsumes parallel
composition, when NO connections are established)"feedback" when some outputs of a component C are connected to some inputs of
the same component CIf we want that composition operations preserve determinism, i.e., the outputs
of a systems are a function of its inputs, then the feedback operation is
problematic, it does not preserve determinism, but it preserves
determinism+strong causality. Therefore, the feedback operation is well defined
on Moore machines, but not on Mealy machines. On the other hand, the identity
function, where the output (stream) is the input (stream), is not computed by a
Moore machine.Cheers, Eugenio
I'd love to see Manfred Broy's slides on "model based system engineering". Thanks.
Here they are:
Not sure this is quite on topic, but I just came across an interesting paper I hadn't seen before. It does have some discussion on open automata . Sheaf Semantics for Concurrent Interacting Objects - Goguen https://pdfs.semanticscholar.org/569a/8c66bac1e8b034b26c77494bbe1dd4e8c88e.pdf
Thanks so much, John!
Therefore, the feedback operation is well defined on Moore machines, but not on Mealy machines. On the other hand, the identity function, where the output (stream) is the input (stream), is not computed by a Moore machine.
Neat. This reminds me of something David Spivak once told me: there easily exists a wiring diagram algebra of Moore machines but not of Mealy machines precisely because of feedback. I think wiring diagrams give a nice presentation of the sequential and feedback compositions Moggi described. I wonder if sequential/feedback composition is how Moggi expected open Moore machines to compose. To me, it feels quite different than the type of composition that comes out of decorated cospans because in sequential/feedback composition inputs and outputs are distinguished and play different roles. I tend to think of the composition that arises from decorated cospan as resource sharing. In the case of Moore machines, resource sharing might be synchronizing two Moore machines by pulling back their states. I'm curious if the "modes" that arise on slide 57 are more like this type of composition.
I'm not sure if this is 100% relevant, but "epsilon transducers" are a kind of "minmal" open(?) machine in the stochastic context. (At least open in the sense that they have inputs and outputs - I'm not sure if the term means something different here.)
Nix Barnett and Jim Crutchfield's paper "Computational Mechanics of Input-Output Processes: Structured transformations and the -transducer"
https://arxiv.org/pdf/1412.2690.pdf
They're minimal in the sense of having the smallest state space needed to generate a given behaviour. (I wonder they can be defined by a universal property in some category. It kind of seems like they should.)
I've read a little about -machines before but not -transducers. In general, though, I find Crutchfield's philosophy on structure of computation really interesting! From a brief skim it looks like -transducers are Mealy machines, meaning the output can depend on the input.
I wonder if the -machine of an automata is something like the terminal simulator that is the identity on outputs. I'm curious about how -machines play with composition either of the sequential variety or of the resource sharing variety.
I haven't looked in detail how much fancier things get for -transducers, but I think that for -machines, Section II, Equation (6) of this paper https://arxiv.org/pdf/0708.1580.pdf gives a pretty clear picture that there is some universal property in play.
In their setup, you have a stochastic process , a joint distribution over a binfinite sequence of random variables corresponding to pasts and futures . The causal-states/-machine come from the equivalence relation on past trajectories: iff , namely if two histories give rise to the same future conditionals. Then, they say that for any other 'prescient rival,' meaning any other compression of histories that doesn't loose any mutual information with respect to futures, the one has the smallest entropy/statistical complexity. That last bit sounds a lot like, given any other competitor, there is this map.
Yea, I'd be keen on playing around with some simple examples of how composition and epsiloning get along.
Maybe there's a 'versal' property at work. That's like a universal property but without requiring any uniqueness in the map from the versal object to its competitors.
I'm wanting to work on categories of open machines, maybe with Sophie and anyone else who wants to join in. In my usual style I'd like to start by figuring out what categories we're talking about here, exactly.
I would love to join in!
Philip Zucker said:
Not sure this is quite on topic, but I just came across an interesting paper I hadn't seen before. It does have some discussion on open automata . Sheaf Semantics for Concurrent Interacting Objects - Goguen https://pdfs.semanticscholar.org/569a/8c66bac1e8b034b26c77494bbe1dd4e8c88e.pdf
Cool find. Had not seen this Goguen paper before!
What's an open machine? Is it just a machine with inputs and outputs (which can be composed into bigger machines) or is it something else?
What is the difference between an open machine and an open dynamical system? I say none.
Until they've been defined, you're right.
Yea, it seems like there are a number of perspectives floating about for how to define 'open machines,' cospans, operads of wiring diagrams, and more. Then, there are even more knobs to turn within each of those, {structured, decorated}, colored wires, no passing wires for the operad stuff. I'm not super keen on fleshing out all the differences as much as picking a particular question and trying to answer it in one.
Dynamical system defined
Arnold and Avez (1968), Perspectives of nonlinear dynamics, Vol 1, E. Atlee Jackson, p 51.
Let be a measure preserving manifold, a measure on defined by a continuous positive density, a one -parameter group of measure-preserving diffeomorphisms. The collection is called a classical dynamical system.
I guess we should say that we are considering physically realizable machines, otherwise we will end of focusing on oracles.
That's one of many definitions of dynamical system. As Blake said, there are lots of knobs to turn. Here are three:
1) What's the space of states? Here it's a "measure preserving manifold". That doesn't actually mean anything, so let's correct it: it's a manifold with a Borel measure on it.
2) What is time? Here it's the real numbers.
3) How does time evolution act on states? Here it acts by a one-parameter group of measure-preserving diffeomorphisms.
You can see a definition that tries to cover many choices of knob settings here:
Each of the more interesting choices of knob settings has given rise to a large body of theorems.
For example, sitting by my bed I have a whole book of amazing theorems about this case:
1) What's the space of states? It's the unit interval [0,1].
2) What is time? It's the natural numbers.
3) How does time evolution act on states? It acts by a semigroup of continuous maps.
So, everything is summarized by a continuous map . We can get periodic points, chaos, etc.
@John Baez could you please clarify " semigroup of continuous maps." By continuous maps do you mean flows?
I mean that for each natural number there is a continuous map and that .
The word "flow" shows up only when time is parametrized by real numbers:
https://en.wikipedia.org/wiki/Flow_(mathematics)#Formal_definition
That's a very general definition of "flow", but usually people add extra conditions, like the ones they mention but also continuity or smoothness with respect to the time parameter .
@John Baez, do you like working with Julia? I'm writing code to support this "machine."
1) Isn't another way to say point 1 is that the Lyapunov exponents tend to be imaginary or the Lyapunov multipliers . I know about positive greatest value leading to chaos and a negative greatest value leading to stability.
2) I work with the dynamics of arithmetic, so I'm happy for time where . I believe I can prove the unity of all maps and flows. I realize how unlikely that sounds. Oh well, I'm happy to consider a flaw in my proof.
3) I work with the Taylor series of .
The first derivative results in the well known Linearization Theorem.
Let and with
The second derivative
The third derivative