You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
Can standard finite-dimensional linear algebra techniques (particularly vectors and matrices) be generalised to work in arbitrary symmetric monoidal categories? In particular, do they generalise to Rel? See the link below for some more background.
I think this should be a simple question, but I came across an odd coherence problem when working out the details, so I wanted to check that I'm not on a wild goose chase. https://math.stackexchange.com/questions/3608344/linear-algebra-in-rel
James Wood said:
Can standard finite-dimensional linear algebra techniques (particularly vectors and matrices) be generalised to work in arbitrary symmetric monoidal categories? In particular, do they generalise to Rel? See the link below for some more background.
I think this should be a simple question, but I came across an odd coherence problem when working out the details, so I wanted to check that I'm not on a wild goose chase. https://math.stackexchange.com/questions/3608344/linear-algebra-in-rel
Yes. Actually, when you look at stuff like graphical linear algebra, the underlying category is exactly linear relations.
FD-Vect and Rel are extraordinarily similar as categories. Indeed, Rel exhibits many "quantum-like" properties that make it similar to FD-Vect (for instance, you have a Frobenius structure on every object, you have a dagger, you have cups/caps, the monoidal product is not a cartesian product etc). These similarities have been extensively studied, from different points of view.
I'd always taken the underlying category there to be (Set-)FDVectₖ, with its biproduct. I guess this makes it a good place to look at generalising, but I'd still need to actually construct the biproduct for Rel-FDVectₖ. Thanks for the tip.
there where? in Rel?
Note: for these purposes, I don't care about the similarity between FDVect and Rel (though they of course exist). Rel-FDVect is potentially quite different to Set-FDVect, and that's what I'm trying to find out.
Well, the product in Rel is the cartesian product of Set, but that's not a cartesian product in Rel :slight_smile:
“there” being Graphical Linear Algebra
Oh. I'm quite sure Pawel's graphical linear algebra is over the category of linear relations :slight_smile:
Do you have a reference for that?
Quite recently, along with Fabio Zanasi and @Robin Piedeleu , they also started to look into additive relations, that turn out to be very interesting for many reasons
https://graphicallinearalgebra.net/2015/12/26/27-linear-relations/
IIUC, though, linear relations are still too Set-ish for me. I want to work with rings in Rel, rather than rings in Set, and I doubt this lines up with linear relations.
But yeah, this seems like a good refinement of the question: how far can we generalise Graphical Linear Algebra?
Graphical linear algebra seems to study something different than what you're asking: it starts with a semiring R (implicitly in Set) and gives a graphical language for R-linear relations, i.e., relations closed under the relevant operations of the semiring. This delineates a subcategory of Rel in which you can do R-linear algebra.
I suppose you could start with a semiring object in Rel itself but you would lose some of the axioms (namely the ones encoding the fact that the semiring addition and scalar action are maps, that is, are single valued and total, which guarantee that certain diagrams slide past each other nicely).
And, now that I think of it, these axioms allow you to represent matrices in this setting so I'm not sure how you would be doing linear algebra without them. Although, the operations of a Rel semiring object would be lax in the sense of Carboni and Walters' Cartesian bicategories. Perhaps there is still something to say after all...
Do you have an example of such a diagram? I came across my problems before when trying to prove associativity of matrix multiplication, which seemed to require single-valuedness. I couldn't think of a counterexample, largely because I can't think of any interesting examples of Rel-semirings.
My guess at the moment would be that vectors still work nicely, but closure is more complicated.
Matrices in graphical linear algebra are represented by diagrams made up of scalars, copying and adding, all going in the same direction (you can find an intro here: https://graphicallinearalgebra.net/2015/06/09/matrices-diagrammatically/). And this is a summary of all the axioms for GLA over a field: https://graphicallinearalgebra.files.wordpress.com/2015/11/ih.gif.
The following law would, for example, fail in general for a semiring (or just monoid) object in Rel:
image.png
As you say, it would be needed to prove associativity of matrix composition, as you would need to push the layer of additions followed by copy past each other to obtain a layer of copy followed by additions instead.
I think this should be part of the definition of a semiring in Rel. The definition contains an extra structure: the comonoid that does copying and discarding, and this structure obeys some homomorphism equations like these.
If you force a monoid in Rel to satisfy these equalities, then it should be a monoid in Set too---they just say that the monoid operation is total and single valued.
Robin Piedeleu said:
If you force a monoid in Rel to satisfy these equalities, then it should be a monoid in Set too---they just say that the monoid operation is total and single valued.
Yes, this is one of the most useful results in Cartesian Bicategories I if I recall correctly
Even for an otherwise arbitrary comonoid?
It's clear to see for the copy/delete comonoid, but not in general.
Additionally, I don't expect the comonoid to be natural, just that it respects the additive structure (and maybe the multiplicative structure, but I'm not sure).
Yes, basically for a function f being an homomorphism wrt copy means that it's single valued, while being a homomorphism wrt delete means it's total. So if you have both then you have a function.
Right, so when I don't assume that it's copy/delete, I still have a chance of getting interesting (non-Set) stuff.
In fact, by symmetry it must be possible, unless the laws involving multiplication screw things up. Take the addition to be cocopy/codelete, and the comonoid to be the opposite of some non-trivial addition operator from Set. Then this doesn't really have a Set analogue.
That summary of axioms is useful, though. I guess the point is that GLA works in an arbitrary symmetric monoidal category in which all them laws hold.
How important are Copyingᵒᵖ and Addingᵒᵖ?
(And the backwards scalars?)
James Wood said:
Right, so when I don't assume that it's copy/delete, I still have a chance of getting interesting (non-Set) stuff.
But it is copy/delete if it acts as an homomorphism for any other morphism composable with it!
James Wood said:
How important are Copyingᵒᵖ and Addingᵒᵖ?
Very important since in Rel you don't have a real direction of morphisms, and things can always be flipped!
Fabrizio Genovese said:
James Wood said:
Right, so when I don't assume that it's copy/delete, I still have a chance of getting interesting (non-Set) stuff.
But it is copy/delete if it acts as an homomorphism for any other morphism composable with it!
Right, this was the result I was thinking of, but it's easy enough to stay clear of.
You may be interested in the "Linear Algebra of Programming" https://hackage.haskell.org/package/laop-0.1.0.0
Maybe also my blog post? But it kind of sounds like you're shooting for something more formal http://www.philipzucker.com/linear-relation-algebra-of-circuits-with-hmatrix/
As far as I understand, this is replacing FDVect by Rel, whereas I want to replace Set by Rel in the definition of FDVect.
What do you mean, "replace Set by Rel in the definition of FDVect"? Can you define the category you're hinting at here?
It might be something I know...
In Set, you can state what a ring is, and take modules over that category of rings to get the usual notion of module. When you take the full subcategory of finite-dimensional modules, it is well behaved in various ways that I'm interested in. Instead of stating what a ring is in Set, I want a notion of ring in Rel, over which I can take modules and focus on the finite-dimensional ones (finite-dimensional in the sense that elements are finite Rel-⊗s of ring elements, similar to how in Set the elements are finite Set-×s of ring elements. This should correspond to something like the FDVect biproduct). I know that the notion of ring in Rel will contain the extra structure of a commutative comonoid (necessarily trivial in Set), and this comonoid will form a bialgebra with the addition monoid.
You can define a ring object in any category with finite products. The category Rel has finite products. So there's that. That may not be what you want, but it's probably worth figuring out what it amounts to.
I suspect that instead you may want to use the "tensor product" monoidal structure on Rel and define a ring in Rel to be a monoid with respect to this monoidal structure.
For example, an ordinary ring is a monoid in .
Feel free to ask me what the hell I mean by all this!
I'm kinda working from the bottom up, and know that ultimately I want to be talking about predicates on tuples of things, so it makes sense to me to just go straight for the Rel tensor product.
And I have some monoids in (Rel, ⊗) lying around which are going to come in handy eventually.
But yeah, I'm with you so far.
(Interesting question whether I can state Rel-rings as monoids in a category of Rel-abelian groups, but it's going to be difficult to prove that the latter is monoidal)
Okay, so I suggest that a monoid in (Rel, ⊗) might be the best idea of a "ring in Rel".
But I don't think they involve "Rel-abelian groups". In fact I have no idea how to define an abelian group in (Rel, ⊗); I can only do it in a category with finite products, because the group axioms involve duplicating and deleting variables.
The closest thing to a group in a general symmetric monoidal category is a "Hopf monoid".
For this, I assume a commutative comonoid structure, just like with Hopf algebras.
Okay, it sounds like your "Rel-abelian group" might be a commutative and cocommutative Hopf monoid in (Rel, ⊗).
That's the usual way to try to mimic the concept of abelian group in a symmetric monoidal category.
Now you say it, yeah, that sounds right.
I do imagine that the category of these things is monoidal (in Rel, I imagine you can still form integers and lists and stuff), so that gives a candidate definition of a Rel-ring.
Ultimately, I'd want to work it out in simpler terms, because I don't like working with these explicitly constructed tensor products so much.
The category of Hopf monoids in a symmetric monoidal category with finite colimits is always itself symmetric monoidal.
Nice :+1:
To show this, you copy the way people usually define the tensor product of Hopf algebras in (Vect, ⊗).
Though, as an aside, is this commutative Hopf monoids? Because Hopf monoids in Set are just groups, which don't form a monoidal category IIRC.
Or rather, they don't have a tensor product.
Umm, let me see: Hopf monoids in (Set, x) are just groups... yes, that sounds right. I could be wrong, but I bet their tensor product will work out to be just the cartesian product of groups.
In fact maybe the stuff about "finite colimits" was unnecessary. To form a tensor product of Hopf monoids M and N in a symmetric monoidal category (C, ) you don't need to mod out by any relations. You just form M N and show this gets a Hopf monoid structure from those of M and N, right?
In the case of (Set, x), this is just showing that given groups G and H, GxH gets a group structure.
For Abelian groups, that G ⊗ H is something quite complicated (a big quotient of the free group over ×-pairs of the underlying sets). But yeah, I guess that it forms an Abelian group again comes down to that the individual pairs do.
Hence I don't like working with it. :stuck_out_tongue:
But it's good for design.
I'm saying if you've got Hopf monoids in Set coming from abelian groups G and H, their tensor product - the usual tensor product of Hopf monoids - corresponds to the abelian group GxH, not the abelian group G ⊗ H.
So yeah, you may not want to use the tensor product of Hopf monoids!
Yeah, the criterion for all of this is that I'm really thinking in terms of an arbitrary symmetric monoidal category (C, ⊗) (which may have extra structure/properties as long as both Set and Rel have them), and when I specialise this to C = Set and ⊗ = ×, I get back normal linear algebra.
Right.
As a mildly interesting data point, I've worked a lot of this through in (Poset, ×) before (though I bet people take this for granted all the time).
So what do you get there?
Normal linear algebra, but everything is monotonic with respect to the order in the way you'd expect.
Even multiplication...?
I was only working with semirings, semimodules, &c, so I didn't even have to consider negation being contravariant.
I didn't really do it as “in Poset”; I just assumed that every basic operation was monotonic, and got out that every derived operation is monotonic too. Maybe linear algebra in Poset is slightly different; I'm not sure.
After all, working in semirings doesn't preclude negative elements. Maybe semirings in Poset do, though.
Then it becomes interesting to ask whether there even are non-trivial rings in Poset.
Right, if + and × are monotonic, but - is contravariantly monotonic, then x ≤ y gives both -y ≤ -x and (-1)×x ≤ (-1)×y, so -x ≤ -y. So -x = -y, and therefore x = y. If - is covariantly monotonic, then there's probably some other silly derivation you can do. I can't think of any non-trivial models, anyway.
It could be better to look at rigs, which are like rings but where addition gives just a commutative monoid, not an abelian group. For example is a rig.
I do a lot of work with rigs.
Ah, I actually want rigs. I just say rings because mathematicians usually find them more familiar.
Also rig = semiring in my dialect.
Okay. If you use rigs you don't need to worry about negation being order-reversing.
I say "rig" in part because there's an annoying disanalogy
group : semigroup ?? ring : semiring
I'm one of these **** who doesn't believe in semigroups. :wink:
The people I speak to (and audiences I write for) typically want a reminder of the definition, so I don't need a precise word.
Also, if not “semimodule”, what word do you use there? I guess just “module” works, in a sense.
obligatory nlab link https://ncatlab.org/nlab/show/matrix+calculus
haha re: rig rather than ring—i spent a pretty long time thinking that Rel on finite sets was equivalent to FinVect over F₂ before realizing that it was actually over the boolean rig 2 (classically) :sweat_smile:
OH lmao i misread what this as about as being, like, interpreting stuff from linear algebra into Rel instead of FinVect, rather than objects internal to Rel instead of Set—sorry, link is not so relevant then i think
James Wood said:
Also, if not “semimodule”, what word do you use there? I guess just “module” works, in a sense.
I speak of "modules" of rigs, but I probably would define them if I used that notion in a paper, since it's not really standard yet.
sarahzrf said:
haha re: rig rather than ring—i spent a pretty long time thinking that Rel on finite sets was equivalent to FinVect over F₂ before realizing that it was actually over the boolean rig 2 (classically) :sweat_smile:
Yes, that's funny. But it's not a bad confusion to go through, because turning a Boolean algebra into a Boolean ring and vice versa uses exactly the fact that we can define the rig operations in ({0,1}, ) in terms of the ring operations in () and vice versa.
(Maybe this is obvious to you by now, but I remember being excited by it when I first learned about Boolean rings.)
It's sort of weird how the fact that there are exactly 2 rig structures on a 2-element set, up to isomorphism, has such cosmic implications.
sarahzrf said:
obligatory nlab link https://ncatlab.org/nlab/show/matrix+calculus
Don't worry about misreading; the way I stated the problem must have been a little unclear.
I think this link is still helpful. In particular, it points out that if I can show that Rel-Vect has biproducts, then I'm well on my way. What I don't understand from this page, though, is how it talks about encoding morphisms without talking about closure. I guess these are, like, metatheoretic/external matrices, and I'd still need to show that these structures can be internalised and that they play well with the tensor product.
I would say they are just ordinary/non-internal matrices. You don't want to go around internalizing all of math into Rel, it's just too alien.
I don't think I mean “internalise” too formally here (but maybe I do), but I mean to say that I want matrices to be the witness of the monoidal closed structure of Rel-FDVect, which requires these matrices to live inside Rel, rather than the ambient Set in which that article is written.
I suppose “internal Hom” is why I think of “internalise”.
Another thought: maybe a nice representation of external homs is actually enough for what I'm doing, but I have a strong feeling that internal homs will look exactly the same, given that the objects of (Set, ×) and (Rel, ⊗) are the same.
yeah, they are external matrices
in general, any morphism A + B + C → X × Y × Z × W in any category can be broken down into a 4×3 matrix
because a morphism out of A + B + C is the same as a triple of morphisms out of the summands, and a morphism into X × Y × Z × W is the same as a quadruple of morphisms into the factors, so a morphism from the former to the latter is the same as a morphism A → X, A → Y, ... B → Z, ... C → W, and you can handily put those into a matrix
then if you have a morphism X + Y + Z + W → J × K that's a 2×4 matrix, and if there's a canonical identification of X × Y × Z × W with X + Y + Z + W, then you can compose them, and it'll be given by matrix multiplication
where the sums of morphisms are meaningful because finite biproducts gives a canonical enrichment over cmon
in FinVect, a basis is exactly the equipment of an object as a biproduct of copies of the base field k, so morphisms between spaces with bases can be broken into matrices of endomorphisms of k, and the endomorphisms of k in FinVect correspond precisely to scalars—hence, linear algebra :sparkles:
Cool, this is a really nice way to break it down. Thanks!
btw, if you're gonna be working a bunch with Rel
a couple of things which you might already know but which i'll share just because ive found them really useful & if you dont already know then theyre good to keep in mind:
Rel is equivalent to the Kleisli category of the covariant powerset monad on Set
covariant powerset monad is powerset on objects and has and .
a Rel morphism A → B corresponds to the Set morphism A → P(B) which sends each element of A to the set of everything it's related to
in line with this, u can think of a relation R : A → B as a "non-deterministic and potentially partial function A → B"—"a R b" means "R(a) can be b", and then this works out because we have "(R ∘ S)(a) can be b" iff "R(S(a)) can be b for some S(a)"
i've often found relation composition a lot easier to think about in these terms personally :)
sarahzrf said:
in general, any morphism A + B + C → X × Y × Z × W in any category can be broken down into a 4×3 matrix, because a morphism out of A + B + C is the same as a triple of morphisms out of the summands, and a morphism into X × Y × Z × W is the same as a quadruple of morphisms into the factors, so a morphism from the former to the latter is the same as a morphism A → X, A → Y, ... B → Z, ... C → W, and you can handily put those into a matrix.
I didn't see you reach the punchline here, so I'll give it: in Rel products are the same as coproducts, so this simplifies! We can call the morphism you wrote down A + B + C → X + Y + Z + W, and now these morphisms are easier to compose.
You did mention that in Vect products are the same as coproducts.
What's going on is that vector spaces are linear maps between free modules over a field, while relations are linear maps between free modules over a rig, namely the boolean rig {0,1}.
In any category of free modules over a rig, products are the same as coproducts... so we're doing matrix algebra.
A relation is a matrix of booleans.
of truth values :weary:
actually, do the relations between infinite sets correspond to linear things?
sarahzrf said:
Rel is equivalent to the Kleisli category of the covariant powerset monad on Set
I remember redescovering this one myself. There's also some relationship between (Set-)FDVect and and the Kleisli category of the monad of formal linear combinations (the one that gives rise to free vector spaces from a basis).
there is indeed!
that's the monad of the adjunction between Set and FinVect
that adjunction is monadic, i think? actually i'm not 100% sure...
but if it is, that means that FinVect is equivalent to the eilenberg-moore category, of which the kleisli category is canonically equivalent to a subcategory
well, actually the kleisli category is initial among adjunctions regardless, so even if the adjunction isn't monadic, you still have a unique functor to FinVect factoring the adjunctions
also nvm re: the relations between infinite sets question—i realized the issue i was thinking of was—if A is infinite, then it no longer corresponds to the Ω-module Ω^A, because that's no longer the free one on A, exactly like w/ free real vector spaces on infinite bases
Aah, that's how EM categories come about...
perhaps!
the adjunctions between Set and most kinds of category of "algebraic-y object" are monadic
then i had second thoughts about blindly saying so for FinVect since, idk, scaling by R sounds sorta infinitary, so i'd want to stop and think about it... but for example, the adjunction thru Mon or Grp is monadic
maybe i should learn the monadicity theorem :thinking:
I've checked that it's monadic before, and it's largely for the same reasons as why the covariant powerset functor is monadic.
You can think of these free vector spaces as weighted finite multisets.
wait, which monadic adjunction is covariant powerset part of?
Ah, sorry, just that it is a monad.
oh! thats not what i meant
an adjunction is monadic if it is equivalent to the adjunction thru the EM category of its monad
Aah, okay, I have no idea, then. :silence:
idea is like: you get a monad (and, of course, a comonad) from any adjunction, but that's in general lossy—there are most likely many adjunctions that produce the same monad
but forming the kleisli category of the monad and forming the eilenberg-moore category of the monad are the two canonical ways of, like, "re-slicing" any monad into an adjunction
and in fact the kleisli adjunction is initial: it factors thru every other adjunction producing the monad, uniquely
and the EM adjunction is terminal: every other adjunction producing the monad factors thru it, uniquely
so a monadic adjunction is one which was already terminal among the adjunctions that had the same monad
although phrasing it in terms of "terminal" is maybe less informative
but here's a nice example you can extract from this:
so the free monoid on a set S is lists of elements from S under concatenation. therefore, composing that with the forgetful functor to get the monad on Set gives you just... the list monad
but i said just earlier that the adjunction thru Mon was monadic
meaning that it was equivalent to the EM adjunction for its monad, the list monad
and, dropping the fact that the adjunctions themselves are equivalent, that means that Mon is equivalent to the EM category for the list monad!
a monoid is a list monad algebra
Aah, pretty nice, and with the simple example it makes sense.
more broadly, there's sort of a notion that you can encode a theory of some kind, or structure of some kind, as a monad which sends an object to the object of formal constructions over it in the theory
and then an algebra for such a monad is a morphism which gives interpretation to the formal constructions in a way which is consistent with the laws, so a model of the theory
then, if you squint at any adjunction, you can wonder whether maybe it's doing something like... "the left adjoint makes a free object out of all of the formal constructions, and the right adjoint forgets the structure", in which case the adjunction's monad should summarize the formal constructions in question
and then you should be able to recover everything as the algebras
that situation is when the adjunction is monadic
one example of an adjunction that is not monadic is Disc ⊣ U, where Disc : Set → Top forms the discrete space and U : Top → Set forgets the topology
sarahzrf said:
that's the monad of the adjunction between Set and FinVect
There's no adjunction between Set and FinVect... well, at least, not one where the right adjoint sends a finite-dimensional vector space to its underlying set... well, at least not if the field you're working over is infinite!
There's an adjunction between Set and Vect where the right adjoint sends a vector space to its underlying set.
sorry, yeah, i meant Vect
or rather, i meant FinVect, realized later that actually it needed to be Vect, and forgot to correct myself
One reason we should all be working with finite fields is that then you get an adjunction between FinSet and FinVect and never need to think about infinity. :slight_smile:
Reid Barton said:
I would say they are just ordinary/non-internal matrices. You don't want to go around internalizing all of math into Rel, it's just too alien.
Why not? Alien things can be very interesting! Of course internalizing all of math would take quite a while... so you have to judiciously choose things that seem likely to have a nice payoff.