Category Theory
Zulip Server
Archive

You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.


Stream: theory: applied category theory

Topic: the big picture


view this post on Zulip Daniel Geisler (Apr 30 2020 at 22:04):

Now that I have a bit of an idea of what open dynamical systems are about, I'm seeing @John Baez ' recent work in a new context. Understanding the subtleties of reaction networks and open dynamical systems would be useful in taking on climate change and other ecological challenges.

view this post on Zulip Fabrizio Genovese (Apr 30 2020 at 22:13):

Yes, I think that was John's goal since the very beginning, studying the behavior of networks to try fixing the world :slight_smile:

view this post on Zulip John Baez (Apr 30 2020 at 22:25):

Yes, exactly. I made some progress on the first part.

view this post on Zulip Joe Moeller (Apr 30 2020 at 23:25):

I think the only sense in which we can say that ACT is a "new field" is by emphasizing that we're now open to approaching sciences other than cs. Largely what I've seen is more cs. It's fine and good to include cs, it's been very fruitful, but it's also imperative that we attempt to do good with all this.

view this post on Zulip Fabrizio Genovese (Apr 30 2020 at 23:27):

In part I agree, but I'd like to see more CT applied to CS as well. Even in CS categorical techniques remain quite marginal, unfortunately

view this post on Zulip Fabrizio Genovese (Apr 30 2020 at 23:27):

Also, I think you are mainly thinking about functional programming, but there are entire fields of CS that are very "hic sunt leones" for us :slight_smile:

view this post on Zulip Joe Moeller (Apr 30 2020 at 23:29):

Yes, I was trying to carefully not suggest that we do less cs, simply that we do more ecology etc. :slight_smile:

view this post on Zulip Fabrizio Genovese (Apr 30 2020 at 23:32):

Ecology would indeed be nice. But obviously the more you move away from math, the more things are less well-behaved and we need better tools to say something useful, that makes venturing into other fields (e.g. biology, or even sociology) more ambitious. As an old proverb says, "the more you climb the ladder, the more your ass is shown" :P

view this post on Zulip Joe Moeller (Apr 30 2020 at 23:36):

Right. And ultimately, the solution to e.g. climate change will be far more social than scientific.

view this post on Zulip Todd Trimble (May 01 2020 at 00:07):

Social but especially economic. Large corporations have to see the "green" in both senses of the word. It could happen!

view this post on Zulip John Baez (May 01 2020 at 00:37):

In the first ACT conference, ACT2018, I tried to exclude computer science, to give other applications of category theory room to grow. But in my talk there, I pointed out that every branch of science and engineering uses computers now, so the right route may be to blend computer science with category theory and then apply the resulting blend to everything else.

view this post on Zulip Jules Hedges (May 01 2020 at 10:37):

John Baez said:

In the first ACT conference, ACT2018, I tried to exclude computer science, to give other applications of category theory room to grow. But in my talk there, I pointed out that every branch of science and engineering uses computers now, so the right route may be to blend computer science with category theory and then apply the resulting blend to everything else.

You're talking about scientific computing, which is hardly at all related to computer science. Scientific computing is mostly done in departments like physics, chemistry, biology, engineering, economics, ........ And I think that the world changing stuff for us will be to target scientific computing, and bypass computer science

view this post on Zulip Morgan Rogers (he/him) (May 01 2020 at 11:00):

I would have thought there's plenty of computer science being put into practice under the bonnet of scientific computing. I suppose you mean that in order to achieve a meaningful impact we should be targeting work at physicists, chemists etc. rather than computer scientists?

view this post on Zulip Jules Hedges (May 01 2020 at 11:03):

Yep. Remember that most scientific computing is done in Fortran, which was state of the art as computer science about 70 years ago

view this post on Zulip Reid Barton (May 01 2020 at 11:05):

I thought it was done using proprietary vendor-specific GPU libraries...

view this post on Zulip Morgan Rogers (he/him) (May 01 2020 at 11:16):

eg Matlab

view this post on Zulip Nathanael Arkor (May 01 2020 at 11:45):

I think both approaches are still prevelant.

view this post on Zulip Nathaniel Virgo (May 01 2020 at 14:30):

Python is pretty important as well these days for scientific computing.

view this post on Zulip Evan Patterson (May 01 2020 at 18:14):

Yeah, MATLAB still has its adherents among the old-timers but times are changing. More and more scientific computing is being done using open source software, like Python, R, or Julia. I figure that in another generation or two MATLAB will be basically dead, unless it finds a way to reinvent itself.

view this post on Zulip John Baez (May 02 2020 at 02:04):

I don't think you can really "target scientific computing and bypass computer science" and do something wonderful without creating some new kind of computer science.

view this post on Zulip Bob Haugen (Jun 19 2020 at 21:12):

John Baez said:

In the first ACT conference, ACT2018, I tried to exclude computer science, to give other applications of category theory room to grow. But in my talk there, I pointed out that every branch of science and engineering uses computers now, so the right route may be to blend computer science with category theory and then apply the resulting blend to everything else.

How would you handle manufacturing supply chains? They do use a lot of computers, but you can also analyze them in terms of resource flows, or input-ptocess-output models, or Petri nets.

view this post on Zulip Jules Hedges (Jun 19 2020 at 22:08):

According to the arbitrary academic boundaries we have been given, this is operations research and not really anything to do with computer science

view this post on Zulip Jules Hedges (Jun 19 2020 at 22:09):

In a better world we'd have the proper cross-discipline collaborations you need to properly study something like that

view this post on Zulip Bob Haugen (Jun 19 2020 at 22:18):

Jules Hedges said:

According to the arbitrary academic boundaries we have been given, this is operations research and not really anything to do with computer science

Can you apply category theory to operations research? Or, why not?

(P.S. At least it would get you away from computer science for a bit...)

view this post on Zulip Jules Hedges (Jun 19 2020 at 22:19):

This is something I'm thinking about. Some things work out, but anything to do with optimisation hasn't really worked out for me so far

view this post on Zulip Bob Haugen (Jun 19 2020 at 22:24):

Jules Hedges said:

...anything to do with optimisation hasn't really worked out for me so far

Not me either. Optimization in manufacturing and supply chains is often way too flaky for practical application. Back in the day, an optimization software company that I know piloted their product in a manufacturing plant I know and got thrown out because every day the optimizing scheduler would reschedule everything. Simpler rules of thumb like sequence synchronization was almost as good and did not drive people crazy.

view this post on Zulip Bob Haugen (Jun 19 2020 at 22:34):

These folks https://www.organicvalley.coop/about-us/organic-food-co-op/ who are 5 miles from me, use some linear-programming-related software for coordinating milk supply chains, which are insanely complex, but it's not as fiddly and they seem to be able to make it work.

view this post on Zulip Jules Hedges (Jun 19 2020 at 22:41):

Huhhh, why is it always milk? I started thinking a bunch about linear programming for supply chains exactly because my colleague Viktor Winschel was working with someone doing milk supply chain optimisation in Germany

view this post on Zulip Jules Hedges (Jun 19 2020 at 22:42):

It's clearly a setting where a very tiny improvement would have a massive impact

view this post on Zulip Jules Hedges (Jun 19 2020 at 22:45):

I built a categorical setting for describing optimisation problems compositionally (using decorated cospans) and I was pretty pleased with myself. Then I realised that argmax isn't functorial - even for linear optimisation - and then pretty much gave up

view this post on Zulip Jules Hedges (Jun 19 2020 at 22:46):

It's just a fact, the solution of an optimisation problem doesn't arise from solutions to parts of the problem. That seems to be something that's true independently of any particular way of formalising it

view this post on Zulip Bob Haugen (Jun 19 2020 at 23:42):

Jules Hedges said:

Huhhh, why is it always milk? I started thinking a bunch about linear programming for supply chains exactly because my colleague Viktor Winschel was working with someone doing milk supply chain optimisation in Germany

Some of the complications in milk supply chains: for one thing, milk is perishable. For another, the supply needs to be picked up at lots of dairy farms and then transported to processors in compatible locations who have limited capacity and hours of work. And then the demand is limited and variable and has different prices (eg organic milk vs non-organic milk which is cheaper but absorbs some of the over-supply).

So it's an interesting problem and the solutions are worth a lot of money.

view this post on Zulip John Baez (Jun 20 2020 at 01:54):

I don't know enough about manufacturing supply chains to say anything profound about them. I will however mention that PERT charts, widely used in project management, can be understood using enriched copresheaves.

view this post on Zulip John Baez (Jun 20 2020 at 03:31):

This math is connected to tropical algebra, and people use a lot of tropical algebra in railway timetable optimization - in fact there was a conference on that in Birmingham.

view this post on Zulip John Baez (Jun 20 2020 at 03:35):

My work on network models could also be seen as touching on operations research, though maybe it'd be called something else. I wrote 9 blog articles on this here:

view this post on Zulip Morgan Rogers (he/him) (Jun 20 2020 at 09:39):

Jules Hedges said:

It's just a fact, the solution of an optimisation problem doesn't arise from solutions to parts of the problem. That seems to be something that's true independently of any particular way of formalising it

I find this to be a counter-intuitive statement. If I imagine a classical "shortest path on a manifold" optimization problem, any optimal solution has got to be locally optimal, which is to say geodesic, so in this sense it is composed of optimal parts of smaller problems. If I do this in a discrete graph setting instead, I can see how gluing in an edge could discontinuously change the global optimal path, but on the other hand there are some invariants such as cohomology that detect the possibility of that change (if I glue in an edge that reduces the distance between two points, I must have introduced a new loop into the graph, which is detected by a change in the first cohomology group, for example). Cohomology is functorial, and for finitely presented problems it's also relatively well-behaved. Also, for the simplest composition where graphs equipped with a source and target vertex are glued source-to-target, the shortest route in the composite graph is again constructed from the optimal solutions to the individual graphs...
So where do the obstacles to compositionality that you're alluding to come from?

view this post on Zulip (=_=) (Jun 20 2020 at 10:14):

[Mod] Morgan Rogers said:

If I imagine a classical "shortest path on a manifold" optimization problem, any optimal solution has got to be locally optimal, which is to say geodesic, so in this sense it is composed of optimal parts of smaller problems.

This is, more or less, a description of the hill climbing algorithm, which is known to get stuck on local optima. It is true that any optimal solution to the shortest path problem has to be locally optimal, but coming up with the trajectory for that shortest path is not intuitive.

For a classic example, consider the shortest path for an intercontinental flight, say, London to Sydney. The path is locally geodesic, but on a flat map, the resulting path looks like a parabola. If you try to piece together the shortest path using connecting flights, say London->Dubai->Sydney, you'd not arrive at the actual globally optimal solution. The obstruction is the curvature of the Earth, or the curvature of the manifold in general.

view this post on Zulip Jules Hedges (Jun 20 2020 at 10:35):

Right.... I tried to lead other people down the route of thinking about obstructions to compositionality like this (I gave up on doing it myself, since I can't get my head around cohomology at all)

view this post on Zulip Jules Hedges (Jun 20 2020 at 10:39):

In my particular case, I had a setup where composing morphisms/diagrams corresponded to pointwise addition of objective functions, plus variable sharing. (I wrote about it somewhere on here, I can dig out that thread if you're interested). In the case that ff and gg have disjoint domains then the optimum of f+gf + g is indeed the optimum of ff together with the optimum of gg. But as soon as they share an input, the optima of f+gf + g can be completely disjoint from the optima of ff and gg individually

view this post on Zulip Jules Hedges (Jun 20 2020 at 10:40):

Eg. f(x)=x2f (x) = -x^2 (optimum x=0x=0) and g(x)=(x1)2g (x) = -(x - 1)^2 (optimum x=1x=1), the sum has optimum x=1/2x = 1/2

view this post on Zulip Jules Hedges (Jun 20 2020 at 10:41):

And part of the reason I tried to use + for composition is that I know max\max and + play nicely with each other

view this post on Zulip Jules Hedges (Jun 20 2020 at 10:41):

But not that nicely

view this post on Zulip Jules Hedges (Jun 20 2020 at 10:46):

argmax is just horrible all round, it's not even continuous (but if you restrict it to linear functions I think it's something like upper-semicontinuous, which is ok)

view this post on Zulip Jules Hedges (Jun 20 2020 at 10:53):

So one of the things I'm thinking about at the moment is to come up with a compositional structure on linear programs which is (1) useful for specifying linear programs in practice, and (2) makes solving functorial in some reasonable sense

view this post on Zulip Bob Haugen (Jun 20 2020 at 11:20):

Lots of interesting readings! I also want to give credit to "Seven Sketches in Compositionality" by @Brendan Fong and @David Spivak for first tipping me off that category theory would be useful for composing supply chains and other economic networks. Also introduced us to string diagrams, which we have adopted as a visual language. https://github.com/valueflows/valueflows/issues/507

view this post on Zulip (=_=) (Jun 20 2020 at 12:31):

Jules Hedges said:

In my particular case, I had a setup where composing morphisms/diagrams corresponded to pointwise addition of objective functions, plus variable sharing. (I wrote about it somewhere on here, I can dig out that thread if you're interested).

For anyone who's interested, the topic was #practice: applied ct > measuring non-compositionality and the comment where this was described is here.

view this post on Zulip (=_=) (Jun 20 2020 at 12:33):

Jules Hedges said:

Right.... I tried to lead other people down the route of thinking about obstructions to compositionality like this (I gave up on doing it myself, since I can't get my head around cohomology at all)

Right... don't feel so bad. This is why mainstream economics is terrible as well: they can't get their head around cohomology.

view this post on Zulip (=_=) (Jun 20 2020 at 12:47):

Jules Hedges said:

So one of the things I'm thinking about at the moment is to come up with a compositional structure on linear programs which is (1) useful for specifying linear programs in practice, and (2) makes solving functorial in some reasonable sense

So what they did in mainstream economics was to restrict everything to linear (and maybe quadratic) programs. For example, they impose the "law" of demand that market demand curves are always downward-sloping (i.e. monotonically decreasing), which immediately excludes any polynomial functions of degree >2 from being a market demand function, to ensure that there is a unique equilibrium (the point at which demand intersects supply).

However, this doesn't work even when you have rational utility-maximising consumers, by the SMD theorem, which basically says any polynomial function can arise as a market demand function. Which is why I said "impose": the "law" has to be by fiat, because mathematical reality will not comply with the mainstream economic theory.

But mainstream economics plods on regardless, because the SMD theorem implies the non-uniqueness of market equilibrium, which means that economic policy becomes debatable, and ain't nobody got time for that apparently. This is despite the fact that the theorem was proven by three mainstream mathematical economists. :shrug:

view this post on Zulip (=_=) (Jun 20 2020 at 13:57):

Re. this tweet by Fabrizio: well... it kinda needs more work than that.

@_julesh_ The most reasonable starting point then would be "complexity as cohomology".

- Fabrizio Romano Genovese (@fabgenovese)

view this post on Zulip (=_=) (Jun 20 2020 at 14:23):

To start with, cohomology groups H:=ker(d)/im(d)H^{\bullet} := \textrm{ker}(d_{\bullet}) / \textrm{im}(d_{\bullet}) are usually defined as quotients of classes of things, and these classes tend to come up as kernels and images of some operators di ⁣:CiCi+1d_i\colon C_i \to C_{i+1} (which square to zero, i.e. di+1di=0d_{i+1} \circ d_i = 0) connecting some sequence (a chain complex) of nice objects CiC_i (usually abelian groups or modules).

view this post on Zulip (=_=) (Jun 20 2020 at 14:24):

It's not immediately clear to me how you'd define these things in complexity theory, since all you have are complexity classes, which are classes of problems that can be solved using mechanisms with some fixed property, but no maps that can serve as the did_i's.

view this post on Zulip (=_=) (Jun 20 2020 at 14:24):

What I think could be helpful is to try not to squeeze complexity theory into homological algebra, but to see how to define \infty-categorical objects that would encode the homotopy-theoretic information in complexity classes. John Baez has talked about how homological algebra represents a loss of information here and here.

view this post on Zulip (=_=) (Jun 20 2020 at 14:25):

I think a hint of why this would be useful is how Jules has found that there's an obstruction to compositionality in the form of laxness. This suggests the presence of homotopical information that should be investigated further.

view this post on Zulip David Tanzer (Jun 20 2020 at 19:37):

Found this reference, which gives a crystal clear introduction:

Abstract. We provide an introduction to graph homology and cohomology,
giving the definitions of four homology and cohomology theories of graphs, and
proving some results on the relation between the combinatorial and topological
properties of graphs and their homology. The only required background is
undergraduate algebra.

view this post on Zulip Morgan Rogers (he/him) (Jun 20 2020 at 21:28):

Jules Hedges said:

An example where functorality fails is reachability in open graphs, where F(f)F (f) is the pairs of boundary vertices (x,y)(x, y) where yy is reachable from xx in ff (say, directed graphs for now). Composition is by glueing boundary vertices. Then you can make an example where zz is reachable from xx in f;gf;g, but there is no boundary yy with yy reachable from xx and zz reachable from yy. Clue: you need to loop back and cross the boundary 3 times

Ah this example was what I was looking for, I didn't realise this had already been discussed. So it's not even a matter of "conditional optimization" (optimizing on each piece conditionally on passing through the intermediate set of vertices at a particular vertex, then optimizing over the boundary); we can have... squiggles! And things can be arbitrarily bad on this front, with many many squiggles.

view this post on Zulip John Baez (Jun 20 2020 at 21:48):

Rongmin Lu said:

Jules Hedges said:

Right.... I tried to lead other people down the route of thinking about obstructions to compositionality like this (I gave up on doing it myself, since I can't get my head around cohomology at all)

Right... don't feel so bad. This is why mainstream economics is terrible as well: they can't get their head around cohomology.

I feel I understand cohomology very well, but I'm not sure how it's supposed to help with anything like economics. I can vaguely imagine it might, but I sure don't see how in any sort of detail.

view this post on Zulip Morgan Rogers (he/him) (Jun 20 2020 at 21:50):

There is a way of turning this around so that we can still understand the new shorter (ie finite) path as arising from a loop which might be identified by cohomology, or some higher categorical gadgets, if Rongmin's link to John's comments is pursued. First observe that we can transform this into a finitary problem, by adding in a long path into each open graph from the single source vertex to the disconnected target vertices (and the dual on the other side), long enough that the squiggle obtained by gluing the vertices gives a shorter path. The disconnected version is a limit as the length of these artificial paths tend to infinity. But irrespective of the lengths of the added paths, by drawing them in you can see the loops which are formed by the gluing, and it's by "taking the other route around them" that the new optimal route is found.
I don't know if that's clear or helpful, but my personal conclusion is that when an optimization problem comes from a cobordism category, there is some hope of cohomology guiding the solution.
How or whether this is relevant to the addition of functions example, I don't know!

view this post on Zulip (=_=) (Jun 20 2020 at 22:58):

John Baez said:

Rongmin Lu said:

Jules Hedges said:

Right.... I tried to lead other people down the route of thinking about obstructions to compositionality like this (I gave up on doing it myself, since I can't get my head around cohomology at all)

Right... don't feel so bad. This is why mainstream economics is terrible as well: they can't get their head around cohomology.

I feel I understand cohomology very well, but I'm not sure how it's supposed to help with anything like economics. I can vaguely imagine it might, but I sure don't see how in any sort of detail.

Sorry, this was probably a bit too sleek. When I said "cohomology", I'm referring to nonlinearity, i.e. "obstructions" to linearity.

view this post on Zulip (=_=) (Jun 20 2020 at 23:17):

As I've explained here, the mainstream school of economics, neoclassical economics or "marginalism" (because they're always talking about "marginal" stuff, so that they can use first-year calculus), enforces linearisation so that their theory can work.
This is despite, and due to, the fact that they have proven the SMD theorem, which says that market demand functions are almost always nonlinear even if you assume that the market participants are all rational utility-maximising agents, which guarantees that each of the individual demand functions are linear.

view this post on Zulip (=_=) (Jun 20 2020 at 23:21):

Mainstream economics also assumes away the existence of time, because their methods are static by nature as they don't consider dynamical systems at all. That is something that Keynes has already observed:

But this long run is a misleading guide to current affairs. In the long run we are all dead. Economists set themselves too easy, too useless a task, if in tempestuous seasons they can only tell us, that when the storm is long past, the ocean is flat again.

It is on this basis that Keynes talks about "uncertainty" in his theory, rather than the more easily quantifiable "risk" that people talk about today.

view this post on Zulip (=_=) (Jun 20 2020 at 23:26):

Ole Peters has this nice talk where he expounds on the importance of considering time in economics, because many economic processes are non-ergodic in nature. He does something called "ergodicity economics", which seems to me to be putting Keynes' theory of uncertainty on a more rigorous basis. And in that talk, he claims he can publish his work in most prestigious journals -- with the notable exception of mainstream economics journals, because they have been explicitly rejecting, for decades now, papers that violate the "linearisation" I talked about in the above.

view this post on Zulip (=_=) (Jun 20 2020 at 23:33):

I'm not sure I understand the intuition behind this tweet of Jules Hedges. You can have functors that are not easy to compute. The higher homotopy group functors πn\pi_n are an example: see Hatcher's Chapter 4 of his textbook, Algebraic Topology, for a proof of their functorality. However, computing the higher homotopy groups of SnS^n is still a very difficult problem.

Underrated idea: complexity classes as barriers to compositionality. I'm no expert on complexity theory, but I quite often do informal reasoning like "if this functor existed then that problem would be easy to solve, but it's hard to solve, therefore the functor can't exist"

- julesh (@_julesh_)

view this post on Zulip (=_=) (Jun 20 2020 at 23:35):

Jules has elaborated a bit on his intuition here, but it's still a bit baffling to me without seeing all the details.

@LucaAmb The one I thought about the most is Nash equilibria of open games. If they were described by a functor in the most obvious way then you would be able to compute Nash equilibria in linear time in the complexity of the game. In reality it's an exponential time problem

- julesh (@_julesh_)

view this post on Zulip Fabrizio Genovese (Jun 20 2020 at 23:52):

Rongmin Lu said:

Re. this tweet by Fabrizio: well... it kinda needs more work than that.

I think that for your own mental sanity it's worth to spell this out: At least for me, Twitter is NOT a faithful representation of what I'm thinking, researching on, or even caring about. 50% of my tweets is trolling, 45% is just my brain farting and 5% is something that may MAYBE become an actual insight. I am pretty sure this applies to the majority of people out there tweeting.
As for my own mental sanity instead, I'm quite depressed, at the moment I feel like my work and life are worthless, and I am sure there are many other people like me out there. For MANY of us just losing a bit of time on Twitter is one of the few ways we have to relieve a bit of fucking stress in a too much suffocating world. If I have to think twice about what I am tweeting because everything may get scrutinized and quoted on this platform like a factual statement then you are literally doing me more harm than good. So please take a tweet for what it is (a fucking tweet) and not necessarily as a serious research proposition. Now I'll go burning out elsewhere, cheers.

view this post on Zulip John Baez (Jun 21 2020 at 00:22):

Come on, Rongmin. Fabrizio said he's in a really bad mood, feeling his work is worthless. Be nice.

view this post on Zulip (=_=) (Jun 21 2020 at 00:22):

Fabrizio Genovese said:

As for my own mental sanity instead, I'm quite depressed, at the moment I feel like my work and life are worthless, and I am sure there are many other people like me out there.

I'm sorry to hear that and I'm sure you have your reasons for thinking that. It may help if you can examine those reasons critically, because from my perspective, it's not clear to me why that's the case. People seem to love what Statebox is doing, at least on here, and I'm not aware of any dissenting opinions. If it's a business issue, there are answers out there, since there are many other people who're (or have been) startup founders out there. You just have to ask.

view this post on Zulip (=_=) (Jun 21 2020 at 00:22):

John Baez said:

Come on, Rongmin. Fabrizio said he's in a really bad mood, feeling his work is worthless. Be nice.

I'm trying, but I'm responding to a comment that started with "I think that for your own sanity it's worth spelling out".

view this post on Zulip (=_=) (Jun 21 2020 at 00:30):

Fabrizio Genovese said:

So please take a tweet for what it is (a fucking tweet) and not necessarily as a serious research proposition.

But serious research propositions can come out of thinking about "fucking tweets", or indeed any non-serious thing. I mean, the whole idea of "a group is just a category with one object" started out as a joke that was taken seriously with serious consequences for CT.

Note that what I said about your tweet was it "needs more work": I did NOT say it was a bad idea. I think it's moving in the right direction. What it needs is more refinement, but that's what research propositions are: they need more work, otherwise they'd just be factual statements.

view this post on Zulip Jules Hedges (Jun 21 2020 at 10:31):

The way I see it, "complexity as cohomology" is at least plausibly a very good idea, but it might take 100 years to work out the details

view this post on Zulip Jules Hedges (Jun 21 2020 at 10:32):

In the sense of being a general theory of "no, you can't do that"

view this post on Zulip Jules Hedges (Jun 21 2020 at 10:32):

Complexity theory is, in a sense, exactly the study of obstructions to the existence of algorithms

view this post on Zulip (=_=) (Jun 22 2020 at 14:34):

Jules Hedges said:

The way I see it, "complexity as cohomology" is at least plausibly a very good idea, but it might take 100 years to work out the details

My hunch is it probably needs more homotopy type theory. Also, there may not be enough information on the homological algebra level: that's why I mentioned John's description of homological algebra as a kind of "linearisation" of higher objects.

view this post on Zulip (=_=) (Jun 22 2020 at 14:38):

Jules Hedges said:

Complexity theory is, in a sense, exactly the study of obstructions to the existence of algorithms

Exactly. There used to be something called obstruction theory, which yielded cohomological invariants. So if by "cohomology", you mean a theory of obstructions, then "complexity as cohomology" makes sense.

I think it may be worthwhile to study the old obstruction theory papers for ideas as to how to make things more concrete. What they were doing back then would probably have seemed to them to be doing something really ambitious, trying to whip something so hopelessly vague into some formal mathematical structure. I mean, category theory was developed to formalise the heuristic and seemingly hand-wavy notion of what it meant for some transformation to be "natural"!