You're reading the public-facing archive of the Category Theory Zulip server.
To join the server you need an invite. Anybody can get an invite by contacting Matteo Capucci at name dot surname at gmail dot com.
For all things related to this archive refer to the same person.
Hello all! This is the thread of discussion for the talk of Micah Halter, Evan Patterson, Andrew Baas and James Fairbanks, "Compositional Scientific Computing with Catlab and SemanticModels".
Date and time: Wednesday July 8, 16:00 UTC.
Zoom meeting: https://mit.zoom.us/j/7055345747
YouTube live stream: https://www.youtube.com/watch?v=jCOi0kCYCeU&list=PLCOXjXDLt3pZDHGYOIqtg1m1lLOURjl1Q
We start again in 4 minutes!
I think it's worth talking more about the similarity between Catlab and DisCoPy. People have Opinions about which language features are better than others and can argue in circles forever. Maybe it's a good idea to have multiple implementations of the same thing for that reason. But I worry about large scale duplicated effort between them
Is anyone implementing category theory in a statically typed language? I'm surprised that so many category theorists are using languages with dynamic types, which seem to me to be the opposite of the way category theory teaches us to think.
Yes, Statebox have a big library in Idris called idris-ct
Jules beat me to it: https://github.com/statebox/idris-ct
Yes, open games are currently prototyped in Haskell, but I'm considering porting to Julia for various reasons
can you do coends in Julia?
No, I don't think the question even really makes sense, but you can for sure fake it enough to work
In haskell, the typesystem statically prevents you from accessing the bound type in an invalid way. In any other language you can make it private and just trust the programmer not to do anything nasty
I'm with Mike. I wish Julia had dependent types! Otherwise, it's such a nice language. A proper type theory would make it perfect.
Hi Jules and Toby, let's talk more about open games in Catlab, I think we need to do more with optics in general.
James Fairbanks said:
Hi Jules and Toby, let's talk more about open games in Catlab, I think we need to do more with optics in general.
yes I'd love to think about this!
This twitter thread has a bunch of discussion about the mechanics of open games in dynamic languages: https://twitter.com/_julesh_/status/1247115077079228422
So here's a recurring question. My open games tool (https://github.com/jules-hedges/open-games-hs) is written for Haskell, but obviously I don't expect my target users to pick up Haskell. Ideally it would target something like Python. I have no idea how easy or hard that would be, but I'd guess hard
- julesh (@_julesh_)Mike Shulman said:
Is anyone implementing category theory in a statically typed language? I'm surprised that so many category theorists are using languages with dynamic types, which seem to me to be the opposite of the way category theory teaches us to think.
As others have mentioned, doing CT in a statically typed language is the standard approach, and there is a long tradition of that in the Haskell community, for example. In principle, I'm not opposed at all to that approach, but where I come from (statistics and traditional applied math generally), the need to support scientific and numerical computation in a practical way is a paramount. Right now, Julia seems like the best compromise for a scientific computing language with reasonable programming language features.
@Evan Patterson, what do you think are Python's main disadvantages compared to Julia?
For us the most important thing is reliability instead. This is why we shoose to implement our libraries using formally verified languages. Idris is a good compromise here because it has dependent types, but it's really aimed at industry strength production
Right, if formal verification is your prime directive, then Julia is not going to help you much. So I think a lot depends on what you're trying to accomplish. There is definitely room for multiple approaches.
It' not my prime directive, it's our customer's :grinning:
Right, but Python and Julia seem very similar to each other. They are both dynamic and both used for heavy numerics
Right, there are two separate threads going on here, one about Julia vs static/dependently typed languages and one about Julia vs Python. Let me try to give my perspective on the second.
Evan Patterson said:
As others have mentioned, doing CT in a statically typed language is the standard approach, and there is a long tradition of that in the Haskell community, for example. In principle, I'm not opposed at all to that approach, but where I come from (statistics and traditional applied math generally), the need to support scientific and numerical computation in a practical way is a paramount. Right now, Julia seems like the best compromise for a scientific computing language with reasonable programming language features.
Is there some reason that "supporting scientific and numerical computation in a practical way" leads to the use of dynamic types? A priori they don't seem related to me; in fact it seems on a quick reading that part of what makes Julia performant is the ability of its compiler to optimize away the dynamic types!
As Jules says, both Python and Julia are commonly used for numerical computing. Python has going for it that it is easy to use and is popular in the (huge) ML community, so it has lots of mindshare. But Python suffers badly from what is sometimes called the "two language problem": it is too slow to directly support intensive computing, and so numerical code and data structures are written in languages like C, C++, and Fortran and wrapped for use in vectorized computing in Python. Julia was created to solve this problem, to have an easy-to-use language that is fast enough that your inner loops don't have to be written in C. Moreover, in ACT, a lot of the computing is also over discrete structures like graphs, for which farming out to other languages for speed becomes even more tedious.
Julia also has a number of language features, not possessed by Python, that are surprisingly well matched to ACT. One is its system for multiple type dispatch, so you can have a single generic function compose
that works across any category. Good support for Unicode operators allows nice syntactic sugar for these functions, which matches the standard mathematical notation. Julia also has support for Lisp-style metaprogramming, which we use extensively. For example, the system for GATs (generalized algebraic theories) in Catlab effectively implements a simple form of dependent typing in Julia using metaprogramming.
Mike Shulman said:
Is there some reason that "supporting scientific and numerical computation in a practical way" leads to the use of dynamic types? A priori they don't seem related to me; in fact it seems on a quick reading that part of what makes Julia performant is the ability of its compiler to optimize away the dynamic types!
I totally agree. There is no reason that this should necessarily lead to dynamic types and is in some ways opposed to it. Historically scientists and engineers have resisted types, but probably that's just because the classic languages with static types, like C, are really unpleasant.
Scientific computing currently seems to be converging not to dynamic typing but towards "gradual tying," where types are optional but recommended, in part because people have gradually realized that types are actually good for avoiding programming errors. Types in Julia are like this. In fact, if you look at the Catlab codebase, you'll see that basically every function is annotated with types, because we like to write in that style.
My understanding of "gradual typing" is that when the optional types are given, they should be static, i.e. checked at compile time. But my brief reading and experiments with Julia suggest that even when type annotations are given they are only checked at runtime. Is that correct?
I would second all of Evan's points. We find that multiple dispatch really lets you write code that looks like the math and we heavily exploit the lisp style macros to create DSLs embedded in Julia. Dynamic types are really popular in scientific computing because the users are scientists first and programmers seconds. You have to really think about types a lot in statically typed languages and in dynamic languages you can just write code without thinking about types too much. Julia gives you the best of both worlds because of gradual typing.
julia has a complicated relationship with "compile time" and "run time"
because it uses a just in time compiler, there is more than one compile time during the execution of the program. so yes the gradual types are static at compile time. There is just more than one compile time per execution
this is really important for us, because things like CSet{C} can be generated by runtime values of C.
Yes, everything I've seen about Julia seems pretty great except for the dynamicness of its typing. (Well, and maybe the 1-based array indexing.)
I thought mathematicians liked 1-based indexing? :slight_smile:
Maybe you're thinking of those people who call themselves mathematicians but don't admit 0 as a natural number?
I think I see what you mean about the JIT. But in practice it seems to me that this is less useful than true static typing. Consider the following example:
function foo(x::Int)
x
end
function bar(y::String)
foo(y)
end
It seems that this doesn't produce an error until bar
is called. Maybe that's technically "compile time" because I guess the code isn't compiled until it's used, but it seems to me that it means that instead of being able to write my program, compile it, and be sure as soon as it compiles that I didn't make any type errors, I need to carefully write tests that cover all possible execution paths in order to be sure that "all the code is compiled" and therefore typechecked. That doesn't feel like static typing to me.
Ok let's take the flame war about natural numbers elsewhere, it is definitely off-topic here. (-:O
Sorry.
yeah, it definitely is a compromise relative to statically typed language for your purposes. But in scientific computing if you want to run at the largest scales aka "the petaflop club", your options are C/C++, Fortran, and Julia https://juliacomputing.com/case-studies/celeste.html
is there currently a Julia library comparable to pytorch/tensorflow/etc in capability?
Given the huge amount of money Google has spent on TF, there is not an equally powerful library in Julia. You can use TF from Julia with Tensorflow.jl which links against the standard TF tools, and you can use a pure julia implementation DNNs with Flux.jl which is better for doing DNN research because it is pure julia with no dependence on black box C++ libs.
Maybe at ACT 2021 we will have a Catlab implementation of ParaParaEuc or @Toby Smithe's optics of learning framework
Given that Julia does know how to compile and to check types when it compiles, couldn't there be some flag one could pass to it instructing it to compile (and hence typecheck) everything rather than waiting for JIT? (I suppose probably this wouldn't catch everything since with parametrized types etc. there might always be some code that has to be compiled JIT, but it would be better than nothing.)
People are working on that for embedded platforms.
James Fairbanks said:
Maybe at ACT 2021 we will have a Catlab implementation of ParaParaEuc or Toby Smithe's optics of learning framework
It would be great if we can make that happen!
You want to hit the AACT jitsi and chat?
https://meet.jit.si/ACT2020-AppliedApplied
Yes, but right now I'm due to have supper..! (It's evening in Oxford now) Let's arrange something :-)
James Fairbanks said:
People are working on that for embedded platforms.
Yes, it's a practical problem for certain kinds of deployments. Work in progress on this is happening at PackageCompiler.jl.
Building off what @James Fairbanks presented in our talk this afternoon, I am going to drop some links here of working examples using Catlab and other AlgebraicJulia packages as a way to rapidly develop different types of models specifically on using open petri nets and decorated cospans. AlgebraicPetri.jl is the package that connects Catlab as a categorical interface for the Petri net simulation library in Julia Petri.jl. A few basic examples can be found in the AlgebraicPetri.jl documentation:
And lastly, here are a couple links to the repositories for Catlab.jl and AlgebraicPetri.jl
Here's the video!
https://www.youtube.com/watch?v=Q5BzzkNDpPE&list=PLCOXjXDLt3pYot9VNdLlZqGajHyZUywdI