cfbolz changed the topic of #pypy to: #pypy PyPy, the flexible snake https://pypy.org | IRC logs: https://quodlibet.duckdns.org/irc/pypy/latest.log.html#irc-end and https://libera.irclog.whitequark.org/pypy | the pypy angle is to shrug and copy the implementation of CPython as closely as possible, and staying out of design decisions
mjacob has quit [Ping timeout: 252 seconds]
mjacob has joined #pypy
mjacob has quit [Ping timeout: 252 seconds]
mjacob has joined #pypy
mjacob has quit [Ping timeout: 252 seconds]
mjacob has joined #pypy
jcea has quit [Ping timeout: 248 seconds]
itamarst has quit [Quit: Connection closed for inactivity]
itamarst has joined #pypy
jcea has joined #pypy
<korvo> https://github.com/statusfailed/catgrad This is quite cool: gradient descent defined in terms of a type signature using hypergraphs and semirings, and homomorphically compiled to multiple backends using the standard compiling-to-categories lore.
<korvo> I might add an RPython backend once they mature a bit more and ossify some of the interfaces; it would emit code similar to the Python 3 backend, but using some sort of RPython matrix-multiplication library.
[Arfrever] has quit [Ping timeout: 248 seconds]
<korvo> I don't want this for techbro BS, but for the humbler task of learning the parameters to some temperature sensors. I know what the sensors are like at compile time, so I should be able to compile a simple gradient-descent system based on the sensor layout.
<korvo> And then I won't have to do the cross algorithm for gradient-free optimizing, which is not especially efficient in high dimensions.
<korvo> ...Okay, I *am* thinking about building a Gödel machine, but it would just be for automatically proving Metamath theorems and maybe automatically improving speedruns.
[Arfrever] has joined #pypy
[Arfrever] has quit [Ping timeout: 268 seconds]
[Arfrever] has joined #pypy
[Arfrever] has quit [Ping timeout: 246 seconds]
[Arfrever] has joined #pypy