alice has quit [Remote host closed the connection]
alice has joined #pypy
pbsds has quit [Quit: Ping timeout (120 seconds)]
pbsds has joined #pypy
<fijal>
phlebas: I personally think it's better that it's *not* compatible with python
<fijal>
cfbolz: FYI I have been thinking quite a lot about something like mojo. I would not say I ever thought about it in any way that would make it a reality, but the topic of "python is really awkward for AI development" popped to my mind quite a bit
<cfbolz>
Yes, but here is one thing that bothers me
<cfbolz>
They write
<cfbolz>
That in the long term they hope cpython could be written in mojo
<fijal>
oh I'm sure there are many things that bother me :-)
<cfbolz>
That's kind of cool
<cfbolz>
But why do they use C++ the self
<fijal>
like, not really having an open source roadmap is very much a problem
<cfbolz>
Themself
<fijal>
without looking one bit, I would imagine for the sake of bootstrapping?
<cfbolz>
I don't think they plan to change that, and if they did they would write about it
<fijal>
yeah I have no idea
<fijal>
it's really hard to have a slightest clue based on docs only, without the ability to play or see the source code
<ctismer>
cfbolz: arigato: NoGil Python with PySide runs mandelbrot_nogil.py 7.7 times faster on a 10-core M1 machine.
<ctismer>
That would be a nice option for PyPy, which already runs the computation 10 times faster :)
<mattip>
nice but there are some hard problems to solve around reusing JITted code in multiple threads
<mattip>
without even talking about reproducing nogil in rpython/pypy
<ctismer>
mattip: sure, I once asked about that, already and know it is not trivial. Just wanted to show the huge effect, as a motivation
<ctismer>
But multiprocessing can anyway be used as well and does not need anything, right?
<mattip>
the overhead is higher
<ctismer>
yes, but with Mandelbrot, I split the problem into stripes which can be computed in parallel, and I only need to know when it is ready and collect the tiles. The whole computation takes long, so multiprocessing is probably fine.
<ctismer>
actually easier, because NoGil needed extra care to make sure that you don't share certain objects, which makes it slow again.
<ctismer>
mattip: would the Python 12 feature (one GIL per interpreter) be easier to add to PyPy?
<ctismer>
and would that provide a similar speed-up?
lehmrob has joined #pypy
jcea has joined #pypy
lehmrob has quit [Ping timeout: 260 seconds]
<mattip>
ctismer: that will be at least 2 years off, and I estimate will be difficult to add to PyPy
<mattip>
both because of coding challenges and because we really don't have alot of developer bandwidth
<ctismer>
mattip: Yes you should get another funding round to achieve all that and get more developers. It is so much worth to have PyPy everywhere, because it really saved a lot of energy. You should get some big potent sponsors (hi Elon)
jevinskie[m] has quit [Quit: Bridge terminating on SIGTERM]
jryans has quit [Quit: Bridge terminating on SIGTERM]
jean-paul[m] has quit [Quit: Bridge terminating on SIGTERM]
marmoute has quit [Quit: Bridge terminating on SIGTERM]
ronny has quit [Quit: Bridge terminating on SIGTERM]
audgirka[m] has quit [Quit: Bridge terminating on SIGTERM]
ronny has joined #pypy
<cfbolz>
that was research funding though
jean-paul[m] has joined #pypy
marmoute has joined #pypy
audgirka[m] has joined #pypy
jevinskie[m] has joined #pypy
jryans has joined #pypy
<ronny>
Holger may also be aware of some mechanisms
<larstiq>
dealing with all of that funding stuff is almost an entire job by itself
<ctismer>
larstiq: Sure it is. You need an extra person for that. But at this advanced project stage, it might IMHO be much easier to get funding than it was in 2005. Yes asking Holger makes sense, too
otisolsen70 has joined #pypy
[Arfrever] has quit [Ping timeout: 240 seconds]
[Arfrever] has joined #pypy
greedom has joined #pypy
greedom has quit [Read error: No route to host]
marvin_ has quit [Remote host closed the connection]
marvin_ has joined #pypy
marvin_ has quit [Remote host closed the connection]
marvin_ has joined #pypy
[Arfrever] has quit [Ping timeout: 240 seconds]
marvin_ has quit [Read error: No route to host]
marvin has joined #pypy
[Arfrever] has joined #pypy
otisolsen70 has quit [Read error: Connection reset by peer]
nanonyme has joined #pypy
<nanonyme>
arigato, hey, have you ever considered yanking cffi version '2' (sic). It is currently based on version sorting latest version of cffi.
<nanonyme>
Or is it yanked... this is actually pretty weird :)
Dejan has quit [Quit: Leaving]
<nanonyme>
Ah, wild, this is just a version parse error with packaging
<nanonyme>
Looks like it considers cffi-1.0.2-2.tar.gz version == 2
<nanonyme>
Sorry for disturbing :D
<nanonyme>
cffi project might get hit by this once pip updates packaging though so maybe worth yanking regardless
<mattip>
nanonyme: if there is not already an open issue at github.com/pypa/packaging, then maybe open one?
<nanonyme>
mattip, I think the problem is that cffi package violates version spec
<nanonyme>
That said, maybe pip will internally handle this somehow.... packaging project dropped support of not standards-compliant versions
<mattip>
ahh, right, the dash in 1.0.2-2 is broken
<mattip>
I wonder how many other packages on pypi have broken identifiers
<nanonyme>
I'm afraid so. Yanking sort of helps. It afaik keeps data on pypi so pinned requirements keep working but they can then be ignored easily while resolving versions
<mattip>
no, maybe not. dash is permitted for non-normalized version syntaxes,