cfbolz changed the topic of #pypy to: #pypy PyPy, the flexible snake https://pypy.org | IRC logs: https://quodlibet.duckdns.org/irc/pypy/latest.log.html#irc-end and https://libera.irclog.whitequark.org/pypy | hacking on TLS is fun, way more fun than arguing over petty shit, turns out
mattip has quit [Ping timeout: 256 seconds]
mattip has joined #pypy
slisnake has quit [Ping timeout: 256 seconds]
slisnake has joined #pypy
Atque has joined #pypy
Corbin has joined #pypy
_whitelogger has joined #pypy
otisolsen70 has joined #pypy
otisolsen70 has quit [Remote host closed the connection]
otisolsen70 has joined #pypy
dmalcolm has quit [Remote host closed the connection]
otisolsen70 has quit [*.net *.split]
Corbin has quit [*.net *.split]
marvin has quit [*.net *.split]
danchr has quit [*.net *.split]
Atque has quit [*.net *.split]
arigato has quit [*.net *.split]
catern has quit [*.net *.split]
luckydonald has quit [*.net *.split]
Hodgestar has quit [*.net *.split]
mgorny has quit [*.net *.split]
ronan has quit [*.net *.split]
yizawa has quit [*.net *.split]
idnar has quit [*.net *.split]
commandoline has quit [*.net *.split]
mattip has quit [*.net *.split]
nanonyme has quit [*.net *.split]
ammar2 has quit [*.net *.split]
habnabit_ has quit [*.net *.split]
tazle has quit [*.net *.split]
indyZ has quit [*.net *.split]
marmoute has quit [*.net *.split]
alicetries has quit [*.net *.split]
the_rat has quit [*.net *.split]
epony has quit [*.net *.split]
jacob22 has quit [*.net *.split]
eamanu has quit [*.net *.split]
shodan45 has quit [*.net *.split]
Lightsword has quit [*.net *.split]
graingert has quit [*.net *.split]
Ninpo has quit [*.net *.split]
agronholm has quit [*.net *.split]
dustinm has quit [*.net *.split]
nimaje has quit [*.net *.split]
mjacob has quit [*.net *.split]
tumbleweed has quit [*.net *.split]
fijal has quit [*.net *.split]
cfbolz has quit [*.net *.split]
ctismer_ has quit [*.net *.split]
phlebas has quit [*.net *.split]
jerith has quit [*.net *.split]
pjenvey has quit [*.net *.split]
luckydonald has joined #pypy
dmalcolm_ has joined #pypy
habnabit_ has joined #pypy
otisolsen70 has joined #pypy
nanonyme has joined #pypy
Atque has joined #pypy
epony has joined #pypy
mattip has joined #pypy
Corbin has joined #pypy
catern has joined #pypy
arigato has joined #pypy
commandoline has joined #pypy
idnar has joined #pypy
yizawa has joined #pypy
ronan has joined #pypy
mgorny has joined #pypy
marmoute has joined #pypy
indyZ has joined #pypy
the_rat has joined #pypy
tazle has joined #pypy
alicetries has joined #pypy
cfbolz has joined #pypy
Hodgestar has joined #pypy
nimaje has joined #pypy
graingert has joined #pypy
Lightsword has joined #pypy
tumbleweed has joined #pypy
shodan45 has joined #pypy
eamanu has joined #pypy
jacob22 has joined #pypy
jerith has joined #pypy
phlebas has joined #pypy
fijal has joined #pypy
ctismer_ has joined #pypy
marvin has joined #pypy
danchr has joined #pypy
pjenvey has joined #pypy
mjacob has joined #pypy
dustinm has joined #pypy
Ninpo has joined #pypy
agronholm has joined #pypy
epony has quit [Max SendQ exceeded]
epony has joined #pypy
ammar2 has joined #pypy
slav0nic has joined #pypy
<fijal> cfbolz: ok, so I have an interesting problem, I think
<fijal> I have stuff like this - https://foss.heptapod.net/pypy/pypy/-/issues/3591
<fijal> and it turns out it's mostly a warmup problem, I think
<fijal> but you need to run it for quite a bit
<fijal> one of the issues is that I can\'t really tell what % of time is spent warming up as all the "waiting" will show as "normal execution"
<fijal> which also muddies how much I can play with the buffers and such
<cfbolz> fijal: ah, right
<cfbolz> fijal: so you would like to profile using "user time" somehow?
<fijal> profile too, but also the PYPYLOG (that's the only way I get % of GC and JIT)
<fijal> so I can't see if having memoryview objects is actually problematic for the GC or not
<cfbolz> right
<fijal> in general it probably works badly - I would think
<fijal> but I can't prove it
<fijal> (and the experiments I did without network connection made pypy ~2x faster than cpy which is ok)
<cfbolz> fijal: so either you hack pypylog to use user time somehow
<cfbolz> or can you have it all locally somehow?
<fijal> so essentially the mess is as follows - you have a socket, that socket is wrapped in ssl then that is wrapped with io.BufferedSomething
<fijal> and the BufferedSomething requires fixed memory address, which is done using old generation
<fijal> for almost no real reason
<cfbolz> fijal: ok, but I think it would be a good idea to have a local test setup, to get rid of some of the randomness
<fijal> yeah, I've tried it with local nginx
<cfbolz> (didn't we already find out how to do that in november? I have a vague memory)
<fijal> that was something else, now I need a real HTTP server I think
<fijal> (the layers in httplib/client.py add up)
<cfbolz> ok
<cfbolz> but then the measurement error should be better, no?
<cfbolz> or is even local waiting a real source of unclarity
<fijal> normal-execution 81.96%
<fijal> gc-minor 4.71%
<fijal> gc-collect-step 5.18%
<fijal> jit-tracing 3.17%
<fijal> yeahg
<fijal> so that's a bit bad I think?
<fijal> there is pretty much no state that would require major gc
<cfbolz> so the gc-collect-step is a bit suspicious
<fijal> there is pretty much no state that would require major gc
<fijal> (it's doing small requests all the time)
<fijal> well, we know where it comes from - it's the buffer that requires copying to old GC memory to get the address
<fijal> do you have any clue how to do it better?
<fijal> pinning would work *in this particular case*, but we still need to have information saying "the address is not valid for very long here"
<cfbolz> fijal: can you do some very temporary hack to pin and see whether that helps?
<fijal> cfbolz: what happens if we pin stuff, fail to unpin it and then it dies?
<fijal> do we have support for that?
<cfbolz> fijal: good question, I think not
<cfbolz> because pin really means "C has a pointer, we can't track it"
<fijal> how is this related?
<cfbolz> so we can't know when it dies
<cfbolz> because even if there is no GC reference, the C code might still hold on
<fijal> ah no, that's definitely unsupported
<fijal> I would imagine
<fijal> but maybe it's supported
<cfbolz> it's likely a bug
<cfbolz> if there is no GC reference we *cannot* unpin
<cfbolz> so that's bad
<fijal> I think it's sane to say that we should hold onto the GC reference for as long as C needs one
<cfbolz> right
<cfbolz> but still, I think in general you really must unpin
<fijal> yes
<fijal> pffff, this is such a mess
<fijal> cfbolz: we can't *quite* pin what we want to pin, because this is a resizable array
<fijal> so we want to pin ll_items, but if we make a new one, we want to move the pin flag too
<cfbolz> ouch, complicated
<cfbolz> fijal: which array is that?
<fijal> it's the resizable char array that you use for buffered io I think?
<fijal> that's a very good question actually, I don't know, there are so many layers
<fijal> the one in SocketIO?
<fijal> cfbolz: whatever socket.makefile creates
<cfbolz> Right
* cfbolz afk for a few min
danchr has quit [Remote host closed the connection]
<cfbolz> back
danchr_ has joined #pypy
<cfbolz> or the commented out code a remnant of testing
<mattip> I am not sure :). I think something is still off around utf8, since there is a newly-failing lib-python test
lritter has joined #pypy
<fijal> grumble
<fijal> cfbolz: does not *obviously* help
<fijal> but I don't know if I can run it for long enough, I just removed the copying of the array, but I'm not sure if that's even the right idea
<fijal> (I mean no, I know it's not the right idea)
<fijal> ok, but the concept makes no sense, grumble
<fijal> we have a socket, that requires writing to some memory somewhere
<fijal> so we allocate the memory as a resizable array, in GC memory, then each time someone writes there, copy it to old memory
<cfbolz> mattip: no, that one is fixed on this night's run
<cfbolz> But yes, I know of at least one remaining problem of Utf8 mode
<fijal> hlp
<cfbolz> fijal: we copy it to old memory, but it's *different* memory every time, right?
<fijal> well, we don't copy same memory twice, no
<fijal> but it still makes no sense
<fijal> if we grow the list, we allocate new one (probably in the nursery), copy contents and poof, copy the contents again to old
<cfbolz> fijal: right
<fijal> at the very least the very conservative overallocation strategy is dumb
<cfbolz> fijal: that's all a property of the html implementation? Or still ssl sockets are the problem?
<cfbolz> http I mean
<fijal> it's socket.makefile
<fijal> so I'm not 100% sure who's responsibility is that, but probably "whoever wraps sockets in io.*"
<fijal> which seems to be http, but presumably also others
<cfbolz> fijal: ah, makefile, yes, that seems dangerous all in all
<cfbolz> io has various careful shortcuts for *exactly* files
<fijal> why is it done this way?
<fijal> cfbolz: so is the answer "don't use makefile"?
<fijal> I mean, that's an ok answer, I think
<cfbolz> fijal: I don't know, it's hard to say that all libraries need to stop
<cfbolz> fijal: maybe io needs some careful socket adaptations
<fijal> it already uses a thing called SocketIO
<fijal> so we know if it's used for sockets or not
<cfbolz> fijal: ok
<cfbolz> fijal: so that's a place that can maybe be tuned
<fijal> ok
<cfbolz> Would still be interesting whether it's a special problem with ssl sockets, or a general one
<fijal> I think a general one
<cfbolz> fijal: cool, that means we can probably somehow get a smaller reproducer with less annoying setup
<fijal> cfbolz: I'm happy with just hammering local nginx (and I can do it on both http and https)
<cfbolz> fijal: ok
<cfbolz> fijal: did you try perf or something else that gives you an rpython C function profile?
<cfbolz> (that's what I did for other io problems)
<fijal> no, I did not
<fijal> cfbolz: quick perf record shows mostly GC
<fijal> extreeeeemly efficient way to share a bunch of text
<cfbolz> :-)
<cfbolz> fijal: there are also hacks to find rpython functions that allocate the most
<cfbolz> Also, who makes all the objects with finalizers
<fijal> sockets?
<cfbolz> fijal: if it's just sockets, fine
<cfbolz> Not much you can do
<cfbolz> Do sockets unregister themselves when they are closed?
<fijal> from finalizers? no, how would you do that?
<cfbolz> fijal: there is an api for that nowadays
<cfbolz> And sockets uses it correctly so good
<fijal> ok, so someone doesn't, where is the API?
<fijal> I can try to start there
<cfbolz> self.may_unregister_rpython_finalizer(space)
<cfbolz> No no, it's all good, as long as the socket is closed
<cfbolz> fijal: where is socketio defined?
<fijal> in socket.py I think
<cfbolz> Ah, so it's a pure python class, ok
<cfbolz> No way this is fast :-(
<fijal> cfbolz: it's not a pure python class, it uses rawiobase extensively, I think
<fijal> seems like a very thin wrapper around RawIOBase, but indeed no way it's fast
<fijal> cfbolz: ok, so can I move this to RPython and try to be smarter about buffers?
<cfbolz> fijal: I don't quite know, cpython has it in pure python too, right?
<cfbolz> so it would be weird if we need to write more rpython than they have C code
<fijal> meh
<fijal> it also clearly does not care about performance, I think
<cfbolz> fijal: ok, but the whole problem is that cpython is faster, right?
<fijal> cfbolz: not quite, I *think* at the end of the day it's slower
<cfbolz> ah
<cfbolz> ok
<fijal> but it takes a long time to warm up and for the amount of pure python code involved, it's not much slower
<cfbolz> right
<fijal> (and I think our worst latency is still worse)
otisolsen70 has quit [Quit: Leaving]
Darth has joined #pypy
<Darth> Hi, is this Pypy irc chat?
<Corbin> Yep. What's up?
<Darth> I have been trying to compile Pypy 3 for my Raspberry Pi 3b+ and 4B to no avail as Pypy seems to be consuming over 2.6 Gb with the translation and 900 mb of Ram during the build process, despite the expected 2 Gb of Ram needed for a 32 Bit OS
<Darth> Can you advise how I should build Pypy3
<Corbin> Heh, funny coincidence. I ran into a similar issue last night. I think that cross-compilation is a reasonable approach, but your target distro might not have good cross-compiling support.
<cfbolz> pypy is really bad at cross-compiling
<cfbolz> Darth: do the prebuilt arm64 binaries not work?
<Darth> Ah I see...
<Darth> I have yet to test it but I doubt it will work since Raspbian runs on Armv7/Armhf which is 32 bit
<larstiq_> that section also has some advice for trying to manage still
<larstiq_> "More precisely, translation on 32-bit takes at this point 2.7 GB if PyPy is used and 2.9 GB if CPython is used. There are two workarounds: ..."
<Darth> Actually, I was monitoring Ram usage and compilation used over 3.5 Gb out of the available 3.6 Gb Ram which led to an error and refusal in the build process
<Corbin> cfbolz: How much of it is environmental problems? Would QEMU be better than cross-compiling?
<Darth> To add on, I am using pypy 2 provided by apt to compile pypy 3 from the source tar ball but the 2nd memory saving measure might be needed as I am not seeing the Ram usage being kept under 3 Gb
<Darth> I heard using QEMU and SB2 to compile Pypy 3 for Raspberry pi is a thing but I have no experience on how to accomplish that...
<mattip> there are two hints I can give
<Darth> Oh? Please do advice
<mattip> one is to use separate source and compile steps
<mattip> and the other is to use environment variables to limit RAM
<mattip> let me look up the exact steps (I did this for the conda-forge build which is done on RAM-restricted machines)
<Darth> As in separate the stages for translation and compilation procedure?
<Darth> Thanks that will be great
<mattip> export PYPY_GC_MAX_DELTA=400MB
<mattip> pypy2 ..\..\rpython\bin\rpython --no-compile --shared -Ojit targetpypystandalone.py
<mattip> #then cd into the /tmp/usession-XXX/testing_1 directory
<mattip> make Makefile
<mattip> cd -
<mattip> cp /tmp/usession-XXX/testing_1/pypy3-c .
<mattip> cp /tmp/usession-XXX/testing_1/libpypy3-c .so .
<mattip> that should do it
<larstiq_> seems similar to the instructions in https://www.pypy.org/download_advanced.html#building-from-source
<Darth> I see and thanks for sharing mattip
<mattip> larstiq_: yeah
<mattip> ahh, right, don't forget to build the cffi c-extension modules like _ssl
<mattip> Mark Shannon asked for comments on PEP 669 https://www.python.org/dev/peps/pep-0669/
<mattip> proposing sys.monitoring for python3.11 instead of sys.settrace, sys.setprofilw
<Darth> Would I need to add any specifc flags pertaining to Arm and such? The command I was using as a result of some extensive search on Stackoverflow and Github led me to using this command: pypy2 pypy3.7-v7.3.7-src/rpython/bin/rpython --opt=jit --platform=arm --gcrootfinder=shadowstack --jit-backend=arm --no-shared
<Darth> /home/pi/bot/pypy3_3/pypy3.7-v7.3.7-src/pypy/goal/targetpypystandalone.py
<mattip> I would start with no flags other than --opt=jit (also written -Ojit )
<Darth> I actually got rid of the --platform=arm Nx --jit-backend=arm flag as it was reporting a no SB2 env variable found error when I executed it
<mattip> comments on PEP 669 can be made to the thread at https://discuss.python.org/t/pep-669-low-impact-monitoring-for-cpython/13018
<Darth> mattip I had already done what you said initially but I will have to resort to the another alternative method due to the Ram issue
<Corbin> mattip: That's an interesting trick. I'll have to try that with Nix. Thanks.
<mattip> Darth: even with PYPY_GC_MAX_DELTA=400MB the translation runs out of memory?
<mattip> are you translating pypy3 (which translates without micronumpy by default)?
<mattip> or default (pypy2 - which has micronumpy turned on) - in which case try turning it off with --withoutmod-micronumpy
<Darth> I will be trying that shortly as I have to reboot my Raspberry Pi 4 after the memory overran with an error and it's in a unusable (crashed) state
<Darth> To clarify, I am trying to compile Pypy 3 from source but using Pypy 2 to build it, default pypy2 from apt to build pypy3 from source essentially
<Darth> mattip: Do I set the environmemt variable to 400 Mb and run pypy2 ../../rpython/bin/rpython --no-compile --shared -Ojit targetpypystandalone.py? I assume --no-compile exists to separate thr stages
<mattip> correct ( --shared is the default)
<Corbin> mattip: With --no-compile, is there a reliable way to get the build directory, or do I have to guess by looking at paths?
<Corbin> If I'm allowed to just *set* the build directory, I can work with that.
<mattip> yes, you can look at rpython/tools/udir for the way it works out the build directory
<mattip> using PYPY_USESSION_BASENAME and PYPY_USESSION_DIR
xcm_ has quit [Remote host closed the connection]
<Darth> Alas, my previous run resulted in a failure while referring to my logs
<Darth> [translation:info] written: /tmp/usession-release-pypy3.7-v7.3.7-9/testing_1/testing_1.c
<Darth> [91cbd] translation-task}
<Darth> [translation:info] Compiling c source...
<Darth> [91cbd] {translation-task
<Darth> starting compile_c
<Darth> [platform:execute] make -j 3 in /tmp/usession-release-pypy3.7-v7.3.7-9/testing_1
<Darth> Killed
<Darth> real114m31.401s
<Darth> user99m3.320s
<Darth> sys0m29.634s
<Darth> Traceback (most recent call last):
<Darth>   File "/home/pi/bot/pypy3_3/pypy3.7-v7.3.7-src/rpython/tool/runsubprocess.py", line 70, in <module>
<Darth>     sys.stdout.write('%r\n' % (results,))
<Darth> IOError: [Errno 32] Broken pipe: '<fdopen>'
xcm_ has joined #pypy
<mattip> uhh, --no-compile was supposed to not run the make
<mattip> so you can cd into /tmp/usession-release-pypy3.7-v7.3.7-9/testing_1
<mattip> and rerun make
greedom has joined #pypy
greedom has quit [Remote host closed the connection]
greedom has joined #pypy
<mattip> any hints how I can get the annotator to mark SomeUnicodeString as notnull?
<mattip> ztranslation is failing on windows in module/posix, the problem seems to be that
<mattip> rposix._as_unicode0 calls FileEncoder.as_unicode() and then checks the result
<cfbolz> mattip: not None?
<cfbolz> or not containing a \0
<mattip> as_unicode() does raise if there is a \x00, but the annotator is not seeing it
<mattip> sorry, no_nul
<mattip> adding "assert '\x00' not in result" before returning from as_unicode() does not help
<cfbolz> mattip: who still uses SomeUnicodeString, btw?
<mattip> yeah, we should get rid of that ...
<cfbolz> mattip: I thought the assert should do it :-(
<mattip> ok, I will ignore it, the correct thing to do is to remove realunicode_w
Darth has quit [Quit: Ping timeout (120 seconds)]
Darth has joined #pypy
lritter has quit [Ping timeout: 256 seconds]
greedom has quit [Remote host closed the connection]
greedom has joined #pypy
greedom has quit [Remote host closed the connection]
Darth has quit [Quit: Ping timeout (120 seconds)]
Darth has joined #pypy
<Darth> mattip: It seems like even after setting it up with PYPY_GC_MAX_DELTA=400MB, I am still unable to get Pypy3 to compile as the memory far exceeds 3.6 Gb of Ram which my system has to offer onboard here
<Darth> I might need to proceed with the --no-compile option you mentioned earlier amd proceed to separate the translation and build process
<Darth> How do I proceed with the Make process after running the Pypy3 translation with no compile? I am still quite a bit confused on how to proceed after doing that
<mattip> you cd into the directory you defined, and then run
<mattip> make
<mattip> it will compile the code, then you need to manually copy out the created artifacts
<mattip> pypy3-c and libpypy3-c.so
<mattip> back into the source tree pypy/goal
<mattip> then you build the cffi c-extensions by "cd lib_pypy"
<mattip> and ../pypy/goal/pypy3-c pypy_tools/build_cffi_imports.py
<mattip> now you have an in-place pypy you can play with
<mattip> if you want you can package it into a tarball via python pypy/tools/release/package.py
<mattip> and unzip the tarball into some convenient directory
Atque has quit [Quit: ...]
<Darth> Alright so by directory I defined you mean Pwd and not the /tmp directory right?
<Darth> I assume I would be able to call Pypy3 directly by typing out its name in the command line after packaging and unzipping the tarball in my directory of choice?
<Darth> mattip: By C extensions, that would include Pycryptodomex, Future imports and such that are running as such? (Sorry for the many questions)
<Darth> Whats really weird is that on my 2nd attempt to rerun the compilation process with the same (--compile) command, the console said killed after the make -j 3 build command. However, the build process continues to run with 3 Cc processes running in the background
<Darth> Now I have no idea if this is a failed run and if I would have to attempt using the --no-compile option
<Darth> Edit: Confirmed failed run as the console errored out, ignore the above comment
<Darth> I will attempt building without compiling first in this case now
<mattip> you can restart the make as many times as necessary to complete the compilation, it will pick up where it left off
Atque has joined #pypy
<mattip> "directory you defined" is the one with the Makefile, somewhere in /tmp, by using PYPY_USESSION_BASENAME and PYPY_USESSION_DIR
<mattip> c-extensions are the ones in the top of lib_pypy/pypy_tools/build_cffi_imports.py: _ssl, sqlite3 and more
infernix has quit [Ping timeout: 268 seconds]
infernix has joined #pypy
slav0nic has quit [Ping timeout: 240 seconds]
Darth has quit [Ping timeout: 256 seconds]