<cfbolz>
No no, it's all good, as long as the socket is closed
<cfbolz>
fijal: where is socketio defined?
<fijal>
in socket.py I think
<cfbolz>
Ah, so it's a pure python class, ok
<cfbolz>
No way this is fast :-(
<fijal>
cfbolz: it's not a pure python class, it uses rawiobase extensively, I think
<fijal>
seems like a very thin wrapper around RawIOBase, but indeed no way it's fast
<fijal>
cfbolz: ok, so can I move this to RPython and try to be smarter about buffers?
<cfbolz>
fijal: I don't quite know, cpython has it in pure python too, right?
<cfbolz>
so it would be weird if we need to write more rpython than they have C code
<fijal>
meh
<fijal>
it also clearly does not care about performance, I think
<cfbolz>
fijal: ok, but the whole problem is that cpython is faster, right?
<fijal>
cfbolz: not quite, I *think* at the end of the day it's slower
<cfbolz>
ah
<cfbolz>
ok
<fijal>
but it takes a long time to warm up and for the amount of pure python code involved, it's not much slower
<cfbolz>
right
<fijal>
(and I think our worst latency is still worse)
otisolsen70 has quit [Quit: Leaving]
Darth has joined #pypy
<Darth>
Hi, is this Pypy irc chat?
<Corbin>
Yep. What's up?
<Darth>
I have been trying to compile Pypy 3 for my Raspberry Pi 3b+ and 4B to no avail as Pypy seems to be consuming over 2.6 Gb with the translation and 900 mb of Ram during the build process, despite the expected 2 Gb of Ram needed for a 32 Bit OS
<Darth>
Can you advise how I should build Pypy3
<Corbin>
Heh, funny coincidence. I ran into a similar issue last night. I think that cross-compilation is a reasonable approach, but your target distro might not have good cross-compiling support.
<cfbolz>
pypy is really bad at cross-compiling
<cfbolz>
Darth: do the prebuilt arm64 binaries not work?
<Darth>
I have yet to test it but I doubt it will work since Raspbian runs on Armv7/Armhf which is 32 bit
<larstiq_>
that section also has some advice for trying to manage still
<larstiq_>
"More precisely, translation on 32-bit takes at this point 2.7 GB if PyPy is used and 2.9 GB if CPython is used. There are two workarounds: ..."
<Darth>
Actually, I was monitoring Ram usage and compilation used over 3.5 Gb out of the available 3.6 Gb Ram which led to an error and refusal in the build process
<Corbin>
cfbolz: How much of it is environmental problems? Would QEMU be better than cross-compiling?
<Darth>
To add on, I am using pypy 2 provided by apt to compile pypy 3 from the source tar ball but the 2nd memory saving measure might be needed as I am not seeing the Ram usage being kept under 3 Gb
<Darth>
I heard using QEMU and SB2 to compile Pypy 3 for Raspberry pi is a thing but I have no experience on how to accomplish that...
<mattip>
there are two hints I can give
<Darth>
Oh? Please do advice
<mattip>
one is to use separate source and compile steps
<mattip>
and the other is to use environment variables to limit RAM
<mattip>
let me look up the exact steps (I did this for the conda-forge build which is done on RAM-restricted machines)
<Darth>
As in separate the stages for translation and compilation procedure?
<mattip>
proposing sys.monitoring for python3.11 instead of sys.settrace, sys.setprofilw
<Darth>
Would I need to add any specifc flags pertaining to Arm and such? The command I was using as a result of some extensive search on Stackoverflow and Github led me to using this command: pypy2 pypy3.7-v7.3.7-src/rpython/bin/rpython --opt=jit --platform=arm --gcrootfinder=shadowstack --jit-backend=arm --no-shared
<Darth>
mattip I had already done what you said initially but I will have to resort to the another alternative method due to the Ram issue
<Corbin>
mattip: That's an interesting trick. I'll have to try that with Nix. Thanks.
<mattip>
Darth: even with PYPY_GC_MAX_DELTA=400MB the translation runs out of memory?
<mattip>
are you translating pypy3 (which translates without micronumpy by default)?
<mattip>
or default (pypy2 - which has micronumpy turned on) - in which case try turning it off with --withoutmod-micronumpy
<Darth>
I will be trying that shortly as I have to reboot my Raspberry Pi 4 after the memory overran with an error and it's in a unusable (crashed) state
<Darth>
To clarify, I am trying to compile Pypy 3 from source but using Pypy 2 to build it, default pypy2 from apt to build pypy3 from source essentially
<Darth>
mattip: Do I set the environmemt variable to 400 Mb and run pypy2 ../../rpython/bin/rpython --no-compile --shared -Ojit targetpypystandalone.py? I assume --no-compile exists to separate thr stages
<mattip>
correct ( --shared is the default)
<Corbin>
mattip: With --no-compile, is there a reliable way to get the build directory, or do I have to guess by looking at paths?
<Corbin>
If I'm allowed to just *set* the build directory, I can work with that.
<mattip>
yes, you can look at rpython/tools/udir for the way it works out the build directory
<mattip>
using PYPY_USESSION_BASENAME and PYPY_USESSION_DIR
xcm_ has quit [Remote host closed the connection]
<Darth>
Alas, my previous run resulted in a failure while referring to my logs
<mattip>
uhh, --no-compile was supposed to not run the make
<mattip>
so you can cd into /tmp/usession-release-pypy3.7-v7.3.7-9/testing_1
<mattip>
and rerun make
greedom has joined #pypy
greedom has quit [Remote host closed the connection]
greedom has joined #pypy
<mattip>
any hints how I can get the annotator to mark SomeUnicodeString as notnull?
<mattip>
ztranslation is failing on windows in module/posix, the problem seems to be that
<mattip>
rposix._as_unicode0 calls FileEncoder.as_unicode() and then checks the result
<cfbolz>
mattip: not None?
<cfbolz>
or not containing a \0
<mattip>
as_unicode() does raise if there is a \x00, but the annotator is not seeing it
<mattip>
sorry, no_nul
<mattip>
adding "assert '\x00' not in result" before returning from as_unicode() does not help
<cfbolz>
mattip: who still uses SomeUnicodeString, btw?
<mattip>
yeah, we should get rid of that ...
<cfbolz>
mattip: I thought the assert should do it :-(
<mattip>
ok, I will ignore it, the correct thing to do is to remove realunicode_w
Darth has quit [Quit: Ping timeout (120 seconds)]
Darth has joined #pypy
lritter has quit [Ping timeout: 256 seconds]
greedom has quit [Remote host closed the connection]
greedom has joined #pypy
greedom has quit [Remote host closed the connection]
Darth has quit [Quit: Ping timeout (120 seconds)]
Darth has joined #pypy
<Darth>
mattip: It seems like even after setting it up with PYPY_GC_MAX_DELTA=400MB, I am still unable to get Pypy3 to compile as the memory far exceeds 3.6 Gb of Ram which my system has to offer onboard here
<Darth>
I might need to proceed with the --no-compile option you mentioned earlier amd proceed to separate the translation and build process
<Darth>
How do I proceed with the Make process after running the Pypy3 translation with no compile? I am still quite a bit confused on how to proceed after doing that
<mattip>
you cd into the directory you defined, and then run
<mattip>
make
<mattip>
it will compile the code, then you need to manually copy out the created artifacts
<mattip>
pypy3-c and libpypy3-c.so
<mattip>
back into the source tree pypy/goal
<mattip>
then you build the cffi c-extensions by "cd lib_pypy"
<mattip>
and ../pypy/goal/pypy3-c pypy_tools/build_cffi_imports.py
<mattip>
now you have an in-place pypy you can play with
<mattip>
if you want you can package it into a tarball via python pypy/tools/release/package.py
<mattip>
and unzip the tarball into some convenient directory
Atque has quit [Quit: ...]
<Darth>
Alright so by directory I defined you mean Pwd and not the /tmp directory right?
<Darth>
I assume I would be able to call Pypy3 directly by typing out its name in the command line after packaging and unzipping the tarball in my directory of choice?
<Darth>
mattip: By C extensions, that would include Pycryptodomex, Future imports and such that are running as such? (Sorry for the many questions)
<Darth>
Whats really weird is that on my 2nd attempt to rerun the compilation process with the same (--compile) command, the console said killed after the make -j 3 build command. However, the build process continues to run with 3 Cc processes running in the background
<Darth>
Now I have no idea if this is a failed run and if I would have to attempt using the --no-compile option
<Darth>
Edit: Confirmed failed run as the console errored out, ignore the above comment
<Darth>
I will attempt building without compiling first in this case now
<mattip>
you can restart the make as many times as necessary to complete the compilation, it will pick up where it left off
Atque has joined #pypy
<mattip>
"directory you defined" is the one with the Makefile, somewhere in /tmp, by using PYPY_USESSION_BASENAME and PYPY_USESSION_DIR
<mattip>
c-extensions are the ones in the top of lib_pypy/pypy_tools/build_cffi_imports.py: _ssl, sqlite3 and more