<ronan>
mattip: can you have a look at it? I'm still fairly confused by all the different ways of including these headers
<steve_s>
I still wonder if the public C APIs should offer HPy versions, which should be also used by the already ported internal code -- passing the context around instead of using some global state. The legacy Python C API versions can just fetch the HPy context and delegate to the new HPy variant.
<mattip>
ronan: rather than use a static function in the header, I think it should be part of the numpy c api
<mattip>
so you add /*NUMPY_API */ to the comment before the function definition in the C file, and a parser adds it to the struct-of-functions
<mattip>
maybe instead of a capsule, you can use a PyContextVar
<mattip>
see for instance PyDataMem_GetHandler in numpy/core/src/multiarray/alloc.c
<ronan>
mattip: It can't be in PyArray_API (which actually defines the ABI, right?) because it's used in static inline functions defined in a header that's required by __multiarray_api.h
<Hodgestar>
steve_s: On the previous call I thought we all agreed that we liked the trampoline method best & that maybe it is a useful thing long term. It's also quite onerous, but that won't hurt things like pybind11 where the hard parts only need to be written a few times probably.
<ronan>
I don't really understand ContextVars, TBH, but I don't see the point of using them since there's only one numpy module per process
<vstinner>
ContextVars has a different value per thread if i recall correctly
<mattip>
is the HPy context one-per-process or one-per-thread?
<antocuni>
it depends on the implementation
<antocuni>
so far:
<antocuni>
for the CPython ABI, it's a per-process global
<antocuni>
for the universal ABI on CPython, it's a per-process singleton, but each module stores its own reference to it
<antocuni>
for PyPy, same as universal ABI on CPython IIRC
<antocuni>
in practice, at the moment you can have two contexts around: the normal one and the debug-mode one
<vstinner>
and then CPython subinterpreters enter the room >_<
<Hodgestar>
We should probably give some though to the semantics of the trampolines (if we add them) and what to do in edge cases. E.g. "HPy_Trampoline(ctx, f)" could mean "give me a function pointer that will call f in an appropriate context for the same interpreter that ctx was from" and we probably need a means to return an error if the interpreter is no longer running.
<mattip>
ronan: I see. Those changes are problematic since callers of those static functions assume they can be called without the GIL
<mattip>
ronan: with that, the changeset to add npy_get_context() seems appropriate now that I look at it in the context of the whole branch
marvin_ has quit [Remote host closed the connection]
marvin has joined #hpy
<phlebas>
we should keep the possibility open that the hpy context is one per downcall scope - when running on graalvm with the llvm interpreter (the "managed mode") this might help us tremendously. at the very least what graalvm wants is one context per thread. this makes upcalls on the jvm much faster, since there is some thread-specific state that needs to be handled
<antocuni>
I agree
vstinner has left #hpy [#hpy]
<Hodgestar>
+1 from me too.
<Hodgestar>
Right now my view of the context is "it's the context the interpreter has given the function to run with". Everything else is not part of the contract at the moment.