
I know that the "object/method calling API" is an open question but in the meantime is there any reliable (but maybe ugly) way for me to call an HPyDef_METH in a "Universal" (non-CPython) context?
I'm looking for something like this:
extern int PyWasm_Call_PyMethod(void *meth_handle, void *args, void *kwargs){ HPyMeth *method = (HPyMeth *)meth_handle; PyWasmDebug("Calling func %s with %p and %p ", method->name, args, kwargs); HPy rc = method->impl(args, kwargs) }
It's unclear to me whether the thing in HPyMeth->impl has a uniform calling convention or the heterogeneous one exposed by the objects themselves.
For example:
static HPy do_nothing_impl(HPyContext ctx, HPy self)
Versus
static HPy add_ints_impl(HPyContext ctx, HPy self, HPy *args, HPy_ssize_t nargs)
Do I have to look at the signature and call each one with the "right" calling convention?
Or is there a particular CPython library that I could pull in to do the calling without pulling in the whole interpreter (which would defeat the purpose of HPy!)?
Paul

Hi Paul,
As you say, some of these things are still being figured out. I'm attempting to answer below, but feel free to describe more about what you are implementing and what HPy additions or changes would make things better from that perspective.
On Wed, Dec 23, 2020 at 2:30 AM Paul Prescod <paul@prescod.net> wrote:
extern int PyWasm_Call_PyMethod(void *meth_handle, void *args, void *kwargs){
All the "void *"s in the signature make it a bit unclear to me what you're expecting meth_handle, args and kwargs to be. Are they each an HPy handle?
If they are all HPy handles, perhaps you want HPy_Call which I have implemented in https://github.com/hpyproject/hpy/pull/147 although it isn't merged yet.
HPyMeth *method = (HPyMeth *)meth_handle;
This line doesn't make sense to me because an HPy instance cannot be cased into a HPyMeth. If you really do have an HPyMeth instance, then maybe you can look at ctx_CallRealFunctionFromTrampoline in:
https://github.com/hpyproject/hpy/blob/master/hpy/universal/src/ctx_meth.c#L...
That is the CPython implementation of ctx->ctx_CallRealFunctionFromTrampoline. If you are implementing your own context (HPyContext), you'll probably want to write your own implementation. If you're not implementing your own context, I would probably suggest not writing code that uses HPyMeth directly.
It's unclear to me whether the thing in HPyMeth->impl has a uniform calling convention or the heterogeneous one exposed by the objects themselves.
The signatures of the impl functions vary according to the signature in HPyMeth->signature.
There is discussion of how to make calling the implementations more efficient by bypassing the argument parsing logic in https://github.com/hpyproject/hpy/issues/129 but I think that might a be bit separate from what you're asking here.
Hopefully this is useful. Feel free to reply with more questions.
Yours sincerely, Simon Cross

Thanks for the pointers Simon.
Yes I am implementing a context, where the host is a WASM container and the extension is compiled into a WASM module.
Just to step back a bit: it's been a long time since I programmed against the C API and now I'm doing it in the unfamiliar context of HPy, so terminology is confusing me a bit.
It certainly doesn't help that so many method signatures have void * and HPy types, although I understand why that is! Might it make sense to have different "kinds" of HPy pointer which hint at what interface the object behind the handle must implement (analogous to Python type hints)?
For example: HPy HPy_Call(HPyContext ctx, HPyCallable callable, HPySequence args, HPyMapping kw); For example in this case: have I have guessed correctly that any sequence works for args or does it have to be an actual tuple? Same for mapping.
One terminology thing that's confusing me is "Method" versus "Function". It seems like they are used essentially interchangeably at the C level whereas they have clear distinctions at the Python level. Is this something worth cleaning up in HPy? For me it is slightly confusing that lines like this seem to use Meth and Func interchangeably (unless I'm misunderstanding something!):
HPyDef_METH(call, "call", call_impl, HPyFunc_VARARGS) Part of why I hadn't looked much at the ctx_CallRealFunctionFromTrampoline stuff is that I thought maybe it referred to "C functions" used to implement an HPy context or extension rather than "Python functions" in a module.
I see what you mean about HPyMeth * not being compatible with HPy and I think that's just a typo/thinko in my pseudo-code.
That is the CPython implementation of
Yeah, that's what I was afraid of!
I'm confused though: the link you sent me to is in the "universal" directory. Is it really CPython-specific?
Part of what's confusing me is that CPython's implementation of PyObject_Call revolves around the tp->call pointer and I don't see much or any reference to that in HPy. It's unclear to me how the HPy_Call here:
https://github.com/hpyproject/hpy/pull/147/files#diff-3db9d2f8315cd47cac1b24...
Relates to the ctx->ctx_CallRealFunctionFromTrampoline
Basically I can't put the pseudo-code for ctx_Call together in my head. ctx_Call should invoke tp->call but HPyDefs don't have a t->call.
(another terminological weirdness you inherited from CPython is HPy_CallObject. What makes it more "related" to objects than HPy_Call?)
Sorry for the long and diverse email!
On Wed, Dec 23, 2020 at 3:36 AM Simon Cross <hodgestar@gmail.com> wrote:

On Wed, Dec 23, 2020 at 6:24 PM Paul Prescod <paul@prescod.net> wrote:
Yes I am implementing a context, where the host is a WASM container and the extension is compiled into a WASM module.
Could you perhaps share a link to the code you have so far? I don't promise to read all of it, but it might help avoid me trying to guess the details of what you are building.
My view is that we are actually trying to avoid such fine-grained typing at the C-level. Handles are intended to be opaque and building a C analogue of Python ABCs on top of them seems to go against that.
At the C level they are more similar -- it is only how they are called that is different, not how they are defined.
I'm confused though: the link you sent me to is in the "universal" directory. Is it really CPython- specific?
This confused me too initially! What is in hpy/universal is really the *CPython* implementation of the universal ABI. PyPy and GraalPython (and maybe now you?) have their own implementations.
I have a plan to reorganize the package structure to make this confusion go away, but it's on the todo list behind some other pieces of work.
Part of what's confusing me is that CPython's implementation of PyObject_Call revolves around the tp->call pointer and I don't see much or any reference to that in HPy. It's unclear to me how the HPy_Call here:
...
HPy_Call and ctx_CallRealFunctionFromTrampoline are really completely separate things -- which is probably why trying to put them together is confusing.
HPy_Call is a means to call a function passed as an HPy handle. Typically it will be used in the code of a C extension.
ctx_CallRealFunctionFromTrampoline is part of how the Python interpreter calls methods or functions implemented in a C extension using HPy. Typically it is only called behind the scenes and C extension writers will not use it directly (although since it's part of the ctx, they could). It's needed because the HPy functions, e.g., f_impl, don't have the same signatures as methods or functions implemented using the old C API.
(another terminological weirdness you inherited from CPython is HPy_CallObject. What makes it more "related" to objects than HPy_Call?)
I have no idea why the CPython API method is called PyObject_CallObject. :)
In general in HPy we are constantly faced with the question of whether we should make some part of the old C API better. We are trying to make only changes that are really necessary, and to keep many of the "irrelevant" warts in order to make the lives of people who port their extensions to HPy easier.
Sorry for the long and diverse email!
Not a problem at all!
Yours sincerely, Simon Cross

Happy holidays Hpy folks!
My work is very messy so far, but its here:
https://github.com/hpyproject/hpy/compare/master...prescod:feature/wasm-prot...
Let me try and summarize it though.
HPy Universal modules are DLLs which are compatible with any Python version that implements HPy.
But Imagine that there were a pure-Python wrapper for HPy built on ctypes. So if you implement CTypes, you would get HPy "for free".
Now go a step farther: we implement the binaries in WASM. So CPU architecture and operating system ALSO does not matter. Any binary can work in any Python implementation with access to a WASM runtime. Also, you only a small subset of ctypes' capabilities (perhaps just "struct"?).
That's what I'm trying to prototype.
So for example, to support the reflection of types from the extension into the host environment, I define this Ctypes structure:
class HPyType_Spec(Structure): _fields_ = [ ("name", c_int32), ("basicsize", c_int32), ("itemsize", c_int32), ("flags", c_uint32), ("legacy_slots", c_void_p), ("defines", c_void_p), ]
And then I write code like this
def PyUUType_FromSpec(runtime_context, ctx: int, spec: voidptr, params: voidptr) -> int: sizeof_spec = ctypes.sizeof(HPyType_Spec) struct_view = runtime_context.Ptr(spec).deref_to_view()[0:sizeof_spec] struct_obj = HPyType_Spec.from_buffer_copy(struct_view) nameptr = runtime_context.Ptr(struct_obj.name) name = nameptr.deref_to_str()
return runtime_context.new_handle(type(name, (), {}))
So this creates a new type in the host which reflects the name of the PyType that was defined in C. (now I'm wondering if I can make CFFI work for me on this context).
The next step is to make the methods on this type callable. That's where I'm not sure whether I should try to reuse the trampoline code you pointed me at or just start reimplementing it in Python.
I'm not asking that the handle "types" express the internal structure of the objects but rather the operations that can reasonably be done on them. Just like Python type annotations in pure-Python code. If an object legitimately MUST be a sequence or MUST be a tuple then you need to express that *somehow*, whether it be documentation or type signatures. And just as in Python we are tending towards type signatures instead of documentation, I think that's true here too.
Yeah, so my question is whether I need to implement ctx_CallRealFunctionFromTrampoline from scratch or is the implementation intended to be reused? Would PyPy or Jython 3 use the one in the HPy repo or would they have to implement their own?
Or could there be some function that simplifies this for interpreter creators? It seems like there are more than 20 calling conventions in the HPyFunc_Signature enum.
Couldn't a helper function implement all of the method argument "parsing" and call convention stuff for all host interpreters?
In the meantime I suppose I'll just start implementing my own arg call trampoline thingee.
I run into a lot of build problems whenever I include universal/src/ctx_meth.c so that has made it hard for me to experiment with it. I don't build with setup.py so I guess I'm missing something.
$ /usr/local/Cellar/llvm@9/9.0.1_2/bin/clang -I ./hpy/devel/include/ -I /usr/local/Cellar/python@3.8 /3.8.6_2/Frameworks/Python.framework/Versions/3.8/include/python3.8/ hpy/universal/src/ctx_meth.c
In file included from hpy/universal/src/ctx_meth.c:2: In file included from hpy/universal/src/ctx_meth.h:2: hpy/universal/src/api.h:6:9: warning: 'HPyAPI_STORAGE' macro redefined [-Wmacro-redefined] #define HPyAPI_STORAGE _HPy_HIDDEN ^ ./hpy/devel/include/cpython/hpy.h:17:9: note: previous definition is here #define HPyAPI_STORAGE __attribute__((unused)) static inline ^ In file included from hpy/universal/src/ctx_meth.c:3: hpy/universal/src/handles.h:90:5: error: expected ')' HPy _py2h(PyObject *);In file included from hpy/universal/src/ctx_meth.c:2: In file included from hpy/universal/src/ctx_meth.h:2: hpy/universal/src/api.h:6:9: warning: 'HPyAPI_STORAGE' macro redefined [-Wmacro-redefined] #define HPyAPI_STORAGE _HPy_HIDDEN ^ ./hpy/devel/include/cpython/hpy.h:17:9: note: previous definition is here #define HPyAPI_STORAGE __attribute__((unused)) static inline ^ In file included from hpy/universal/src/ctx_meth.c:3: hpy/universal/src/handles.h:90:5: error: expected ')' HPy _py2h(PyObject *);

I seem to be making headway like this:
def call(self, objself, args, kwargs): if objself is None: objself = self.runtime.new_handle(None)
if objself: assert self.runtime.has_handle(objself) # should be a handle
if self.signature == HPyFunc_Signature.HPyFunc_NOARGS.value: assert not args and not kwargs res = self.runtime.instance.exports.PyUU_Call_HPyFunc_NOARGS( self.extension_context, self.impl, objself ) return self.runtime.resolve_handle(res) elif self.signature == HPyFunc_Signature.HPyFunc_O.value: assert not kwargs assert len(args) == 1 arg = self.runtime.new_handle(args[0]) res = self.runtime.instance.exports.PyUU_Call_HPyFunc_O( self.extension_context, self.impl, objself, arg ) return self.runtime.resolve_handle(res) elif ... : # these will be harder, but doable ...
On Fri, Dec 25, 2020 at 8:39 PM Paul Prescod <paul@prescod.net> wrote:

Hello Paul, sorry for the late response. Maybe you have already solved your problems in the meantime, but let me try to sort out some of the confusions. Generally speaking, it is probably a good idea to look at the PyPy implementation of HPy, which is way cleaner than the CPython one.
Calling HPyMeth: as you discovered, HPyMeths have a ->impl, which is a function pointer to a generic C function and ->signature, which describes what is the signature of impl. So yes, if you want to call it you need to handle all the cases. However, at the moment the signature of HPyMeth is supposed to be only _VARARGS, _KEYWORDS, _NOARGS and _O. If you try to build an HPyMeth with a different signature you will probably get assertion errors or aborts() during the execution of your program.
ctx_CallRealFunctionFromTrampoline: forget about this, you should not call it at all. In theory, an HPy-compliant python implementation should be able to call HPyMeth directly, but in practice CPython obviously can't. So, in order to be able to implement hpy.universal for CPython, we generate CPython-only trampolines when we compile the extension (trampolines which are and should be completely ignored by all the other implementations); CallRealFunctionFromTrampoline is used in the body of these autogenerated trampolines and it is implemented only by the CPython version of hpy.universal.
ciao, Anto
On Wed, Dec 23, 2020 at 1:30 AM Paul Prescod <paul@prescod.net> wrote:

On Mon, Dec 28, 2020 at 10:04 AM Antonio Cuni <anto.cuni@gmail.com> wrote:
Thanks Antonio. Where can I find the PyPy implementation of HPy? Maybe I can reuse parts of it.
Calling HPyMeth: as you discovered, HPyMeths have a ->impl, which is a
Okay great, I did implement those 4 signatures.
Paul Prescod

Okay I found the hpy/pypy branch. A lot of my design and its had converged already.
@functions.add() def HPyUnicode_FromString(runtime_context, ctx: int, utf8: int) -> handle: data = runtime_context.decode(utf8) return runtime_context.new_handle(data)
Versus:
@API.func("HPy HPyUnicode_FromString(HPyContext ctx, const char *utf8)") def HPyUnicode_FromString(space, ctx, utf8): w_obj = _maybe_utf8_to_w(space, utf8) return handles.new(space, w_obj)
But on the other hand, I (for example) create new types like this:
newtype = type(type_struct.name_as_str, bases, defines) And you create them like this: w_type = W_HPyTypeObject( space, name, bases_w or [space.w_object], dict_w, basicsize)
On Mon, Dec 28, 2020 at 8:05 PM Paul Prescod <paul@prescod.net> wrote:

In PyPy, if I understand correctly, the interpreted RPython "runtime" is injecting things into the JIT-based "Python runtime" which is separate and called "space".
In my thing, the HPy host code runs in the same memory management "space" as the full Python interpreter and its the extension which is in a different memory management "space".
So that's why your HPyDictNew looks like this: @API.func("long HPyLong_AsLong(HPyContext ctx, HPy h)", error_value=API.cast("long", -1)) def HPyLong_AsLong(space, ctx, h): w_long = handles.deref(space, h) return space.int_w(space.int(w_long))
and mine looks like this
@functions.add() def HPyLong_AsLong(runtime_context, ctx: int, num_handle: handle) -> int: number = runtime_context.resolve_handle(num_handle) return number
I'm just dereferencing a Python number object. You're extracting it from a different memory context.
I could easily add a "space" abstraction to my system to get closer to sharing code but where it gets hairy is lltype, rffi, rgc.
It would be cool if any Python implementation could share SOME of the host-side semantics but the PyPy code may have too many PyPy quirks like rffi and rgc baked in.
It's an interesting thought experiment though.
What if GraalPython, IronPython, and HPy-Wasm could implement "space", "ffi" and "gc" interfaces and get functions like HPyType_FromSpec for free from the HPy project?
Is it a crazy idea? Could it work?
I have an equivalent for e.g. lltype.malloc, lltype.nullptr, etc.
But the lltype interface is hard-coded ad being imported from rpython.rtyper .lltypesystem
With some form of dependency injection, I could provide my own. And GraalPython could supply ITS own, and so forth. And tons of code could be reused, IMO.
On Mon, Dec 28, 2020 at 10:01 PM Paul Prescod <paul@prescod.net> wrote:

Hello Paul,
On Tue, Dec 29, 2020 at 7:31 AM Paul Prescod <paul@prescod.net> wrote:
not really. Forget about the JIT for now, it's not important for this discussion.
PyPy is a Python interpreter written in RPython. RPython itself is a low-level, statically typed language which is translated to C and then compiled by gcc into an executable. JIT apart, the PyPy's interpreter is fairly standard: it has a bytecode compiler, an bytecode eval loop, etc. In PyPy jargon, we call the code written in RPython as "interp level code", and the code written in "real" Python as "app level code". The app-level code is interpreted by the interpreter written at interp-level (that's where the names come from). (Note: the interp-level code can be compiled to C or be executed directly by a host Python interpreter, thus having a double interpretation. This is very useful for tests, but for the sake of this discussion you can forget about it and think of RPython as a normal compiled language). So, from this point of view, RPython : PyPy = C : CPython = Java : GraalPython = Rust : RustPython. The code inside pypy/module/_hpy_universal/ is interp-level RPython.
In my thing, the HPy host code runs in the same memory management "space"
as the full Python interpreter and its the extension which is in a different memory management "space".
You can think of the "space" as a namespace (it's not exactly this, but it's a good enough approximation). So, space.add is completely equivalent to CPython's PyObject_Add, space.sub to PyObject_Subtract, and so on.
The code looks similar superficially because this is a very simple function, but as you noticed it can quickly become more complicated. This happens because the code in _hpy_universal is very tied to the PyPy's Python interpreter (it's an extension module written specifically for it, after all), and also because it's meant to be compiled at the level of C, that's why you see things like lltype.malloc.
What you do is completely different: if I understand correctly (see below) you are trying to implement an HPy universal importer in pure python, at app-level: your thing runs on any python interpreter, pypy's code doesn't have any chance at it.
Personally, I don't see any advantage in doing that. You will end up sharing only very simple and uninteresting stuff, and have to diverge for the more important ones.
Let me also answer here to your other email "WASM <--> HPy", to avoid spreading the conversation over too many threads. If I understand it correctly, you are writing a pure-python module which allows you to import HPy WASM-compiled extensions into any python interpreter. The concept itself is very cool and if written well could be a good reference implementation for HPy's semantics. Apart from that, I fail to understand if/how/why it is useful in the real world: implementing it in pure python introduces a lot of overhead, so extensions will be horribly slow.
If the goal is to import HPy extensions in the browser, then you must already have a Python interpreter running in the browser, and this Python interpreter should provide its own low-level implementation of HPy instead of using this pure-python backed. If the goal is to demonstrate that this is doable, then I agree it's very cool :)
A few other random comments:
I see that you wrote it by modifying the HPy repo. This should not be necessary and such a project should live in a separate repo, IMHO
I see that you modified/added some files in hpy/devel/include. This is "wrong". hpy/devel contains the .h files which are necessary to compile universal extensions. If you need to provide a special header in order to compile for WASM, then they are no longer really universal. I admit I didn't look in detail though, so I might be wrong
you should use cffi instead of ctypes. I can't think of any good reason to prefer ctypes over cffi apart the fact which is in the stdlib, but it's not a good enough reason to use it :)

Hi Paul,
As you say, some of these things are still being figured out. I'm attempting to answer below, but feel free to describe more about what you are implementing and what HPy additions or changes would make things better from that perspective.
On Wed, Dec 23, 2020 at 2:30 AM Paul Prescod <paul@prescod.net> wrote:
extern int PyWasm_Call_PyMethod(void *meth_handle, void *args, void *kwargs){
All the "void *"s in the signature make it a bit unclear to me what you're expecting meth_handle, args and kwargs to be. Are they each an HPy handle?
If they are all HPy handles, perhaps you want HPy_Call which I have implemented in https://github.com/hpyproject/hpy/pull/147 although it isn't merged yet.
HPyMeth *method = (HPyMeth *)meth_handle;
This line doesn't make sense to me because an HPy instance cannot be cased into a HPyMeth. If you really do have an HPyMeth instance, then maybe you can look at ctx_CallRealFunctionFromTrampoline in:
https://github.com/hpyproject/hpy/blob/master/hpy/universal/src/ctx_meth.c#L...
That is the CPython implementation of ctx->ctx_CallRealFunctionFromTrampoline. If you are implementing your own context (HPyContext), you'll probably want to write your own implementation. If you're not implementing your own context, I would probably suggest not writing code that uses HPyMeth directly.
It's unclear to me whether the thing in HPyMeth->impl has a uniform calling convention or the heterogeneous one exposed by the objects themselves.
The signatures of the impl functions vary according to the signature in HPyMeth->signature.
There is discussion of how to make calling the implementations more efficient by bypassing the argument parsing logic in https://github.com/hpyproject/hpy/issues/129 but I think that might a be bit separate from what you're asking here.
Hopefully this is useful. Feel free to reply with more questions.
Yours sincerely, Simon Cross

Thanks for the pointers Simon.
Yes I am implementing a context, where the host is a WASM container and the extension is compiled into a WASM module.
Just to step back a bit: it's been a long time since I programmed against the C API and now I'm doing it in the unfamiliar context of HPy, so terminology is confusing me a bit.
It certainly doesn't help that so many method signatures have void * and HPy types, although I understand why that is! Might it make sense to have different "kinds" of HPy pointer which hint at what interface the object behind the handle must implement (analogous to Python type hints)?
For example: HPy HPy_Call(HPyContext ctx, HPyCallable callable, HPySequence args, HPyMapping kw); For example in this case: have I have guessed correctly that any sequence works for args or does it have to be an actual tuple? Same for mapping.
One terminology thing that's confusing me is "Method" versus "Function". It seems like they are used essentially interchangeably at the C level whereas they have clear distinctions at the Python level. Is this something worth cleaning up in HPy? For me it is slightly confusing that lines like this seem to use Meth and Func interchangeably (unless I'm misunderstanding something!):
HPyDef_METH(call, "call", call_impl, HPyFunc_VARARGS) Part of why I hadn't looked much at the ctx_CallRealFunctionFromTrampoline stuff is that I thought maybe it referred to "C functions" used to implement an HPy context or extension rather than "Python functions" in a module.
I see what you mean about HPyMeth * not being compatible with HPy and I think that's just a typo/thinko in my pseudo-code.
That is the CPython implementation of
Yeah, that's what I was afraid of!
I'm confused though: the link you sent me to is in the "universal" directory. Is it really CPython-specific?
Part of what's confusing me is that CPython's implementation of PyObject_Call revolves around the tp->call pointer and I don't see much or any reference to that in HPy. It's unclear to me how the HPy_Call here:
https://github.com/hpyproject/hpy/pull/147/files#diff-3db9d2f8315cd47cac1b24...
Relates to the ctx->ctx_CallRealFunctionFromTrampoline
Basically I can't put the pseudo-code for ctx_Call together in my head. ctx_Call should invoke tp->call but HPyDefs don't have a t->call.
(another terminological weirdness you inherited from CPython is HPy_CallObject. What makes it more "related" to objects than HPy_Call?)
Sorry for the long and diverse email!
On Wed, Dec 23, 2020 at 3:36 AM Simon Cross <hodgestar@gmail.com> wrote:

On Wed, Dec 23, 2020 at 6:24 PM Paul Prescod <paul@prescod.net> wrote:
Yes I am implementing a context, where the host is a WASM container and the extension is compiled into a WASM module.
Could you perhaps share a link to the code you have so far? I don't promise to read all of it, but it might help avoid me trying to guess the details of what you are building.
My view is that we are actually trying to avoid such fine-grained typing at the C-level. Handles are intended to be opaque and building a C analogue of Python ABCs on top of them seems to go against that.
At the C level they are more similar -- it is only how they are called that is different, not how they are defined.
I'm confused though: the link you sent me to is in the "universal" directory. Is it really CPython- specific?
This confused me too initially! What is in hpy/universal is really the *CPython* implementation of the universal ABI. PyPy and GraalPython (and maybe now you?) have their own implementations.
I have a plan to reorganize the package structure to make this confusion go away, but it's on the todo list behind some other pieces of work.
Part of what's confusing me is that CPython's implementation of PyObject_Call revolves around the tp->call pointer and I don't see much or any reference to that in HPy. It's unclear to me how the HPy_Call here:
...
HPy_Call and ctx_CallRealFunctionFromTrampoline are really completely separate things -- which is probably why trying to put them together is confusing.
HPy_Call is a means to call a function passed as an HPy handle. Typically it will be used in the code of a C extension.
ctx_CallRealFunctionFromTrampoline is part of how the Python interpreter calls methods or functions implemented in a C extension using HPy. Typically it is only called behind the scenes and C extension writers will not use it directly (although since it's part of the ctx, they could). It's needed because the HPy functions, e.g., f_impl, don't have the same signatures as methods or functions implemented using the old C API.
(another terminological weirdness you inherited from CPython is HPy_CallObject. What makes it more "related" to objects than HPy_Call?)
I have no idea why the CPython API method is called PyObject_CallObject. :)
In general in HPy we are constantly faced with the question of whether we should make some part of the old C API better. We are trying to make only changes that are really necessary, and to keep many of the "irrelevant" warts in order to make the lives of people who port their extensions to HPy easier.
Sorry for the long and diverse email!
Not a problem at all!
Yours sincerely, Simon Cross

Happy holidays Hpy folks!
My work is very messy so far, but its here:
https://github.com/hpyproject/hpy/compare/master...prescod:feature/wasm-prot...
Let me try and summarize it though.
HPy Universal modules are DLLs which are compatible with any Python version that implements HPy.
But Imagine that there were a pure-Python wrapper for HPy built on ctypes. So if you implement CTypes, you would get HPy "for free".
Now go a step farther: we implement the binaries in WASM. So CPU architecture and operating system ALSO does not matter. Any binary can work in any Python implementation with access to a WASM runtime. Also, you only a small subset of ctypes' capabilities (perhaps just "struct"?).
That's what I'm trying to prototype.
So for example, to support the reflection of types from the extension into the host environment, I define this Ctypes structure:
class HPyType_Spec(Structure): _fields_ = [ ("name", c_int32), ("basicsize", c_int32), ("itemsize", c_int32), ("flags", c_uint32), ("legacy_slots", c_void_p), ("defines", c_void_p), ]
And then I write code like this
def PyUUType_FromSpec(runtime_context, ctx: int, spec: voidptr, params: voidptr) -> int: sizeof_spec = ctypes.sizeof(HPyType_Spec) struct_view = runtime_context.Ptr(spec).deref_to_view()[0:sizeof_spec] struct_obj = HPyType_Spec.from_buffer_copy(struct_view) nameptr = runtime_context.Ptr(struct_obj.name) name = nameptr.deref_to_str()
return runtime_context.new_handle(type(name, (), {}))
So this creates a new type in the host which reflects the name of the PyType that was defined in C. (now I'm wondering if I can make CFFI work for me on this context).
The next step is to make the methods on this type callable. That's where I'm not sure whether I should try to reuse the trampoline code you pointed me at or just start reimplementing it in Python.
I'm not asking that the handle "types" express the internal structure of the objects but rather the operations that can reasonably be done on them. Just like Python type annotations in pure-Python code. If an object legitimately MUST be a sequence or MUST be a tuple then you need to express that *somehow*, whether it be documentation or type signatures. And just as in Python we are tending towards type signatures instead of documentation, I think that's true here too.
Yeah, so my question is whether I need to implement ctx_CallRealFunctionFromTrampoline from scratch or is the implementation intended to be reused? Would PyPy or Jython 3 use the one in the HPy repo or would they have to implement their own?
Or could there be some function that simplifies this for interpreter creators? It seems like there are more than 20 calling conventions in the HPyFunc_Signature enum.
Couldn't a helper function implement all of the method argument "parsing" and call convention stuff for all host interpreters?
In the meantime I suppose I'll just start implementing my own arg call trampoline thingee.
I run into a lot of build problems whenever I include universal/src/ctx_meth.c so that has made it hard for me to experiment with it. I don't build with setup.py so I guess I'm missing something.
$ /usr/local/Cellar/llvm@9/9.0.1_2/bin/clang -I ./hpy/devel/include/ -I /usr/local/Cellar/python@3.8 /3.8.6_2/Frameworks/Python.framework/Versions/3.8/include/python3.8/ hpy/universal/src/ctx_meth.c
In file included from hpy/universal/src/ctx_meth.c:2: In file included from hpy/universal/src/ctx_meth.h:2: hpy/universal/src/api.h:6:9: warning: 'HPyAPI_STORAGE' macro redefined [-Wmacro-redefined] #define HPyAPI_STORAGE _HPy_HIDDEN ^ ./hpy/devel/include/cpython/hpy.h:17:9: note: previous definition is here #define HPyAPI_STORAGE __attribute__((unused)) static inline ^ In file included from hpy/universal/src/ctx_meth.c:3: hpy/universal/src/handles.h:90:5: error: expected ')' HPy _py2h(PyObject *);In file included from hpy/universal/src/ctx_meth.c:2: In file included from hpy/universal/src/ctx_meth.h:2: hpy/universal/src/api.h:6:9: warning: 'HPyAPI_STORAGE' macro redefined [-Wmacro-redefined] #define HPyAPI_STORAGE _HPy_HIDDEN ^ ./hpy/devel/include/cpython/hpy.h:17:9: note: previous definition is here #define HPyAPI_STORAGE __attribute__((unused)) static inline ^ In file included from hpy/universal/src/ctx_meth.c:3: hpy/universal/src/handles.h:90:5: error: expected ')' HPy _py2h(PyObject *);

I seem to be making headway like this:
def call(self, objself, args, kwargs): if objself is None: objself = self.runtime.new_handle(None)
if objself: assert self.runtime.has_handle(objself) # should be a handle
if self.signature == HPyFunc_Signature.HPyFunc_NOARGS.value: assert not args and not kwargs res = self.runtime.instance.exports.PyUU_Call_HPyFunc_NOARGS( self.extension_context, self.impl, objself ) return self.runtime.resolve_handle(res) elif self.signature == HPyFunc_Signature.HPyFunc_O.value: assert not kwargs assert len(args) == 1 arg = self.runtime.new_handle(args[0]) res = self.runtime.instance.exports.PyUU_Call_HPyFunc_O( self.extension_context, self.impl, objself, arg ) return self.runtime.resolve_handle(res) elif ... : # these will be harder, but doable ...
On Fri, Dec 25, 2020 at 8:39 PM Paul Prescod <paul@prescod.net> wrote:

Hello Paul, sorry for the late response. Maybe you have already solved your problems in the meantime, but let me try to sort out some of the confusions. Generally speaking, it is probably a good idea to look at the PyPy implementation of HPy, which is way cleaner than the CPython one.
Calling HPyMeth: as you discovered, HPyMeths have a ->impl, which is a function pointer to a generic C function and ->signature, which describes what is the signature of impl. So yes, if you want to call it you need to handle all the cases. However, at the moment the signature of HPyMeth is supposed to be only _VARARGS, _KEYWORDS, _NOARGS and _O. If you try to build an HPyMeth with a different signature you will probably get assertion errors or aborts() during the execution of your program.
ctx_CallRealFunctionFromTrampoline: forget about this, you should not call it at all. In theory, an HPy-compliant python implementation should be able to call HPyMeth directly, but in practice CPython obviously can't. So, in order to be able to implement hpy.universal for CPython, we generate CPython-only trampolines when we compile the extension (trampolines which are and should be completely ignored by all the other implementations); CallRealFunctionFromTrampoline is used in the body of these autogenerated trampolines and it is implemented only by the CPython version of hpy.universal.
ciao, Anto
On Wed, Dec 23, 2020 at 1:30 AM Paul Prescod <paul@prescod.net> wrote:

On Mon, Dec 28, 2020 at 10:04 AM Antonio Cuni <anto.cuni@gmail.com> wrote:
Thanks Antonio. Where can I find the PyPy implementation of HPy? Maybe I can reuse parts of it.
Calling HPyMeth: as you discovered, HPyMeths have a ->impl, which is a
Okay great, I did implement those 4 signatures.
Paul Prescod

Okay I found the hpy/pypy branch. A lot of my design and its had converged already.
@functions.add() def HPyUnicode_FromString(runtime_context, ctx: int, utf8: int) -> handle: data = runtime_context.decode(utf8) return runtime_context.new_handle(data)
Versus:
@API.func("HPy HPyUnicode_FromString(HPyContext ctx, const char *utf8)") def HPyUnicode_FromString(space, ctx, utf8): w_obj = _maybe_utf8_to_w(space, utf8) return handles.new(space, w_obj)
But on the other hand, I (for example) create new types like this:
newtype = type(type_struct.name_as_str, bases, defines) And you create them like this: w_type = W_HPyTypeObject( space, name, bases_w or [space.w_object], dict_w, basicsize)
On Mon, Dec 28, 2020 at 8:05 PM Paul Prescod <paul@prescod.net> wrote:

In PyPy, if I understand correctly, the interpreted RPython "runtime" is injecting things into the JIT-based "Python runtime" which is separate and called "space".
In my thing, the HPy host code runs in the same memory management "space" as the full Python interpreter and its the extension which is in a different memory management "space".
So that's why your HPyDictNew looks like this: @API.func("long HPyLong_AsLong(HPyContext ctx, HPy h)", error_value=API.cast("long", -1)) def HPyLong_AsLong(space, ctx, h): w_long = handles.deref(space, h) return space.int_w(space.int(w_long))
and mine looks like this
@functions.add() def HPyLong_AsLong(runtime_context, ctx: int, num_handle: handle) -> int: number = runtime_context.resolve_handle(num_handle) return number
I'm just dereferencing a Python number object. You're extracting it from a different memory context.
I could easily add a "space" abstraction to my system to get closer to sharing code but where it gets hairy is lltype, rffi, rgc.
It would be cool if any Python implementation could share SOME of the host-side semantics but the PyPy code may have too many PyPy quirks like rffi and rgc baked in.
It's an interesting thought experiment though.
What if GraalPython, IronPython, and HPy-Wasm could implement "space", "ffi" and "gc" interfaces and get functions like HPyType_FromSpec for free from the HPy project?
Is it a crazy idea? Could it work?
I have an equivalent for e.g. lltype.malloc, lltype.nullptr, etc.
But the lltype interface is hard-coded ad being imported from rpython.rtyper .lltypesystem
With some form of dependency injection, I could provide my own. And GraalPython could supply ITS own, and so forth. And tons of code could be reused, IMO.
On Mon, Dec 28, 2020 at 10:01 PM Paul Prescod <paul@prescod.net> wrote:

Hello Paul,
On Tue, Dec 29, 2020 at 7:31 AM Paul Prescod <paul@prescod.net> wrote:
not really. Forget about the JIT for now, it's not important for this discussion.
PyPy is a Python interpreter written in RPython. RPython itself is a low-level, statically typed language which is translated to C and then compiled by gcc into an executable. JIT apart, the PyPy's interpreter is fairly standard: it has a bytecode compiler, an bytecode eval loop, etc. In PyPy jargon, we call the code written in RPython as "interp level code", and the code written in "real" Python as "app level code". The app-level code is interpreted by the interpreter written at interp-level (that's where the names come from). (Note: the interp-level code can be compiled to C or be executed directly by a host Python interpreter, thus having a double interpretation. This is very useful for tests, but for the sake of this discussion you can forget about it and think of RPython as a normal compiled language). So, from this point of view, RPython : PyPy = C : CPython = Java : GraalPython = Rust : RustPython. The code inside pypy/module/_hpy_universal/ is interp-level RPython.
In my thing, the HPy host code runs in the same memory management "space"
as the full Python interpreter and its the extension which is in a different memory management "space".
You can think of the "space" as a namespace (it's not exactly this, but it's a good enough approximation). So, space.add is completely equivalent to CPython's PyObject_Add, space.sub to PyObject_Subtract, and so on.
The code looks similar superficially because this is a very simple function, but as you noticed it can quickly become more complicated. This happens because the code in _hpy_universal is very tied to the PyPy's Python interpreter (it's an extension module written specifically for it, after all), and also because it's meant to be compiled at the level of C, that's why you see things like lltype.malloc.
What you do is completely different: if I understand correctly (see below) you are trying to implement an HPy universal importer in pure python, at app-level: your thing runs on any python interpreter, pypy's code doesn't have any chance at it.
Personally, I don't see any advantage in doing that. You will end up sharing only very simple and uninteresting stuff, and have to diverge for the more important ones.
Let me also answer here to your other email "WASM <--> HPy", to avoid spreading the conversation over too many threads. If I understand it correctly, you are writing a pure-python module which allows you to import HPy WASM-compiled extensions into any python interpreter. The concept itself is very cool and if written well could be a good reference implementation for HPy's semantics. Apart from that, I fail to understand if/how/why it is useful in the real world: implementing it in pure python introduces a lot of overhead, so extensions will be horribly slow.
If the goal is to import HPy extensions in the browser, then you must already have a Python interpreter running in the browser, and this Python interpreter should provide its own low-level implementation of HPy instead of using this pure-python backed. If the goal is to demonstrate that this is doable, then I agree it's very cool :)
A few other random comments:
I see that you wrote it by modifying the HPy repo. This should not be necessary and such a project should live in a separate repo, IMHO
I see that you modified/added some files in hpy/devel/include. This is "wrong". hpy/devel contains the .h files which are necessary to compile universal extensions. If you need to provide a special header in order to compile for WASM, then they are no longer really universal. I admit I didn't look in detail though, so I might be wrong
you should use cffi instead of ctypes. I can't think of any good reason to prefer ctypes over cffi apart the fact which is in the stdlib, but it's not a good enough reason to use it :)
participants (3)
-
Antonio Cuni
-
Paul Prescod
-
Simon Cross