if you can add in "assembly-level-like" infusion, where javascript snippets can be substituted directly into the resultant javascript output, then you are on to a winner. the pyjamas compiler uses this technique to directly interface with the DOM model of the browser.
I'm not going to comment on the merits of this (I actually think javascript is a better language to write javascript code in), but if this is exactly what pyjamas is doing, why would we want to do it on pypy?
the benefits of having a generic "assembler" mechanism in a compiler should be already clear, just as the "asm { ... }" statement shows in gcc. so i can safely assume that that's not what you're asking. which leaves the implication that it should be pypy that provides direct access to web browsers' DOM model, which most definitely is not the case. pyjamas comprises three parts: 1) a stand-alone python-to-javascript compiler [that has an "assembler" directive called JS()] 2) an independent library called DOM.py which mostly comprises JS() assembler directives. 3) an independent ui "widget" library which uses DOM.py to provide a widget set that is startlingly similar to python-qt4 and python-gtk2. the js tutorial which has been removed (so i cannot look at it) i remember it used mochikit, somehow, to do the same thing as DOM.py that implies that somehow there was the means to call javascript functions "as if from python". which, i can tell you from experience, makes a bit of a mess of the compiler as well as making a meal out of the interface between python and javascript.
however i believe that it would be relatively easy to go from this "good enough" / "relaxed" mode to "full python compliance" - the "strict" mode as i call it.
Exactly how? How do you map real python to javascript without killing the performance gain any js interpreter is going to give you?
ahh, i'm _so_ glad you asked :) one way to explain it is: i'm looking to replace cpython's PyType_IntObject, in intobject.h: typedef struct { PyObject_HEAD long ob_ival; } PyIntObject; with this: typedef struct { PyObject_HEAD JSObject *ob_ival; } PyIntObject; and in intobject.c, replace this: static PyObject * int_div(PyIntObject *x, PyIntObject *y) { long xi, yi; long d, m; CONVERT_TO_LONG(x, xi); CONVERT_TO_LONG(y, yi); switch (i_divmod(xi, yi, &d, &m)) { case DIVMOD_OK: return PyInt_FromLong(d); case DIVMOD_OVERFLOW: return PyLong_Type.tp_as_number->nb_divide((PyObject *)x, (PyObject *)y); default: return NULL; } } with this: static PyObject * int_div(PyIntObject *x, PyIntObject *y) { JSObject *d; /* the js object associated with x will have a javascript function __div__. get it, and call it. */ JSFunction *js_div_fn = get_js_function(x->ob_ival, "__div__"); d = call_js_function(js_div_fn, x->ob_ival, y->ob_ival); return PyInt_FromJSObject(d); } the function "int.__div__" will start off as a python class that will be compiled to javascript. in this way, i expect to gain the benefit of the aggressive JIT javascript JIT compilation of e.g. Google V8, _and_ keep interoperability with http://python.org's c-based modules (after a recompile, of course). i haven't quite worked out how the "coerce" function fits in to all this. and also someone kindly warned me about garbage collection, which i don't have a handle on, yet. in this way, all code that is actually python will end up being compiled to aggressively-JIT-optimised assembler. there will be no "checking" of CPython types. there will be no "conversion" to/from CPython types, because all CPython types will become is "boxes" around Javascript Objects. this is the approach that the unladen/swallow team will have to take, in their "Phase 2". they are _going_ to do this. so, i figured i might as well get a handle on it, in advance, to see if it's possible, and try to "get on the bandwagon" so to speak. also, i figured that there must be a way that you could also leverage this. there _has_ to be. in a generic way: typedef struct { PyObject_HEAD void *ob_ival; /* for DSO-loadable modules to override with something */ } PyIntObject; l.