[pypy-dev] __builtin__ module
bokr at oz.net
Sat Jan 25 00:28:16 CET 2003
At 08:32 2003-01-24 -0500, Scott Fenton wrote:
>On Fri, Jan 24, 2003 at 02:48:42PM +0100, holger krekel wrote:
>> what do you mean by this? "everything" can always be expressed in
>> a python function. It's a matter of time and space so could you
>> be more specific?
>Mostly, stuff that exists as "builtin" syntax, ie printf-style formats
>that can be implemented "under the hood" using sprintf should probably
>be C. "everything" can be expressed in term of my signature, I just
>wouldn't try it if my life depened on it.
>> Anyway, I would try very hard to express all the builtins in python.
>> And i see Thomas Heller's ctypes approach as a way to make
>> this possible. Before coding anything in C there must be a
>> real *need* to do so.
>I can make chr python either way. It's a question of how "deep" we want
>python to go.
I'm thinking "depth" is a monotonic relationship among nodes along a path
in an acyclic graph, and so far we are talking about two kinds of nodes:
"interpreter level" and "Python level". I am getting an idea that maybe
we should be thinking meta-levels instead of two kinds, and that in general
there can concurrently exist many levels of nodes in the tree, all busily
acting as "interpreter level" for their higher level parents, except for
root and leaves. Not sure how useful this is for immediate goals, but I'm
struggling to form a satisfying abstract view of the whole problem ;-)
It's interesting that manipulation of representational elements in one
level implements operations at another level, and manipulation is _defined_
by a pattern of elements that live somewhere too. It's a subtle soup ;-)
BTW, discussion of chr(i) and ord(c) brings up the question of representing
reinterpret_cast at an abstract level. (Whether a C reinterpret_cast correctly
implements chr and ord semantics is a separate question. I just want to talk
about casting a moment, for the light it may shed on type info).
ISTM we need a (python level) type to (meta-) represent untyped bits. I.e., the
8 bits of a char don't carry the type info that makes them char vs uint8.
The hypothetical meta_repr function I mentioned in a previous post does make
use of a Python level object (Bits instance) to represent untyped bits. I.e.,
if c is a character, conceptually: meta_repr(c) => ('str', id(c), Bits(c))
where Bits(c) is a special bit vector at the Python level, but represents the
untyped bits of c at the next meta-level ("interpreter level" here).
An in-place reinterpret_cast from char to uint8 would amount to
('str', id(c), Bits(c)) => ('int', id(c), Bits(c))
(Assuming that in the abstract, 'int' doesn't care about the current number of
bits in its representation, though I'm skipping a detail about sign, which
is an interpretation of the msb of Bits(c), so it should probably be written
('str', id(c), Bits(c)) => ('int', id(c), Bits(0)+Bits(c))
where the '+' means bit vectors concatenate. Or you could have a distinct 'uint'
type. That might be necessary for unsigned fixed-width representations.
A _conversion_ would virtually allocate existence space by also providing
a new distinct id and copying the representational Bits (or not copying, if
we can tag the instance as immutable for some uses). Note that you can imagine
corresponding operations in a different level where there's an "actual" space
allocation somewhere in some array (Python level or malloc level ;-) and
id indicates a location in that array, and 'int' is perhaps encoded in some
Bits associated with the Bits of the newly allocated int representation, or maybe
implicit in the indication of the space array, if that is dedicated to a single type.
All this, too, could be represented abstractly at a level before machine language,
so I think a two-meta-level model may be constraining.
If you look at ('str', id(c), Bits(c)) as Python-level code, it is
a Python level tuple with a Python level string, int, and class instance.
The whole thing only has meta-meaning because of how it is interpreted
in a context relating two meta-levels.
I.e., the interpretation itself is expressed at the Python level, but being plain
Python, there is a meta_repr(('str', id(c), Bits(c))) involved in the next
level of what's happening, and so forth, until leaf representations of machine
code are evolved and used, IWT.
Hoping I'm helping factor and clarify concepts rather than tangle and muddy,
More information about the Pypy-dev