[Python-ideas] Python as meta-language
talin at acm.org
Mon Dec 25 09:25:23 CET 2006
Josiah Carlson wrote:
> Talin <talin at acm.org> wrote:
>> One of my long-standing interests is in mini-languages, particularly
>> declarative languages. I'm always looking out for examples where a
>> declarative language is used to represent some idea or concept that is
>> not easily written in an imperative language. Examples are the behaviors
>> of particle systems, kinematic constraints, formalized grammars, logical
>> inferencing systems, query languages, and so on. In other words, you are
>> using a language to describe a complex set of relationships, but you
>> aren't giving specific commands to execute in a specific order.
> If you haven't already seen it, you should check out Logix:
I think I looked at it before, but I'll have another look. From what I
can tell, there's some interesting ideas here, but there are also some
things that are kind of kludged together.
I'm not sure that I would want Python to be quite so malleable as Logix.
The reason for this is that it's hard to reason about a language when
the language is so changeable. Of course, this argument shouldn't be
taken too strongly - I don't mean to condemn programmable syntax in
general, I just don't want to take it too far.
I think what would make sense is to identify some of the most common use
cases in Logix, and see if those specific use cases fit within the
Python model. The example in the docs shows how to define an 'isa'
operator, and I certainly think that Python readability would be
improved by having something similar. (I've argued this before, so I
don't expect to get much traction on this.) But I wouldn't go so far as
to allow you to define brand new operators of arbitrary precedence.
Similarly, I think that it wouldn't hurt to expand the suite of
operators that are built into the compiler. Like most languages,
Python's compiler supports only those operators which are defined for
built-in types. Thus, we have the math operators +, -, * and /, because
we have built-in numeric types which support those basic operations.
However, mathematics defines many different kinds of operators, many of
which operate on other kinds of entities than just scalars. Examples
include cross-product and dot-product. Most languages don't define
operators for cross-product and dot-product, because they don't define
matrices as a built-in type. This gets inconvenient when you are doing
things like, say, 3D graphics programming, where they types that you are
operating on are vectors and matrices, and writing the code using
operators allows for more concise and readable code than having to do
everything using function calls. (There's an inherent readability
advantage IMHO to being able to take something right out of a math
textbook and type it directly as a line of code.)
Of course, anything that's done with an operator can also be done with a
function call, but from a readability standpoint, there are times when
operators make a lot of sense. After all, do you really want to write
'add( multiply( a, 1 ), 2 )'?
Given the ability to overload operators and define their meaning, why
should we limit the set of operators recognized by the compiler to only
those that make sense on built-in types?
I would say that while it certainly does make sense to give operators a
standard *meaning*, there's no reason why operators have to have a
standard *implementation*. In other words - unlike Logix, I want to be
able to look at a given operator and always know what it means, just as
I can look at the symbol '+' and know that it is somehow related to the
concept of 'addition'. The same would be true for any new operators.
In the case of mathematics and other languages that make heavy use of
symbols, there's the additional problem of rendering those symbols into
some combination of ASCII characters. To see what I mean, have a look at
this chart of math symbols as a starting point:
Some of these symbols would be fairly difficult to represent in ASCII,
others would be pretty easy. For example, the characters '=>' could be
used to represent the logical 'implies' symbol.
A less trivial example would be the "::=" operator that is used in many
formal grammars. I'd call this the "becomes" operator - thus, the
expression "a ::= b" might be spoken as "a becomes b" (or is it the
other way around?). A corresponding __becomes__ function would allow the
operator to be implemented for certain types, although I haven't really
worked out what the calling protocol would be.
Such an operator could be used in more than just parser generators
however; Ideally, it ought to be usable as an overloadable assignment
operator, or any situation where you want to express the concept 'a is
defined as b'.
More information about the Python-ideas