[Python-ideas] New 3.x restriction on number of keyword arguments

Cesare Di Mauro cesare.di.mauro at gmail.com
Sat Oct 23 08:07:48 CEST 2010

2010/10/23 M.-A. Lemburg <mal at egenix.com>

> I wish we could get rid off all the byte shifting and div'ery
> use in the byte compiler - I'm pretty sure that such operations
> are rather slow nowadays compared to working with 16-bit or 32-bit
> integers and dropping the notion of taking the word "byte"
> in byte code literally.

Unfortunately we can't remove such shift & masking operations, even on
non-byte(code) compilers/VMs.

In wpython I handle 16 or 32 bits opcodes (it works on multiple of 16 bits
words), but I have:
- specialized opcodes to call functions and procedures (functions which
trashes the result) which handle the most common cases (84-85% on average
from that stats that I have collected from some projects and standard
library); I have packed 4 bits nargs and 4 bits nkwargs into a single byte
in order to obtain a short (and fast), 16 bits opcode;
- big endian systems still need to extract and "rotate" the bytes to get the
correct word(s) value.

So, even on words (and longwords) representations, they are need.

The good thing is that they can be handled a bit fast because oparg stays in
one register, and na and nk vars read (and manipulate) it independently, so
a (common) out-of-order processor can do a good work, scheduling and
parallelize such instructions, leaving a few final dependencies (when
recombining shift and/or mask partial results).
Some work can also be done reordering the instructions to enhance execution
on in-order processors.

It may seem strange to have functions, methods or object constructors
> with more than 255 parameters, but as I said: when using code generators,
> the generators don't care whether they use 100 or 300 parameters. Even if
> just 10 parameters are actually used later on. However, the user
> will care a lot if the generators fail due such limits and then become
> unusable.
> As example, take a database query method that exposes 3-4 parameters
> for each query field. In more complex database schemas that you find
> in e.g. data warehouse applications, it is not uncommon to have
> 100+ query fields or columns in a data table.
> With the current
> limit in function/call argument counts, such a model could not be
> mapped directly to Python. Instead, you'd have to turn to solutions
> based on other data structures that are not automatically checked
> by Python when calling methods/functions.
> --
> Marc-Andre Lemburg

I understood the problem, but I don't know if this is the correct solution.
Anyway, now there's at least one solution. :)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20101023/f3c4cd75/attachment.html>

More information about the Python-ideas mailing list