
Cesare Di Mauro wrote:
2010/10/22 M.-A. Lemburg <mal@egenix.com>
Cesare Di Mauro wrote:
I think that having more than 255 arguments for a function call is a very rare case for which a workaround (may be passing a tuple/list or a dictionary) can be a better solution than having to introduce a brand new opcode to handle it.
It's certainly rare when writing applications by hand, but such limits can be reached with code generators wrapping external resources such as database query rows, spreadsheet rows, sensor data input, etc.
We've had such a limit before (number of lines in a module) and that was raised for the same reason.
Changing the current opcode(s) is a very bad idea, since common cases will slow down.
I'm sure there are ways to avoid that, e.g. by using EXTENDED_ARG for such cases.
-- Marc-Andre Lemburg eGenix.com
I've patched Python 3.2 alpha 3 with a rough solution using EXTENDED_ARG for CALL_FUNCTION* opcodes, raising the arguments and keywords limits to 65535 maximum. I hope it'll be enough. :)
Sure, we don't have to raise it to 2**64 :-) Looks like a pretty simple fix, indeed. I wish we could get rid off all the byte shifting and div'ery use in the byte compiler - I'm pretty sure that such operations are rather slow nowadays compared to working with 16-bit or 32-bit integers and dropping the notion of taking the word "byte" in byte code literally.
In ast.c:
ast_for_arguments: if (nposargs > 65535 || nkwonlyargs > 65535) { ast_error(n, "more than 65535 arguments"); return NULL; }
ast_for_call: if (nargs + ngens > 65535 || nkeywords > 65535) { ast_error(n, "more than 65535 arguments"); return NULL; }
In compile.c:
opcode_stack_effect: #define NARGS(o) (((o) & 0xff) + ((o) >> 8 & 0xff00) + 2*(((o) >> 8 & 0xff) + ((o) >> 16 & 0xff00))) case CALL_FUNCTION: return -NARGS(oparg); case CALL_FUNCTION_VAR: case CALL_FUNCTION_KW: return -NARGS(oparg)-1; case CALL_FUNCTION_VAR_KW: return -NARGS(oparg)-2; #undef NARGS #define NARGS(o) (((o) % 256) + 2*(((o) / 256) % 256)) case MAKE_FUNCTION: return -NARGS(oparg) - ((oparg >> 16) & 0xffff); case MAKE_CLOSURE: return -1 - NARGS(oparg) - ((oparg >> 16) & 0xffff); #undef NARGS
compiler_call_helper: int len; int code = 0;
len = asdl_seq_LEN(args) + n; n = len & 0xff | (len & 0xff00) << 8; VISIT_SEQ(c, expr, args); if (keywords) { VISIT_SEQ(c, keyword, keywords); len = asdl_seq_LEN(keywords); n |= (len & 0xff | (len & 0xff00) << 8) << 8; }
In ceval.c:
PyEval_EvalFrameEx: TARGET_WITH_IMPL(CALL_FUNCTION_VAR, _call_function_var_kw) TARGET_WITH_IMPL(CALL_FUNCTION_KW, _call_function_var_kw) TARGET(CALL_FUNCTION_VAR_KW) _call_function_var_kw: { int na = oparg & 0xff | oparg >> 8 & 0xff00; int nk = (oparg & 0xff00 | oparg >> 8 & 0xff0000) >> 8;
call_function: int na = oparg & 0xff | oparg >> 8 & 0xff00; int nk = (oparg & 0xff00 | oparg >> 8 & 0xff0000) >> 8;
A quick example:
s = '''def f(*Args, **Keywords): print('Got', len(Args), 'arguments and', len(Keywords), 'keywords')
def g(): f(''' + ', '.join(str(i) for i in range(500)) + ', ' + ', '.join('k{} = {}'.format(i, i) for i in range(500)) + ''')
g() '''
c = compile(s, '<string>', 'exec') eval(c) from dis import dis dis(g)
The output is:
Got 500 arguments and 500 keywords
5 0 LOAD_GLOBAL 0 (f) 3 LOAD_CONST 1 (0) 6 LOAD_CONST 2 (1) [...] 1497 LOAD_CONST 499 (498) 1500 LOAD_CONST 500 (499) 1503 LOAD_CONST 501 ('k0') 1506 LOAD_CONST 1 (0) 1509 LOAD_CONST 502 ('k1') 1512 LOAD_CONST 2 (1) [...] 4491 LOAD_CONST 999 ('k498') 4494 LOAD_CONST 499 (498) 4497 LOAD_CONST 1000 ('k499') 4500 LOAD_CONST 500 (499) 4503 EXTENDED_ARG 257 4506 CALL_FUNCTION 16905460 4509 POP_TOP 4510 LOAD_CONST 0 (None) 4513 RETURN_VALUE
The dis module seems to have some problem displaying the correct extended value, but I have no time now to check and fix it.
Anyway, I'm still unconvinced of the need to raise the function def/call limits.
It may seem strange to have functions, methods or object constructors with more than 255 parameters, but as I said: when using code generators, the generators don't care whether they use 100 or 300 parameters. Even if just 10 parameters are actually used later on. However, the user will care a lot if the generators fail due such limits and then become unusable. As example, take a database query method that exposes 3-4 parameters for each query field. In more complex database schemas that you find in e.g. data warehouse applications, it is not uncommon to have 100+ query fields or columns in a data table. With the current limit in function/call argument counts, such a model could not be mapped directly to Python. Instead, you'd have to turn to solutions based on other data structures that are not automatically checked by Python when calling methods/functions. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Oct 22 2010)
Python/Zope Consulting and Support ... http://www.egenix.com/ mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/
::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/