[Python-ideas] 'Injecting' objects as function-local constants

Terry Reedy tjreedy at udel.edu
Sat Jun 11 22:09:17 CEST 2011


On 6/11/2011 9:30 AM, Jan Kaliszewski wrote:
> == Use cases ==
>
> A quite common practice is 'injecting' objects into a function as its
> locals, at def-time, using function arguments with default values...
>
> Sometimes to keep state using a mutable container:
>
>      def do_and_remember(val, verbose=False, mem=collections.Counter()):
>          result = do_something(val)
>          mem[val] += 1
>          if verbose:
>              print('Done {} times for {!r}'.format(mem[val], val))
>
> Sometimes, when creating functions dynamically (making use of nested
> scopes), e.g. to keep some individual function features (usable within
> that functions):
>
>      def make_my_callbacks(callback_params):
>          my_callbacks = []
>          for params in callback_params:
>              def fun1(*args, _params=params, **kwargs):
>                  "...do something with args and params..."
>              def fun2(*args, _params=params, **kwargs):
>                  "...do something with args and params..."
>              def fun3(*args, _fun1=fun1, _fun2=fun2, **kwargs):
>                  """...do something with args and with functions fun1, fun2,
>                  for example pass them as callbacks to other functions..."
>              my_callbacks.append((fun1, fun2, fun3))
>          return my_callbacks
>
> Sometimes simply to make critical parts of code optimised...
>
>      def do_it_quickly(fields, _len=len, _split=str.split,
>                        _sth=something):
>          return [_len(f), _split(f), _sth(f) for f in fields]
>
> ...or even for readability -- keeping function-specific constants within
> the function definition:
>
>      def check_value(val,
>                      VAL_REGEX=re.compile('^...$'),
>                      VAL_MAX_LEN=38):
>          return len(val)<= VAL_MAX_LEN and VAL_RE.search(val) is not None
>
> In all that cases (and probably some other too) that technique appears
> to be quite useful.
>
>
> == The problem ==
>
> ...is that it is not very elegant. We add arguments which:
> a) mess up function signatures (both in the code and in auto-generated docs);
> b) can be incidentally overriden (especially when a function has an "open"
>     signature with **kwargs).

One problem with trying to 'fix' this is that there can be defaulted args
which are not intended to be overwritten by users but which are intended to
be replaced in recursive calls.

> == Proposed solutions ==
>
> I see three possibilities:
>
> 1.
> To add a new keyword, e.g. `inject':
>      def do_and_remember(val, verbose=False):
>          inject mem = collections.Counter()
>          ...
> or maybe:
>      def do_and_remember(val, verbose=False):
>          inject collections.Counter() as mem

The body should all be runtime. Deftime expression should be in the header.

> 2. (which personally I would prefer)
> To add `dummy' (or `hidden') keyword arguments, defined after **kwargs
> (and after bare ** if kwargs are not needed; we have already have
> keyword-only arguments after *args or bare *):
>
>      def do_and_remember(val, verbose=False, **, mem=collections.Counter()):
>          ...

I thought of this while reading 'the problem'. It is at least plausible 
to me.

> do_and_remember(val, False, mem='something') would raise TypeError and
> `mem' shoudn not appear in help() etc. as a function argument.
>
> 3.
> To provide a special decorator, e.g. functools.within:
>      @functools.within(mem=collections.Counter())
>      def do_and_remember(val, verbose=False):
>          ...

The decorator would have to modify the code object as well as the 
function objects, probably in ways not currently allowed.

-- 
Terry Jan Reedy




More information about the Python-ideas mailing list