Generator vs functools.partial?
research at johnohagan.com
Thu Jun 21 16:49:46 CEST 2012
On 21 Jun 2012 12:19:20 GMT
Steven D'Aprano <steve+comp.lang.python at pearwood.info> wrote:
> On Thu, 21 Jun 2012 21:25:04 +1000, John O'Hagan wrote:
> > Sometimes a function gets called repeatedly with the same expensive
> > argument:
> > def some_func(arg, i):
> > (do_something with arg and i)
> > same_old_arg = big_calculation()
> Since big_calculation() is only called once, the cost of generating that
> expensive argument is only paid once.
> > for i in lots_of_items:
> > some_func(same_old_arg, i)
> Passing that same_old_arg to some_func is cheap. Once you've paid the
> cost of producing the value in the first place, there is absolutely no
> difference in cost between passing an "expensive" argument and a "cheap"
> > A simple case like that looks OK, but it can get messy when groups of
> > arguments are passed between functions, especially if the arguments are
> > used by functions called within the functions they are passed to (by
> > other functions!).
> I'm not sure what you're trying to say here. Argument passing is cheap.
> Function call overhead is not quite so cheap, but unless you've carefully
> profiled your code and determined that the cost of function overhead is
> significant, you're better off using ignoring it. Likely the actual
> calculation within the function is hundreds or thousands of times more
> expensive than the function call itself.
What I neglected to say was that I'm looking to refactor for clarity and DRY
to minimise the number of objects that get passed through a chain of functions
before being used, rather than efficiency. I realise the different examples I
gave (and the more obvious one in Thomas's reply) do more or less the same
thing and that there would probably only be marginal efficiency differences.
I'll try to come up with better examples to show what I mean. Or not: see my
reply to Thomas.
> Premature optimization is the root of all evil.
More information about the Python-list