[Tutor] any cons to using a module of functions?

Peter Otten __peter__ at web.de
Fri Feb 3 09:56:14 CET 2012


Che M wrote:

> 
> I have a bunch of functions that do various utility-type tasks in an
> application (such as prettifying date strings, etc.), and they are used in
> many modules.  Much of the time, I have just been lazily copying and
> pasting the functions into whichever modules need them.  I realize that is
> very bad form and I should refactor, and so I am beginning to put these
> functions in their own module so that I can import the module and its
> functions when I need it; they will all be in one place and only only
> place.
> 
> My question is about resources.  Let's say I have the module, myUtils.py,
> and I import it into every other module that will need one or more of the
> functions within it.  Is this in any way costly in terms of memory? 
> (since each time I import myUtils.py I import *all* the functions, instead
> of in the cut&paste approach, where I just run the functions I need).

I hope by "importing all functions" you mean

import myutils

or

from myutils import foo, bar

The oh-so-convenient

from myutils import *

will sooner or later result in nasty name clashes.

> I'm fairly sure this is not at all an issue, but I just want to understand
> why it's not.

After entering the interactive interpreter (Python 2.7) I see

>>> import sys
>>> len(sys.modules)
39
>>> len(sys.builtin_module_names)
20

So there are already forty or sixty modules, depending on how you count; the 
memory and runtime impact of adding one more is likely negligable.

There is an effect on your processes. If you hammer up a quick and dirty 
script using your kitchen-sink myutils.py, then forget the script, and in a 
year or so need it again it's likely that myutils have evolved and your 
script will not work out of the box.
At that point it will start to pay off having unit tests in place that 
ensure a stable api and to put everything into version control.



More information about the Tutor mailing list