Speed ain't bad
Anders J. Munch
andersjm at inbound.dk
Sat Jan 1 08:20:06 EST 2005
"Bulba!" <bulba at bulba.com> wrote:
>
> One of the posters inspired me to do profiling on my newbie script
> (pasted below). After measurements I have found that the speed
> of Python, at least in the area where my script works, is surprisingly
> high.
Pretty good code for someone who calls himself a newbie.
One line that puzzles me:
> sfile=open(sfpath,'rb')
You never use sfile again.
In any case, you should explicitly close all files that you open. Even
if there's an exception:
sfile = open(sfpath, 'rb')
try:
<stuff to do with the file open>
finally:
sfile.close()
>
> The only thing I'm missing in this picture is knowledge if my script
> could be further optimised (not that I actually need better
> performance, I'm just curious what possible solutions could be).
>
> Any takers among the experienced guys?
Basically the way to optimise these things is to cut down on anything
that does I/O: Use as few calls to os.path.is{dir,file}, os.stat, open
and such that you can get away with.
One way to do that is caching; e.g. storing names of known directories
in a set (sets.Set()) and checking that set before calling
os.path.isdir. I haven't spotted any obvious opportunities for that
in your script, though.
Another way is the strategy of "it's easier to ask forgiveness than to
ask permission".
If you replace:
if(not os.path.isdir(zfdir)):
os.makedirs(zfdir)
with:
try:
os.makedirs(zfdir)
except EnvironmentError:
pass
then not only will your script become a micron more robust, but
assuming zfdir typically does not exist, you will have saved the call
to os.path.isdir.
- Anders
More information about the Python-list
mailing list