[Python-Dev] Caching directory files in import.c

James C. Ahlstrom jim@interet.com
Fri, 02 Nov 2001 15:22:17 -0500


I have a new version of my zip importing code.  As before,
it reads the file names from zipfiles and records them in
a global dictionary to speed up finding zip imports.
But what about imports from directories?

Looking at the code, I saw that I could do an os.listdir(path),
and record the directory file names into the same dictionary.
Then it would not be necessary to perform a large number of
fopen()'s.  The same dictionary lookup is used instead.

Is this a good idea???

It seems it should be faster when a "large" percentage of
files in a directory are imported.  It should be slower
when only one file is imported from a directory with
many names.

I think I remember people discussing this before.  Is
the speedup real and worth the slight amount of additional
code?

JimA