waling a directory with very many files
BjornSteinarFjeldPettersen at gmail.com
Tue Jun 16 19:39:44 CEST 2009
On Jun 15, 6:56 am, Steven D'Aprano
<ste... at REMOVE.THIS.cybersource.com.au> wrote:
> On Sun, 14 Jun 2009 22:35:50 +0200, Andre Engels wrote:
> > On Sun, Jun 14, 2009 at 6:35 PM, tom<f... at thefsb.org> wrote:
> >> i can traverse a directory using os.listdir() or os.walk(). but if a
> >> directory has a very large number of files, these methods produce very
> >> large objects talking a lot of memory.
> >> in other languages one can avoid generating such an object by walking a
> >> directory as a liked list. for example, in c, perl or php one can use
> >> opendir() and then repeatedly readdir() until getting to the end of the
> >> file list. it seems this could be more efficient in some applications.
> >> is there a way to do this in python? i'm relatively new to the
> >> language. i looked through the documentation and tried googling but
> >> came up empty.
> > What kind of directories are those that just a list of files would
> > result in a "very large" object? I don't think I have ever seen
> > directories with more than a few thousand files...
> You haven't looked very hard :)
> $ pwd
> $ ls | wc -l
> And I periodically delete thumbnails, to prevent the number of files
> growing to hundreds of thousands.
Not proud of this, but...:
[django] www4:~/datakortet/media$ ls bfpbilder|wc -l
all .jpg files between 40 and 250KB with the path stored in a database
Oddly enough, I'm a relieved that others have had similar folder sizes
(I've been waiting for this burst to the top of my list for a while
More information about the Python-list