Max files in unix folder from PIL process

David Pratt fairwinds at eastlink.ca
Tue Mar 29 03:39:05 CEST 2005


Hi Jason.  Many thanks your reply.  This is good to know about ls - 
what did it do? Was it just slow or did the server or machine die? My 
images  will be going into the path of a web server.  This is 
unchartered territory for me and I don't know whether there will be 
speed and access problems or how the filesystem copes with this kind of 
volume.

I am definitely planning to split the images into directories by size 
and that will at least divide the number by a factor of the various 
sizes (but on the higher end this could still be between 150 - 175 
thousand images which is still a pretty big number.  I don't know if 
this will be a problem or not or there is really anything to worry 
about at all - but it is better to obtain advice from those that have 
been there, done that - or are at least a bit more familiar with 
pushing limits on Unix resources than to wonder whether it will work.

Regards,
David

On Monday, March 28, 2005, at 07:18 PM, Kane wrote:

> I ran into a similar situation with a massive directory of PIL
> generated images (around 10k).  No problems on the filesystem/Python
> side of things but other tools (most noteably 'ls') don't cope very
> well.    As it happens my data has natural groups so I broke the big
> dir into subdirs to sidestep the problem.
>
> -- 
> http://mail.python.org/mailman/listinfo/python-list
>



More information about the Python-list mailing list