Storing a big amount of path names
Paulo da Silva
p_s_d_a_s_i_l_v_a_ns at netcabo.pt
Fri Feb 12 00:49:52 EST 2016
Às 05:02 de 12-02-2016, Chris Angelico escreveu:
> On Fri, Feb 12, 2016 at 3:45 PM, Paulo da Silva
> <p_s_d_a_s_i_l_v_a_ns at netcabo.pt> wrote:
>> I think a dict, as MRAB suggested, is needed.
>> At the end of the store process I may delete the dict.
> I'm not 100% sure of what's going on here, but my suspicion is that a
> string that isn't being used is allowed to be flushed from the
You are right. I have tried with a small class and it seems to work.
> How many files, roughly? Do you ever look at the contents of the
> files? Most likely, you'll be dwarfing the files' names with their
> contents. Unless you actually have over two million unique files, each
> one with over a thousand characters in the name, you can't use all
> that 2GB with file names.
That's not only the filenames.
The more memory I have more expensive but faster algorithm I can implement.
Thank you very much for your nice suggestion which also contributed to
my Python knowledge.
Thank you all who responded.
More information about the Python-list