Memory ?

Shagshag13 shagshag13 at yahoo.fr
Mon Jul 8 09:28:00 EDT 2002


I have a data structure like this :

key_0 -> value_0_? -> value_0_? -> ... value_0_m
key_i -> value_i_? -> value_i_?
...
key_n -> value_n_? -> value_n_? -> value_n_? -> value_n_?

where :

- key are characters,
- value are float
- n > 2,500,000
- value "list" can have length from 1 to m = ~1,000,000 (values from key_i could be 1 long, while key_i+1 could be 1,000,000)
- i didn't knew exact value for n (it depends on input data)
- i can guess m

by now i use a dict index for key, which give me access to a queue (list) containing my values. that's really to slow !!!

with your helps, i think i should use a numeric.array for my values list, but i even wonder if i shouldn't use a matrix (2,500,000 *
1,000,000) even if it should be quite empty. Which one do you think is best (memory & time -- i do some others calculations by
line ) ?

> Function Numeric.concatenate, which you mention, also returns a new copy,
> and thus is similarly free from any worry.

my needs for concatenate is just to add a new value (one at a time) at the end of my values list (which could be 1 long and possibly
grow to m).

thanks,

s13.





More information about the Python-list mailing list