On 3 October 2012 23:30, Benjamin Jessup <span dir="ltr"><<a href="mailto:bsj@abzinc.com" target="_blank">bsj@abzinc.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I have a group of objects identified by unique (x,y) pairs and I want to find out an object's "neighbors" in a matrix of size 2400 x 2400.<br>
#############<br>
#obj# # #<br>
#############<br>
# # #obj# 3 x 3 Example<br>
#############<br>
# # # #<br>
#############<br>
There is either a neighbor, or a null value. I always know the (x,y) pair to check the neighbors of, so is doing,<br>
>> obj = grid[x][y] #lists, doesn't scale with num of objects<br>
or,<br>
>> obj = grid.get((x,y),None) #dictionary, scales with num of objects<br>
the fastest? I can't seem to find a conclusion by testing each alone, then in the full environment. Is it that, depending on the number of objects, each has an advantage?<br>
<br>
I know the fastest way to retrieve them would be to have them store pointers to their neighbors, then use those for retrieval. When large numbers of objects are changing their (x,y) pairs, rebuilding the pointers is too slow</blockquote>
<div><br></div><div>You really are asking the wrong question.</div><div><br></div><div>If most of the data is None, then a sparse matrix is best. Otherwise, lists make the most sense.</div><div><i>However, </i>what you really want is... a matrix. Look for a sparse matrix type (<a href="http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices">http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices</a>) if you need sparse, and otherwise Numpy's matrix should do fine. </div>
<div><br></div><div>The thing is this:<br>"I can't seem to find a conclusion by testing each alone, then in the full environment. Is it that, depending on the number of objects, each has an advantage?"</div>
<div><br></div><div>If you can't tell, don't bother. Dictionaries will have a memory advantage with sparse matrices, lists in the other cases. That's the important part, as look-up is fast for both*. If you need faster than these builtins, use Numpy or SciPy.</div>
<div><br></div><div>If you really need optimization help, profile and ask the <i>right</i> question to this list.</div><div><br></div><div>* To actually answer the question:</div><div>They both have O(1) look-up time, although dictionaries have a theoretical <i>worst</i> case of O(N). Being simpler, it's faster to index a list. However, you're doing that twice, so it's probably around even.</div>
</div>