Recursing through a website
DanielK at aracnet.com
Mon Nov 20 19:43:30 CET 2000
This is kinda off-topic for this group, but a btree with only 3-6 elements
per node would indicate that either the size of each element is very large
or the node size is every small. A typical buffer on a computer is at least
1k and many times it is 4k or 8k. A btree is best optimized by ensuring that
the minimum size for each node is 1/2 the node size so that when the node
degenerates to less than 1/2, it merges with its sibling along with one key
from its parent.
I have coded btrees both recursively and non-recursively (albeit in other
languages) and have noticed a marked performance gain with the latter. Does
recursion on Python have a perfermence 'cost'?
"Aahz Maruch" <aahz at panix.com> wrote in message
news:8vbp52$f2u$1 at panix2.panix.com...
> In article <qmdS5.273$OC.69644 at typhoon.aracnet.com>,
> Daniel Klein <DanielK at aracnet.com> wrote:
> >Are you saying that Python does not support recursion to more that 6
> >A Btree of order 100 can store a minimum of 1 trillion entries (and a
> >maximum of 65 trillion). At 7 levels, this jumps to 107 trillion and 132
> >million trillion entries.
> Of course not. I was thinking of a "typical" B-Tree with three to six
> elements per node, which would be around fifteen levels deep for
> "billions". Still way under Python's recursion limit, which I've never
> seen at less than a thousand -- but my point was to emphasize that you
> don't want a Real World [tm] algorithm to come anywhere close to the
> recursion limit. A B-Tree (or even a binary tree, really) fits that
> --- Aahz (Copyright 2000 by aahz at pobox.com)
> Androgynous poly kinky vanilla queer het <*>
> Hugs and backrubs -- I break Rule 6
> "as long as we like the same operating system, things are cool." --
More information about the Python-list