[Python-Dev] FW: [Python-Help] Python threads with suncc (Forte 6.1) compiler

Tim Peters tim.one@home.com
Sat, 4 Aug 2001 05:01:07 -0400


[Guido]
> Dunno, but I doubt 64-bit Linux would be so dumb as to declare
> FD_SETSIZE to be 2**16.

Quick search on google shows it's "the normal case" on that 64-bit Solaris,
nothing special under Tru64, and probably allowed on some others.

> ...
> instead of testing for MS_WINDOWS, we could test for a preposterous
> value of FD_SETSIZE.  For example, change all three occurrences of
>
>     #ifdef MS_WINDOWS
>
> into
>
>     #if FD_SETSIZE > 1024

Sure.

> Looking at the code, a better solution may be to always allocate this
> thing on the heap, and to let list2set allocate it.  The actual number
> of items *used* is the length of the list of file descriptors + 1 (a
> sentinel).  Even if FD_SETSIZE is not preposterous, the array size
> allocated is the theoretical maximum, which is always ways larger than
> the size needed.

I assume the current code is just trying to save 3 malloc/free pairs in the
usual case.  Since I'm not a select wizard, I don't know whether the speed
hit would matter (OTOH, if there are few descriptors it's cheap regardless,
and if there are many it's expensive regardless).

> (I also fail to understand why 3 extra elements are allocated; the
> algorithm seems to need only one extra.)

I've stared uncomprehendingly at that too.  Anyone?  It discouraged me from
becoming a select wizard a couple of years ago <wink>.

> Yet another approach would (shrudder :) be to forget about arrays and
> use a Python dict, since that's the usage: the data structure stores
> the mapping between file descriptors (unique small ints) and objects
> in the argument lists (either Python ints or objects with a fileno()
> method).
>
> Volunteers?  Not me! :)

Ditto.  Someone who stopped at the first suggestion could be finished
already, though <wink>.

BTW, do we actually need *any* auxiliary data structure here?  Looks to me
like all these giant pylist thingies could be tossed in favor of making
shallow copies of the input ifdlist etc list arguments.  The giant pylists
don't *appear* to accomplish anything except cache the result of a
PyObject_AsFileDescriptor() call per object.  But if that's not an expensive
call (doesn't look expensive to me), it's not worth all this trouble no
matter how spelled.