[Pythonmac-SIG] cyptes problem on MacOS X

Perry Smith pedz at easesoftware.com
Mon Dec 11 22:36:28 CET 2006


Does this set up (python calling C code) have a way to introduce a C  
function prototype?

I *think* that on the PPC, to call a function that has a variable  
number of arguments of
mixed types, the compiler HAS to see the prototype so it knows to  
push the arguments on
the stack and not pass them in registers.  I'm getting old and I  
can't remember how the PPC
guys solved this problem.  The MIPs guys blew it.

I think, the best I can recall, a function like:

void f(double d);

passes d in floating point register 0.  A function like:

void g(int i);

passes i in general purpose register 3.  (The next argument takes  
register 4, etc).

But when the compile sees:

int printf(char *f, ...);

It says "oh I gotta pass everything on the stack".

Try this: right a tiny C program.  Do not include stdio.h.  Do not  
include anything.

Just call printf like you are calling it and see what happens.  I bet  
it doesn't work.

Then include stdio.h, and I bet it will magically start working.


On Dec 11, 2006, at 3:17 PM, Russell E Owen wrote:

> At 2:39 PM -0600 2006-12-11, Perry Smith wrote:
>> I don't use floats and double much.  But the printf man page  
>> implies that all
>> floats are actually passed as doubles.  I vaguely recall that to  
>> be the case.
>>
>> But I doubt if the python thingy knows that.  Try not passing  
>> floats.  You might
>> also have problems trying to pass char's and short's (since they  
>> are promoted
>> to int's).
>
> You're right. I should not have passed floats to printf. But the  
> example also showed doubles being mishandled. Here is a more  
> extensive example:
>
> import ctypes as ct
> libc = ct.CDLL("libc.dylib")
> libc.printf("int=%d  double=%f\n", ct.c_int(1), ct.c_double(3.0))
> libc.printf("int=%d  double=%f  double=%f\n", ct.c_int(1),  
> ct.c_double(3.0), ct.c_double(3.0))
>
> On MacOS X this prints:
> int=1  double=-1.996155
> int=1  double=-1.996140  double=0.000000
>
> On a linux box, with a suitably modified ct.CDLL statement, this  
> produces the more reasonable:
> int=1  double=3.000000
> int=1  double=3.000000  double=3.000000
>
> I am totally mystified by the MacOS X results.
>
> -- Russell
>

Perry Smith ( pedz at easesoftware.com )
Ease Software, Inc. ( http://www.easesoftware.com )

Low cost SATA Disk Systems for IBMs p5, pSeries, and RS/6000 AIX systems




More information about the Pythonmac-SIG mailing list