[Pythonmac-SIG] cyptes problem on MacOS X

Russell E Owen rowen at cesmail.net
Mon Dec 11 22:17:29 CET 2006

At 2:39 PM -0600 2006-12-11, Perry Smith wrote:
>I don't use floats and double much.  But the printf man page implies that all
>floats are actually passed as doubles.  I vaguely recall that to be the case.
>But I doubt if the python thingy knows that.  Try not passing 
>floats.  You might
>also have problems trying to pass char's and short's (since they are promoted
>to int's).

You're right. I should not have passed floats to printf. But the 
example also showed doubles being mishandled. Here is a more 
extensive example:

import ctypes as ct
libc = ct.CDLL("libc.dylib")
libc.printf("int=%d  double=%f\n", ct.c_int(1), ct.c_double(3.0))
libc.printf("int=%d  double=%f  double=%f\n", ct.c_int(1), 
ct.c_double(3.0), ct.c_double(3.0))

On MacOS X this prints:
int=1  double=-1.996155
int=1  double=-1.996140  double=0.000000

On a linux box, with a suitably modified ct.CDLL statement, this 
produces the more reasonable:
int=1  double=3.000000
int=1  double=3.000000  double=3.000000

I am totally mystified by the MacOS X results.

-- Russell

More information about the Pythonmac-SIG mailing list