Need help with an array problem.
bellman at lysator.liu.se
Tue Oct 3 10:03:26 CEST 2006
Dennis Lee Bieber <wlfraed at ix.netcom.com> wrote:
> Of course, there is also the confusion between "type cast" and "type
> conversion" -- at least, for me...
> cast taking the bit-pattern of a value in one "type" and
> interpreting the same bit-pattern as if it were a different "type"
> conversion taking the value of a bit-pattern in one "type" and
> converting it to the bit pattern of the equivalent value in another
>From where have you learned those definitions? If it's from C,
then you have read the wrong books. A cast in C is a type
conversion, with the same semantics as you write under "conversion"
above. The C standard (ISO/IEC 9899:1999) says:
6.5.4 Cast operators
Preceding an expression by a parenthesized type name converts the
value of the expression to the named type. This construction is
called a cast. A cast that specifies no conversion has no effect
on the type or value of an expression.
("A cast that specifies no conversion" refers to when you cast
from one type to the same type, e.g. '(int)x' if x is of the type
You may also try this program:
/* Assumption: sizeof(float)==sizeof(int). This is the most
* common case on modern computers. */
float f = -17.0;
int i = -23;
printf("Cast: %d %10.6f\n", (int)f, (float)i);
memcpy(&fjunk, &i, sizeof i);
memcpy(&ijunk, &f, sizeof f);
printf("Bitcopy: %d %10.6f\n", ijunk, fjunk);
If a cast had been what you believed, both printf() statements
above would give the same output. Unless your C compiler uses
some really strange floating point representation, they will
print rather different things. The first one must print
Cast: -17 -23.000000
showing very clearly that a cast is a type conversion. The
second printf() will print some seemingly random numbers, showing
that those bit-patterns will be interpreted very differently when
interpreted as another type.
What you might have been confused by, is dereferencing a casted
pointer. Add the following statement:
printf("Pointer cast: %d %10.6f\n", *(int*)&f, *(float*)&i);
to the program. It should output the same numbers as the
"Bitcopy" printf(). But what is cast here is the *address* of
the variables, not the actual contents of them. It is the
*dereferencing* of those casted pointers that interpret the bit
patterns in the variables as if they were another type, not the
Thomas Bellman, Lysator Computer Club, Linköping University, Sweden
"I don't think [that word] means what you ! bellman @ lysator.liu.se
think it means." -- The Princess Bride ! Make Love -- Nicht Wahr!
More information about the Python-list