
On Wed, 23 Sep 2009 02:35:49 am Masklinn wrote:
On 22 Sep 2009, at 17:46 , Steven D'Aprano wrote:
On Wed, 23 Sep 2009 12:25:32 am Masklinn wrote:
On 22 Sep 2009, at 15:16 , Steven D'Aprano wrote:
On Tue, 22 Sep 2009 01:05:41 pm Mathias Panzenböck wrote:
I don't think this is a valid test to determine how a language is typed. Ok, C++ is more or less weakly typed (for other reasons), but I think you could write something similar in C#, too. And C# is strongly typed.
Weak and strong typing are a matter of degree -- there's no definitive test for "weak" vs "strong" typing, only degrees of weakness. The classic test is whether you can add strings and ints together, but of course that's only one possible test out of many.
And it's a pretty bad one to boot: both Java and C# allow adding strings and ints (whether it's `3 + "5"` or `"5" + 3`) (in fact they allow adding *anything* to a string), but the operation is well defined: convert any non-string involved to a string (via #toString ()/.ToString()) and concatenate.
I don't see why you say it's a bad test. To me, it's a good test, and Java and C# pass it.
Uh no, by your criterion they fail it: both java and C# do add strings and integers without peeping.
What are you talking about? Using Python syntax, we might write: assert "2"+2 == 4 This test will FAIL for Python, Pascal and other strongly typed languages, and PASS for weakly typed languages. (For some languages, like Pascal, you can't even compile the test!)
If you're only criterion is that an operation is well-defined, then "weak typing" becomes meaningless...
I don't think I defined any criterion of strong/weak typing.
You conceded that C# adds strings and ints, but disputed that this fact makes C# weakly typed on the basis that (I quote) "the operation is well defined". There's nothing about the definition of weakly typed languages that requires the implicit conversions to be random, arbitrary or badly defined, ONLY that they are implicit, and by that definition, C# is weakly typed with regard to strings+ints.
As far as I'm concerned, and as I've already mentioned in this thread, the whole weak/strong axis is meaningless and laughable.
I agree that treating weak/strong as a binary state is an over-simplification, but a weak/strong axis is perfectly reasonable. Count the number of possible mixed-type operations permitted by the language (for simplicity, limit it to built-in or fundamental types). What percentage of them succeed without explicit casts or conversions? If that percentage is 0%, then the language is entirely strong. If it is 100%, then the language is entirely weak. In practice, languages will likely fall somewhere in the middle rather at the ends. This is an objective test for degree of type strength.
This heuristic is not arbitrary.
Of course it is.
Now you're just being irrational.
Automatically converting ints to floats is mathematically reasonable, because we consider e.g. 3 and 3.0 to be the same number.
Do we? Given 3/2 and 3.0/2 don't necessarily give the same answer (some languages don't even consider the first operation valid), I'm not sure we do.
Ask a mathematician, and he'll say that they are the same. The underlying concept of number treats 3 and 3.0 as the same thing (until you get to some extremely hairy mathematics, by which time the mathematician you ask will say that the question doesn't even make sense). In mathematics, the difference between 3 and 3+0/10 is, not surprisingly, 0. -- Steven D'Aprano