[Python-ideas] Improving the expressivity of function annotations

Masklinn masklinn at masklinn.net
Tue Apr 5 08:27:05 CEST 2011


On 2011-04-04, at 21:47 , Michael Foord wrote:
> On 4 April 2011 20:30, Masklinn <masklinn at masklinn.net> wrote:
>> On 2011-04-04, at 21:05 , Michael Foord wrote:
>>> On 4 April 2011 20:04, Masklinn <masklinn at masklinn.net> wrote:
>>>> On 2011-04-04, at 20:47 , Michael Foord wrote:
>>>>> It's a standard term for languages like C# and Java, but if you don't
>> use
>>>>> these languages there is no reason you should know it. Generics is a
>>>>> solution (hack - but still an elegant hack) that allows you to write
>>>>> "generic" containers and functions that can work with any types whilst
>>>> still
>>>>> being type safe.
>>>> 
>>>> Why do you find generics to be a hack?
>>>> 
>>> Because with a good dynamic type system they are completely unnecessary.
>> 
>> If you go that way, types themselves are unnecessary "and therefore hacks",
>> static or not.
>> 
>> I don't think that makes much sense, though I can see you were probably
>> replying in jest I was interested in the answer.
> 
> I wasn't entirely joking, and no a dynamic type system doesn't make types
> themselves redundant
You stated generics (and static types) were unnecessary. As Church demonstrated in 1936 types themselves are unnecessary.

> just the declaration of them all over the place (and
> often multiple times for the same use in languages like Java and C#).
> Generics are a hack within the language syntax to tell the compiler that
> different types *might* be used (and allow you to refer to these types in
> your implementation without knowing what they will be)
That has nothing to do with types themselves, generic or not. Haskell, ML and many other languages have parametric types and generics (potentially in extensions to the core language) as well as type inference (local and global).

If you mean that Java and C#'s suck I have no problem with that, but I still don't see how that ends up yielding "generics are unnecessary". Even if types are inferred, they're still there and still parametric.

> whereas a smarter compiler could deduce this for itself anyway.
I don't think that isn't possible (for all cases) for a non-total language unless the compiler can solve the halting problem. But I might be mistaken.


More information about the Python-ideas mailing list