Functional programming

Steven D'Aprano steve+comp.lang.python at
Mon Mar 3 18:27:51 CET 2014

On Tue, 04 Mar 2014 02:01:47 +1100, Chris Angelico wrote:

> This is why it's tricky to put rules in based on type inference. The
> programmer's intent isn't in the picture. 

Of course it is. If I assign 23 to variable x, that signals my intent to 
assign an int to x. By Occam's razor, it is reasonable to extrapolate 
that intent to mean "x is an int", rather than "an int, or a list" or "an 
odd int larger than 7 but smaller than 25", or "any int except 13". Type 
inference picks the type which involves the fewest additional 
assumptions. The programmer can always over-ride the type inference by 
explicitly stating the type.

It works really well in practice, because most of the time you don't need 
a lot of type dynamism. Or even any.

Think about the sort of type declarations you have to do in (say) Pascal, 
and consider how stupid the compiler must be:

function add_one(x: integer):integer;
    add_one := x+1;

Given that x is an integer, and that you add 1 (also an integer) to it, 
is it really necessary to tell the compiler that add_one returns an 
integer? What else could the output type be?

This was state of the art back in 1970, but these days, if the compiler 
cannot *at least* infer the type of the return result of a function given 
the argument types, the static type system is too dumb to bother with. A 
good static type system can even detect infinite loops at compile time:

This is not cutting edge technology: ML dates back to the 1990s, if not 

> If Python ever acquires that
> kind of restriction ("here's a list that can contain only this type /
> these types of object"), I would hope that it's left up to the
> programmer, not the compiler, to stipulate.

That's not type inference. That's ancient and annoying obligatory type 
declarations as used by ancient languages with primitive type systems, 
like Pascal and C.

> That's how it is with Pike
> (if you just say "array", it can take anything), and that's the only way
> to be sure the programmer doesn't have to fight the language.

To be sure, any form of static typing is going to restrict what you can 
do. This isn't intended to imply that static typing is better than 
dynamic typing. But if you have static typing, there's *no point* to it 
if the type system cannot detect bugs, and having to declare types is 
like having to calculate your own GOTO addresses. With a good type system 
like ML or Haskell have, you're not fighting the compiler, *every* type 
error you get is a real, actual bug in your code.

Steven D'Aprano

More information about the Python-list mailing list