[Baypiggies] Type checking - argument for it being bad practice?
jason.lai at gmail.com
Fri Oct 8 08:54:10 CEST 2010
As an alternative to an explicit assert, you may consider creating a
@enforce_type(inputNumber = Number)
That way, it separates the type checking from the main body of code, and
enforce_type could generate an appropriate detailed error message like
"myfunc expected inputNumber to be of type 'Number' but was passed 'str'"
which is nicer than just an assert. You could also easily demote it to a
warning rather than an exception without changing all of your code (e.g.
type enforcement only for debugging), although you might miss it if the
method just stores the argument and the type error doesn't happen until much
later, which is one argument for early typechecking.
Python 3.0 supports function annotations, which allows you to annotate
function parameters with arbitrary objects including type objects (although
it doesn't assign any special meaning to them), which would allow you to do
def myfunc(inputNumber: Number):
Note that numbers.Number (which is a superclass of the standard number types
like int and float) is only available in Python 2.6+.
For non-numeric/non-string types though, you have to be more careful -- for
example it's not uncommon for people to pass around dictionary-like objects
that implement the usual x["key"] protocol but aren't actually dictionaries.
And file-like objects are enshrined in Python's API.
Python 2.6+ has the concept of Abstract Base Classes to allow type-checking
to ensure an object implements a particular interface. It seems a little
inconsistent though -- if you implement __iter__ on a random class MyClass
then isinstance(MyClass(), collections.Iterable) returns True. However, if
you try to implement the collections.Mapping ABC, simply implementing the
methods __getitem__, __iter__, and __len__ is insufficient;
isinstance(MyClass(), collections.Mapping) will return False.
In that case, you have to make MyClass subclass collections.Mapping, or use
collections.Mapping.register(MyClass). This is OK if you control all the
code, since you can just make sure all your dictionary-like objects inherit
from the right ABC, but probably too strict for open-source library code
where most people expect to pass an object that looks like a dictionary and
have the duck typing just work.
You could potentially work around this in the type-checking decorator by, if
the initial type check fails, iterate over the __abstractmethods__ property
of the the required class and make sure all the necessary methods exist.
For your own user-defined classes which don't have a well-defined protocol
like the collection ABCs though, it's probably more trouble than it's worth
to define an explicit interface, in which case you might as well just stick
to duck typing.
On Thu, Oct 7, 2010 at 9:22 PM, Seth Friedman <sfseth at gmail.com> wrote:
> Hi Mark,
> Thanks for your response.
> I can't quite wrap my head around your argument. I don't think one or two
> lines per input is a lot of / too much time gatekeeping input variables.
> Philisophically I think I see your point, I'm trying to impose some rigidity
> in a domain that doesn't like it. But there are lots of other values that
> python brings - it seems extreme to go to a different super-verbose language
> just for type-checking. Not to mention that I don't trust those other
> languages to actually do it for me as well as python's assert(isinstance...)
> style - Java for instance strikes me as a prime candidate for false-negative
> type checking, they've got a lot of subtle type variations.
> Is something like
> assert(isinstance(inputNumber,isActuallyANumberWeExpected)) really that
> cumbersome, aside from my long variable names? The biggest examples I can
> think of are the distinctions between number and string of number, and list
> of numbers, which seem like totally legit "how far does this function need
> to go" question, asserting what is expected seems to illuminate rather than
> I mean, what's wrong with code asserting that what it got passed was what
> was expected? If there's blind faith that stuff will just work beneath the
> scenes, ok, but when i *know* it won't, why not fail there, and explicitly?
> If I write a function that can be called from python code or command line,
> I might get a string of a number or an int, and I will probably forget 6
> months from now what I wrote the code to handle, without some kind of
> prompt. The closer that prompt is to the disconnect, the better.
> On Thu, Oct 7, 2010 at 8:20 PM, Mark Voorhies <mvoorhie at yahoo.com> wrote:
>> On Thursday, October 07, 2010 07:32:18 pm Seth Friedman wrote:
>> > Hi Baypiggies,
>> > I've encountered, a couple of times now, a sort of categorical "dont do
>> > typechecking" attitude on various blog posts and I haven't seen
>> rationale to
>> > back it up. Enough that I'm now curious: what's the deal?
>> > So I put the question to you all, typechecking: good or bad thing? Or
>> > pointless philosophical debate?
>> The lack of type signatures in function declarations are in line with
>> as a rapid prototyping language" -- if you're spending a lot of lines on
>> a function's arguments, you might be better off in a language that does
>> work for you (e.g., Java, C++, ...). There is a similar argument for not
>> a lot of work into enforcing private variables in Python.
>> That said, I don't have a strong opinion on this, and I've heard the
>> "traits" system
>> in the Enthought Python Distribution recommended as a way to assert more
>> control over the types that a function accepts.
> Baypiggies mailing list
> Baypiggies at python.org
> To change your subscription options or unsubscribe:
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Baypiggies