Decorater inside a function? Is there a way?

Ron_Adam radam2 at tampabay.rr.com
Sun Apr 3 16:20:56 EDT 2005


On 3 Apr 2005 11:17:35 -0700, "George Sakkis" <gsakkis at rutgers.edu>
wrote:

>>def define(func):
>>    if not ENABLE_TYPECHECKING:
>>        return lambda func: func
>>    # else decorate func
>
>A small correction: The argument of the decorator is not 'func' but the
>parameter checks you want to enforce. A template for define would be:
>
>def define(inputTypes, outputType):
>    if not ENABLE_TYPECHECKING:
>        return lambda func: func
>    def decorate(func):
>        def typecheckedFunc(*args,**kwds):
>            ##### TYPECHECK *args, **kwds HERE #####
>            r = func(*args,**kwds)
>            ##### TYPECHECK r HERE #####
>            return r
>        return typecheckedFunc
>    return decorate

This is the same pattern I used except without the enable/disable at
the top.

The inline type check function also checks for TYPECHECK == True, and
TYPESTRICT == False, as default to determine the strictness of the
type checking wanted.  Where TYPESTRICT == True, causes it to give an
error if they are not the correct type, even if they are the exact
value. TYPESTRICT == False, result in it trying to convert the object,
then checks it, by converting it back to the original type. If it's
still equal it returns the converted object in the specified type.

>Depending on how much flexibility you allow in inputTypes, filling in
>the typechecking logic can be from easy to challenging. For example,
>does typechecking have to be applied in all arguments or you allow
>non-typechecked aruments ? Can it handle *varargs and **kwdargs in the
>original function ? An orthogonal extension is to support 'templated
>types' (ala C++), so that you can check if something is 'a dict with
>string keys and lists of integers for values'. I would post my module
>here or the cookbook but at 560 (commented) lines it's a bit long to
>qualify for a recipe :-)
>
>George

Sounds like your version does quite a bit more than my little test
functions. :)  

I question how far type checking should go before you are better off
with a confirmtypes() function that can do a deep type check. And then
how much flexibility should that have? 

My view point is that type checking should be available to the
singleton types, with conversions only if data integrity can be
insured. ie.. the conversion is reversible with an "identical" result
returned.

def type_convert( a, t): 
    b = t(a)
    aa = type(a)(b)
    if a == aa:
        return b
    else:
        raise TypeError  

In cases where a conversion is wanted, but type checking gives an
error, an explicit conversion function or method should be used.  

In containers, and more complex objects, deep type checking should be
available through a general function which can compare an object to a
template of types, specific to that object. It's important to use a
template instead of a sample, because a sample could have been
changed. 

It's all about protecting the data content with a high degree of
confidence.  In general, 98% of the time the current python way would
be adequate, but those remaining 2% are important enough to warrant
the additional effort that type checking takes.

On another note, there's the possibility that type checking in python
source code could make writing a compiler easier.

Another idea is that of assigning a name a type preference. And then
overload the assign operators to check for that first before changing
a name to point to a new object.  It could probably be done with a
second name dictionary in name space with {name:type} pairs. With that
approach you only need to give key variables a type, then they keep
that type preference until it's assigned a new type, or removed from
the list. The down side to this is that it could slow things down.

Cheers,
Ron




More information about the Python-list mailing list