There's a whole matrix of these and I'm wondering why the matrix is
currently sparse rather than implementing them all. Or rather, why we
can't stack them as:
class foo(object):
@classmethod
@property
def bar(cls, ...):
...
Essentially the permutation are, I think:
{'unadorned'|abc.abstract}{'normal'|static|class}{method|property|non-callable
attribute}.
concreteness
implicit first arg
type
name
comments
{unadorned}
{unadorned}
method
def foo():
exists now
{unadorned} {unadorned} property
@property
exists now
{unadorned} {unadorned} non-callable attribute
x = 2
exists now
{unadorned} static
method @staticmethod
exists now
{unadorned} static property @staticproperty
proposing
{unadorned} static non-callable attribute {degenerate case -
variables don't have arguments}
unnecessary
{unadorned} class
method @classmethod
exists now
{unadorned} class property @classproperty or @classmethod;@property
proposing
{unadorned} class non-callable attribute {degenerate case - variables
don't have arguments}
unnecessary
abc.abstract {unadorned} method @abc.abstractmethod
exists now
abc.abstract {unadorned} property @abc.abstractproperty
exists now
abc.abstract {unadorned} non-callable attribute
@abc.abstractattribute or @abc.abstract;@attribute
proposing
abc.abstract static method @abc.abstractstaticmethod
exists now
abc.abstract static property @abc.staticproperty
proposing
abc.abstract static non-callable attribute {degenerate case -
variables don't have arguments} unnecessary
abc.abstract class method @abc.abstractclassmethod
exists now
abc.abstract class property @abc.abstractclassproperty
proposing
abc.abstract class non-callable attribute {degenerate case -
variables don't have arguments} unnecessary
I think the meanings of the new ones are pretty straightforward, but in
case they are not...
@staticproperty - like @property only without an implicit first
argument. Allows the property to be called directly from the class
without requiring a throw-away instance.
@classproperty - like @property, only the implicit first argument to the
method is the class. Allows the property to be called directly from the
class without requiring a throw-away instance.
@abc.abstractattribute - a simple, non-callable variable that must be
overridden in subclasses
@abc.abstractstaticproperty - like @abc.abstractproperty only for
@staticproperty
@abc.abstractclassproperty - like @abc.abstractproperty only for
@classproperty
--rich
I think it would be a good idea if Python tracebacks could be translated
into languages other than English - and it would set a good example.
For example, using French as my default local language, instead of
>>> 1/0
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ZeroDivisionError: integer division or modulo by zero
I might get something like
>>> 1/0
Suivi d'erreur (appel le plus récent en dernier) :
Fichier "<stdin>", à la ligne 1, dans <module>
ZeroDivisionError: division entière ou modulo par zéro
André
Here's an updated version of the PEP reflecting my
recent suggestions on how to eliminate 'codef'.
PEP: XXX
Title: Cofunctions
Version: $Revision$
Last-Modified: $Date$
Author: Gregory Ewing <greg.ewing(a)canterbury.ac.nz>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 13-Feb-2009
Python-Version: 3.x
Post-History:
Abstract
========
A syntax is proposed for defining and calling a special type of generator
called a 'cofunction'. It is designed to provide a streamlined way of
writing generator-based coroutines, and allow the early detection of
certain kinds of error that are easily made when writing such code, which
otherwise tend to cause hard-to-diagnose symptoms.
This proposal builds on the 'yield from' mechanism described in PEP 380,
and describes some of the semantics of cofunctions in terms of it. However,
it would be possible to define and implement cofunctions independently of
PEP 380 if so desired.
Specification
=============
Cofunction definitions
----------------------
A cofunction is a special kind of generator, distinguished by the presence
of the keyword ``cocall`` (defined below) at least once in its body. It may
also contain ``yield`` and/or ``yield from`` expressions, which behave as
they do in other generators.
From the outside, the distinguishing feature of a cofunction is that it cannot
be called the same way as an ordinary function. An exception is raised if an
ordinary call to a cofunction is attempted.
Cocalls
-------
Calls from one cofunction to another are made by marking the call with
a new keyword ``cocall``. The expression
::
cocall f(*args, **kwds)
is evaluated by first checking whether the object ``f`` implements
a ``__cocall__`` method. If it does, the cocall expression is
equivalent to
::
yield from f.__cocall__(*args, **kwds)
except that the object returned by __cocall__ is expected to be an
iterator, so the step of calling iter() on it is skipped.
If ``f`` does not have a ``__cocall__`` method, or the ``__cocall__``
method returns ``NotImplemented``, then the cocall expression is
treated as an ordinary call, and the ``__call__`` method of ``f``
is invoked.
Objects which implement __cocall__ are expected to return an object
obeying the iterator protocol. Cofunctions respond to __cocall__ the
same way as ordinary generator functions respond to __call__, i.e. by
returning a generator-iterator.
Certain objects that wrap other callable objects, notably bound methods,
will be given __cocall__ implementations that delegate to the underlying
object.
Grammar
-------
The full syntax of a cocall expression is described by the following
grammar lines:
::
atom: cocall | <existing alternatives for atom>
cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
cotrailer: '[' subscriptlist ']' | '.' NAME
Note that this syntax allows cocalls to methods and elements of sequences
or mappings to be expressed naturally. For example, the following are valid:
::
y = cocall self.foo(x)
y = cocall funcdict[key](x)
y = cocall a.b.c[i].d(x)
Also note that the final calling parentheses are mandatory, so that for example
the following is invalid syntax:
::
y = cocall f # INVALID
New builtins, attributes and C API functions
--------------------------------------------
To facilitate interfacing cofunctions with non-coroutine code, there will
be a built-in function ``costart`` whose definition is equivalent to
::
def costart(obj, *args, **kwds):
try:
m = obj.__cocall__
except AttributeError:
result = NotImplemented
else:
result = m(*args, **kwds)
if result is NotImplemented:
raise TypeError("Object does not support cocall")
return result
There will also be a corresponding C API function
::
PyObject *PyObject_CoCall(PyObject *obj, PyObject *args, PyObject *kwds)
It is left unspecified for now whether a cofunction is a distinct type
of object or, like a generator function, is simply a specially-marked
function instance. If the latter, a read-only boolean attribute
``__iscofunction__`` should be provided to allow testing whether a given
function object is a cofunction.
Motivation and Rationale
========================
The ``yield from`` syntax is reasonably self-explanatory when used for the
purpose of delegating part of the work of a generator to another function. It
can also be used to good effect in the implementation of generator-based
coroutines, but it reads somewhat awkwardly when used for that purpose, and
tends to obscure the true intent of the code.
Furthermore, using generators as coroutines is somewhat error-prone. If one
forgets to use ``yield from`` when it should have been used, or uses it when it
shouldn't have, the symptoms that result can be extremely obscure and confusing.
Finally, sometimes there is a need for a function to be a coroutine even though
it does not yield anything, and in these cases it is necessary to resort to
kludges such as ``if 0: yield`` to force it to be a generator.
The ``cocall`` construct address the first issue by making the syntax directly
reflect the intent, that is, that the function being called forms part of a
coroutine.
The second issue is addressed by making it impossible to mix coroutine and
non-coroutine code in ways that don't make sense. If the rules are violated, an
exception is raised that points out exactly what and where the problem is.
Lastly, the need for dummy yields is eliminated by making it possible for a
cofunction to call both cofunctions and ordinary functions with the same syntax,
so that an ordinary function can be used in place of a cofunction that yields
zero times.
Record of Discussion
====================
An earlier version of this proposal required a special keyword ``codef`` to be
used in place of ``def`` when defining a cofunction, and disallowed calling an
ordinary function using ``cocall``. However, it became evident that these
features were not necessary, and the ``codef`` keyword was dropped in the
interests of minimising the number of new keywords required.
The use of a decorator instead of ``codef`` was also suggested, but the current
proposal makes this unnecessary as well.
It has been questioned whether some combination of decorators and functions
could be used instead of a dedicated ``cocall`` syntax. While this might be
possible, to achieve equivalent error-detecting power it would be necessary
to write cofunction calls as something like
::
yield from cocall(f)(args)
making them even more verbose and inelegant than an unadorned ``yield from``.
It is also not clear whether it is possible to achieve all of the benefits of
the cocall syntax using this kind of approach.
Prototype Implementation
========================
An implementation of an earlier version of this proposal in the form of patches
to Python 3.1.2 can be found here:
http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/cofunctions.h…
If this version of the proposal is received favourably, the implementation will
be updated to match.
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:
Hello,
PEP-3151 < http://www.python.org/dev/peps/pep-3151/ > mentions a
really weird syntax for pattern-matching. I was hoping I could suggest
an alternative that's both more concise, and possible to implement
without doing something drastic like changing existing syntax or
semantics.
The PEP offers the pattern-matching syntax:
> except IOError as e if e.errno == errno.ENOENT: ...
I'd instead suggest something along the lines of
> except io_error(errno.ENOENT): ...
Implementing this is fairly straightforward, I've included it in the
bottom of this email. Raising an exception becomes `raise
io_error(errno.ENOENT)(msg)`. Some notational sugar can be implemented
(for example, `raise io_error(errno.ENOENT, msg)` could also be made
to work). I'm not fussed about the details.
I personally prefer keeping the errnos as a big part of handling
exceptions, they're common across OSes and involve less research /
memorization for those that are already aware of errno. I guess what
I'd like to see is the same exception hierarchy proposed by the PEP,
but with the addition of allowing errnos to be specified by
pattern-matching, so that errors not covered by the hierarchy, or more
specific than the hierarchy, can be concisely caught. However, I'm not
really well-versed in the pros and cons for all of this.
Above all, I'd like for the pattern matching alternative to be a bit
more reasonable. It doesn't have to be verbose and it doesn't have to
involve new syntax. Apologies for any mistakes in the code, they are
my own.
Here's the code:
# evil global state or something
error_classes = {}
def io_error(errno_, msg=None): # or something, you get the idea
try:
cls = error_classes[errno_]
except LookupError:
class IOErrorWithErrno(IOError):
errno = errno_
cls = error_classes[errno_] = IOErrorWithErrno
return error_classes[errno_]
# example of usage
import errno
try:
raise io_error(errno.ENOENT)("<Error message>")
except io_error(errno.ENOENT):
print("success")
Thanks for your time!
Devin Jeanpierre
A few decorator recipes that might be worthwhile to add to functools:
A way to assign annotations to functions/classes:
def annotations(**annots):
def deco(obj):
if hasattr(obj,'__annotations__'):
obj.__annotations__.update(annots)
else:
obj.__annotations__ = annots
return obj
return deco
_NONE = object()
def getannot(obj, key, default=_NONE):
if hasattr(obj, '__annotations__'):
if default is _NONE:
return obj.__annotations__[key]
else:
return obj.__annotations__.get(key, default)
elif default is _NONE:
raise KeyError(key)
else:
return default
def setannot(obj, key, value):
if hasattr(obj, '__annotations__'):
obj.__annotations__[key] = value
else:
obj.__annotations__ = {key: value}
Usage:
>>> @annotations(foo='bar',egg='spam')
>>> def foo():
>>> pass
>>>
>>> getannot(foo, 'egg')
'spam'
A way to assign values to classes/functions (not of much use for classes, of course):
def assign(**values):
def deco(obj):
for key in values:
setattr(obj, key, values[key])
return obj
return deco
Usage:
>>> @assign(bla='bleh',x=12)
>>> def foo():
>>> pass
>>>
>>> foo.x
12
Another spin-off from the "[Python-Dev] PyObject_RichCompareBool
identity shortcut" thread:
> I would like to discuss another peculiarity of NaNs:
>
>>>> float('nan') < 0
> False
>>>> float('nan') > 0
> False
>
> This property in my experience causes much more trouble than nan ==
> nan being false. The problem is that common sorting or binary search
> algorithms may degenerate into infinite loops in the presence of nans.
> This may even happen when searching for a finite value in a large
> array that contains a single nan. Errors like this do happen in the
> wild and and after chasing a bug like this programmers tend to avoid
> nans at all costs. Oftentimes this leads to using "magic"
> placeholders such as 1e300 for missing data.
>
> Since py3k has already made None < 0 an error, it may be reasonable
> for float('nan') < 0 to raise an error as well (probably ValueError
> rather than TypeError). This will not make lists with nans sortable
> or searchable using binary search, but will make associated bugs
> easier to find.
>
Hello everybody,
My name is Christophe Schlick, from the University of Bordeaux,
France. I've been using Python for years, but it happens that this is
my first post on a Python-related mailing list. For the first time, I
feel that I may have some topic interesting enough for the Python
community, so I would be very happy to get any kind of feeback
(positive or negative) on it. Thanks in advance...
The goal of this post is to propose a new syntax for defining
decorators in Python. I would like to fully explain the rationale and
the benefits of this proposal, as I guess that the discussion might be
more fruitful if all details are written down once for all. However
this has led me to a very long post (a kind of informal PEP) that was
blocked by the moderators. So, following their recommendation, I've
divided the text in three successive parts: (1) the rationale, (2) the
proposal, (3) the proposed implementation and additional
questions/remarks.
For better clarity of the description, I will call the proposed syntax
as "new-style decorators" (NSD, for short) and rename the classic
syntax as "old-style decorators" (OSD), following the terms used some
years ago with the (re)definition of classes. By the way, the
introducing process for NSD shares many aspects with the process used
for introducing new-style classes, including the following features:
* No existing syntax is broken: the only thing required to create a
new-style decorator is to decorate itself by a newly-introduced
decorator called... "decorator" (well, this sentence is less recursive
than it might appear at the first reading).
* Every thing that can be done with OSD is possible with NSD, but NSD
offer additional more user-friendly features.
* NSD can peacefully live together with OSD in the same code. An NSD
may even decorate an OSD (and vice-versa), however some properties of
the NSD are lost with such a combination.
--------------------------------------------------
1 - Why bother with a new syntax?
To explain what I don't like with the current syntax of decorators,
let me take the example of a basic decorator (called
'old_style_repeat_fix') that simply repeats 3 times its undecorated
function, and adds some tracing to the standard output. Here is the
code:
#---
def old_style_repeat_fix(func):
"""docstring for decorating function"""
# @wraps(func)
def dummy_func_name_never_used(*args, **keys):
"""docstring for decorated function"""
print "apply 'old_style_repeat_fix' on %r" % func.__name__
for loop in range(3): func(*args, **keys)
return dummy_func_name_never_used
#---
Even if such code snippets have become quite usual since the
introduction of decorators in Python 2.2, many people have argued (and
I am obviously one of them) that the decorator syntax is a bit
cumbersome. First, it imposes the use of nested functions, which often
reduces readability by moving the function signature and docstring too
far from the corresponding code. Second, as anonymous lambdas
expressions can usually not be employed for decorating functions, the
programmer has no other choice than to create a dummy function name
(only used for one single 'return' statement), which is never a good
coding principle, whatever the programming language. Once you have
tried to teach decorators to a bunch of students, you really
understand how much this syntax leverages the difficulty to grab the
idea.
The situation is even worse when the decorator needs some arguments:
let's create an extended decorator (called 'old_style_repeat_var)
that includes an integer 'n' to control the number of iterations, and
a boolean 'trace' to control the tracing behavior. Here is the code:
#---
def old_style_repeat_var(n=3, trace=True):
"""docstring for decorating function"""
def dummy_deco_name_never_used(func):
"""docstring never used"""
# @wraps(func)
def dummy_func_name_never_used(*args, **keys):
"""docstring for decorated function"""
if trace:
print "apply 'old_style_repeat_var' on %r" % func.__name__
for loop in range(n): func(*args, **keys)
return dummy_func_name_never_used
return dummy_deco_name_never_used
#---
This time a two-level function nesting is required and the code needs
two dummy names for these two nested functions. Note that the
docstring of the middle nested function is even totally invisible for
introspection tools. So whether you like nested functions or not,
there is some evidence here that the current syntax is somehow
suboptimal.
Another drawback of OSD is that they do not gently collaborate with
introspection and documentation tools. For instance, let's apply our
decorator on a silly 'test' function:
#---
@old_style_repeat_var(n=5) # 'trace' keeps its default value
def test(first=0, last=0):
"""docstring for undecorated function"""
print "test: first=%s last=%s" % (first, last)
#---
Now, if we try 'help' on it, we get the following answer:
#---
>>> help(test)
dummy_func_name_never_used(*args, **keys)
docstring for decorated function
#---
which means that neither the name, nor the docstring, nor the
signature of the 'test' function are correct. Things are a little
better when using the 'wraps' function from the standard 'functools'
module (simply uncomment the line '@wraps(func)' in the code of
'old_style_repeat_var'):
#---
>>> help(test)
test(*args, **keys)
"""docstring for undecorated function"""
#---
'@wraps(func)' copies the name and the docstring from the undecorated
function to the decorated one, in order to get some useful piece of
information when using 'help'. However, the signature of the function
still comes from the decorated function, not the genuine one. The
reason is that signature copying is not an easy process. The only
solution is to inspect the undecorated function and then use 'exec' to
generate a wrapper with a correct signature. This is basically what is
done in the 'decorator' package (available at PyPI) written by Michele
Simionato. There has been a lengthy discussion in python-dev (in 2009
I guess, but I can't find the archive right now) whether to include or
not this package in the standard library.
As far as I know, there is currently no clear consensus whether this
is a good idea or not, because there has always been a mixed feeling
from the community about transparent copy from the undecorated to the
decorated function (even about the 'wraps' function): on one hand,
transparent copy is cool for immediate help, for automatic
documentation and for introspection tools, but on the other hand, it
totally hides the decorating process which is not always what is
wanted... or needed.
The syntax for NSD presented in this proposal tries to improve this
situation by offering two desirable features, according to the Zen of
Python:
* "flat is better than nested": no nested functions with dummy names
are required, even when parameters are passed to the decorator; only
one single decorating function has to be written by the programmer,
whatever the kind of decorator.
* "explicit is better than implicit": introspection of a decorated
function explicitely reveals the decoration process, and allows one to
get the name/signature/docstring not only for the corresponding
undecorated function, but also for any number of chained decorators
that have been applied on it.
------
to be continued in Part 2...
CS
Returning to the original topic post on the comparison of objects in
containers ...
Could be asked to enter a special method __equals__ of objects for
comparison in containers (by default, if it is not defined - use usual
method for compatibility), just as is done in C# (it's special
object's method Equals is used to compare items in the collection).
-----
Zaur
It was necessary to clarify the reason for the possible introduction
of a special method __equals__. The fact that the objects in python
are often interpreted as values. Method __eq__, IMHO is used more for
comparing objects as values. Introduction of method __equals__ can
explicitly indicate the method of comparison of the objects as objects
and __eq__ objects as values. If method __equals__ is not defined
then __eq__ will be used.
--
Zaur
Hello, guys.
I did post this idea a few months ago. Now the revised version.
Goal:
Let _all_ alphanumeric keywords be legal as names for variables, functions and
classes, even the ones that are reserved words now.
Rationale:
1. Python took most good English words as reserved tokens. Situation goes worse
from version to version. I often have hard time searching for acceptable synonyms.
2. Because of that, old Python programs cease to work, even if they do not use
any abandoned features. Their only sin is using certain words that further
versions of Python have stolen away.
3. Sometimes one needs to import keywords from some other language, XML be an
example, or "translate" another programming language into Python in one way or
another. Keyword reservation is a big problem then; it does not allow to use
the natural Python syntax.
Solution:
Let the parser treat all keywords that come after a dot (".") as regular
identifiers.
For attributes, nothing changes:
> boo.for = 7
For names that are not attributes, only one syntax change is needed: let a dot
precede any identifier.
> .with = 3
Of course, if a keyword is not preceded by a dot, it would be treated as a
reserved word, just like now.
> with = 3 # syntax error
There is only one case where a dot is used as a prefix of an identifier and
that is a relative module import.
> from .boo import foo
My change is consistent with this case.
One benefit would be that converting current programs to work with future
versions would be a matter of simple grep.
Python is a great language. In my opinion, this change is the one last step to
make it every geeky teenager's wet dream: the language where one can redefine
almost anything. When I work with some problem, I always try to translate it to
Python, solve and translate back. Prohibited identifier names are the main
obstacle.
So, let's set the identifiers free and swallow all the world, making Python the
least common denominator of every computer problem on this planet.
Regards,
Bartosz Tarnowski
-------------------------------------------------
Jedz te potrawy, aby uchronic sie przed rakiem!
Sprawdz >> http://linkint.pl/f2946