I think it would be a good idea if Python tracebacks could be translated
into languages other than English - and it would set a good example.
For example, using French as my default local language, instead of
>>> 1/0
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ZeroDivisionError: integer division or modulo by zero
I might get something like
>>> 1/0
Suivi d'erreur (appel le plus récent en dernier) :
Fichier "<stdin>", à la ligne 1, dans <module>
ZeroDivisionError: division entière ou modulo par zéro
André
Here's an updated version of the PEP reflecting my
recent suggestions on how to eliminate 'codef'.
PEP: XXX
Title: Cofunctions
Version: $Revision$
Last-Modified: $Date$
Author: Gregory Ewing <greg.ewing(a)canterbury.ac.nz>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 13-Feb-2009
Python-Version: 3.x
Post-History:
Abstract
========
A syntax is proposed for defining and calling a special type of generator
called a 'cofunction'. It is designed to provide a streamlined way of
writing generator-based coroutines, and allow the early detection of
certain kinds of error that are easily made when writing such code, which
otherwise tend to cause hard-to-diagnose symptoms.
This proposal builds on the 'yield from' mechanism described in PEP 380,
and describes some of the semantics of cofunctions in terms of it. However,
it would be possible to define and implement cofunctions independently of
PEP 380 if so desired.
Specification
=============
Cofunction definitions
----------------------
A cofunction is a special kind of generator, distinguished by the presence
of the keyword ``cocall`` (defined below) at least once in its body. It may
also contain ``yield`` and/or ``yield from`` expressions, which behave as
they do in other generators.
From the outside, the distinguishing feature of a cofunction is that it cannot
be called the same way as an ordinary function. An exception is raised if an
ordinary call to a cofunction is attempted.
Cocalls
-------
Calls from one cofunction to another are made by marking the call with
a new keyword ``cocall``. The expression
::
cocall f(*args, **kwds)
is evaluated by first checking whether the object ``f`` implements
a ``__cocall__`` method. If it does, the cocall expression is
equivalent to
::
yield from f.__cocall__(*args, **kwds)
except that the object returned by __cocall__ is expected to be an
iterator, so the step of calling iter() on it is skipped.
If ``f`` does not have a ``__cocall__`` method, or the ``__cocall__``
method returns ``NotImplemented``, then the cocall expression is
treated as an ordinary call, and the ``__call__`` method of ``f``
is invoked.
Objects which implement __cocall__ are expected to return an object
obeying the iterator protocol. Cofunctions respond to __cocall__ the
same way as ordinary generator functions respond to __call__, i.e. by
returning a generator-iterator.
Certain objects that wrap other callable objects, notably bound methods,
will be given __cocall__ implementations that delegate to the underlying
object.
Grammar
-------
The full syntax of a cocall expression is described by the following
grammar lines:
::
atom: cocall | <existing alternatives for atom>
cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
cotrailer: '[' subscriptlist ']' | '.' NAME
Note that this syntax allows cocalls to methods and elements of sequences
or mappings to be expressed naturally. For example, the following are valid:
::
y = cocall self.foo(x)
y = cocall funcdict[key](x)
y = cocall a.b.c[i].d(x)
Also note that the final calling parentheses are mandatory, so that for example
the following is invalid syntax:
::
y = cocall f # INVALID
New builtins, attributes and C API functions
--------------------------------------------
To facilitate interfacing cofunctions with non-coroutine code, there will
be a built-in function ``costart`` whose definition is equivalent to
::
def costart(obj, *args, **kwds):
try:
m = obj.__cocall__
except AttributeError:
result = NotImplemented
else:
result = m(*args, **kwds)
if result is NotImplemented:
raise TypeError("Object does not support cocall")
return result
There will also be a corresponding C API function
::
PyObject *PyObject_CoCall(PyObject *obj, PyObject *args, PyObject *kwds)
It is left unspecified for now whether a cofunction is a distinct type
of object or, like a generator function, is simply a specially-marked
function instance. If the latter, a read-only boolean attribute
``__iscofunction__`` should be provided to allow testing whether a given
function object is a cofunction.
Motivation and Rationale
========================
The ``yield from`` syntax is reasonably self-explanatory when used for the
purpose of delegating part of the work of a generator to another function. It
can also be used to good effect in the implementation of generator-based
coroutines, but it reads somewhat awkwardly when used for that purpose, and
tends to obscure the true intent of the code.
Furthermore, using generators as coroutines is somewhat error-prone. If one
forgets to use ``yield from`` when it should have been used, or uses it when it
shouldn't have, the symptoms that result can be extremely obscure and confusing.
Finally, sometimes there is a need for a function to be a coroutine even though
it does not yield anything, and in these cases it is necessary to resort to
kludges such as ``if 0: yield`` to force it to be a generator.
The ``cocall`` construct address the first issue by making the syntax directly
reflect the intent, that is, that the function being called forms part of a
coroutine.
The second issue is addressed by making it impossible to mix coroutine and
non-coroutine code in ways that don't make sense. If the rules are violated, an
exception is raised that points out exactly what and where the problem is.
Lastly, the need for dummy yields is eliminated by making it possible for a
cofunction to call both cofunctions and ordinary functions with the same syntax,
so that an ordinary function can be used in place of a cofunction that yields
zero times.
Record of Discussion
====================
An earlier version of this proposal required a special keyword ``codef`` to be
used in place of ``def`` when defining a cofunction, and disallowed calling an
ordinary function using ``cocall``. However, it became evident that these
features were not necessary, and the ``codef`` keyword was dropped in the
interests of minimising the number of new keywords required.
The use of a decorator instead of ``codef`` was also suggested, but the current
proposal makes this unnecessary as well.
It has been questioned whether some combination of decorators and functions
could be used instead of a dedicated ``cocall`` syntax. While this might be
possible, to achieve equivalent error-detecting power it would be necessary
to write cofunction calls as something like
::
yield from cocall(f)(args)
making them even more verbose and inelegant than an unadorned ``yield from``.
It is also not clear whether it is possible to achieve all of the benefits of
the cocall syntax using this kind of approach.
Prototype Implementation
========================
An implementation of an earlier version of this proposal in the form of patches
to Python 3.1.2 can be found here:
http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/cofunctions.h…
If this version of the proposal is received favourably, the implementation will
be updated to match.
Copyright
=========
This document has been placed in the public domain.
..
Local Variables:
mode: indented-text
indent-tabs-mode: nil
sentence-end-double-space: t
fill-column: 70
coding: utf-8
End:
Hello,
Python 3 has removed callable() under the justification that's it's not
very useful and duck typing (EAFP) should be used instead. However,
it has since been felt by many people that it was an annoying loss;
there are situations where you truly want to know whether something is a
callable without actually calling it (for example when writing
sophisticated decorators, or simply when you want to inform the user
of an API misuse).
The substitute of writing `isinstance(x, collections.Callable)` is
not good, 1) because it's wordier 2) because collections is really not
an intuitive place where to look for a Callable ABC.
So, I would advocate bringing back the callable() builtin, which was
easy to use, helpful and semantically sane.
Regards
Antoine.
[Changed subject]
> On 2010-10-25 04:37, Guido van Rossum wrote:
>> This should not require threads.
>>
>> Here's a bare-bones sketch using generators:
[...]
On Mon, Oct 25, 2010 at 3:19 AM, Jacob Holm <jh(a)improva.dk> wrote:
> If you don't care about allowing the funcs to raise StopIteration, this
> can actually be simplified to:
[...]
Indeed, I realized this after posting. :-) I had several other ideas
for improvements, e.g. being able to pass an initial value to the
reduce-like function or even being able to supply a reduce-like
function of one's own.
> More interesting (to me at least) is that this is an excellent example
> of why I would like to see a version of PEP380 where "close" on a
> generator can return a value (AFAICT the version of PEP380 on
> http://www.python.org/dev/peps/pep-0380 is not up-to-date and does not
> mention this possibility, or even link to the heated discussion we had
> on python-ideas around march/april 2009).
Can you dig up the link here?
I recall that discussion but I don't recall a clear conclusion coming
from it -- just heated debate.
Based on my example I have to agree that returning a value from
close() would be nice. There is a little detail, how multiple
arguments to StopIteration should be interpreted, but that's not so
important if it's being raised by a return statement.
> Assuming that "close" on a reduce_collector generator instance returns
> the value of the StopIteration raised by the "return" statements, we can
> simplify the code even further:
>
>
> def reduce_collector(func):
> try:
> outcome = yield
> except GeneratorExit:
> return None
> while True:
> try:
> val = yield
> except GeneratorExit:
> return outcome
> outcome = func(outcome, val)
>
> def parallel_reduce(iterable, funcs):
> collectors = [reduce_collector(func) for func in funcs]
> for coll in collectors:
> next(coll)
> for val in iterable:
> for coll in collectors:
> coll.send(val)
> return [coll.close() for coll in collectors]
>
>
> Yes, this is only saving a few lines, but I find it *much* more readable...
I totally agree that not having to call throw() and catch whatever it
bounces back is much nicer. (Now I wish there was a way to avoid the
"try..except GeneratorExit" construct in the generator, but I think I
should stop while I'm ahead. :-)
The interesting thing is that I've been dealing with generators used
as coroutines or tasks intensely on and off since July, and I haven't
had a single need for any of the three patterns that this example
happened to demonstrate:
- the need to "prime" the generator in a separate step
- throwing and catching GeneratorExit
- getting a value from close()
(I did have a lot of use for send(), throw(), and extracting a value
from StopIteration.)
In my context, generators are used to emulate concurrently running
tasks, and "yield" is always used to mean "block until this piece of
async I/O is complete, and wake me up with the result". This is
similar to the "classic" trampoline code found in PEP 342.
In fact, when I wrote the example for this thread, I fumbled a bit
because the use of generators there is different than I had been using
them (though it was no doubt thanks to having worked with them
intensely that I came up with the example quickly).
So, it is clear that generators are extremely versatile, and PEP 380
deserves several good use cases to explain all the API subtleties.
BTW, while I have you, what do you think of Greg's "cofunctions" proposal?
--
--Guido van Rossum (python.org/~guido)
Hello,
This morning I tried to fix an issue for a while before I realized I
had a circular import. This issue is not obvious because you get a
cryptic error, like an AttributeError and it can tak a while before
finding out.
I don't know of this was mentioned before, or how hard it would be.
But it would be nice if Python had a specific "CircularImportError"
raised in that case, or something..
That would be a fabulous hint for developers.
Cheers
Tarek
--
Tarek Ziadé | http://ziade.org
`issubclass(1, list)` raises an Exception, complaining that `1` is not a
class. This is wrong in my opinion. It should just return False.
Use case: I have an object which can be either a list, or a string, or a
callable, or a type. And I want to check whether it's a sub-class of some
base class.
So I don't think I should be taking extra precautions before using
`issubclass`: If my object is not a subclass of the given base class, I
should just get `False`.
Ram.
Let's recall Guido's old Computer Programming for Everybody (CP4E) proposal.
Nowadays that Python is established, it's high time to push Python
into education, especially first programming language education. I
think, in the modern world it means pre-school.
Now the larger part of the world's children doesn't learn English
before school, therefore we need to have truly localized Python.
Some might recall a Python derivative demo with unicode variable names
(link anyone?).
I think we ought to go further. For example, consider imaginary
language pig latin:
"""This does that""" --> """Thiso acto thato""" # docstrings
__version__ = (1,2,3) --> __versio__ = (1,2,3) # variable names
import time --> importo chrono # standard module names
def foo(): pass --> defo foo(): passo # Python keywords
"foo".upper() --> "foo".uppero() # standard library
raise Xx("undefined") --> raisio Xx("indifinito") # errors
#!/usr/bin/python --> #!/usr/bin/pythono # executable name
#!/usr/bin/python --> #!/usero/binaro/pythono # name and path
Of course there are concerns for many languages:
Each language needs to establish stable translations for keywords,
basic types, standard modules, methods in standard modules, etc.
Some languages don't support word spaces natively
Some languages have different punctuation rules, e.g. comma for decimal point
Some languages use different quotes
RTL languages spell words RTL yet (some/all?) spell numbers LTR
Hopefully none has to recreate 10,000-separator system ;-)
Anyhow, it's not the issue of core Python to support particular
languages, what is needed is:
the concept that this is needed, and
the base where from a particular localization can evolve from
Here, a fun example, how Python might look like in
google-translate-simplified-chinese. Blame google, not me as I know
very little about this language.
"""This does that""" --> """这是""" # docstrings
__version__ = (1,2,3) --> __版本__ = (1,2,3) # variable names
import time --> 进口 时间 # standard module names
def foo(): pass --> 业 美孚(): 通过 # Python keywords
"foo".upper() --> “富” 上层() # standard library
raise Xx("undefined") --> 提高。二十(“未定义”) # errors
#!/usr/bin/python --> #!/usr/bin/蛇 # executable name
#!/usr/bin/python --> #!/用户/二进制/蛇 # name and path
I track this here and will update with the received feedback:
http://pythonic-wisdom.blogspot.com/2010/11/truly-international-python.html
Perhaps you can clarify what exactly you want to do.
I can see at least 2 distict cases
1. Multithreaded web server (or even browser)
* interpreters need separate imports
at least pure python modules should be loaded and unloaded,
different versions of same python modules could be used by different
interpreters
if different versions of C extensions are needed, different dynamic
loading might be needed
btw, is it still the case that C extensions cannot be unloaded?
* interpreters need separate memory regions
so that individual interpreters can be killed quickly
2. Long-running process that executes many small independent user
scripts (e.g. phone)
* memory could be shared as long as cross-references are forbidden
garbage collector hopefully kill circular references after
interpreter is terminated
* long-running process might want to reload a C extension, ouch
* a possible workaround would be execute only one interpreter at a time,
somehow pickling user script state?
Hi there.
Some months back I mentioned that I had other stuff in store for GC.
Here is an idea for you. This particular idea is just a generalization of a system we've used in EVE for years now: garbage collection callbacks.
The need, originally, was to be able to quantify the time spent in garbage collection runs. Since they occur out of direct control of the application, we needed to have python tell us about it somehow.
We added gc.register_callback(), a function that added a callable to an internal list of functions that get called on two occations:
1) When a garbage collection run Is about to start
2) When a garbage collection run has finished.
The cases are distinguished using an integer argument. The callbacks are invoked from gc with gc inhibited from reentry, so that the callbacks cannot themselves cause another gc run to commence.
What we traditionally use this for is to start and stop a performance timer and other stats.
More recently though, we have found another very important use for this. When gc finds uncollectable objects, they are put in the gc.garbage list. This then needs to be handled by the application. However, there is no particularly good way for the application to do this, except to periodically check this list.
With the above callback, modules that create uncollectable objects, such as classes with __del__ methods, can register their callback. At the end of a gc run, they can then walk gc.garbage and take appropriate action for those objects it recognizes.
So, what do you think? This is a very simple addition to gc, orthogonal to everything and easily implemented. I also think it is very useful.
K