[pypy-dev] RFC: draft idea for making for loops automatically close iterators
hubo
hubo at jiedaibao.com
Mon Oct 24 00:03:19 EDT 2016
Well the code don't appear to be broken, in fact they have been working well for serveral months with PyPy, it only means the design is broken - there are chances that iterators are not closed correctly in certain circumstances which leads to unpredictable behaviors. It might not be critical for small scripts but may be quite critical for services that must keep running for a long time, which is what PyPy is for. Files may not be the most critical problem, the real problem is LOCK - when you use with on a lock, there are chances that it never unlocks.
As far as I know quite a lot of softwares use generators on network programmings because it is convenient to process the callbacks with generators, and it is not so unusual to "call" another generator method or recurse on itself. When the connection is suddenly shutdown, the connection manager closes the generator - but not the generators called inside.
Python 3 is using this as the standard programming model of asyncio, it may also suffer but not that much, because yield from seems to close the iterator automatically because it reraises the exception inside.
2016-10-24
hubo
发件人:Steven D'Aprano <steve at pearwood.info>
发送时间:2016-10-22 07:13
主题:Re: [pypy-dev] RFC: draft idea for making for loops automatically close iterators
收件人:"pypy-dev"<pypy-dev at python.org>
抄送:
On Fri, Oct 21, 2016 at 10:13:45PM +0800, hubo wrote:
> Well I'm really shocked to find out what I thought was a "automatic
> close" is really the ref-couting GC of CPython, means that a lot of my
> code breaks in PyPy...
But does it really?
If you've run your code in PyPy, and it obviously, clearly breaks, then
why are you so shocked? You should have already known this. (Unless this
is your first time running your code under PyPy.)
But if your code runs under PyPy, with no crashes, no exceptions, no
failures caused by running out of file descriptors... then you can't
really say your code is broken. What does it matter if your application
doesn't close the files until exit, if you only open three files and the
application never runs for more than two seconds?
I'd like to get a good idea of how often this is an actual problem,
causing scripts and applications to fail when run in PyPy. Actual
failures, not just wasting a file descriptor or three.
> I'm wondering since a ref-counting GC implemention is not possible for
> PyPy, is it possible to hack on the for loop to make it "try to"
> collect the generator? That may really save a lot of lives.
Saving lives? That's a bit of an exaggeration, isn't it?
There is a big discussion going on over on the Python-Ideas mailing
list, and exaggerated, over-the-top responses aren't going to help this
proposal's case. Already people have said this issue is only a problem
for PyPy, so it's PyPy's problem to fix.
--
Steve
_______________________________________________
pypy-dev mailing list
pypy-dev at python.org
https://mail.python.org/mailman/listinfo/pypy-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/pypy-dev/attachments/20161024/22e7159c/attachment.html>
More information about the pypy-dev
mailing list