From greg.ewing at canterbury.ac.nz  Fri May  1 01:24:51 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 01 May 2015 11:24:51 +1200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <55425237.1080700@gmail.com>
References: <554185C2.5080003@gmail.com> <5541F2DB.5000201@canterbury.ac.nz>
 <55425237.1080700@gmail.com>
Message-ID: <5542B9C3.2080203@canterbury.ac.nz>

Yury Selivanov wrote:
> Well, using next() and iter() on coroutines in asyncio
> code is something esoteric.  I can't even imagine
> why you would want to do that.

I'm talking about the fact that existing generator-
based coroutines that aren't decorated with
@coroutine won't be able to call new ones that use
async def.

This means that converting one body of code to the
new style can force changes in other code that
interacts with it.

Maybe this is not considered a problem, but as long
as it's true, I don't think it's accurate to claim
"full backwards compatibility".

-- 
Greg

From yselivanov.ml at gmail.com  Fri May  1 01:40:26 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Thu, 30 Apr 2015 19:40:26 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5542B9C3.2080203@canterbury.ac.nz>
References: <554185C2.5080003@gmail.com> <5541F2DB.5000201@canterbury.ac.nz>
 <55425237.1080700@gmail.com> <5542B9C3.2080203@canterbury.ac.nz>
Message-ID: <5542BD6A.3010505@gmail.com>

On 2015-04-30 7:24 PM, Greg Ewing wrote:
> Yury Selivanov wrote:
>> Well, using next() and iter() on coroutines in asyncio
>> code is something esoteric.  I can't even imagine
>> why you would want to do that.
>
> I'm talking about the fact that existing generator-
> based coroutines that aren't decorated with
> @coroutine won't be able to call new ones that use
> async def.


Ah, alright.

You quoted this:

     3. CO_NATIVE_COROUTINE flag. This enables us to
     disable __iter__ and __next__ on native coroutines
     while maintaining full backwards compatibility.

I wrote "full backwards compatibility" for that
particular point #3 -- existing @asyncio.coroutines
will have __iter__ and __next__ working just fine.

Sorry if this was misleading.
>
> This means that converting one body of code to the
> new style can force changes in other code that
> interacts with it.
>
> Maybe this is not considered a problem, but as long
> as it's true, I don't think it's accurate to claim
> "full backwards compatibility".
>

I covered this in point #4.  I also touched this in
https://www.python.org/dev/peps/pep-0492/#migration-strategy


I'm still waiting for feedback on this from Guido.  If
he decides to go with RuntimeWarnings, then it's 100%
backwards compatible.  If we keep TypeErrors --
then *existing code will work on 3.5*, but something
*might* break during adopting new syntax.  I'll update
the Backwards Compatibility section.


Thanks,
Yury

From guido at python.org  Fri May  1 01:49:21 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 30 Apr 2015 16:49:21 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5542B9C3.2080203@canterbury.ac.nz>
References: <554185C2.5080003@gmail.com> <5541F2DB.5000201@canterbury.ac.nz>
 <55425237.1080700@gmail.com> <5542B9C3.2080203@canterbury.ac.nz>
Message-ID: <CAP7+vJJU6+HhY9M5e8E0sgWM2rzZZ+UWY0G81drrr1V=38dP5A@mail.gmail.com>

On Thu, Apr 30, 2015 at 4:24 PM, Greg Ewing <greg.ewing at canterbury.ac.nz>
wrote:

> Yury Selivanov wrote:
>
>> Well, using next() and iter() on coroutines in asyncio
>> code is something esoteric.  I can't even imagine
>> why you would want to do that.
>>
>
> I'm talking about the fact that existing generator-
> based coroutines that aren't decorated with
> @coroutine won't be able to call new ones that use
> async def.
>
> This means that converting one body of code to the
> new style can force changes in other code that
> interacts with it.
>
> Maybe this is not considered a problem, but as long
> as it's true, I don't think it's accurate to claim
> "full backwards compatibility".
>

Greg, you seem to have an odd notion of "full backwards compatibility". The
term means that old code won't break. It doesn't imply that old and new
code can always seamlessly interact (that would be an impossibly high bar
for almost any change).

That said, interoperability between old code and new code is an area of
interest. But if the only thing that's standing between old code and new
code is the @coroutine decorator, things are looking pretty good -- that
decorator is already strongly required for coroutines intended for use with
the asyncio package, and older versions of the asyncio package also define
that decorator, so if there's old code out there that needs to be able to
call the new coroutines (by whatever name, e.g. async functions :-), adding
the @coroutine decorator to the old code doesn't look like too much of a
burden.

I assume there might be code out there that uses yield-from-based
coroutines but does not use the asyncio package, but I doubt there is much
code like that (I haven't seen much mention of yield-from outside its use
in asyncio). So I think the interop problem is mostly limited to
asyncio-using code that plays loose with the @coroutine decorator
requirement and now wants to work with the new async functions. That's easy
enough to address.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150430/0676c0d4/attachment.html>

From greg.ewing at canterbury.ac.nz  Fri May  1 02:52:55 2015
From: greg.ewing at canterbury.ac.nz (Greg)
Date: Fri, 01 May 2015 12:52:55 +1200
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <CAP7+vJLnW=uM+9n7er31uSk0QYgU3tYnEG=8=sZrXZBdiM0Bsw@mail.gmail.com>
References: <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com>
 <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com>
 <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com>
 <20150430161541.GG10248@stoneleaf.us>
 <CAP7+vJLnW=uM+9n7er31uSk0QYgU3tYnEG=8=sZrXZBdiM0Bsw@mail.gmail.com>
Message-ID: <5542CE67.8060501@canterbury.ac.nz>

On 1/05/2015 5:38 a.m., Guido van Rossum wrote:
> you can write "not -x" but you can't write "- not x".

That seems just as arbitrary and unintuitive, though.

There are some other unintuitive consequences as well, e.g.
you can write

    not a + b

but it's not immediately obvious that this is parsed as
'not (a + b)' rather than '(not a) + b'.

The presence of one arbitrary and unintuitive thing in the
grammar is not by itself a justification for adding another one.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri May  1 03:13:20 2015
From: greg.ewing at canterbury.ac.nz (Greg)
Date: Fri, 01 May 2015 13:13:20 +1200
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <55426ECA.6040403@gmail.com>
References: <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com>
 <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com>
 <20150430161541.GG10248@stoneleaf.us>
 <CAP7+vJLnW=uM+9n7er31uSk0QYgU3tYnEG=8=sZrXZBdiM0Bsw@mail.gmail.com>
 <20150430175634.GH10248@stoneleaf.us> <55426ECA.6040403@gmail.com>
Message-ID: <5542D330.90404@canterbury.ac.nz>

On 1/05/2015 6:04 a.m., Yury Selivanov wrote:

> I still want to see where my current grammar forces to use
> parens.  See [1], there are no useless parens anywhere.

It's not about requiring or not requiring parens. It's about
making the simplest possible change to the grammar necessary
to achieve the desired goals. Keeping the grammar simple
makes it easy for humans to reason about.

The question is whether syntactically disallowing certain
constructs that are unlikely to be needed is a desirable
enough goal to be worth complicating the grammar. You think
it is, some others of us think it's not.

> FWIW, I'll fix the 'await (await x)' expression to be parsed
> without parens.

I don't particularly care whether 'await -x' or 'await await x'
can be written without parens or not. The point is that the
simplest grammar change necessary to be able to write the
things we *do* want also happens to allow those. I don't see
that as a problem worth worrying about.

-- 
Greg


From jeanpierreda at gmail.com  Fri May  1 03:56:21 2015
From: jeanpierreda at gmail.com (Devin Jeanpierre)
Date: Thu, 30 Apr 2015 18:56:21 -0700
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <5542D330.90404@canterbury.ac.nz>
References: <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com>
 <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com>
 <20150430161541.GG10248@stoneleaf.us>
 <CAP7+vJLnW=uM+9n7er31uSk0QYgU3tYnEG=8=sZrXZBdiM0Bsw@mail.gmail.com>
 <20150430175634.GH10248@stoneleaf.us> <55426ECA.6040403@gmail.com>
 <5542D330.90404@canterbury.ac.nz>
Message-ID: <CABicbJLC7r8-MxNu8DNmriR6wb+Cx2ZrM0mm7YabzR83DRc0Lg@mail.gmail.com>

On Thu, Apr 30, 2015 at 6:13 PM, Greg <greg.ewing at canterbury.ac.nz> wrote:
> It's not about requiring or not requiring parens. It's about
> making the simplest possible change to the grammar necessary
> to achieve the desired goals. Keeping the grammar simple
> makes it easy for humans to reason about.
>
> The question is whether syntactically disallowing certain
> constructs that are unlikely to be needed is a desirable
> enough goal to be worth complicating the grammar. You think
> it is, some others of us think it's not.

+1. It seems weird to add a whole new precedence level when an
existing one works fine. Accidentally negating a future/deferred is
not a significant source of errors, so I don't get why that would be a
justifying example.

-- Devin

From guido at python.org  Fri May  1 05:09:41 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 30 Apr 2015 20:09:41 -0700
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <CABicbJLC7r8-MxNu8DNmriR6wb+Cx2ZrM0mm7YabzR83DRc0Lg@mail.gmail.com>
References: <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com>
 <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com>
 <20150430161541.GG10248@stoneleaf.us>
 <CAP7+vJLnW=uM+9n7er31uSk0QYgU3tYnEG=8=sZrXZBdiM0Bsw@mail.gmail.com>
 <20150430175634.GH10248@stoneleaf.us> <55426ECA.6040403@gmail.com>
 <5542D330.90404@canterbury.ac.nz>
 <CABicbJLC7r8-MxNu8DNmriR6wb+Cx2ZrM0mm7YabzR83DRc0Lg@mail.gmail.com>
Message-ID: <CAP7+vJ+idmZxEy8c3xLvsStnuWo2bAHAkgfq=kt8fuYz8YP_Vg@mail.gmail.com>

On Thu, Apr 30, 2015 at 6:56 PM, Devin Jeanpierre <jeanpierreda at gmail.com>
wrote:

> On Thu, Apr 30, 2015 at 6:13 PM, Greg <greg.ewing at canterbury.ac.nz> wrote:
> > It's not about requiring or not requiring parens. It's about
> > making the simplest possible change to the grammar necessary
> > to achieve the desired goals. Keeping the grammar simple
> > makes it easy for humans to reason about.
> >
> > The question is whether syntactically disallowing certain
> > constructs that are unlikely to be needed is a desirable
> > enough goal to be worth complicating the grammar. You think
> > it is, some others of us think it's not.
>
> +1. It seems weird to add a whole new precedence level when an
> existing one works fine. Accidentally negating a future/deferred is
> not a significant source of errors, so I don't get why that would be a
> justifying example.
>

You can call me weird, but I *like* fine-tuning operator binding rules to
suit my intuition for an operator. 'await' is not arithmetic, so I don't
see why it should be lumped in with '-'. It's not like the proposed grammar
change introducing 'await' is earth-shattering in complexity.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150430/5905c142/attachment-0001.html>

From njs at pobox.com  Fri May  1 05:30:05 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Thu, 30 Apr 2015 20:30:05 -0700
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <5541EE2A.1070109@canterbury.ac.nz>
References: <CAP7+vJK6GQ=rJ__KeE=H1cAES3gThdg4okmq+iFFSvE6SushBw@mail.gmail.com>
 <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz>
 <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com>
 <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz>
 <CAPJVwBm3E_oY12VbuBOMApSouVuMbAjqkN+sMoi3tSVYwt_Sgw@mail.gmail.com>
 <55416DBF.3020303@gmail.com>
 <CAPJVwBkhHMzxcECaEskU3TqhKPm5ws0JiAd9s_pPtehDtxJbVw@mail.gmail.com>
 <5541EE2A.1070109@canterbury.ac.nz>
Message-ID: <CAPJVwB=_UbdQBAWE1AYrEDNcDkUSBM2Xsj_eRQhSiLD1dEtTjQ@mail.gmail.com>

On Apr 30, 2015 1:57 AM, "Greg Ewing" <greg.ewing at canterbury.ac.nz> wrote:
>
> Nathaniel Smith wrote:
>>
>> Even if we put aside our trained intuitions about arithmetic, I think
>> it's correct to say that the way unary minus is parsed is: everything
>> to the right of it that has a tighter precedence gets collected up and
>> parsed as an expression, and then it takes that expression as its
>> argument.
>
>
> Tighter or equal, actually: '--a' is allowed.
>
> This explains why Yury's syntax disallows 'await -f'.
> The 'await' operator requires something after it, but
> there's *nothing* between it and the following '-',
> which binds less tightly.
>
> So it's understandable, but you have to think a bit
> harder.
>
> Why do we have to think harder? I suspect it's because
> the notion of precedence is normally introduced to resolve
> ambiguities. Knowing that infix '*' has higher precedence
> than infix '+' tells us that 'a + b * c' is parsed as
> 'a + (b * c)' and not '(a + b) * c'.
>
> Similarly, knowing that infix '.' has higher precedence
> than prefix '-' tells us that '-a.b' is parsed as
> '-(a.b)' rather than '(-a).b'.
>
> However, giving prefix 'await' higher precedence than
> prefix '-' doesn't serve to resolve any ambiguity.
> '- await f' is parsed as '-(await f)' either way, and
> 'await f + g' is parsed as '(await f) + g' either way.
>
> So when we see 'await -f', we think we already know
> what it means. There is only one possible order for
> the operations, so it doesn't look as though precedence
> comes into it at all, and we don't consider it when
> judging whether it's a valid expression.

The other reason this threw me is that I've recently been spending time
with a shunting yard parser, and in shunting yard parsers unary prefix
operators just work in the expected way (their precedence only affects
their interaction with later binary operators; a chain of unaries is always
allowed). It's just a limitation of the parser generator tech that python
uses that it can't handle unary operators in the natural fashion. (OTOH it
can handle lots of cases that shunting yard parsers can't -- I'm not
criticizing python's choice of parser.) Once I read the new "documentation
grammar" this became much clearer.

> What's the conclusion from all this? I think it's
> that using precedence purely to disallow certain
> constructs, rather than to resolve ambiguities, leads
> to a grammar with less-than-intuitive characteristics.

The actual effect of making "await" a different precedence is to resolve
the ambiguity in
  await x ** 2

If await acted like -, then this would be
  await (x ** 2)
But with the proposed grammar, it's instead
  (await x) ** 2
Which is probably correct, and produces the IMHO rather nice invariant that
"await" binds more tightly than arithmetic in general (instead of having to
say that it binds more tightly than arithmetic *except* in this one corner
case...).

But then given the limitations of Python's parser plus the desire to
disambiguate the expression above in the given way, it becomes an arguably
regrettable, yet inevitable, consequence that
  await -fut
  await +fut
  await ~fut
become parse errors.

AFAICT these and the ** case are the only expressions where there's any
difference between Yury's proposed grammar and your proposal of treating
await like unary minus.

-n
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150430/ed299072/attachment.html>

From guido at python.org  Fri May  1 05:40:03 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 30 Apr 2015 20:40:03 -0700
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <CAPJVwB=_UbdQBAWE1AYrEDNcDkUSBM2Xsj_eRQhSiLD1dEtTjQ@mail.gmail.com>
References: <CAP7+vJK6GQ=rJ__KeE=H1cAES3gThdg4okmq+iFFSvE6SushBw@mail.gmail.com>
 <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz>
 <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com>
 <5540A093.2070808@canterbury.ac.nz>
 <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us>
 <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com>
 <55415F5F.3090101@canterbury.ac.nz>
 <CAPJVwBm3E_oY12VbuBOMApSouVuMbAjqkN+sMoi3tSVYwt_Sgw@mail.gmail.com>
 <55416DBF.3020303@gmail.com>
 <CAPJVwBkhHMzxcECaEskU3TqhKPm5ws0JiAd9s_pPtehDtxJbVw@mail.gmail.com>
 <5541EE2A.1070109@canterbury.ac.nz>
 <CAPJVwB=_UbdQBAWE1AYrEDNcDkUSBM2Xsj_eRQhSiLD1dEtTjQ@mail.gmail.com>
Message-ID: <CAP7+vJ+drN5ysFWrcAGZ-iEMR7Q+NsZmsfdzC15T7zi_MwD8pw@mail.gmail.com>

On Thu, Apr 30, 2015 at 8:30 PM, Nathaniel Smith <njs at pobox.com> wrote:

> The actual effect of making "await" a different precedence is to resolve
> the ambiguity in
>
>   await x ** 2
>
> If await acted like -, then this would be
>   await (x ** 2)
> But with the proposed grammar, it's instead
>   (await x) ** 2
> Which is probably correct, and produces the IMHO rather nice invariant
> that "await" binds more tightly than arithmetic in general (instead of
> having to say that it binds more tightly than arithmetic *except* in this
> one corner case...)
>
Correct.

> AFAICT these and the ** case are the only expressions where there's any
> difference between Yury's proposed grammar and your proposal of treating
> await like unary minus. But then given the limitations of Python's parser
> plus the desire to disambiguate the expression above in the given way, it
> becomes an arguably regrettable, yet inevitable, consequence that
>   await -fut
>   await +fut
>   await ~fut
> become parse errors.
>
 Why is that regrettable? Do you have a plan for overloading one of those
on Futures? I personally consider it a feature that you can't do that. :-)

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150430/5ed9226f/attachment.html>

From njs at pobox.com  Fri May  1 05:57:12 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Thu, 30 Apr 2015 20:57:12 -0700
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <CAP7+vJ+drN5ysFWrcAGZ-iEMR7Q+NsZmsfdzC15T7zi_MwD8pw@mail.gmail.com>
References: <CAP7+vJK6GQ=rJ__KeE=H1cAES3gThdg4okmq+iFFSvE6SushBw@mail.gmail.com>
 <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz>
 <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com>
 <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz>
 <CAPJVwBm3E_oY12VbuBOMApSouVuMbAjqkN+sMoi3tSVYwt_Sgw@mail.gmail.com>
 <55416DBF.3020303@gmail.com>
 <CAPJVwBkhHMzxcECaEskU3TqhKPm5ws0JiAd9s_pPtehDtxJbVw@mail.gmail.com>
 <5541EE2A.1070109@canterbury.ac.nz>
 <CAPJVwB=_UbdQBAWE1AYrEDNcDkUSBM2Xsj_eRQhSiLD1dEtTjQ@mail.gmail.com>
 <CAP7+vJ+drN5ysFWrcAGZ-iEMR7Q+NsZmsfdzC15T7zi_MwD8pw@mail.gmail.com>
Message-ID: <CAPJVwBn-ouipE0cW61Lw_9VNOPDbSqwzdmuC1L-0U-iLegizdQ@mail.gmail.com>

On Apr 30, 2015 8:40 PM, "Guido van Rossum" <guido at python.org> wrote:
>
> On Thu, Apr 30, 2015 at 8:30 PM, Nathaniel Smith <njs at pobox.com> wrote:
>>
>> The actual effect of making "await" a different precedence is to resolve
the ambiguity in
>>
>>   await x ** 2
>>
>> If await acted like -, then this would be
>>   await (x ** 2)
>> But with the proposed grammar, it's instead
>>   (await x) ** 2
>> Which is probably correct, and produces the IMHO rather nice invariant
that "await" binds more tightly than arithmetic in general (instead of
having to say that it binds more tightly than arithmetic *except* in this
one corner case...)
>
> Correct.
>>
>> AFAICT these and the ** case are the only expressions where there's any
difference between Yury's proposed grammar and your proposal of treating
await like unary minus. But then given the limitations of Python's parser
plus the desire to disambiguate the expression above in the given way, it
becomes an arguably regrettable, yet inevitable, consequence that
>>
>>   await -fut
>>   await +fut
>>   await ~fut
>> become parse errors.
>
>  Why is that regrettable? Do you have a plan for overloading one of those
on Futures? I personally consider it a feature that you can't do that. :-)

I didn't say it was regrettable, I said it was arguably regrettable. For
proof, see the last week of python-dev ;-).

(I guess all else being equal it would be nice if unary operators could
stack arbitrarily, since that really is the more natural parse rule IMO and
also if things had worked that way then I would have spent this thread less
confused. But this is a pure argument from elegance. In practice there's
obviously no good reason to write "await -fut" or "-not x", so meh,
whatever.)

-n
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150430/284a1b28/attachment.html>

From stephen at xemacs.org  Fri May  1 06:14:02 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Fri, 01 May 2015 13:14:02 +0900
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
Message-ID: <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>

Adam Barto? writes:

 > Unfortunately, it doesn't work. With PYTHONIOENCODING=utf-8, the
 > sys.std* streams are created with utf-8 encoding (which doesn't
 > help on Windows since they still don't use ReadConsoleW and
 > WriteConsoleW to communicate with the terminal) and after changing
 > the sys.std* streams to the fixed ones and setting readline hook,
 > it still doesn't work,

I don't see why you would expect it to work: either your code is
bypassing PYTHONIOENCODING=utf-8 processing, and that variable doesn't
matter, or you're feeding already decoded text *as UTF-8* to your
module which evidently expects something else (UTF-16LE?).

 > so presumably the PyCF_SOURCE_IS_UTF8 is still not set.

I don't think that flag does what you think it does.  AFAICT from
looking at the source, that flag gets unconditionally set in the
execution context for compile, eval, and exec, and it is checked in
the parser when creating an AST node.  So it looks to me like it
asserts that the *internal* representation of the program is UTF-8
*after* transforming the input to an internal representation (doing
charset decoding, removing comments and line continuations, etc).

 > > Regarding your environment, the repeated use of "custom" is a red
 > > flag.  Unless you bundle your whole environment with the code you
 > > distribute, Python can know nothing about that.  In general, Python
 > > doesn't know what encoding it is receiving text in.
 > 
 > Well, the received text comes from sys.stdin and its encoding is
 > known.

How?  You keep asserting this.  *You* know, but how are you passing
that information to *the Python interpreter*?  Guido may have a time
machine, but nobody claims the Python interpreter is telepathic.

 > Ideally, Python would recieve the text as Unicode String object so
 > there would be no problem with encoding

Forget "ideal".  Python 3 was created (among other reasons) to get
closer to that ideal.  But programs in Python 2 are received as str,
which is bytes in an ASCII-compatible encoding, not unicode (unless
otherwise specified by PYTHONIOENCODING or a coding cookie in a source
file, and as far as I know that's the only ways to specify source
encoding).  This specification of "Python program" isn't going to
change in Python 2; that's one of the major unfixable reasons that
Python 2 and Python 3 will be incompatible forever.

 > The custom stdio streams and readline hooks are set at runtime by a
 > code in sitecustomize. It does not affect IDLE and it is compatible
 > with IPython. I would like to also set PyCF_SOURCE_IS_UTF8 at
 > runtime from Python e.g. via ctypes. But this may be impossible.

 > Yes. In the latter case, eval has no idea how the bytes given are
 > encoded.

Eval *never* knows how bytes are encoded, not even implicitly.  That's
one of the important reasons why Python 3 was necessary.  I think you
know that, but you don't write like you understand the implications
for your current work, which makes it hard to communicate.



From greg.ewing at canterbury.ac.nz  Fri May  1 07:58:55 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 01 May 2015 17:58:55 +1200
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <CAPJVwB=_UbdQBAWE1AYrEDNcDkUSBM2Xsj_eRQhSiLD1dEtTjQ@mail.gmail.com>
References: <CAP7+vJK6GQ=rJ__KeE=H1cAES3gThdg4okmq+iFFSvE6SushBw@mail.gmail.com>
 <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz>
 <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com>
 <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz>
 <CAPJVwBm3E_oY12VbuBOMApSouVuMbAjqkN+sMoi3tSVYwt_Sgw@mail.gmail.com>
 <55416DBF.3020303@gmail.com>
 <CAPJVwBkhHMzxcECaEskU3TqhKPm5ws0JiAd9s_pPtehDtxJbVw@mail.gmail.com>
 <5541EE2A.1070109@canterbury.ac.nz>
 <CAPJVwB=_UbdQBAWE1AYrEDNcDkUSBM2Xsj_eRQhSiLD1dEtTjQ@mail.gmail.com>
Message-ID: <5543161F.2090806@canterbury.ac.nz>

Nathaniel Smith wrote:

> If await acted like -, then this would be
>   await (x ** 2)
> But with the proposed grammar, it's instead
>   (await x) ** 2

Ah, I had missed that!

This is a *good* argument for Yuri's grammar.
I withdraw my objection now.

-- 
Greg

From drekin at gmail.com  Fri May  1 11:43:09 2015
From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=)
Date: Fri, 1 May 2015 11:43:09 +0200
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>

On Fri, May 1, 2015 at 6:14 AM, Stephen J. Turnbull <stephen at xemacs.org>
wrote:

> Adam Barto? writes:
>
>  > Unfortunately, it doesn't work. With PYTHONIOENCODING=utf-8, the
>  > sys.std* streams are created with utf-8 encoding (which doesn't
>  > help on Windows since they still don't use ReadConsoleW and
>  > WriteConsoleW to communicate with the terminal) and after changing
>  > the sys.std* streams to the fixed ones and setting readline hook,
>  > it still doesn't work,
>
> I don't see why you would expect it to work: either your code is
> bypassing PYTHONIOENCODING=utf-8 processing, and that variable doesn't
> matter, or you're feeding already decoded text *as UTF-8* to your
> module which evidently expects something else (UTF-16LE?).
>

I'll describe my picture of the situation, which might be terribly wrong.
On Linux, in a typical situation, we have a UTF-8 terminal,
PYTHONENIOENCODING=utf-8, GNU readline is used. When the REPL wants input
from a user the tokenizer calls PyOS_Readline, which calls GNU readline.
The user is prompted >>> , during the input he can use autocompletion and
everything and he enters u'?'. PyOS_Readline returns b"u'\xce\xb1'" (as
char* or something), which is UTF-8 encoded input from the user. The
tokenizer, parser, and evaluator process the input and the result is
u'\u03b1', which is printed as an answer.

In my case I install custom sys.std* objects and a custom readline hook.
Again, the tokenizer calls PyOS_Readline, which calls my readline hook,
which calls sys.stdin.readline(), which returns an Unicode string a user
entered (it was decoded from UTF-16-LE bytes actually). My readline hook
encodes this string to UTF-8 and returns it. So the situation is the same.
The tokenizer gets b"\u'xce\xb1'" as before, but know it results in
u'\xce\xb1'.

Why is the result different? I though that in the first case
PyCF_SOURCE_IS_UTF8 might have been set. And after your suggestion, I
thought that PYTHONIOENCODING=utf-8 is the thing that also sets
PyCF_SOURCE_IS_UTF8.



>  > so presumably the PyCF_SOURCE_IS_UTF8 is still not set.
>
> I don't think that flag does what you think it does.  AFAICT from
> looking at the source, that flag gets unconditionally set in the
> execution context for compile, eval, and exec, and it is checked in
> the parser when creating an AST node.  So it looks to me like it
> asserts that the *internal* representation of the program is UTF-8
> *after* transforming the input to an internal representation (doing
> charset decoding, removing comments and line continuations, etc).
>

I thought it might do what I want because of the behaviour of eval. I
thought that the PyUnicode_AsUTF8String call in eval just encodes the
passed unicode to UTF-8, so the situation looks like follows:
eval(u"u'\u031b'") -> (b"u'\xce\xb1'", PyCF_SOURCE_IS_UTF8 set) -> u'\u03b1'
eval(u"u'\u031b'".encode('utf-8')) -> (b"u'\xce\xb1'", PyCF_SOURCE_IS_UTF8
not set) -> u'\xce\xb1'
But of course, this my picture might be wrong.


 > Well, the received text comes from sys.stdin and its encoding is
>  > known.
>
> How?  You keep asserting this.  *You* know, but how are you passing
> that information to *the Python interpreter*?  Guido may have a time
> machine, but nobody claims the Python interpreter is telepathic.
>

I thought that the Python interpreter knows the input comes from sys.stdin
at least to some extent because in pythonrun.c:PyRun_InteractiveOneObject
the encoding for the tokenizer is inferred from sys.stdin.encoding. But
this is actually the case only in Python 3. So I was wrong.


 > Yes. In the latter case, eval has no idea how the bytes given are
>  > encoded.
>
> Eval *never* knows how bytes are encoded, not even implicitly.  That's
> one of the important reasons why Python 3 was necessary.  I think you
> know that, but you don't write like you understand the implications
> for your current work, which makes it hard to communicate.
>

Yes, eval never knows how bytes are encoded. But I meant it in comparison
with the first case where a Unicode string was passed.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/3533842c/attachment.html>

From steve at pearwood.info  Fri May  1 13:41:13 2015
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 1 May 2015 21:41:13 +1000
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
Message-ID: <20150501114112.GI5663@ando.pearwood.info>

On Wed, Apr 29, 2015 at 06:12:37PM -0700, Guido van Rossum wrote:
> On Wed, Apr 29, 2015 at 5:59 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> > On 30 April 2015 at 10:21, Ethan Furman <ethan at stoneleaf.us> wrote:
> > > From the PEP:
> > >
> > >> Why not a __future__ import
> > >>
> > >> __future__ imports are inconvenient and easy to forget to add.
> > >
> > > That is a horrible rationale for not using an import.  By that logic we
> > > should have everything in built-ins.  ;)
> >
> 
> This response is silly. The point is not against import but against
> __future__. A __future__ import definitely is inconvenient -- few people I
> know could even recite the correct constraints on their placement.

Are you talking about actual Python programmers, or people who dabble 
with the odd Python script now and again? I'm kinda shocked if it's the 
first.

It's not a complex rule: the __future__ import must be the first line of 
actual executable code in the file, so it can come after any encoding 
cookie, module docstring, comments and blank lines, but before any other 
code. The only part I didn't remember was that you can have multiple 
__future__ imports, I thought they all had to be on one line. (Nice to 
learn something new!)



[...]
> > 'as' went through the "not really a keyword" path, and
> > it's a recipe for complexity in the code generation toolchain and
> > general quirkiness as things behave in unexpected ways.
> >
> 
> I don't recall that -- but it was a really long time ago so I may
> misremember (did we even have __future__ at the time?).

I have a memory of much rejoicing when "as" was made a keyword, and an 
emphatic "we're never going to do that again!" about semi-keywords. I've 
tried searching for the relevant post(s), but cannot find anything. 
Maybe I imagined it?

But I do have Python 2.4 available, when we could write lovely code like 
this:

py> import math as as
py> as
<module 'math' from '/usr/lib/python2.4/lib-dynload/mathmodule.so'>

I'm definitely not looking forward to anything like that again.



-- 
Steve

From steve at pearwood.info  Fri May  1 13:54:48 2015
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 1 May 2015 21:54:48 +1000
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
Message-ID: <20150501115447.GJ5663@ando.pearwood.info>

On Wed, Apr 29, 2015 at 07:31:22PM -0700, Guido van Rossum wrote:

> Ah, but here's the other clever bit: it's only interpreted this way
> *inside* a function declared with 'async def'. Outside such functions,
> 'await' is not a keyword, so that grammar rule doesn't trigger. (Kind of
> similar to the way that the print_function __future__ disables the
> keyword-ness of 'print', except here it's toggled on or off depending on
> whether the nearest surrounding scope is 'async def' or not. The PEP could
> probably be clearer about this; it's all hidden in the Transition Plan
> section.)

You mean we could write code like this?

def await(x):
    ...


if condition:
    async def spam():
        await (eggs or cheese)
else:
    def spam():
        await(eggs or cheese)


I must admit that's kind of cool, but I'm sure I'd regret it.



-- 
Steve

From p.f.moore at gmail.com  Fri May  1 14:14:51 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 1 May 2015 13:14:51 +0100
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <20150501115447.GJ5663@ando.pearwood.info>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info>
Message-ID: <CACac1F_3a8DacWdoh=5CSBWCST_2HgzyjO96kK8uLYhPKmvc1w@mail.gmail.com>

On 1 May 2015 at 12:54, Steven D'Aprano <steve at pearwood.info> wrote:
> You mean we could write code like this?
>
> def await(x):
>     ...
>
>
> if condition:
>     async def spam():
>         await (eggs or cheese)
> else:
>     def spam():
>         await(eggs or cheese)
>
>
> I must admit that's kind of cool, but I'm sure I'd regret it.

You could, but there are people with buckets of tar and feathers
waiting for you if you do :-)
Paul

From stefan_ml at behnel.de  Fri May  1 14:39:21 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Fri, 01 May 2015 14:39:21 +0200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <554185C2.5080003@gmail.com>
References: <554185C2.5080003@gmail.com>
Message-ID: <mhvs5p$tg8$1@ger.gmane.org>

Yury Selivanov schrieb am 30.04.2015 um 03:30:
> Asynchronous Iterators and "async for"
> --------------------------------------
> 
> An *asynchronous iterable* is able to call asynchronous code in its
> *iter* implementation, and *asynchronous iterator* can call
> asynchronous code in its *next* method.  To support asynchronous
> iteration:
> 
> 1. An object must implement an  ``__aiter__`` method returning an
>    *awaitable* resulting in an *asynchronous iterator object*.
> 
> 2. An *asynchronous iterator object* must implement an ``__anext__``
>    method returning an *awaitable*.
> 
> 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
>    exception.

What this section does not explain, AFAICT, nor the section on design
considerations, is why the iterator protocol needs to be duplicated
entirely. Can't we just assume (or even validate) that any 'regular'
iterator returned from "__aiter__()" (as opposed to "__iter__()") properly
obeys to the new protocol? Why additionally duplicate "__next__()" and
"StopIteration"?

ISTM that all this really depends on is that "__next__()" returns an
awaitable. Renaming the method doesn't help with that guarantee.

Stefan



From stefan_ml at behnel.de  Fri May  1 14:50:19 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Fri, 01 May 2015 14:50:19 +0200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <554185C2.5080003@gmail.com>
References: <554185C2.5080003@gmail.com>
Message-ID: <mhvsqb$85j$1@ger.gmane.org>

Yury Selivanov schrieb am 30.04.2015 um 03:30:
> 1. Terminology:
> - *native coroutine* term is used for "async def" functions.

When I read "native", I think of native (binary) code. So "native
coroutine" sounds like it's implemented in some compiled low-level
language. That might be the case (certainly in the CPython world), but it's
not related to this PEP nor its set of examples.


> We should discuss how we will name new 'async def' coroutines in
> Python Documentation if the PEP is accepted.

Well, it doesn't hurt to avoid obvious misleading terminology upfront.

Stefan



From guido at python.org  Fri May  1 17:28:07 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 08:28:07 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <mhvs5p$tg8$1@ger.gmane.org>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
Message-ID: <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>

On Fri, May 1, 2015 at 5:39 AM, Stefan Behnel <stefan_ml at behnel.de> wrote:

> Yury Selivanov schrieb am 30.04.2015 um 03:30:
> > Asynchronous Iterators and "async for"
> > --------------------------------------
> >
> > An *asynchronous iterable* is able to call asynchronous code in its
> > *iter* implementation, and *asynchronous iterator* can call
> > asynchronous code in its *next* method.  To support asynchronous
> > iteration:
> >
> > 1. An object must implement an  ``__aiter__`` method returning an
> >    *awaitable* resulting in an *asynchronous iterator object*.
> >
> > 2. An *asynchronous iterator object* must implement an ``__anext__``
> >    method returning an *awaitable*.
> >
> > 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
> >    exception.
>
> What this section does not explain, AFAICT, nor the section on design
> considerations, is why the iterator protocol needs to be duplicated
> entirely. Can't we just assume (or even validate) that any 'regular'
> iterator returned from "__aiter__()" (as opposed to "__iter__()") properly
> obeys to the new protocol? Why additionally duplicate "__next__()" and
> "StopIteration"?
>
> ISTM that all this really depends on is that "__next__()" returns an
> awaitable. Renaming the method doesn't help with that guarantee.


This is an astute observation. I think its flaw (if any) is the situation
where we want a single object to be both a regular iterator and an async
iterator (say when migrating code to the new world). The __next__ method
might want to return a result while __anext__ has to return an awaitable.
The solution to that would be to have __aiter__ return an instance of a
different class than __iter__, but that's not always convenient.

Thus, aware of the choice, I would still prefer a separate __anext__ method.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/e23a4a77/attachment.html>

From guido at python.org  Fri May  1 17:31:20 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 08:31:20 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <mhvsqb$85j$1@ger.gmane.org>
References: <554185C2.5080003@gmail.com> <mhvsqb$85j$1@ger.gmane.org>
Message-ID: <CAP7+vJ+GwN-aYgo0Ud-_ihoejmdTpjy=B0qpR5MZr5SBWhTDog@mail.gmail.com>

On Fri, May 1, 2015 at 5:50 AM, Stefan Behnel <stefan_ml at behnel.de> wrote:

> Yury Selivanov schrieb am 30.04.2015 um 03:30:
> > 1. Terminology:
> > - *native coroutine* term is used for "async def" functions.
>
> When I read "native", I think of native (binary) code. So "native
> coroutine" sounds like it's implemented in some compiled low-level
> language. That might be the case (certainly in the CPython world), but it's
> not related to this PEP nor its set of examples.
>
>
> > We should discuss how we will name new 'async def' coroutines in
> > Python Documentation if the PEP is accepted.
>
> Well, it doesn't hurt to avoid obvious misleading terminology upfront.
>

I think "obvious[ly] misleading" is too strong, nobody is trying to mislead
anybody, we just have different associations with the same word. Given the
feedback I'd call "native coroutine" suboptimal (even though I proposed it
myself) and I am now in favor of using "async function".

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/57176907/attachment.html>

From gjcarneiro at gmail.com  Fri May  1 17:55:22 2015
From: gjcarneiro at gmail.com (Gustavo Carneiro)
Date: Fri, 1 May 2015 16:55:22 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJ+GwN-aYgo0Ud-_ihoejmdTpjy=B0qpR5MZr5SBWhTDog@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvsqb$85j$1@ger.gmane.org>
 <CAP7+vJ+GwN-aYgo0Ud-_ihoejmdTpjy=B0qpR5MZr5SBWhTDog@mail.gmail.com>
Message-ID: <CAO-CpE+nqJu7JkRj1TKfVACwOuF9Wc=xSDx2SRaL5F8ZZup4Yw@mail.gmail.com>

On 1 May 2015 at 16:31, Guido van Rossum <guido at python.org> wrote:

> On Fri, May 1, 2015 at 5:50 AM, Stefan Behnel <stefan_ml at behnel.de> wrote:
>
>> Yury Selivanov schrieb am 30.04.2015 um 03:30:
>> > 1. Terminology:
>> > - *native coroutine* term is used for "async def" functions.
>>
>> When I read "native", I think of native (binary) code. So "native
>> coroutine" sounds like it's implemented in some compiled low-level
>> language. That might be the case (certainly in the CPython world), but
>> it's
>> not related to this PEP nor its set of examples.
>>
>>
>> > We should discuss how we will name new 'async def' coroutines in
>> > Python Documentation if the PEP is accepted.
>>
>> Well, it doesn't hurt to avoid obvious misleading terminology upfront.
>>
>
> I think "obvious[ly] misleading" is too strong, nobody is trying to
> mislead anybody, we just have different associations with the same word.
> Given the feedback I'd call "native coroutine" suboptimal (even though I
> proposed it myself) and I am now in favor of using "async function".
>

But what if you have async methods?  I know, a method is almost a function,
but still, sounds slightly confusing.

IMHO, these are really classical coroutines.  If gevent calls them
coroutines, I don't think asyncio has any less right to call them
coroutines.

You have to ask yourself this: a new programmer, when he sees mentions of
coroutines, how likely is he to understand what he is dealing with?  What
about "async function"?  The former is a well known concept, since decades
ago, while the latter is something he probably (at least me) never heard of
before.

For me, an async function is just as likely to be an API that is
asynchronous in the sense that it takes an extra "callback" parameter to be
called when the asynchronous work is done.

I think coroutine is the name of a concept, not a specific implementation.

Cheers,

-- 
Gustavo J. A. M. Carneiro
Gambit Research
"The universe is always one step beyond logic." -- Frank Herbert
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/6752dae4/attachment.html>

From status at bugs.python.org  Fri May  1 18:14:25 2015
From: status at bugs.python.org (Python tracker)
Date: Fri,  1 May 2015 18:14:25 +0200 (CEST)
Subject: [Python-Dev] Summary of Python tracker Issues
Message-ID: <20150501161425.763F4561C7@psf.upfronthosting.co.za>


ACTIVITY SUMMARY (2015-04-24 - 2015-05-01)
Python tracker at http://bugs.python.org/

To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.

Issues counts and deltas:
  open    4841 (+27)
  closed 31025 (+25)
  total  35866 (+52)

Open issues with patches: 2254 


Issues opened (39)
==================

#24054: Invalid syntax in inspect_fodder2.py (on Python 2.x)
http://bugs.python.org/issue24054  opened by ddriddle

#24055: unittest package-level set up & tear down module
http://bugs.python.org/issue24055  opened by demian.brecht

#24056: Expose closure & generator status in function repr()
http://bugs.python.org/issue24056  opened by ncoghlan

#24060: Clearify necessities for logging with timestamps
http://bugs.python.org/issue24060  opened by krichter

#24063: Support Mageia and Arch Linux in the platform module
http://bugs.python.org/issue24063  opened by akien

#24064: Make the property doctstring writeable
http://bugs.python.org/issue24064  opened by rhettinger

#24065: Outdated *_RESTRICTED flags in structmember.h
http://bugs.python.org/issue24065  opened by berker.peksag

#24066: send_message should take all the addresses in the To: header i
http://bugs.python.org/issue24066  opened by kirelagin

#24067: Weakproxy is an instance of collections.Iterator
http://bugs.python.org/issue24067  opened by ereuveni

#24068: statistics module - incorrect results with boolean input
http://bugs.python.org/issue24068  opened by wolma

#24069: Option to delete obsolete bytecode files
http://bugs.python.org/issue24069  opened by Sworddragon

#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076  opened by lukasz.langa

#24078: inspect.getsourcelines ignores context and returns wrong line 
http://bugs.python.org/issue24078  opened by siyuan

#24079: xml.etree.ElementTree.Element.text does not conform to the doc
http://bugs.python.org/issue24079  opened by jlaurens

#24080: asyncio.Event.wait() Task was destroyed but it is pending
http://bugs.python.org/issue24080  opened by matt

#24081: Obsolete caveat in reload() docs
http://bugs.python.org/issue24081  opened by encukou

#24082: Obsolete note in argument parsing (c-api/arg.rst)
http://bugs.python.org/issue24082  opened by encukou

#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084  opened by Romuald

#24085: large memory overhead when pyc is recompiled
http://bugs.python.org/issue24085  opened by bukzor

#24086: Configparser interpolation is unexpected
http://bugs.python.org/issue24086  opened by tbekolay

#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087  opened by paul.moore

#24088: yield expression confusion
http://bugs.python.org/issue24088  opened by Jim.Jewett

#24089: argparse crashes with AssertionError
http://bugs.python.org/issue24089  opened by spaceone

#24090: Add a "copy variable to clipboard" option to the edit menu
http://bugs.python.org/issue24090  opened by irdb

#24091: Use after free in Element.extend (1)
http://bugs.python.org/issue24091  opened by pkt

#24092: Use after free in Element.extend (2)
http://bugs.python.org/issue24092  opened by pkt

#24093: Use after free in Element.remove
http://bugs.python.org/issue24093  opened by pkt

#24094: Use after free during json encoding (PyType_IsSubtype)
http://bugs.python.org/issue24094  opened by pkt

#24095: Use after free during json encoding a dict (2)
http://bugs.python.org/issue24095  opened by pkt

#24096: Use after free in get_filter
http://bugs.python.org/issue24096  opened by pkt

#24097: Use after free in PyObject_GetState
http://bugs.python.org/issue24097  opened by pkt

#24098: Multiple use after frees in obj2ast_* methods
http://bugs.python.org/issue24098  opened by pkt

#24099: Use after free in siftdown (1)
http://bugs.python.org/issue24099  opened by pkt

#24100: Use after free in siftdown (2)
http://bugs.python.org/issue24100  opened by pkt

#24101: Use after free in siftup
http://bugs.python.org/issue24101  opened by pkt

#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102  opened by pkt

#24103: Use after free in xmlparser_setevents (1)
http://bugs.python.org/issue24103  opened by pkt

#24104: Use after free in xmlparser_setevents (2)
http://bugs.python.org/issue24104  opened by pkt

#24105: Use after free during json encoding a dict (3)
http://bugs.python.org/issue24105  opened by pkt



Most recent 15 issues with no replies (15)
==========================================

#24105: Use after free during json encoding a dict (3)
http://bugs.python.org/issue24105

#24104: Use after free in xmlparser_setevents (2)
http://bugs.python.org/issue24104

#24103: Use after free in xmlparser_setevents (1)
http://bugs.python.org/issue24103

#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102

#24101: Use after free in siftup
http://bugs.python.org/issue24101

#24100: Use after free in siftdown (2)
http://bugs.python.org/issue24100

#24099: Use after free in siftdown (1)
http://bugs.python.org/issue24099

#24098: Multiple use after frees in obj2ast_* methods
http://bugs.python.org/issue24098

#24097: Use after free in PyObject_GetState
http://bugs.python.org/issue24097

#24095: Use after free during json encoding a dict (2)
http://bugs.python.org/issue24095

#24094: Use after free during json encoding (PyType_IsSubtype)
http://bugs.python.org/issue24094

#24093: Use after free in Element.remove
http://bugs.python.org/issue24093

#24092: Use after free in Element.extend (2)
http://bugs.python.org/issue24092

#24091: Use after free in Element.extend (1)
http://bugs.python.org/issue24091

#24090: Add a "copy variable to clipboard" option to the edit menu
http://bugs.python.org/issue24090



Most recent 15 issues waiting for review (15)
=============================================

#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087

#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084

#24082: Obsolete note in argument parsing (c-api/arg.rst)
http://bugs.python.org/issue24082

#24081: Obsolete caveat in reload() docs
http://bugs.python.org/issue24081

#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076

#24068: statistics module - incorrect results with boolean input
http://bugs.python.org/issue24068

#24066: send_message should take all the addresses in the To: header i
http://bugs.python.org/issue24066

#24064: Make the property doctstring writeable
http://bugs.python.org/issue24064

#24056: Expose closure & generator status in function repr()
http://bugs.python.org/issue24056

#24054: Invalid syntax in inspect_fodder2.py (on Python 2.x)
http://bugs.python.org/issue24054

#24053: Define EXIT_SUCCESS and EXIT_FAILURE constants in sys
http://bugs.python.org/issue24053

#24042: Convert os._getfullpathname() and os._isdir() to Argument Clin
http://bugs.python.org/issue24042

#24037: Argument Clinic: add the boolint converter
http://bugs.python.org/issue24037

#24036: GB2312 codec is using a wrong covert table
http://bugs.python.org/issue24036

#24034: Make fails Objects/typeslots.inc
http://bugs.python.org/issue24034



Top 10 most discussed issues (10)
=================================

#24053: Define EXIT_SUCCESS and EXIT_FAILURE constants in sys
http://bugs.python.org/issue24053  30 msgs

#24067: Weakproxy is an instance of collections.Iterator
http://bugs.python.org/issue24067  12 msgs

#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076  11 msgs

#24085: large memory overhead when pyc is recompiled
http://bugs.python.org/issue24085  11 msgs

#23749: asyncio missing wrap_socket
http://bugs.python.org/issue23749  10 msgs

#24018: add a Generator ABC
http://bugs.python.org/issue24018  10 msgs

#24052: sys.exit(code) returns "success" to the OS for some nonzero va
http://bugs.python.org/issue24052   9 msgs

#23955: Add python.ini file for embedded/applocal installs
http://bugs.python.org/issue23955   8 msgs

#17908: Unittest runner needs an option to call gc.collect() after eac
http://bugs.python.org/issue17908   7 msgs

#24043: Implement mac_romanian and mac_croatian encodings
http://bugs.python.org/issue24043   6 msgs



Issues closed (23)
==================

#9246: os.getcwd() hardcodes max path len
http://bugs.python.org/issue9246  closed by haypo

#21354: PyCFunction_New no longer exposed by python DLL breaking bdist
http://bugs.python.org/issue21354  closed by asvetlov

#23058: argparse silently ignores arguments
http://bugs.python.org/issue23058  closed by barry

#23342: run() - unified high-level interface for subprocess
http://bugs.python.org/issue23342  closed by gregory.p.smith

#23356: In argparse docs simplify example about argline
http://bugs.python.org/issue23356  closed by berker.peksag

#23668: Support os.ftruncate on Windows
http://bugs.python.org/issue23668  closed by steve.dower

#23852: Wrong computation of max_fd on OpenBSD
http://bugs.python.org/issue23852  closed by gregory.p.smith

#23910: property_descr_get reuse argument tuple
http://bugs.python.org/issue23910  closed by rhettinger

#23996: _PyGen_FetchStopIterationValue() crashes on unnormalised excep
http://bugs.python.org/issue23996  closed by pitrou

#24035: When Caps Locked, <Shift> + alpha-character still displayed as
http://bugs.python.org/issue24035  closed by principia1687

#24057: trivial typo in datetime.rst: needing a preceding dot
http://bugs.python.org/issue24057  closed by python-dev

#24058: Compiler warning for readline extension
http://bugs.python.org/issue24058  closed by python-dev

#24059: Minor speed and readability improvement to the random module
http://bugs.python.org/issue24059  closed by rhettinger

#24061: Python 2.x breaks with address sanitizer
http://bugs.python.org/issue24061  closed by python-dev

#24062: links to os.stat() in documentation lead to stat module instea
http://bugs.python.org/issue24062  closed by berker.peksag

#24070: Exceptions and arguments disappear when using argparse inside 
http://bugs.python.org/issue24070  closed by benjamin.peterson

#24071: Python 2.7.8, 2.7.9  re.MULTILINE failure
http://bugs.python.org/issue24071  closed by serhiy.storchaka

#24072: xml.etree.ElementTree.Element does not catch text
http://bugs.python.org/issue24072  closed by ned.deily

#24073: sys.stdin.mode can not give the right mode and os.fdopen does 
http://bugs.python.org/issue24073  closed by ned.deily

#24074: string, center, ljust, rjust, width paramter should accept Non
http://bugs.python.org/issue24074  closed by rhettinger

#24075: list.sort() should do quick exit if len(list) <= 1
http://bugs.python.org/issue24075  closed by benjamin.peterson

#24077: man page says -I implies -S. code says -s.
http://bugs.python.org/issue24077  closed by ned.deily

#24083: MSVCCompiler.get_msvc_path() doesn't work on Win x64
http://bugs.python.org/issue24083  closed by lemburg

From guido at python.org  Fri May  1 18:59:35 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 09:59:35 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAO-CpE+nqJu7JkRj1TKfVACwOuF9Wc=xSDx2SRaL5F8ZZup4Yw@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvsqb$85j$1@ger.gmane.org>
 <CAP7+vJ+GwN-aYgo0Ud-_ihoejmdTpjy=B0qpR5MZr5SBWhTDog@mail.gmail.com>
 <CAO-CpE+nqJu7JkRj1TKfVACwOuF9Wc=xSDx2SRaL5F8ZZup4Yw@mail.gmail.com>
Message-ID: <CAP7+vJJKGjHJRo9gycadhnhdsxmNTDw2yLb-uFFNnBESGKfDXg@mail.gmail.com>

On Fri, May 1, 2015 at 8:55 AM, Gustavo Carneiro <gjcarneiro at gmail.com>
wrote:

>
>
>
> On 1 May 2015 at 16:31, Guido van Rossum <guido at python.org> wrote:
>
>> On Fri, May 1, 2015 at 5:50 AM, Stefan Behnel <stefan_ml at behnel.de>
>> wrote:
>>
>>> Yury Selivanov schrieb am 30.04.2015 um 03:30:
>>> > 1. Terminology:
>>> > - *native coroutine* term is used for "async def" functions.
>>>
>>> When I read "native", I think of native (binary) code. So "native
>>> coroutine" sounds like it's implemented in some compiled low-level
>>> language. That might be the case (certainly in the CPython world), but
>>> it's
>>> not related to this PEP nor its set of examples.
>>>
>>>
>>> > We should discuss how we will name new 'async def' coroutines in
>>> > Python Documentation if the PEP is accepted.
>>>
>>> Well, it doesn't hurt to avoid obvious misleading terminology upfront.
>>>
>>
>> I think "obvious[ly] misleading" is too strong, nobody is trying to
>> mislead anybody, we just have different associations with the same word.
>> Given the feedback I'd call "native coroutine" suboptimal (even though I
>> proposed it myself) and I am now in favor of using "async function".
>>
>
> But what if you have async methods?  I know, a method is almost a
> function, but still, sounds slightly confusing.
>
> IMHO, these are really classical coroutines.  If gevent calls them
> coroutines, I don't think asyncio has any less right to call them
> coroutines.
>
> You have to ask yourself this: a new programmer, when he sees mentions of
> coroutines, how likely is he to understand what he is dealing with?  What
> about "async function"?  The former is a well known concept, since decades
> ago, while the latter is something he probably (at least me) never heard of
> before.
>
> For me, an async function is just as likely to be an API that is
> asynchronous in the sense that it takes an extra "callback" parameter to be
> called when the asynchronous work is done.
>
> I think coroutine is the name of a concept, not a specific implementation.
>
> Cheers,
>
>  Cheers indeed! I agree that the *concept* is best called coroutine -- and
we have used this term ever since PEP 342. But when we're talking specifics
and trying to distinguish e.g. a function declared with 'async def' from a
regular function or from a regular generator function, using 'async
function' sounds right. And 'async method' if it's a method.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/0c25da00/attachment.html>

From stefan_ml at behnel.de  Fri May  1 19:10:06 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Fri, 01 May 2015 19:10:06 +0200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
Message-ID: <mi0c1e$819$1@ger.gmane.org>

Guido van Rossum schrieb am 01.05.2015 um 17:28:
> On Fri, May 1, 2015 at 5:39 AM, Stefan Behnel wrote:
> 
>> Yury Selivanov schrieb am 30.04.2015 um 03:30:
>>> Asynchronous Iterators and "async for"
>>> --------------------------------------
>>>
>>> An *asynchronous iterable* is able to call asynchronous code in its
>>> *iter* implementation, and *asynchronous iterator* can call
>>> asynchronous code in its *next* method.  To support asynchronous
>>> iteration:
>>>
>>> 1. An object must implement an  ``__aiter__`` method returning an
>>>    *awaitable* resulting in an *asynchronous iterator object*.
>>>
>>> 2. An *asynchronous iterator object* must implement an ``__anext__``
>>>    method returning an *awaitable*.
>>>
>>> 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
>>>    exception.
>>
>> What this section does not explain, AFAICT, nor the section on design
>> considerations, is why the iterator protocol needs to be duplicated
>> entirely. Can't we just assume (or even validate) that any 'regular'
>> iterator returned from "__aiter__()" (as opposed to "__iter__()") properly
>> obeys to the new protocol? Why additionally duplicate "__next__()" and
>> "StopIteration"?
>>
>> ISTM that all this really depends on is that "__next__()" returns an
>> awaitable. Renaming the method doesn't help with that guarantee.
> 
> 
> This is an astute observation. I think its flaw (if any) is the situation
> where we want a single object to be both a regular iterator and an async
> iterator (say when migrating code to the new world). The __next__ method
> might want to return a result while __anext__ has to return an awaitable.
> The solution to that would be to have __aiter__ return an instance of a
> different class than __iter__, but that's not always convenient.

My personal gut feeling is that this case would best be handled by a
generic wrapper class. Both are well defined protocols, so I don't expect
people to change all of their classes and instead return a wrapped object
either from __iter__() or __aiter__(), depending on which they want to
optimise for, or which will eventually turn out to be easier to wrap.

But that's trying to predict the [Ff]uture, obviously. It just feels like
unnecessary complexity for now. And we already have a type slot for
__next__ ("tp_iternext"), but not for __anext__.

Stefan



From jimjjewett at gmail.com  Fri May  1 20:26:28 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Fri, 1 May 2015 14:26:28 -0400
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
Message-ID: <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>

On Thu, Apr 30, 2015 at 3:32 PM, Guido van Rossum <guido at python.org> wrote:

(me:)
>> A badly worded attempt to say
>> Normal generator:  yield (as opposed to return) means
>> that the function isn't done, and there may be more
>> things to return later.

>> but an asynchronous (PEP492) coroutine is primarily saying:

>>      "This might take a while, go ahead and do something else
>> meanwhile."

(Yuri:) Correct.

(Guido:)> Actually that's not even wrong. When using generators as
coroutines, PEP 342
> style, "yield" means "I am blocked waiting for a result that the I/O
> multiplexer is eventually going to produce".

So does this mean that yield should NOT be used just to yield control
if a task isn't blocked?  (e.g., if its next step is likely to be
long, or low priority.)  Or even that it wouldn't be considered a
co-routine in the python sense?

If this is really just about avoiding busy-wait on network IO, then
coroutine is way too broad a term, and I'm uncomfortable restricting a
new keyword (async or await) to what is essentially a Domain Specific
Language.

-jJ

From yselivanov.ml at gmail.com  Fri May  1 20:52:17 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 01 May 2015 14:52:17 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <mi0c1e$819$1@ger.gmane.org>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org>
Message-ID: <5543CB61.2080905@gmail.com>

Stefan,

I don't like the idea of combining __next__ and __anext__.
In this case explicit is better than implicit.  __next__
returning coroutines is a perfectly normal thing for a
normal 'for' loop (it wouldn't to anything with them),
whereas 'async for' will interpret that differently, and
will try to await those coroutines.

Yury

On 2015-05-01 1:10 PM, Stefan Behnel wrote:
> Guido van Rossum schrieb am 01.05.2015 um 17:28:
>> On Fri, May 1, 2015 at 5:39 AM, Stefan Behnel wrote:
>>
>>> Yury Selivanov schrieb am 30.04.2015 um 03:30:
>>>> Asynchronous Iterators and "async for"
>>>> --------------------------------------
>>>>
>>>> An *asynchronous iterable* is able to call asynchronous code in its
>>>> *iter* implementation, and *asynchronous iterator* can call
>>>> asynchronous code in its *next* method.  To support asynchronous
>>>> iteration:
>>>>
>>>> 1. An object must implement an  ``__aiter__`` method returning an
>>>>     *awaitable* resulting in an *asynchronous iterator object*.
>>>>
>>>> 2. An *asynchronous iterator object* must implement an ``__anext__``
>>>>     method returning an *awaitable*.
>>>>
>>>> 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
>>>>     exception.
>>> What this section does not explain, AFAICT, nor the section on design
>>> considerations, is why the iterator protocol needs to be duplicated
>>> entirely. Can't we just assume (or even validate) that any 'regular'
>>> iterator returned from "__aiter__()" (as opposed to "__iter__()") properly
>>> obeys to the new protocol? Why additionally duplicate "__next__()" and
>>> "StopIteration"?
>>>
>>> ISTM that all this really depends on is that "__next__()" returns an
>>> awaitable. Renaming the method doesn't help with that guarantee.
>>
>> This is an astute observation. I think its flaw (if any) is the situation
>> where we want a single object to be both a regular iterator and an async
>> iterator (say when migrating code to the new world). The __next__ method
>> might want to return a result while __anext__ has to return an awaitable.
>> The solution to that would be to have __aiter__ return an instance of a
>> different class than __iter__, but that's not always convenient.
> My personal gut feeling is that this case would best be handled by a
> generic wrapper class. Both are well defined protocols, so I don't expect
> people to change all of their classes and instead return a wrapped object
> either from __iter__() or __aiter__(), depending on which they want to
> optimise for, or which will eventually turn out to be easier to wrap.
>
> But that's trying to predict the [Ff]uture, obviously. It just feels like
> unnecessary complexity for now. And we already have a type slot for
> __next__ ("tp_iternext"), but not for __anext__.
>
> Stefan
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com


From guido at python.org  Fri May  1 20:59:40 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 11:59:40 -0700
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
Message-ID: <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>

On Fri, May 1, 2015 at 11:26 AM, Jim J. Jewett <jimjjewett at gmail.com> wrote:

> On Thu, Apr 30, 2015 at 3:32 PM, Guido van Rossum <guido at python.org>
> wrote:
>
> (me:)
> >> A badly worded attempt to say
> >> Normal generator:  yield (as opposed to return) means
> >> that the function isn't done, and there may be more
> >> things to return later.
>
> >> but an asynchronous (PEP492) coroutine is primarily saying:
>
> >>      "This might take a while, go ahead and do something else
> >> meanwhile."
>
> (Yuri:) Correct.
>
> (Guido:)> Actually that's not even wrong. When using generators as
> coroutines, PEP 342
> > style, "yield" means "I am blocked waiting for a result that the I/O
> > multiplexer is eventually going to produce".
>
> So does this mean that yield should NOT be used just to yield control
> if a task isn't blocked?  (e.g., if its next step is likely to be
> long, or low priority.)  Or even that it wouldn't be considered a
> co-routine in the python sense?
>

I'm not sure what you're talking about. Does "next step" refer to something
in the current stack frame or something that you're calling? None of the
current uses of "yield" (the keyword) in Python are good for lowering
priority of something. It's not just the GIL, it's that coroutines (by
whatever name) are still single-threaded. If you have something
long-running CPU-intensive you should probably run it in a background
thread (or process) e.g. using an executor.


> If this is really just about avoiding busy-wait on network IO, then
> coroutine is way too broad a term, and I'm uncomfortable restricting a
> new keyword (async or await) to what is essentially a Domain Specific
> Language.
>

The common use case is network I/O. But it's quite possible to integrate
coroutines with a UI event loop.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/f806c251/attachment.html>

From stefan_ml at behnel.de  Fri May  1 21:03:27 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Fri, 01 May 2015 21:03:27 +0200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5543CB61.2080905@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
Message-ID: <mi0im0$lo5$1@ger.gmane.org>

Yury Selivanov schrieb am 01.05.2015 um 20:52:
> I don't like the idea of combining __next__ and __anext__.
> In this case explicit is better than implicit.  __next__
> returning coroutines is a perfectly normal thing for a
> normal 'for' loop (it wouldn't to anything with them),
> whereas 'async for' will interpret that differently, and
> will try to await those coroutines.

Sure, but the difference is that one would have called __aiter__() first
and the other __iter__(). Normally, either of the two would not exist, so
using the wrong loop on an object will just fail. However, after we passed
that barrier, we already know that the object that was returned is supposed
to obey to the expected protocol, so it doesn't matter whether we call
__next__() or name it __anext__(), except that the second requires us to
duplicate an existing protocol.

This has nothing to do with implicit vs. explicit.

Stefan



From ethan at stoneleaf.us  Fri May  1 21:19:37 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Fri, 1 May 2015 12:19:37 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <mi0im0$lo5$1@ger.gmane.org>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
Message-ID: <20150501191937.GB8013@stoneleaf.us>

On 05/01, Stefan Behnel wrote:
> Yury Selivanov schrieb am 01.05.2015 um 20:52:

>> I don't like the idea of combining __next__ and __anext__.
>> In this case explicit is better than implicit.  __next__
>> returning coroutines is a perfectly normal thing for a
>> normal 'for' loop (it wouldn't to anything with them),
>> whereas 'async for' will interpret that differently, and
>> will try to await those coroutines.
> 
> Sure, but the difference is that one would have called __aiter__() first
> and the other __iter__(). Normally, either of the two would not exist, so
> using the wrong loop on an object will just fail. However, after we passed
> that barrier, we already know that the object that was returned is supposed
> to obey to the expected protocol, so it doesn't matter whether we call
> __next__() or name it __anext__(), except that the second requires us to
> duplicate an existing protocol.

If we must have __aiter__, then we may as well also have __anext__; besides
being more consistent, it also allows an object to be both a normol iterator
and an asynch iterator.

--
~Ethan~

From yselivanov.ml at gmail.com  Fri May  1 21:23:55 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 01 May 2015 15:23:55 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <mi0im0$lo5$1@ger.gmane.org>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
Message-ID: <5543D2CB.7020202@gmail.com>

Let's say it this way: I want to know what I am looking at
when I browse through the code -- an asynchronous iterator,
or a normal iterator.  I want an explicit difference between
these protocols, because they are different.

Moreover, the below code is a perfectly valid, infinite
iterable:

     class SomeIterable:
          def __iter__(self):
              return self
          async def __next__(self):
              return 'spam'

I'm strong -1 on this idea.

Yury

On 2015-05-01 3:03 PM, Stefan Behnel wrote:
> Yury Selivanov schrieb am 01.05.2015 um 20:52:
>> I don't like the idea of combining __next__ and __anext__.
>> In this case explicit is better than implicit.  __next__
>> returning coroutines is a perfectly normal thing for a
>> normal 'for' loop (it wouldn't to anything with them),
>> whereas 'async for' will interpret that differently, and
>> will try to await those coroutines.
> Sure, but the difference is that one would have called __aiter__() first
> and the other __iter__(). Normally, either of the two would not exist, so
> using the wrong loop on an object will just fail. However, after we passed
> that barrier, we already know that the object that was returned is supposed
> to obey to the expected protocol, so it doesn't matter whether we call
> __next__() or name it __anext__(), except that the second requires us to
> duplicate an existing protocol.
>
> This has nothing to do with implicit vs. explicit.
>
> Stefan
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com


From ethan at stoneleaf.us  Fri May  1 21:24:04 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Fri, 1 May 2015 12:24:04 -0700
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
 <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
Message-ID: <20150501192404.GC8013@stoneleaf.us>

On 05/01, Guido van Rossum wrote:
> On Fri, May 1, 2015 at 11:26 AM, Jim J. Jewett <jimjjewett at gmail.com> wrote:

>> So does this mean that yield should NOT be used just to yield control
>> if a task isn't blocked?  (e.g., if its next step is likely to be
>> long, or low priority.)  Or even that it wouldn't be considered a
>> co-routine in the python sense?
>>
> 
> I'm not sure what you're talking about. Does "next step" refer to something
> in the current stack frame or something that you're calling? None of the
> current uses of "yield" (the keyword) in Python are good for lowering
> priority of something. It's not just the GIL, it's that coroutines (by
> whatever name) are still single-threaded. If you have something
> long-running CPU-intensive you should probably run it in a background
> thread (or process) e.g. using an executor.

So when a generator is used as an iterator, yield and yield from are used
to produce the actual working values...

But when a generator is used as a coroutine, yield (and yield from?) are
used to provide context about when they should be run again?

--
~Ethan~

From yselivanov.ml at gmail.com  Fri May  1 21:24:36 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 01 May 2015 15:24:36 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <20150501191937.GB8013@stoneleaf.us>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
Message-ID: <5543D2F4.3060207@gmail.com>

On 2015-05-01 3:19 PM, Ethan Furman wrote:
>> Sure, but the difference is that one would have called __aiter__() first
>> >and the other __iter__(). Normally, either of the two would not exist, so
>> >using the wrong loop on an object will just fail. However, after we passed
>> >that barrier, we already know that the object that was returned is supposed
>> >to obey to the expected protocol, so it doesn't matter whether we call
>> >__next__() or name it __anext__(), except that the second requires us to
>> >duplicate an existing protocol.
> If we must have __aiter__, then we may as well also have __anext__; besides
> being more consistent, it also allows an object to be both a normol iterator
> and an asynch iterator.

And this is a good point too.

Thanks,
Yury

From jimjjewett at gmail.com  Fri May  1 21:48:51 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Fri, 1 May 2015 15:48:51 -0400
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
 <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
Message-ID: <CA+OGgf46Gg+DqMf+_LP1oLJbmtWoHYfq88Hkak_VO-si6ag1cQ@mail.gmail.com>

On Fri, May 1, 2015 at 2:59 PM, Guido van Rossum <guido at python.org> wrote:
> On Fri, May 1, 2015 at 11:26 AM, Jim J. Jewett <jimjjewett at gmail.com> wrote:
>>
>> On Thu, Apr 30, 2015 at 3:32 PM, Guido van Rossum <guido at python.org>
>> wrote:
>>

>> (Guido:)> Actually that's not even wrong. When using generators as
>> coroutines, PEP 342
>> > style, "yield" means "I am blocked waiting for a result that the I/O
>> > multiplexer is eventually going to produce".

>> So does this mean that yield should NOT be used just to yield control
>> if a task isn't blocked?  (e.g., if its next step is likely to be
>> long, or low priority.)  Or even that it wouldn't be considered a
>> co-routine in the python sense?

> I'm not sure what you're talking about. Does "next step" refer to something
> in the current stack frame or something that you're calling?

The next piece of your algorithm.

> None of the
> current uses of "yield" (the keyword) in Python are good for lowering
> priority of something.

If there are more tasks than executors, yield is a way to release your
current executor and go to the back of the line.  I'm pretty sure I
saw several examples of that style back when coroutines were first
discussed.

-jJ

From ron3200 at gmail.com  Fri May  1 21:49:26 2015
From: ron3200 at gmail.com (Ron Adam)
Date: Fri, 01 May 2015 15:49:26 -0400
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <20150501115447.GJ5663@ando.pearwood.info>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info>
Message-ID: <mi0lc7$1eh$1@ger.gmane.org>



On 05/01/2015 07:54 AM, Steven D'Aprano wrote:
> On Wed, Apr 29, 2015 at 07:31:22PM -0700, Guido van Rossum wrote:
>
>> >Ah, but here's the other clever bit: it's only interpreted this way
>> >*inside*  a function declared with 'async def'. Outside such functions,
>> >'await' is not a keyword, so that grammar rule doesn't trigger. (Kind of
>> >similar to the way that the print_function __future__ disables the
>> >keyword-ness of 'print', except here it's toggled on or off depending on
>> >whether the nearest surrounding scope is 'async def' or not. The PEP could
>> >probably be clearer about this; it's all hidden in the Transition Plan
>> >section.)
> You mean we could write code like this?
>
> def await(x):
>      ...
>
>
> if condition:
>      async def spam():
>          await (eggs or cheese)
> else:
>      def spam():
>          await(eggs or cheese)
>
>
> I must admit that's kind of cool, but I'm sure I'd regret it.


Actually in the above...

    def await(x):
        return x

Then in any scope where async is used, the keyword will mask the await 
function.


Are the following correct?


Another useful async function might be...

    async def yielding():
        pass

In a routine is taking very long time, just inserting "await yielding()" in 
the long calculation would let other awaitables run.


If the async loop only has one coroutine (awaitable) in it, then it will be 
just like calling a regular function.  No waiting would occur.


-Ron


From guido at python.org  Fri May  1 21:59:41 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 12:59:41 -0700
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <mi0lc7$1eh$1@ger.gmane.org>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info> <mi0lc7$1eh$1@ger.gmane.org>
Message-ID: <CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>

On Fri, May 1, 2015 at 12:49 PM, Ron Adam <ron3200 at gmail.com> wrote:

>
> Another useful async function might be...
>
>    async def yielding():
>        pass
>
> In a routine is taking very long time, just inserting "await yielding()"
> in the long calculation would let other awaitables run.
>
> That's really up to the scheduler, and a function like this should be
provided by the event loop or scheduler framework you're using.

>
> If the async loop only has one coroutine (awaitable) in it, then it will
> be just like calling a regular function.  No waiting would occur.
>

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/a3cc40c7/attachment.html>

From yselivanov.ml at gmail.com  Fri May  1 22:06:27 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 01 May 2015 16:06:27 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5543D2CB.7020202@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <5543D2CB.7020202@gmail.com>
Message-ID: <5543DCC3.7040907@gmail.com>

On 2015-05-01 3:23 PM, Yury Selivanov wrote:
> Let's say it this way: I want to know what I am looking at
> when I browse through the code -- an asynchronous iterator,
> or a normal iterator.  I want an explicit difference between
> these protocols, because they are different.
>
> Moreover, the below code is a perfectly valid, infinite
> iterable:
>
>     class SomeIterable:
>          def __iter__(self):
>              return self
>          async def __next__(self):
>              return 'spam'
>
> I'm strong -1 on this idea.
>

To further clarify on the example:

     class SomeIterable:
          def __iter__(self):
              return self
          async def __aiter__(self):
              return self
          async def __next__(self):
              print('hello')
              raise StopAsyncIteration


If you pass this to 'async for' you will get
'hello' printed and the loop will be over.

If you pass this to 'for', you will get an
infinite loop, because '__next__' will return a
coroutine object (that has to be also awaited,
but it wouldn't, because it's a plain 'for'
statement).

This is something that we shouldn't let happen.

Yury

From guido at python.org  Fri May  1 22:08:12 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 13:08:12 -0700
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <20150501192404.GC8013@stoneleaf.us>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
 <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
 <20150501192404.GC8013@stoneleaf.us>
Message-ID: <CAP7+vJLD5WdJ8HGd6Tnnn-qx6a9fF=mLn-VVJfmTnysGyvb3tg@mail.gmail.com>

On Fri, May 1, 2015 at 12:24 PM, Ethan Furman <ethan at stoneleaf.us> wrote:

> On 05/01, Guido van Rossum wrote:
> > On Fri, May 1, 2015 at 11:26 AM, Jim J. Jewett <jimjjewett at gmail.com>
> wrote:
>
> >> So does this mean that yield should NOT be used just to yield control
> >> if a task isn't blocked?  (e.g., if its next step is likely to be
> >> long, or low priority.)  Or even that it wouldn't be considered a
> >> co-routine in the python sense?
> >>
> >
> > I'm not sure what you're talking about. Does "next step" refer to
> something
> > in the current stack frame or something that you're calling? None of the
> > current uses of "yield" (the keyword) in Python are good for lowering
> > priority of something. It's not just the GIL, it's that coroutines (by
> > whatever name) are still single-threaded. If you have something
> > long-running CPU-intensive you should probably run it in a background
> > thread (or process) e.g. using an executor.
>
> So when a generator is used as an iterator, yield and yield from are used
> to produce the actual working values...
>
> But when a generator is used as a coroutine, yield (and yield from?) are
> used to provide context about when they should be run again?
>

The common thing is that the *argument* to yield provides info to
whoever/whatever is on the other end, and the *return value* from yield
[from] is whatever they returned in response.

When using yield to implement an iterator, there is no return value from
yield -- the other end is the for-loop that calls __next__, and it just
says "give me the next value", and the value passed to yield is that next
value.

When using yield [from] to implement a coroutine the other end is probably
a trampoline or scheduler or multiplexer. The argument to yield [from]
tells the scheduler what you are waiting for. The scheduler resumes the
coroutine when that value is avaiable.

At this point please go read Greg Ewing's tutorial. Seriously.
http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/yield_from.html

Note that when using yield from, there is a third player: the coroutine
that contains the "yield from". This is neither the scheduler nor the other
thing; the communication between the scheduler and the other thing passes
transparently *through* this coroutine. When the other thing has a value
for this coroutine, it uses *return* to send it a value. The "other thing"
here is a lower-level coroutine -- it could either itself also use
yield-from and return, or it could be an "I/O primitive" that actually
gives the scheduler a specific instruction (e.g. wait until this socket
becomes readable).

Please do read Greg's tutorial.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/48023276/attachment.html>

From guido at python.org  Fri May  1 22:10:01 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 13:10:01 -0700
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CA+OGgf46Gg+DqMf+_LP1oLJbmtWoHYfq88Hkak_VO-si6ag1cQ@mail.gmail.com>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
 <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
 <CA+OGgf46Gg+DqMf+_LP1oLJbmtWoHYfq88Hkak_VO-si6ag1cQ@mail.gmail.com>
Message-ID: <CAP7+vJLjX_kU7RwyF42Vw0DN2h9fvLpH9DdgFONm+1gyauh-8g@mail.gmail.com>

On Fri, May 1, 2015 at 12:48 PM, Jim J. Jewett <jimjjewett at gmail.com> wrote:

> If there are more tasks than executors, yield is a way to release your
> current executor and go to the back of the line.  I'm pretty sure I
> saw several examples of that style back when coroutines were first
> discussed.
>

Could you dig up the actual references? It seems rather odd to me to mix
coroutines and threads this way.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/5af39cbe/attachment.html>

From solipsis at pitrou.net  Fri May  1 22:22:59 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Fri, 1 May 2015 22:22:59 +0200
Subject: [Python-Dev] PEP 492: What is the real goal?
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
 <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
 <CA+OGgf46Gg+DqMf+_LP1oLJbmtWoHYfq88Hkak_VO-si6ag1cQ@mail.gmail.com>
 <CAP7+vJLjX_kU7RwyF42Vw0DN2h9fvLpH9DdgFONm+1gyauh-8g@mail.gmail.com>
Message-ID: <20150501222259.67790f8a@fsol>

On Fri, 1 May 2015 13:10:01 -0700
Guido van Rossum <guido at python.org> wrote:
> On Fri, May 1, 2015 at 12:48 PM, Jim J. Jewett <jimjjewett at gmail.com> wrote:
> 
> > If there are more tasks than executors, yield is a way to release your
> > current executor and go to the back of the line.  I'm pretty sure I
> > saw several examples of that style back when coroutines were first
> > discussed.
> >
> 
> Could you dig up the actual references? It seems rather odd to me to mix
> coroutines and threads this way.

I think Jim is saying that when you have a non-trivial task running
in the event loop, you can "yield" from time to time to give a chance
to other events (e.g. network events or timeouts) to be processed
timely.

Of course, that assumes the event loop will somehow priorize them over
the just yielded task.

Regards

Antoine.



From yselivanov.ml at gmail.com  Fri May  1 22:27:35 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 01 May 2015 16:27:35 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
References: <554185C2.5080003@gmail.com>	<mhvs5p$tg8$1@ger.gmane.org>	<CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>	<mi0c1e$819$1@ger.gmane.org>	<5543CB61.2080905@gmail.com>	<mi0im0$lo5$1@ger.gmane.org>	<20150501191937.GB8013@stoneleaf.us>	<5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
Message-ID: <5543E1B7.6010804@gmail.com>

On 2015-05-01 4:24 PM, Arnaud Delobelle wrote:
> On 1 May 2015 at 20:24, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>> On 2015-05-01 3:19 PM, Ethan Furman wrote:
> [...]
>>> If we must have __aiter__, then we may as well also have __anext__;
>>> besides
>>> being more consistent, it also allows an object to be both a normol
>>> iterator
>>> and an asynch iterator.
>>
>> And this is a good point too.
> I'm not convinced that allowing an object to be both a normal and an
> async iterator is a good thing.  It could be a recipe for confusion.
>


I doubt that it will be a popular thing.  But disallowing this
by merging two different protocols in one isn't a good idea
either.

Yury

From guido at python.org  Fri May  1 22:32:39 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 1 May 2015 13:32:39 -0700
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <20150501222259.67790f8a@fsol>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
 <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
 <CA+OGgf46Gg+DqMf+_LP1oLJbmtWoHYfq88Hkak_VO-si6ag1cQ@mail.gmail.com>
 <CAP7+vJLjX_kU7RwyF42Vw0DN2h9fvLpH9DdgFONm+1gyauh-8g@mail.gmail.com>
 <20150501222259.67790f8a@fsol>
Message-ID: <CAP7+vJJVGvZ5wrK6z7cNqgARruKa1pgixu6VPfQnbpKtf-o+JQ@mail.gmail.com>

On Fri, May 1, 2015 at 1:22 PM, Antoine Pitrou <solipsis at pitrou.net> wrote:

> On Fri, 1 May 2015 13:10:01 -0700
> Guido van Rossum <guido at python.org> wrote:
> > On Fri, May 1, 2015 at 12:48 PM, Jim J. Jewett <jimjjewett at gmail.com>
> wrote:
> >
> > > If there are more tasks than executors, yield is a way to release your
> > > current executor and go to the back of the line.  I'm pretty sure I
> > > saw several examples of that style back when coroutines were first
> > > discussed.
> > >
> >
> > Could you dig up the actual references? It seems rather odd to me to mix
> > coroutines and threads this way.
>
> I think Jim is saying that when you have a non-trivial task running
> in the event loop, you can "yield" from time to time to give a chance
> to other events (e.g. network events or timeouts) to be processed
> timely.
>
> Of course, that assumes the event loop will somehow priorize them over
> the just yielded task.
>

Yeah, but (unlike some frameworks) when using asyncio you can't just put a
plain "yield" statement in your code. You'd have to do something like
`yield from asyncio.sleep(0)`.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/11792d2e/attachment-0001.html>

From jimjjewett at gmail.com  Fri May  1 23:23:57 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Fri, 1 May 2015 17:23:57 -0400
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CAP7+vJLjX_kU7RwyF42Vw0DN2h9fvLpH9DdgFONm+1gyauh-8g@mail.gmail.com>
References: <55411D9F.2050400@gmail.com>
 <55427773.ed89340a.5bd4.0711@mx.google.com>
 <CAP7+vJLmMsRj1LCfyqgK5Cv4-fqpjwnmEgev=QfNKJesJEkGJA@mail.gmail.com>
 <CA+OGgf5jSxgza0Zf_1SZ2g9pTWY=c52TuwH7-PHYzA3_pXV8pA@mail.gmail.com>
 <CAP7+vJJOvGFJPnTbNLJDW+WCnTSW7hTYxvVQ8kZEXdEBoS1jYQ@mail.gmail.com>
 <CA+OGgf46Gg+DqMf+_LP1oLJbmtWoHYfq88Hkak_VO-si6ag1cQ@mail.gmail.com>
 <CAP7+vJLjX_kU7RwyF42Vw0DN2h9fvLpH9DdgFONm+1gyauh-8g@mail.gmail.com>
Message-ID: <CA+OGgf7qwGi+BmXnDE_-Lvj_he8G0VTqnNo1hj9Mv1d4x=gYXA@mail.gmail.com>

On Fri, May 1, 2015 at 4:10 PM, Guido van Rossum <guido at python.org> wrote:
> On Fri, May 1, 2015 at 12:48 PM, Jim J. Jewett <jimjjewett at gmail.com> wrote:
>>
>> If there are more tasks than executors, yield is a way to release your
>> current executor and go to the back of the line.  I'm pretty sure I
>> saw several examples of that style back when coroutines were first
>> discussed.

> Could you dig up the actual references? It seems rather odd to me to mix
> coroutines and threads this way.

I can try in a few days, but the primary case (and perhaps the only
one with running code) was for n_executors=1.  They assumed there
would only be a single thread, or at least only one that was really
important to the event loop -- the pattern was often described as an
alternative to relying on threads.

FWIW, Ron Adam's "yielding"  in
https://mail.python.org/pipermail/python-dev/2015-May/139762.html is
in the same spirit.

You replied it would be better if that were done by calling some
method on the scheduling loop, but that isn't any more standard, and
the yielding function is simple enough that it will be reinvented.

-jJ

From jimjjewett at gmail.com  Fri May  1 23:37:11 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Fri, 01 May 2015 14:37:11 -0700 (PDT)
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <5542820D.1030504@gmail.com>
Message-ID: <5543f207.d3268c0a.1e7b.6dcc@mx.google.com>


On Thu Apr 30 21:27:09 CEST 2015, Yury Selivanov replied:


On 2015-04-30 2:41 PM, Jim J. Jewett wrote:

>> Bad phrasing on my part.  Is there anything that prevents an
>> asynchronous call (or waiting for one) without the "async with"?

>> If so, I'm missing something important.  Either way, I would
>> prefer different wording in the PEP.

> Yes, you can't use 'yield from' in __exit__/__enter__
> in current Python.

I tried it in 3.4, and it worked.

I'm not sure it would ever be sensible, but it didn't raise any
errors, and it did run.

What do you mean by "can't use"?


>>> For coroutines in PEP 492:
>>> __await__ = __anext__ is the same as __call__ = __next__
>>> __await__ = __aiter__ is the same as __call__ = __iter__

>> That tells me that it will be OK sometimes, but will usually
>> be either a mistake or an API problem -- and it explains why.

>> Please put those 3 lines in the PEP.

> There is a line like that:
> https://www.python.org/dev/peps/pep-0492/#await-expression
> Look for "Also, please note..." line.

It was from reading the PEP that the question came up, and I
just reread that section.

Having those 3 explicit lines goes a long way towards explaining
how an asychio coroutine differs from a regular callable, in a
way that the existing PEP doesn't, at least for me.


>>> This is OK. The point is that you can use 'await log' in
>>> __aenter__.  If you don't need awaits in __aenter__ you can
>>> use them in __aexit__. If you don't need them there too,
>>> then just define a regular context manager.

>> Is it an error to use "async with" on a regular context manager?
>> If so, why?  If it is just that doing so could be misleading,
>> then what about "async with mgr1, mgr2, mgr3" -- is it enough
>> that one of the three might suspend itself?

> 'with' requires an object with __enter__ and __exit__

> 'async with' requires an object with __aenter__ and __aexit__

> You can have an object that implements both interfaces.

I'm not still not seeing why with (let alone await with) can't
just run whichever one it finds.  "await with" won't actually let
the BLOCK run until the future is resolved.  So if a context
manager only supplies __enter__ instead of __aenter__, then at most
you've lost a chance to switch tasks while waiting -- and that is no
worse than if the context manager just happened to be really slow.


>>>>> For debugging this kind of mistakes there is a special debug mode in

>> Is the intent to do anything more than preface execution with:

>> import asynchio.coroutines
>> asynchio.coroutines._DEBUG = True

> This won't work, unfortunately.  You need to set the
> debug flag *before* you import asyncio package (otherwise
> we would have an unavoidable performance cost for debug
> features).  If you enable it after you import asyncio,
> then asyncio itself won't be instrumented.  Please
> see the implementation of asyncio.coroutine for details.

Why does asynchio itself have to wrapped?  Is that really something
normal developers need to debug, or is it only for developing the
stdlib itself?  If it if only for developing the stdlib, than I
would rather see workarounds like shoving _DEBUG into builtins
when needed, as opposed to adding multiple attributes to sys.


-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From yselivanov.ml at gmail.com  Fri May  1 23:58:26 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 01 May 2015 17:58:26 -0400
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <5543f207.d3268c0a.1e7b.6dcc@mx.google.com>
References: <5543f207.d3268c0a.1e7b.6dcc@mx.google.com>
Message-ID: <5543F702.3090600@gmail.com>



On 2015-05-01 5:37 PM, Jim J. Jewett wrote:
> On Thu Apr 30 21:27:09 CEST 2015, Yury Selivanov replied:
>
>
> On 2015-04-30 2:41 PM, Jim J. Jewett wrote:
>
>>> Bad phrasing on my part.  Is there anything that prevents an
>>> asynchronous call (or waiting for one) without the "async with"?
>>> If so, I'm missing something important.  Either way, I would
>>> prefer different wording in the PEP.
>> Yes, you can't use 'yield from' in __exit__/__enter__
>> in current Python.
> I tried it in 3.4, and it worked.
>
> I'm not sure it would ever be sensible, but it didn't raise any
> errors, and it did run.
>
> What do you mean by "can't use"?

It probably executed without errors, but it didn't run the
generators.


class Foo:
    def __enter__(self):
        yield from asyncio.sleep(0)
        print('spam')

with Foo(): pass # <- 'spam' won't ever be printed.

>
>
>>>> For coroutines in PEP 492:
>>>> __await__ = __anext__ is the same as __call__ = __next__
>>>> __await__ = __aiter__ is the same as __call__ = __iter__
>>> That tells me that it will be OK sometimes, but will usually
>>> be either a mistake or an API problem -- and it explains why.
>>> Please put those 3 lines in the PEP.
>> There is a line like that:
>> https://www.python.org/dev/peps/pep-0492/#await-expression
>> Look for "Also, please note..." line.
> It was from reading the PEP that the question came up, and I
> just reread that section.
>
> Having those 3 explicit lines goes a long way towards explaining
> how an asychio coroutine differs from a regular callable, in a
> way that the existing PEP doesn't, at least for me.
>
>
>>>> This is OK. The point is that you can use 'await log' in
>>>> __aenter__.  If you don't need awaits in __aenter__ you can
>>>> use them in __aexit__. If you don't need them there too,
>>>> then just define a regular context manager.
>>> Is it an error to use "async with" on a regular context manager?
>>> If so, why?  If it is just that doing so could be misleading,
>>> then what about "async with mgr1, mgr2, mgr3" -- is it enough
>>> that one of the three might suspend itself?
>> 'with' requires an object with __enter__ and __exit__
>> 'async with' requires an object with __aenter__ and __aexit__
>> You can have an object that implements both interfaces.
> I'm not still not seeing why with (let alone await with) can't
> just run whichever one it finds.  "await with" won't actually let
> the BLOCK run until the future is resolved.  So if a context
> manager only supplies __enter__ instead of __aenter__, then at most
> you've lost a chance to switch tasks while waiting -- and that is no
> worse than if the context manager just happened to be really slow.

let's say you have a function:

def foo():
    with Ctx(): pass


if Ctx.__enter__ is a generator/coroutine, then
foo becomes a generator/coroutine (otherwise how
(and to what) would you yield from/await on __enter__?).
And then suddenly calling 'foo' doesn't do anything
(it will return you a generator/coroutine object).

This isn't transparent or even remotely
understandable.


>
>
>>>>>> For debugging this kind of mistakes there is a special debug mode in
>>> Is the intent to do anything more than preface execution with:
>>> import asynchio.coroutines
>>> asynchio.coroutines._DEBUG = True
>> This won't work, unfortunately.  You need to set the
>> debug flag *before* you import asyncio package (otherwise
>> we would have an unavoidable performance cost for debug
>> features).  If you enable it after you import asyncio,
>> then asyncio itself won't be instrumented.  Please
>> see the implementation of asyncio.coroutine for details.
> Why does asynchio itself have to wrapped?  Is that really something
> normal developers need to debug, or is it only for developing the
> stdlib itself?  If it if only for developing the stdlib, than I
> would rather see workarounds like shoving _DEBUG into builtins
> when needed, as opposed to adding multiple attributes to sys.


Yes, normal developers need asyncio to be instrumented,
otherwise you won't know what you did wrong when you
called some asyncio code without 'await' for example.


Yury

From Steve.Dower at microsoft.com  Fri May  1 17:25:18 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Fri, 1 May 2015 15:25:18 +0000
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CACac1F8JBzskNc=J2edcNU-T45edYHCREN8Ze6b-A1pWEABBDw@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>
 <5541EECD.2020903@canterbury.ac.nz>,
 <CACac1F8JBzskNc=J2edcNU-T45edYHCREN8Ze6b-A1pWEABBDw@mail.gmail.com>
Message-ID: <BY1PR03MB14661B487D1F28B2149C5E58F5D50@BY1PR03MB1466.namprd03.prod.outlook.com>

The high-level answer to this is, like yield, the function temporarily returns all the way up the stack to the caller who intends to advance the iterator/async function. This is typically the event loop, and the main confusion here comes when the loop is implicit.

If you explicitly define an event loop it's roughly equivalent to the for loop that is calling next on a generator. For pip, I expect that's what you'd have - a blocking function ("do_work()"?, "execute_plan()"?) that creates a loop and starts it's tasks running within it. Each task inside that call with be asynchronous with respect to each other (think about passing generators to zip() ) but the loop will block the rest of your code until they're all finished.

Cheers,
Steve

Top-posted from my Windows Phone
________________________________
From: Paul Moore<mailto:p.f.moore at gmail.com>
Sent: ?4/?30/?2015 2:07
To: Greg Ewing<mailto:greg.ewing at canterbury.ac.nz>
Cc: Python Dev<mailto:python-dev at python.org>
Subject: Re: [Python-Dev] PEP 492 quibble and request

On 30 April 2015 at 09:58, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Ethan Furman wrote:
>>
>> Having gone through the PEP again, I am still no closer to understanding
>> what happens here:
>>
>>   data = await reader.read(8192)
>>
>> What does the flow of control look like at the interpreter level?
>
>
> Are you sure you *really* want to know? For the sake
> of sanity, I recommend ignoring the actual control
> flow and pretending that it's just like
>
>    data = reader.read(8192)
>
> with the reader.read() method somehow able to be
> magically suspended.

Well, if I don't know, I get confused as to where I invoke the event
loop, how my non-async code runs alongside this etc.
Paul
_______________________________________________
Python-Dev mailing list
Python-Dev at python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/de99b030/attachment.html>

From v+python at g.nevcal.com  Sat May  2 04:55:24 2015
From: v+python at g.nevcal.com (Glenn Linderman)
Date: Fri, 01 May 2015 19:55:24 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJJKGjHJRo9gycadhnhdsxmNTDw2yLb-uFFNnBESGKfDXg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvsqb$85j$1@ger.gmane.org>
 <CAP7+vJ+GwN-aYgo0Ud-_ihoejmdTpjy=B0qpR5MZr5SBWhTDog@mail.gmail.com>
 <CAO-CpE+nqJu7JkRj1TKfVACwOuF9Wc=xSDx2SRaL5F8ZZup4Yw@mail.gmail.com>
 <CAP7+vJJKGjHJRo9gycadhnhdsxmNTDw2yLb-uFFNnBESGKfDXg@mail.gmail.com>
Message-ID: <55443C9C.60107@g.nevcal.com>

On 5/1/2015 9:59 AM, Guido van Rossum wrote:
>
>     I think coroutine is the name of a concept, not a specific
>     implementation.
>
>     Cheers,
>
>  Cheers indeed! I agree that the *concept* is best called coroutine -- 
> and we have used this term ever since PEP 342. But when we're talking 
> specifics and trying to distinguish e.g. a function declared with 
> 'async def' from a regular function or from a regular generator 
> function, using 'async function' sounds right. And 'async method' if 
> it's a method.

Exactly. The async function/method is an implementation technique for a 
specific kind/subset of coroutine functionality.  So the term coroutine, 
qualified by a description of its best usage and limitationsof async 
function, can be used in defining async function, thus appealing to what 
people know or have heard of and vaguely understand and can read more 
about in the literature.

A glossary entry for coroutine in the docs seems appropriate, which 
could point out the 16? ways to implement coroutine-style 
functionalities in Python, and perhaps make recommendations for 
different types of usage.

?OK, not 16 ways, but it is 3 now, or 4?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/36835a37/attachment.html>

From stefan_ml at behnel.de  Sat May  2 06:54:32 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Sat, 02 May 2015 06:54:32 +0200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5543CB61.2080905@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
Message-ID: <mi1la8$7t4$1@ger.gmane.org>

Yury Selivanov schrieb am 01.05.2015 um 20:52:
> I don't like the idea of combining __next__ and __anext__.

Ok, fair enough. So, how would you use this new protocol manually then?
Say, I already know that I won't need to await the next item that the
iterator will return. For normal iterators, I could just call next() on it
and continue the for-loop. How would I do it for AIterators?

Stefan



From arnodel at gmail.com  Fri May  1 22:24:47 2015
From: arnodel at gmail.com (Arnaud Delobelle)
Date: Fri, 1 May 2015 21:24:47 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5543D2F4.3060207@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
Message-ID: <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>

On 1 May 2015 at 20:24, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> On 2015-05-01 3:19 PM, Ethan Furman wrote:
[...]
>> If we must have __aiter__, then we may as well also have __anext__;
>> besides
>> being more consistent, it also allows an object to be both a normol
>> iterator
>> and an asynch iterator.
>
>
> And this is a good point too.

I'm not convinced that allowing an object to be both a normal and an
async iterator is a good thing.  It could be a recipe for confusion.

-- 
Arnaud

From arnodel at gmail.com  Sat May  2 11:15:48 2015
From: arnodel at gmail.com (Arnaud Delobelle)
Date: Sat, 2 May 2015 10:15:48 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5543E1B7.6010804@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
Message-ID: <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>

On 1 May 2015 at 21:27, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> On 2015-05-01 4:24 PM, Arnaud Delobelle wrote:
>>
>> On 1 May 2015 at 20:24, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>>>
>>> On 2015-05-01 3:19 PM, Ethan Furman wrote:
>>
>> [...]
>>>>
>>>> If we must have __aiter__, then we may as well also have __anext__;
>>>> besides
>>>> being more consistent, it also allows an object to be both a normol
>>>> iterator
>>>> and an asynch iterator.
>>>
>>>
>>> And this is a good point too.
>>
>> I'm not convinced that allowing an object to be both a normal and an
>> async iterator is a good thing.  It could be a recipe for confusion.
>>
>
>
> I doubt that it will be a popular thing.  But disallowing this
> by merging two different protocols in one isn't a good idea
> either.

I having been arguing for merging two different protocols.  I'm saying
that allowing an object to be both normal and async iterable is not an
argument for having separate protocols because it's not a good thing.

Cheers,

-- 
Arnaud

From tritium-list at sdamon.com  Fri May  1 06:01:53 2015
From: tritium-list at sdamon.com (Alexander Walters)
Date: Fri, 01 May 2015 00:01:53 -0400
Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round
In-Reply-To: <CAPJVwBn-ouipE0cW61Lw_9VNOPDbSqwzdmuC1L-0U-iLegizdQ@mail.gmail.com>
References: <CAP7+vJK6GQ=rJ__KeE=H1cAES3gThdg4okmq+iFFSvE6SushBw@mail.gmail.com>
 <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz>
 <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com>
 <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com>
 <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com>
 <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz>
 <CAPJVwBm3E_oY12VbuBOMApSouVuMbAjqkN+sMoi3tSVYwt_Sgw@mail.gmail.com>
 <55416DBF.3020303@gmail.com>
 <CAPJVwBkhHMzxcECaEskU3TqhKPm5ws0JiAd9s_pPtehDtxJbVw@mail.gmail.com>
 <5541EE2A.1070109@canterbury.ac.nz>
 <CAPJVwB=_UbdQBAWE1AYrEDNcDkUSBM2Xsj_eRQhSiLD1dEtTjQ@mail.gmail.com>
 <CAP7+vJ+drN5ysFWrcAGZ-iEMR7Q+NsZmsfdzC15T7zi_MwD8pw@mail.gmail.com>
 <CAPJVwBn-ouipE0cW61Lw_9VNOPDbSqwzdmuC1L-0U-iLegizdQ@mail.gmail.com>
Message-ID: <5542FAB1.1000009@sdamon.com>

Out of curiosity, how much of a breaking change would making unary 
operators stack arbitrarily be?


On 4/30/2015 23:57, Nathaniel Smith wrote:
>
> On Apr 30, 2015 8:40 PM, "Guido van Rossum" <guido at python.org 
> <mailto:guido at python.org>> wrote:
> >
> > On Thu, Apr 30, 2015 at 8:30 PM, Nathaniel Smith <njs at pobox.com 
> <mailto:njs at pobox.com>> wrote:
> >>
> >> The actual effect of making "await" a different precedence is to 
> resolve the ambiguity in
> >>
> >>   await x ** 2
> >>
> >> If await acted like -, then this would be
> >>   await (x ** 2)
> >> But with the proposed grammar, it's instead
> >>   (await x) ** 2
> >> Which is probably correct, and produces the IMHO rather nice 
> invariant that "await" binds more tightly than arithmetic in general 
> (instead of having to say that it binds more tightly than arithmetic 
> *except* in this one corner case...)
> >
> > Correct.
> >>
> >> AFAICT these and the ** case are the only expressions where there's 
> any difference between Yury's proposed grammar and your proposal of 
> treating await like unary minus. But then given the limitations of 
> Python's parser plus the desire to disambiguate the expression above 
> in the given way, it becomes an arguably regrettable, yet inevitable, 
> consequence that
> >>
> >>   await -fut
> >>   await +fut
> >>   await ~fut
> >> become parse errors.
> >
> >  Why is that regrettable? Do you have a plan for overloading one of 
> those on Futures? I personally consider it a feature that you can't do 
> that. :-)
>
> I didn't say it was regrettable, I said it was arguably regrettable. 
> For proof, see the last week of python-dev ;-).
>
> (I guess all else being equal it would be nice if unary operators 
> could stack arbitrarily, since that really is the more natural parse 
> rule IMO and also if things had worked that way then I would have 
> spent this thread less confused. But this is a pure argument from 
> elegance. In practice there's obviously no good reason to write "await 
> -fut" or "-not x", so meh, whatever.)
>
> -n
>
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/a24d73d0/attachment.html>

From projetmbc at gmail.com  Fri May  1 21:30:35 2015
From: projetmbc at gmail.com (Christophe Bal)
Date: Fri, 1 May 2015 21:30:35 +0200
Subject: [Python-Dev] Sub-claasing pathlib.Path seems impossible
Message-ID: <CAAb4jGkqyLh4EPgs2tntY1RMaDdig4wROJzX9Ek8ZepxTwq9-A@mail.gmail.com>

Hello.

In this post
<http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-does-not-work/29854141#29854141>,
I have noticed a problem with the following code.

from pathlib import Path
> class PPath(Path):
>     def __init__(self, *args, **kwargs):
>         super().__init__(*args, **kwargs)
>
> test = PPath("dir", "test.txt")
>
>
This gives the following error message.



> Traceback (most recent call last):
>   File "/Users/projetmbc/test.py", line 14, in <module>
>     test = PPath("dir", "test.txt")
>   File "/anaconda/lib/python3.4/pathlib.py", line 907, in __new__
>     self = cls._from_parts(args, init=False)
>   File "/anaconda/lib/python3.4/pathlib.py", line 589, in _from_parts
>     drv, root, parts = self._parse_args(args)
>   File "/anaconda/lib/python3.4/pathlib.py", line 582, in _parse_args
>     return cls._flavour.parse_parts(parts)AttributeError: type object 'PPath' has no attribute '_flavour'
>
>
This breaks the sub-classing from Python point of view. In the post
<http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-does-not-work/29854141#29854141>,
I give a hack to sub-class Path but it's a bit Unpythonic.

*Christophe BAL*
*Enseignant de math?matiques en Lyc?e **et d?veloppeur Python amateur*
*---*
*French math teacher in a "Lyc?e" **and **Python **amateur developer*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150501/53e1438f/attachment.html>

From stephen at xemacs.org  Sat May  2 18:02:54 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Sun, 03 May 2015 01:02:54 +0900
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
Message-ID: <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>

Adam Barto? writes:

 > I'll describe my picture of the situation, which might be terribly wrong.
 > On Linux, in a typical situation, we have a UTF-8 terminal,
 > PYTHONENIOENCODING=utf-8, GNU readline is used. When the REPL wants input
 > from a user the tokenizer calls PyOS_Readline, which calls GNU readline.
 > The user is prompted >>> , during the input he can use autocompletion and
 > everything and he enters u'?'. PyOS_Readline returns b"u'\xce\xb1'" (as
 > char* or something),

It's char*, according to Parser/myreadline.c.  It is not str in Python
2.

 > which is UTF-8 encoded input from the user.

By default, it's just ASCII-compatible bytes.  I don't know offhand
where, but somehow PYTHONIOENCODING tells Python it's UTF-8 -- that's
how Python knows about it in this situation.

 > The tokenizer, parser, and evaluator process the input and the result is
 > u'\u03b1', which is printed as an answer.
 >
 > In my case I install custom sys.std* objects and a custom readline
 > hook.  Again, the tokenizer calls PyOS_Readline, which calls my
 > readline hook, which calls sys.stdin.readline(),

This is your custom version?

 > which returns an Unicode string a user entered (it was decoded from
 > UTF-16-LE bytes actually). My readline hook encodes this string to
 > UTF-8 and returns it. So the situation is the same.  The tokenizer
 > gets b"\u'xce\xb1'" as before, but know it results in u'\xce\xb1'.
 > 
 > Why is the result different?

The result is different because Python doesn't "learn" that the actual
encoding is UTF-8.  If you have tried setting PYTHONIOENCODING=utf-8
with your setup and that doesn't work, I'm not sure where the
communication is failing.

The only other thing I can think of is to set the encoding
sys.stdin.encoding.  That may be readonly, though (that would explain
why the only way to set the PYTHONIOENCODING is via an environment
variable).  At least you could find out what it is, with and without
PYTHONIOENCODING set to 'utf-8' (or maybe it's 'utf8' or 'UTF-8' --
all work as expected with unicode.encode/str.decode on Mac OS X).

Or it could be unimplemented in your replacement module.

 > I though that in the first case PyCF_SOURCE_IS_UTF8 might have been
 > set. And after your suggestion, I thought that
 > PYTHONIOENCODING=utf-8 is the thing that also sets
 > PyCF_SOURCE_IS_UTF8.

No.  PyCF_SOURCE_IS_UTF8 is set unconditionally in the functions
builtin_{eval,exec,compile}_impl in Python/bltins.c in the cases that
matter AFAICS.  It's not obvious to me under what conditions it might
*not* be set.  It is then consulted in ast.c in PyAST_FromNodeObject,
and nowhere else that I can see.


From ethan at stoneleaf.us  Sat May  2 19:04:16 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Sat, 2 May 2015 10:04:16 -0700
Subject: [Python-Dev] PEP 492 and types.coroutine
Message-ID: <20150502170416.GD8013@stoneleaf.us>

According to https://www.python.org/dev/peps/pep-0492/#id31:

  The [types.coroutine] function applies CO_COROUTINE flag to
  generator-function's code object, making it return a coroutine
  object.

Further down in https://www.python.org/dev/peps/pep-0492/#id32:

   [await] uses the yield from implementation with an extra step of
   validating its argument. await only accepts an awaitable,
   which can be one of:

     ...

     - A generator-based coroutine object returned from a generator
       decorated with types.coroutine().

If I'm understanding this correctly, type.coroutine's only purpose is to add
a flag to a generator object so that await will accept it.

This raises the question of why can't await simply accept a generator
object?  There is no code change to the gen obj itself, there is no
behavior change in the gen obj, it's the exact same byte code, only a
flag is different.

types.coroutine feels a lot like unnecessary boiler-plate.

--
~Ethan~

From yselivanov.ml at gmail.com  Sat May  2 19:41:18 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Sat, 02 May 2015 13:41:18 -0400
Subject: [Python-Dev] PEP 492 and types.coroutine
In-Reply-To: <20150502170416.GD8013@stoneleaf.us>
References: <20150502170416.GD8013@stoneleaf.us>
Message-ID: <55450C3E.8020503@gmail.com>

On 2015-05-02 1:04 PM, Ethan Furman wrote:
> According to https://www.python.org/dev/peps/pep-0492/#id31:
>
>    The [types.coroutine] function applies CO_COROUTINE flag to
>    generator-function's code object, making it return a coroutine
>    object.
>
> Further down in https://www.python.org/dev/peps/pep-0492/#id32:
>
>     [await] uses the yield from implementation with an extra step of
>     validating its argument. await only accepts an awaitable,
>     which can be one of:
>
>       ...
>
>       - A generator-based coroutine object returned from a generator
>         decorated with types.coroutine().
>
> If I'm understanding this correctly, type.coroutine's only purpose is to add
> a flag to a generator object so that await will accept it.
>
> This raises the question of why can't await simply accept a generator
> object?  There is no code change to the gen obj itself, there is no
> behavior change in the gen obj, it's the exact same byte code, only a
> flag is different.
>

Because we don't want 'await' to accept random generators.
It can't do anything meaningful with them, in a world where
all asyncio code is written with new syntax, passing generator
to 'await' is just a bug.

'types.coroutine' is something that we need to ease transition
to the new syntax.

Yury

From ethan at stoneleaf.us  Sat May  2 20:14:04 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Sat, 2 May 2015 11:14:04 -0700
Subject: [Python-Dev] PEP 492 and types.coroutine
In-Reply-To: <55450C3E.8020503@gmail.com>
References: <20150502170416.GD8013@stoneleaf.us> <55450C3E.8020503@gmail.com>
Message-ID: <20150502181404.GE8013@stoneleaf.us>

On 05/02, Yury Selivanov wrote:
> On 2015-05-02 1:04 PM, Ethan Furman wrote:

>> If I'm understanding this correctly, type.coroutine's only purpose is to add
>> a flag to a generator object so that await will accept it.
>> 
>> This raises the question of why can't await simply accept a generator
>> object?  There is no code change to the gen obj itself, there is no
>> behavior change in the gen obj, it's the exact same byte code, only a
>> flag is different.
> 
> Because we don't want 'await' to accept random generators.
> It can't do anything meaningful with them, in a world where
> all asyncio code is written with new syntax, passing generator
> to 'await' is just a bug.

And yet in current asyncio code, random generators can be accepted, and not
even the current asyncio.coroutine wrapper can gaurantee that the generator
is a coroutine in fact.

For that matter, even the new types.coroutine cannot gaurantee that the
returned object is a coroutine and not a generator -- so basically it's just
there to tell the compiler, "yeah, I really know what I'm doing, shut up and
do what I asked."

> 'types.coroutine' is something that we need to ease transition
> to the new syntax.

This doesn't make sense -- either the existing generators are correctly
returning coroutines, in which case the decorator adds nothing, or they
are returning non-coroutines, in which case the decorator adds nothing.

So either way, nothing has been added besides a mandatory boiler-plate
requirement.

--
~Ethan~

From drekin at gmail.com  Sat May  2 21:57:45 2015
From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=)
Date: Sat, 2 May 2015 21:57:45 +0200
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
 <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>

I think I have found out where the problem is. In fact, the encoding of the
interactive input is determined by sys.stdin.encoding, but only in the case
that it is a file object (see
https://hg.python.org/cpython/file/d356e68de236/Parser/tokenizer.c#l890 and
the implementation of tok_stdin_decode). For example, by default on my
system sys.stdin has encoding cp852.

>>> u'?'
u'\xe1' # correct
>>> import sys; sys.stdin = "foo"
>>> u'?'
u'\xa0' # incorrect

Even if sys.stdin contained a file-like object with proper encoding
attribute, it wouldn't work since sys.stdin has to be instance of <type
'file'>. So the question is, whether it is possible to make a file instance
in Python that is also customizable so it may call my code. For the first
thing, how to change the value of encoding attribute of a file object.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150502/7da2b9e0/attachment.html>

From yselivanov.ml at gmail.com  Sat May  2 22:10:12 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Sat, 02 May 2015 16:10:12 -0400
Subject: [Python-Dev] PEP 492 and types.coroutine
In-Reply-To: <20150502181404.GE8013@stoneleaf.us>
References: <20150502170416.GD8013@stoneleaf.us> <55450C3E.8020503@gmail.com>
 <20150502181404.GE8013@stoneleaf.us>
Message-ID: <55452F24.1070408@gmail.com>



On 2015-05-02 2:14 PM, Ethan Furman wrote:
> On 05/02, Yury Selivanov wrote:
>> On 2015-05-02 1:04 PM, Ethan Furman wrote:
>>> If I'm understanding this correctly, type.coroutine's only purpose is to add
>>> a flag to a generator object so that await will accept it.
>>>
>>> This raises the question of why can't await simply accept a generator
>>> object?  There is no code change to the gen obj itself, there is no
>>> behavior change in the gen obj, it's the exact same byte code, only a
>>> flag is different.
>> Because we don't want 'await' to accept random generators.
>> It can't do anything meaningful with them, in a world where
>> all asyncio code is written with new syntax, passing generator
>> to 'await' is just a bug.
> And yet in current asyncio code, random generators can be accepted, and not
> even the current asyncio.coroutine wrapper can gaurantee that the generator
> is a coroutine in fact.

This is a flaw in the current Python that we want to fix.
>
> For that matter, even the new types.coroutine cannot gaurantee that the
> returned object is a coroutine and not a generator -- so basically it's just
> there to tell the compiler, "yeah, I really know what I'm doing, shut up and
> do what I asked."

Well, why would you use it on some generator that is not
a generator-based coroutine?
>
>> 'types.coroutine' is something that we need to ease transition
>> to the new syntax.
> This doesn't make sense -- either the existing generators are correctly
> returning coroutines, in which case the decorator adds nothing, or they
> are returning non-coroutines, in which case the decorator adds nothing.
>
> So either way, nothing has been added besides a mandatory boiler-plate
> requirement.

It's not nothing; it's backwards compatibility.  Please read
https://www.python.org/dev/peps/pep-0492/#await-expression

@types.coroutine marks generator function that its generator
is awaitable.

Yury

From ron3200 at gmail.com  Sat May  2 22:42:27 2015
From: ron3200 at gmail.com (Ron Adam)
Date: Sat, 02 May 2015 16:42:27 -0400
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>	<CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>	<CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>	<CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>	<CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>	<20150501115447.GJ5663@ando.pearwood.info>	<mi0lc7$1eh$1@ger.gmane.org>	<CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>
 <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>
Message-ID: <554536B3.2000606@gmail.com>

On 05/02/2015 04:18 PM, Arnaud Delobelle wrote:
> On 1 May 2015 at 20:59, Guido van Rossum<guido at python.org>  wrote:
>> >On Fri, May 1, 2015 at 12:49 PM, Ron Adam<ron3200 at gmail.com>  wrote:
>>> >>
>>> >>
>>> >>Another useful async function might be...
>>> >>
>>> >>    async def yielding():
>>> >>        pass
>>> >>
>>> >>In a routine is taking very long time, just inserting "await yielding()"
>>> >>in the long calculation would let other awaitables run.
>>> >>
>> >That's really up to the scheduler, and a function like this should be
>> >provided by the event loop or scheduler framework you're using.
> Really?  I was under the impression that 'await yielding()' as defined
> above would actually not suspend the coroutine at all, therefore not
> giving any opportunity for the scheduler to resume another coroutine,
> and I thought I understood the PEP well enough.  Does this mean that
> somehow "await x" guarantees that the coroutine will suspend at least
> once?
>
> To me the async def above was the equivalent of the following in the
> 'yield from' world:
>
> def yielding():
>      return
>      yield # Just to make it a generator
>
> Then "yield from yielding()" will not yield at all - which makes its
> name rather misleading!

It was my understanding that yield from also suspends the current thread, 
allowing others to run.  Of course if it's the only thread, it would not.  
But maybe I'm misremembering earlier discussions.  If it doesn't suspend 
the current thread, then you need to put scheduler.sleep() calls throughout 
your co-routines.

I think Guido is saying it could be either.

Cheers,
    Ron





From guido at python.org  Sat May  2 23:09:51 2015
From: guido at python.org (Guido van Rossum)
Date: Sat, 2 May 2015 14:09:51 -0700
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info> <mi0lc7$1eh$1@ger.gmane.org>
 <CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>
 <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>
Message-ID: <CAP7+vJJaLsUok3pVjsuAKe8CpHaBr2O3GXOAGmX=HoqPc=t8JA@mail.gmail.com>

On Sat, May 2, 2015 at 1:18 PM, Arnaud Delobelle <arnodel at gmail.com> wrote:

> On 1 May 2015 at 20:59, Guido van Rossum <guido at python.org> wrote:
> > On Fri, May 1, 2015 at 12:49 PM, Ron Adam <ron3200 at gmail.com> wrote:
> >>
> >>
> >> Another useful async function might be...
> >>
> >>    async def yielding():
> >>        pass
> >>
> >> In a routine is taking very long time, just inserting "await yielding()"
> >> in the long calculation would let other awaitables run.
> >>
> > That's really up to the scheduler, and a function like this should be
> > provided by the event loop or scheduler framework you're using.
>
> Really?  I was under the impression that 'await yielding()' as defined
> above would actually not suspend the coroutine at all, therefore not
> giving any opportunity for the scheduler to resume another coroutine,
> and I thought I understood the PEP well enough.  Does this mean that
> somehow "await x" guarantees that the coroutine will suspend at least
> once?
>

You're correct. That's why I said it should be left up to the framework --
ultimately what you *do* have to put in such a function has to be
understood by the framenwork. And that's why in other messages I've used
await asyncio.sleep(0) as an example. Look up its definition.


> To me the async def above was the equivalent of the following in the
> 'yield from' world:
>
> def yielding():
>     return
>     yield # Just to make it a generator
>
> Then "yield from yielding()" will not yield at all - which makes its
> name rather misleading!
>

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150502/6032f70a/attachment.html>

From ethan at stoneleaf.us  Sat May  2 23:31:37 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Sat, 2 May 2015 14:31:37 -0700
Subject: [Python-Dev] PEP 492 and types.coroutine
In-Reply-To: <55452F24.1070408@gmail.com>
References: <20150502170416.GD8013@stoneleaf.us> <55450C3E.8020503@gmail.com>
 <20150502181404.GE8013@stoneleaf.us> <55452F24.1070408@gmail.com>
Message-ID: <20150502213136.GF8013@stoneleaf.us>

On 05/02, Yury Selivanov wrote:
> On 2015-05-02 2:14 PM, Ethan Furman wrote:
>> On 05/02, Yury Selivanov wrote:
>>> On 2015-05-02 1:04 PM, Ethan Furman wrote:

>> And yet in current asyncio code, random generators can be accepted, and not
>> even the current asyncio.coroutine wrapper can gaurantee that the generator
>> is a coroutine in fact.

> This is a flaw in the current Python that we want to fix.

Your "fix" doesn't fix it.  I can decorate a non-coroutine generator with
type.coroutine and it will still be broken and a bug in my code.


>> For that matter, even the new types.coroutine cannot gaurantee that the
>> returned object is a coroutine and not a generator -- so basically it's just
>> there to tell the compiler, "yeah, I really know what I'm doing, shut up and
>> do what I asked."
> 
> Well, why would you use it on some generator that is not
> a generator-based coroutine?

I wouldn't, that would be a bug; but decorating a wrong type of generator is
still a bug, and type.coroutine has not fixed that bug.

It's worse than mandatory typing because it can't even check that what I have
declared is true.


>> So either way, nothing has been added besides a mandatory boiler-plate
>> requirement.
> 
> It's not nothing; it's backwards compatibility.  Please read
> https://www.python.org/dev/peps/pep-0492/#await-expression

I have read it, more than once.  If you lift the (brand-new) requirement that a
generator must be decorated, then type.coroutine becomes optional (no more
useful, just optional).  It is not a current requirement that coroutine
generators be decorated.

--
~Ethan~

From guido at python.org  Sat May  2 23:37:41 2015
From: guido at python.org (Guido van Rossum)
Date: Sat, 2 May 2015 14:37:41 -0700
Subject: [Python-Dev] PEP 492 and types.coroutine
In-Reply-To: <20150502213136.GF8013@stoneleaf.us>
References: <20150502170416.GD8013@stoneleaf.us> <55450C3E.8020503@gmail.com>
 <20150502181404.GE8013@stoneleaf.us> <55452F24.1070408@gmail.com>
 <20150502213136.GF8013@stoneleaf.us>
Message-ID: <CAP7+vJ+bigyE-G7Yj4EgPbWOim75LDbyrTbE-FeDszawmtuHKw@mail.gmail.com>

Ethan, at this point your continued arguing is not doing anybody anything
good. Please stop.

On Sat, May 2, 2015 at 2:31 PM, Ethan Furman <ethan at stoneleaf.us> wrote:

> On 05/02, Yury Selivanov wrote:
> > On 2015-05-02 2:14 PM, Ethan Furman wrote:
> >> On 05/02, Yury Selivanov wrote:
> >>> On 2015-05-02 1:04 PM, Ethan Furman wrote:
>
> >> And yet in current asyncio code, random generators can be accepted, and
> not
> >> even the current asyncio.coroutine wrapper can gaurantee that the
> generator
> >> is a coroutine in fact.
>
> > This is a flaw in the current Python that we want to fix.
>
> Your "fix" doesn't fix it.  I can decorate a non-coroutine generator with
> type.coroutine and it will still be broken and a bug in my code.
>
>
> >> For that matter, even the new types.coroutine cannot gaurantee that the
> >> returned object is a coroutine and not a generator -- so basically it's
> just
> >> there to tell the compiler, "yeah, I really know what I'm doing, shut
> up and
> >> do what I asked."
> >
> > Well, why would you use it on some generator that is not
> > a generator-based coroutine?
>
> I wouldn't, that would be a bug; but decorating a wrong type of generator
> is
> still a bug, and type.coroutine has not fixed that bug.
>
> It's worse than mandatory typing because it can't even check that what I
> have
> declared is true.
>
>
> >> So either way, nothing has been added besides a mandatory boiler-plate
> >> requirement.
> >
> > It's not nothing; it's backwards compatibility.  Please read
> > https://www.python.org/dev/peps/pep-0492/#await-expression
>
> I have read it, more than once.  If you lift the (brand-new) requirement
> that a
> generator must be decorated, then type.coroutine becomes optional (no more
> useful, just optional).  It is not a current requirement that coroutine
> generators be decorated.
>
> --
> ~Ethan~
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150502/b33bc238/attachment.html>

From greg.ewing at canterbury.ac.nz  Sun May  3 03:22:44 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 03 May 2015 13:22:44 +1200
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CAP7+vJJaLsUok3pVjsuAKe8CpHaBr2O3GXOAGmX=HoqPc=t8JA@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info> <mi0lc7$1eh$1@ger.gmane.org>
 <CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>
 <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>
 <CAP7+vJJaLsUok3pVjsuAKe8CpHaBr2O3GXOAGmX=HoqPc=t8JA@mail.gmail.com>
Message-ID: <55457864.60500@canterbury.ac.nz>

Guido van Rossum wrote:
> On Sat, May 2, 2015 at 1:18 PM, Arnaud Delobelle <arnodel at gmail.com 
> <mailto:arnodel at gmail.com>> wrote:
> 
>     Does this mean that
>     somehow "await x" guarantees that the coroutine will suspend at least
>     once?

No. First, it's possible for x to finish without yielding.
But even if x yields, there is no guarantee that the
scheduler will run something else -- it might just
resume the same task, even if there is another one that
could run. It's up to the scheduler whether it
implements any kind of "fair" scheduling policy.

-- 
Greg

From stefan_ml at behnel.de  Sun May  3 08:32:02 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Sun, 03 May 2015 08:32:02 +0200
Subject: [Python-Dev] ABCs - Re: PEP 492: async/await in Python; version 4
In-Reply-To: <mi1la8$7t4$1@ger.gmane.org>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi1la8$7t4$1@ger.gmane.org>
Message-ID: <mi4fd4$9ak$1@ger.gmane.org>

Stefan Behnel schrieb am 02.05.2015 um 06:54:
> Yury Selivanov schrieb am 01.05.2015 um 20:52:
>> I don't like the idea of combining __next__ and __anext__.
> 
> Ok, fair enough. So, how would you use this new protocol manually then?
> Say, I already know that I won't need to await the next item that the
> iterator will return. For normal iterators, I could just call next() on it
> and continue the for-loop. How would I do it for AIterators?

BTW, I guess that this "AIterator", or rather "AsyncIterator", needs to be
a separate protocol (and ABC) then. Implementing "__aiter__()" and
"__anext__()" seems perfectly reasonable without implementing (or using) a
Coroutine.

That means we also need an "AsyncIterable" as a base class for it.

Would Coroutine then inherit from both Iterator and AsyncIterator? Or
should we try to separate the protocol hierarchy completely? The section on
"Coroutine objects" seems to suggest that inheritance from Iterator is not
intended.

OTOH, I'm not sure if inheriting from AsyncIterator is intended for
Coroutine. The latter might just be a stand-alone ABC with
send/throw/close, after all.

I think that in order to get a better understanding of the protocol(s) that
this PEP proposes, and the terminology that it should use, it would help to
implement these ABCs.

That might even help us to decide if we need new builtins (or helpers)
aiter() and anext() in order to deal with these protocols.

Stefan



From arnodel at gmail.com  Sat May  2 22:18:43 2015
From: arnodel at gmail.com (Arnaud Delobelle)
Date: Sat, 2 May 2015 21:18:43 +0100
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info>
 <mi0lc7$1eh$1@ger.gmane.org>
 <CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>
Message-ID: <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>

On 1 May 2015 at 20:59, Guido van Rossum <guido at python.org> wrote:
> On Fri, May 1, 2015 at 12:49 PM, Ron Adam <ron3200 at gmail.com> wrote:
>>
>>
>> Another useful async function might be...
>>
>>    async def yielding():
>>        pass
>>
>> In a routine is taking very long time, just inserting "await yielding()"
>> in the long calculation would let other awaitables run.
>>
> That's really up to the scheduler, and a function like this should be
> provided by the event loop or scheduler framework you're using.

Really?  I was under the impression that 'await yielding()' as defined
above would actually not suspend the coroutine at all, therefore not
giving any opportunity for the scheduler to resume another coroutine,
and I thought I understood the PEP well enough.  Does this mean that
somehow "await x" guarantees that the coroutine will suspend at least
once?

To me the async def above was the equivalent of the following in the
'yield from' world:

def yielding():
    return
    yield # Just to make it a generator

Then "yield from yielding()" will not yield at all - which makes its
name rather misleading!

-- 
Arnaud

From rymg19 at gmail.com  Sat May  2 22:30:46 2015
From: rymg19 at gmail.com (Ryan Gonzalez)
Date: Sat, 2 May 2015 15:30:46 -0500
Subject: [Python-Dev] Sub-claasing pathlib.Path seems impossible
In-Reply-To: <CAAb4jGkqyLh4EPgs2tntY1RMaDdig4wROJzX9Ek8ZepxTwq9-A@mail.gmail.com>
References: <CAAb4jGkqyLh4EPgs2tntY1RMaDdig4wROJzX9Ek8ZepxTwq9-A@mail.gmail.com>
Message-ID: <CAO41-mN5f=_iHgkxTfLcJhok_NWAwHLRW3JU9DEXbO7Qw4B9bQ@mail.gmail.com>

http://stackoverflow.com/a/29880095/2097780

My favorite thing about Python is that it's so easy to be evil. ;)


On Fri, May 1, 2015 at 2:30 PM, Christophe Bal <projetmbc at gmail.com> wrote:

> Hello.
>
> In this post
> <http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-does-not-work/29854141#29854141>,
> I have noticed a problem with the following code.
>
> from pathlib import Path
>> class PPath(Path):
>>     def __init__(self, *args, **kwargs):
>>         super().__init__(*args, **kwargs)
>>
>> test = PPath("dir", "test.txt")
>>
>>
> This gives the following error message.
>
>
>
>> Traceback (most recent call last):
>>   File "/Users/projetmbc/test.py", line 14, in <module>
>>     test = PPath("dir", "test.txt")
>>   File "/anaconda/lib/python3.4/pathlib.py", line 907, in __new__
>>     self = cls._from_parts(args, init=False)
>>   File "/anaconda/lib/python3.4/pathlib.py", line 589, in _from_parts
>>     drv, root, parts = self._parse_args(args)
>>   File "/anaconda/lib/python3.4/pathlib.py", line 582, in _parse_args
>>     return cls._flavour.parse_parts(parts)AttributeError: type object 'PPath' has no attribute '_flavour'
>>
>>
> This breaks the sub-classing from Python point of view. In the post
> <http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-does-not-work/29854141#29854141>,
> I give a hack to sub-class Path but it's a bit Unpythonic.
>
> *Christophe BAL*
> *Enseignant de math?matiques en Lyc?e **et d?veloppeur Python amateur*
> *---*
> *French math teacher in a "Lyc?e" **and **Python **amateur developer*
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com
>
>


-- 
Ryan
[ERROR]: Your autotools build scripts are 200 lines longer than your
program. Something?s wrong.
http://kirbyfan64.github.io/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150502/32580ecf/attachment.html>

From steve at pearwood.info  Sun May  3 17:24:40 2015
From: steve at pearwood.info (Steven D'Aprano)
Date: Mon, 4 May 2015 01:24:40 +1000
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
Message-ID: <20150503152438.GU5663@ando.pearwood.info>

On Fri, May 01, 2015 at 09:24:47PM +0100, Arnaud Delobelle wrote:

> I'm not convinced that allowing an object to be both a normal and an
> async iterator is a good thing.  It could be a recipe for confusion.

In what way?

I'm thinking that the only confusion would be if you wrote "async for" 
instead of "for", or vice versa, and instead of getting an exception you 
got the (a)syncronous behaviour you didn't want.

But I have no intuition for how likely it is that you could write an 
asyncronous for loop, leave out the async, and still have the code do 
something meaningful.

Other than that, I think it would be fine to have an object be both a 
syncronous and asyncronous iterator. You specify the behaviour you want 
by how you use it. We can already do that, e.g. unittest's assertRaises 
is both a test assertion and a context manager.

Objects can have multiple roles, and it's not usually abused, or 
confusing. I'm not sure that async iterables will be any different.



-- 
Steve

From guido at python.org  Sun May  3 18:45:26 2015
From: guido at python.org (Guido van Rossum)
Date: Sun, 3 May 2015 09:45:26 -0700
Subject: [Python-Dev] Sub-claasing pathlib.Path seems impossible
In-Reply-To: <CAO41-mN5f=_iHgkxTfLcJhok_NWAwHLRW3JU9DEXbO7Qw4B9bQ@mail.gmail.com>
References: <CAAb4jGkqyLh4EPgs2tntY1RMaDdig4wROJzX9Ek8ZepxTwq9-A@mail.gmail.com>
 <CAO41-mN5f=_iHgkxTfLcJhok_NWAwHLRW3JU9DEXbO7Qw4B9bQ@mail.gmail.com>
Message-ID: <CAP7+vJLkrTaVB8Bp34wWTdKX0rA+OPB1Tz+9OJKKWLQwtUPrUQ@mail.gmail.com>

It does sound like subclassing Path should be made easier.

On Sat, May 2, 2015 at 1:30 PM, Ryan Gonzalez <rymg19 at gmail.com> wrote:

> http://stackoverflow.com/a/29880095/2097780
>
> My favorite thing about Python is that it's so easy to be evil. ;)
>
>
> On Fri, May 1, 2015 at 2:30 PM, Christophe Bal <projetmbc at gmail.com>
> wrote:
>
>> Hello.
>>
>> In this post
>> <http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-does-not-work/29854141#29854141>,
>> I have noticed a problem with the following code.
>>
>> from pathlib import Path
>>> class PPath(Path):
>>>     def __init__(self, *args, **kwargs):
>>>         super().__init__(*args, **kwargs)
>>>
>>> test = PPath("dir", "test.txt")
>>>
>>>
>> This gives the following error message.
>>
>>
>>
>>> Traceback (most recent call last):
>>>   File "/Users/projetmbc/test.py", line 14, in <module>
>>>     test = PPath("dir", "test.txt")
>>>   File "/anaconda/lib/python3.4/pathlib.py", line 907, in __new__
>>>     self = cls._from_parts(args, init=False)
>>>   File "/anaconda/lib/python3.4/pathlib.py", line 589, in _from_parts
>>>     drv, root, parts = self._parse_args(args)
>>>   File "/anaconda/lib/python3.4/pathlib.py", line 582, in _parse_args
>>>     return cls._flavour.parse_parts(parts)AttributeError: type object 'PPath' has no attribute '_flavour'
>>>
>>>
>> This breaks the sub-classing from Python point of view. In the post
>> <http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-does-not-work/29854141#29854141>,
>> I give a hack to sub-class Path but it's a bit Unpythonic.
>>
>> *Christophe BAL*
>> *Enseignant de math?matiques en Lyc?e **et d?veloppeur Python amateur*
>> *---*
>> *French math teacher in a "Lyc?e" **and **Python **amateur developer*
>>
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com
>>
>>
>
>
> --
> Ryan
> [ERROR]: Your autotools build scripts are 200 lines longer than your
> program. Something?s wrong.
> http://kirbyfan64.github.io/
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
>


-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150503/94b3c35f/attachment-0001.html>

From jimjjewett at gmail.com  Mon May  4 19:35:57 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Mon, 04 May 2015 10:35:57 -0700 (PDT)
Subject: [Python-Dev] ABCs - Re: PEP 492: async/await in Python;
	version 4
In-Reply-To: <mi4fd4$9ak$1@ger.gmane.org>
Message-ID: <5547adfd.14858c0a.58c7.13c4@mx.google.com>


On Sun May 3 08:32:02 CEST 2015, Stefan Behnel wrote:

> Ok, fair enough. So, how would you use this new protocol manually then?
> Say, I already know that I won't need to await the next item that the
> iterator will return. For normal iterators, I could just call next() on it
> and continue the for-loop. How would I do it for AIterators?

Call next, then stick it somewhere it be waited on.

Or is that syntactically illegal, because of the separation between
sync and async?

The "asych for" seems to assume that you want to do the waiting right
now, at each step.  (At least as far as this thread of the logic goes;
something else might be happening in parallel via other threads of
control.)

> BTW, I guess that this "AIterator", or rather "AsyncIterator", needs to be
> a separate protocol (and ABC) then. Implementing "__aiter__()" and
> "__anext__()" seems perfectly reasonable without implementing (or using) a
> Coroutine.

> That means we also need an "AsyncIterable" as a base class for it.

Agreed.

> That might even help us to decide if we need new builtins (or helpers)
> aiter() and anext() in order to deal with these protocols.

I hope not; they seem more like specialized versions of functions,
such as are found in math or cmath.  Ideally, as much as possible of
this PEP should live in asycio, rather than appearing globally.

Which reminds me ... *should* the "await" keyword work with any future,
or is it really intentionally restricted to use with a single library
module and 3rd party replacements?

-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From tds333 at gmail.com  Tue May  5 09:54:35 2015
From: tds333 at gmail.com (tds333 at gmail.com)
Date: Tue, 05 May 2015 09:54:35 +0200
Subject: [Python-Dev]  PEP 492: async/await in Python; version 4
In-Reply-To: <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
Message-ID: <5548773B.7050004@gmail.com>

Hi,

still watching progress here. Read all posts and changes.

Everything improved and I know it is a lot of work. Thx for doing this.

But I still think this PEP goes to far.

1. To have "native coroutines" and await, __await__ is very good and
useful. Also for beginners to see and mark coroutines are a different
concept than generators. Also this is easy to learn. Easier as to
explain why a generator is in this case a coroutine and so on.

2. I still don't like to sprinkle async everywhere. At all we don't need
it for the first step.
We can handle coroutines similar to generators, when there is a await in
it then it is one. Same as for yield. Or to be more explicit, if it is
marked as one with @coroutine it is enough. But then it makes also sense
to do the same for generators with @generator. We should be consistent here.

3. async with is not needed, there are rare use cases for it and every
can be written with a try: except: finally:
Every async framework lived without it over years. No problem because
for the seldom need try: ... was enough.

4. async for can be implemented with a while loop. For me this is even
more explicit and clear. Every time I see the async for I try to find
out what is done in async manner. For what I do an implicit await ?
Also in most of my uses cases it was enough to produce Futures in a
normal loop and yield (await) for them.
Even for database interfaces. Most of the time they do prefetching under
the hood and I don't have to care at all.

5. For async with/for a lot of new special methods are needed all only
prefixed with "a". Easy to confuse with "a" for abstract. Often used
to prefix abstract classes. Still think __async_iter__, __async_enter__
is better for this. Yes more to write but not so easy to confuse with
the sync __iter__ or __enter__, ... And matches more to I must use async
to call them.


I am not new to the async land, have done a lot of stuff with twisted
and even looked at tornado.
Also has tried to teach some people the asnc world. This is not easy
and you learn most only by doing and using it.

For my final conclusion, we should not rush all this into the language.
Do it step by step and also help other Python implementations to support
it. For now only a very low percentage are at Python 3.4.
And if you compare the statistics in PyPi, you see most are still at
Python 2.7 and using async frameworks like twisted/tornado.
We do such a fundamental language change only because a few people need
it for a niche in the whole Python universe?
We confuse the rest with new stuff they never need?

Even the discussion on python-dev suggests there is some time needed
to finalize all this.

We forget to address the major problems here. How can someone in a
"sync" script use this async stuff easy. How can async and sync stuff
cooperate and we don't need to rewrite the world for async stuff.
How can a normal user access the power of async stuff without rewriting
all his code. So he can use a simple asyc request library in his code.
How can a normal user learn and use all this in an easy way.

And for all this we still can't tell them "oh the async stuff solves
the multiprocessing problem of Python learn it and switch to version
3.5". It does not and it is only most useful for networking stuff
nothing more.

Don't get me wrong, I like async programming and I think it is very
useful. But had to learn not everyone thinks so and most only want
to solve there problems in an easy way and not get a new one called
"async".


Now I shut up. Go to my normal mode and be quiet and read. ;-)



Regards,

Wolfgang






From tds333+pydev at gmail.com  Tue May  5 13:27:04 2015
From: tds333+pydev at gmail.com (Wolfgang)
Date: Tue, 05 May 2015 13:27:04 +0200
Subject: [Python-Dev]  PEP 492: async/await in Python; version 4
In-Reply-To: <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
Message-ID: <5548A908.9010206@gmail.com>

Hi,

still watching progress here. Read all posts and changes.

Everything improved and I know it is a lot of work. Thx for doing this.

But I still think this PEP goes to far.

1. To have "native coroutines" and await, __await__ is very good and
useful. Also for beginners to see and mark coroutines are a different
concept than generators. Also this is easy to learn. Easier as to
explain why a generator is in this case a coroutine and so on.

2. I still don't like to sprinkle async everywhere. At all we don't need
it for the first step.
We can handle coroutines similar to generators, when there is a await in
it then it is one. Same as for yield. Or to be more explicit, if it is
marked as one with @coroutine it is enough. But then it makes also sense
to do the same for generators with @generator. We should be consistent here.

3. async with is not needed, there are rare use cases for it and every
can be written with a try: except: finally:
Every async framework lived without it over years. No problem because
for the seldom need try: ... was enough.

4. async for can be implemented with a while loop. For me this is even
more explicit and clear. Every time I see the async for I try to find
out what is done in async manner. For what I do an implicit await ?
Also in most of my uses cases it was enough to produce Futures in a
normal loop and yield (await) for them.
Even for database interfaces. Most of the time they do prefetching under
the hood and I don't have to care at all.

5. For async with/for a lot of new special methods are needed all only
prefixed with "a". Easy to confuse with "a" for abstract. Often used
to prefix abstract classes. Still think __async_iter__, __async_enter__
is better for this. Yes more to write but not so easy to confuse with
the sync __iter__ or __enter__, ... And matches more to I must use async
to call them.


I am not new to the async land, have done a lot of stuff with twisted
and even looked at tornado.
Also has tried to teach some people the asnc world. This is not easy
and you learn most only by doing and using it.

For my final conclusion, we should not rush all this into the language.
Do it step by step and also help other Python implementations to support
it. For now only a very low percentage are at Python 3.4.
And if you compare the statistics in PyPi, you see most are still at
Python 2.7 and using async frameworks like twisted/tornado.
We do such a fundamental language change only because a few people need
it for a niche in the whole Python universe?
We confuse the rest with new stuff they never need?

Even the discussion on python-dev suggests there is some time needed
to finalize all this.

We forget to address the major problems here. How can someone in a
"sync" script use this async stuff easy. How can async and sync stuff
cooperate and we don't need to rewrite the world for async stuff.
How can a normal user access the power of async stuff without rewriting
all his code. So he can use a simple asyc request library in his code.
How can a normal user learn and use all this in an easy way.

And for all this we still can't tell them "oh the async stuff solves
the multiprocessing problem of Python learn it and switch to version
3.5". It does not and it is only most useful for networking stuff
nothing more.

Don't get me wrong, I like async programming and I think it is very
useful. But had to learn not everyone thinks so and most only want
to solve there problems in an easy way and not get a new one called
"async".


Now I shut up. Go to my normal mode and be quiet and read. ;-)



Regards,

Wolfgang






From koos.zevenhoven at aalto.fi  Tue May  5 16:15:21 2015
From: koos.zevenhoven at aalto.fi (Koos Zevenhoven)
Date: Tue, 5 May 2015 14:15:21 +0000 (UTC)
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548773B.7050004@gmail.com>
Message-ID: <loom.20150505T133745-273@post.gmane.org>

tds333 <at> gmail.com <tds333 <at> gmail.com> writes:

> 
> Hi,
> 
> still watching progress here. Read all posts and changes.
> 
> Everything improved and I know it is a lot of work. Thx for doing this.
> 
> But I still think this PEP goes to far.
> 
> [...]
> 
> We forget to address the major problems here. How can someone in a
> "sync" script use this async stuff easy. How can async and sync stuff
> cooperate and we don't need to rewrite the world for async stuff.
> How can a normal user access the power of async stuff without rewriting
> all his code. So he can use a simple asyc request library in his code.
> How can a normal user learn and use all this in an easy way.
> 
> [...]

Hi Wolfgang,

You may want to see what I just posted on python-ideas. What I wrote about 
is related to several things you mention, and might provide a remedy.

-- Koos




From jimjjewett at gmail.com  Tue May  5 17:30:12 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Tue, 05 May 2015 08:30:12 -0700 (PDT)
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <5543F702.3090600@gmail.com>
Message-ID: <5548e204.8f19370a.4457.0a16@mx.google.com>


On Fri May 1 23:58:26 CEST 2015, Yury Selivanov wrote:


>>> Yes, you can't use 'yield from' in __exit__/__enter__
>>> in current Python.

>> What do you mean by "can't use"?

> It probably executed without errors, but it didn't run the
> generators.

True.  But it did return the one created by __enter__, so it
could be bound to a variable and iterated within the block.

There isn't an easy way to run the generator created by __exit__,
and I'm not coming up with any obvious scenarios where it would
be a sensible thing to do (other than using "with" on a context
manager that *does* return a future instead of finishing).

That said, I'm still not seeing why the distinction is so important
that we have to enforce it at a language level, as opposed to letting
the framework do its own enforcement.  (And if the reason is
performance, then make the checks something that can be turned off,
or offer a fully instrumented loop as an alternative for debugging.)

>>>> Is the intent to do anything more than preface execution with:
>>>> import asynchio.coroutines
>>>> asynchio.coroutines._DEBUG = True

>>> If you enable it after you import asyncio,
>>> then asyncio itself won't be instrumented.

>> Why does asynchio itself have to wrapped?  Is that really something
>> normal developers need to debug, or is it only for developing the
>> stdlib itself?  

> Yes, normal developers need asyncio to be instrumented,
> otherwise you won't know what you did wrong when you
> called some asyncio code without 'await' for example.

I'll trust you that it *does* work that way, but this sure sounds to
me as though the framework isn't ready to be frozen with syntax, and
maybe not even ready for non-provisional stdlib inclusion.

I understand that the disconnected nature of asynchronous tasks makes
them harder to debug.  I heartily agree that the event loop should
offer some sort of debug facility to track this.

But the event loop is supposed to be pluggable.  Saying that this
requires not merely a replacement, or even a replacement before events
are added, but a replacement made before python ever even loads the
default version ...

That seems to be much stronger than sys.settrace -- more like
instrumenting the ceval loop itself.  And that is something that
ordinary developers shouldn't have to do.


-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From oscar.j.benjamin at gmail.com  Tue May  5 18:01:28 2015
From: oscar.j.benjamin at gmail.com (Oscar Benjamin)
Date: Tue, 5 May 2015 17:01:28 +0100
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CAJ6cK1YPp+w=021E1y_D4sV-DhuEVx+QX716KH8-3C2_X_WZHA@mail.gmail.com>
References: <CAP7+vJ+m=qwo+bkhcp+b_Y35VeY8fq1ReraSMB_baV6-hNd-Bw@mail.gmail.com>
 <55411841.c5b3340a.1cf8.07e8@mx.google.com>
 <CACac1F_U7JvLJ=eLh9ou1c--nWVWLhhGTuU0=4zhU+NxKY=Hig@mail.gmail.com>
 <5541261A.9020909@gmail.com>
 <CACac1F95=2pqG5VFR9_5sdR_KOx=OXUp--0Jdkp5CSgYu7k+=A@mail.gmail.com>
 <5541341A.8090204@gmail.com>
 <CAJ6cK1YPp+w=021E1y_D4sV-DhuEVx+QX716KH8-3C2_X_WZHA@mail.gmail.com>
Message-ID: <CAHVvXxRXAn4zQZxji+jR99Zgm+sxM4oSBZxDKa_VCfbwq2Nx_w@mail.gmail.com>

On 30 April 2015 at 09:50, Arnaud Delobelle <arnodel at gmail.com> wrote:
>>
>> I'm flexible about how we name 'async def' functions.  I like
>> to call them "coroutines", because that's what they are, and
>> that's how asyncio calls them.  It's also convenient to use
>> 'coroutine-object' to explain what is the result of calling
>> a coroutine.
>
> I'd like the object created by an 'async def' statement to be called a
> 'coroutine function' and the result of calling it to be called a
> 'coroutine'.

That would be an improvement over the confusing terminology in the PEP
atm. The PEP proposes to name the inspect functions
inspect.iscoroutine() and inspect.iscoroutinefunction(). According to
the PEP iscoroutine() identifies "coroutine objects" and
iscoroutinefunction() identifies "coroutine functions" -- a term which
is not defined in the PEP but presumably means what the PEP calls a
"coroutine" in the glossary.

Calling the async def function an "async function" and the object it
returns a "coroutine" makes for the clearest terminology IMO (provided
the word coroutine is not also used for anything else). It would help
to prevent both experienced and new users from confusing the two
related but necessarily distinct concepts. Clearly distinct
terminology makes it easier to explain/discuss something if nothing
else because it saves repeating definitions all the time.


--
Oscar

From yselivanov.ml at gmail.com  Tue May  5 18:29:44 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 12:29:44 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
Message-ID: <5548EFF8.4060405@gmail.com>

Hi python-dev,


Updated version of the PEP is below.

Quick summary of changes:

1. set_coroutine_wrapper and get_coroutine_wrapper functions
are now thread-specific (like settrace etc).

2. Updated Abstract & Rationale sections.

3. RuntimeWarning is always raised when a coroutine wasn't
awaited on. This is in addition to what 'set_coroutine_wrapper'
will/can do.

4. asyncio.async is renamed to asyncio.ensure_future; it will
be deprecated in 3.5.

5. Uses of async/await in CPython codebase are documented.

6. Other small edits and updates.


Thanks,
Yury



PEP: 492
Title: Coroutines with async and await syntax
Version: $Revision$
Last-Modified: $Date$
Author: Yury Selivanov <yselivanov at sprymix.com>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 09-Apr-2015
Python-Version: 3.5
Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015, 29-Apr-2015,
               05-May-2015


Abstract
========

The growth of Internet and general connectivity has triggered the
proportionate need for responsive and scalable code.  This proposal
aims to answer that need by making writing explicitly asynchronous,
concurrent Python code easier and more Pythonic.

It is proposed to make *coroutines* a proper standalone concept in
Python, and introduce new supporting syntax.  The ultimate goal
is to help establish a common, easily approachable, mental
model of asynchronous programming in Python and make it as close to
synchronous programming as possible.

We believe that the changes proposed here will help keep Python
relevant and competitive in a quickly growing area of asynchronous
programming, as many other languages have adopted, or are planning to
adopt, similar features: [2]_, [5]_, [6]_, [7]_, [8]_, [10]_.


Rationale and Goals
===================

Current Python supports implementing coroutines via generators (PEP
342), further enhanced by the ``yield from`` syntax introduced in PEP
380. This approach has a number of shortcomings:

* It is easy to confuse coroutines with regular generators, since they
   share the same syntax; this is especially true for new developers.

* Whether or not a function is a coroutine is determined by a presence
   of ``yield``  or ``yield from`` statements in its *body*, which can
   lead to unobvious errors when such statements appear in or disappear
   from function body during refactoring.

* Support for asynchronous calls is limited to expressions where
   ``yield`` is allowed syntactically, limiting the usefulness of
   syntactic features, such as ``with`` and ``for`` statements.

This proposal makes coroutines a native Python language feature, and
clearly separates them from generators.  This removes
generator/coroutine ambiguity, and makes it possible to reliably define
coroutines without reliance on a specific library.  This also enables
linters and IDEs to improve static code analysis and refactoring.

Native coroutines and the associated new syntax features make it
possible to define context manager and iteration protocols in
asynchronous terms. As shown later in this proposal, the new ``async
with`` statement lets Python programs perform asynchronous calls when
entering and exiting a runtime context, and the new ``async for``
statement makes it possible to perform asynchronous calls in iterators.


Specification
=============

This proposal introduces new syntax and semantics to enhance coroutine
support in Python.

This specification presumes knowledge of the implementation of
coroutines in Python (PEP 342 and PEP 380).  Motivation for the syntax
changes proposed here comes from the asyncio framework (PEP 3156) and
the "Cofunctions" proposal (PEP 3152, now rejected in favor of this
specification).

 From this point in this document we use the word *native coroutine* to
refer to functions declared using the new syntax. *generator-based
coroutine* is used where necessary to refer to coroutines that are
based on generator syntax.  *coroutine* is used in contexts where both
definitions are applicable.


New Coroutine Declaration Syntax
--------------------------------

The following new syntax is used to declare a *native coroutine*::

     async def read_data(db):
         pass

Key properties of *coroutines*:

* ``async def`` functions are always coroutines, even if they do not
   contain ``await`` expressions.

* It is a ``SyntaxError`` to have ``yield`` or ``yield from``
   expressions in an ``async`` function.

* Internally, two new code object flags were introduced:

   - ``CO_COROUTINE`` is used to enable runtime detection of
     *coroutines* (and migrating existing code).

   - ``CO_NATIVE_COROUTINE`` is used to mark *native coroutines*
     (defined with new syntax.)

   All coroutines have ``CO_COROUTINE``, ``CO_NATIVE_COROUTINE``, and
   ``CO_GENERATOR`` flags set.

* Regular generators, when called, return a *generator object*;
   similarly, coroutines return a *coroutine object*.

* ``StopIteration`` exceptions are not propagated out of coroutines,
   and are replaced with a ``RuntimeError``.  For regular generators
   such behavior requires a future import (see PEP 479).

* When a *coroutine* is garbage collected, a ``RuntimeWarning`` is
   raised if it was never awaited on (see also `Debugging Features`_.)

* See also `Coroutine objects`_ section.


types.coroutine()
-----------------

A new function ``coroutine(gen)`` is added to the ``types`` module.  It
allows interoperability between existing *generator-based coroutines*
in asyncio and *native coroutines* introduced by this PEP.

The function applies ``CO_COROUTINE`` flag to generator-function's code
object, making it return a *coroutine object*.

The function can be used as a decorator, since it modifies generator-
functions in-place and returns them.

Note, that the ``CO_NATIVE_COROUTINE`` flag is not applied by
``types.coroutine()`` to make it possible to separate *native
coroutines* defined with new syntax, from *generator-based coroutines*.


Await Expression
----------------

The following new ``await`` expression is used to obtain a result of
coroutine execution::

     async def read_data(db):
         data = await db.fetch('SELECT ...')
         ...

``await``, similarly to ``yield from``, suspends execution of
``read_data`` coroutine until ``db.fetch`` *awaitable* completes and
returns the result data.

It uses the ``yield from`` implementation with an extra step of
validating its argument.  ``await`` only accepts an *awaitable*, which
can be one of:

* A *native coroutine object* returned from a *native coroutine*.

* A *generator-based coroutine object* returned from a generator
   decorated with ``types.coroutine()``.

* An object with an ``__await__`` method returning an iterator.

   Any ``yield from`` chain of calls ends with a ``yield``.  This is a
   fundamental mechanism of how *Futures* are implemented.  Since,
   internally, coroutines are a special kind of generators, every
   ``await`` is suspended by a ``yield`` somewhere down the chain of
   ``await`` calls (please refer to PEP 3156 for a detailed
   explanation.)

   To enable this behavior for coroutines, a new magic method called
   ``__await__`` is added.  In asyncio, for instance, to enable *Future*
   objects in ``await`` statements, the only change is to add
   ``__await__ = __iter__`` line to ``asyncio.Future`` class.

   Objects with ``__await__`` method are called *Future-like* objects in
   the rest of this PEP.

   Also, please note that ``__aiter__`` method (see its definition
   below) cannot be used for this purpose.  It is a different protocol,
   and would be like using ``__iter__`` instead of ``__call__`` for
   regular callables.

   It is a ``TypeError`` if ``__await__`` returns anything but an
   iterator.

* Objects defined with CPython C API with a ``tp_await`` function,
   returning an iterator (similar to ``__await__`` method).

It is a ``SyntaxError`` to use ``await`` outside of an ``async def``
function (like it is a ``SyntaxError`` to use ``yield`` outside of
``def`` function.)

It is a ``TypeError`` to pass anything other than an *awaitable* object
to an ``await`` expression.


Updated operator precedence table
'''''''''''''''''''''''''''''''''

``await`` keyword is defined as follows::

     power ::=  await ["**" u_expr]
     await ::=  ["await"] primary

where "primary" represents the most tightly bound operations of the
language.  Its syntax is::

     primary ::=  atom | attributeref | subscription | slicing | call

See Python Documentation [12]_ and `Grammar Updates`_ section of this
proposal for details.

The key ``await`` difference from ``yield`` and ``yield from``
operators is that *await expressions* do not require parentheses around
them most of the times.

Also, ``yield from`` allows any expression as its argument, including
expressions like ``yield from a() + b()``, that would be parsed as
``yield from (a() + b())``, which is almost always a bug.  In general,
the result of any arithmetic operation is not an *awaitable* object.
To avoid this kind of mistakes, it was decided to make ``await``
precedence lower than ``[]``, ``()``, and ``.``, but higher than ``**``
operators.

+------------------------------+-----------------------------------+
| Operator                     | Description |
+==============================+===================================+
| ``yield`` ``x``,             | Yield expression |
| ``yield from`` ``x``         | |
+------------------------------+-----------------------------------+
| ``lambda``                   | Lambda expression |
+------------------------------+-----------------------------------+
| ``if`` -- ``else``           | Conditional expression |
+------------------------------+-----------------------------------+
| ``or``                       | Boolean OR |
+------------------------------+-----------------------------------+
| ``and``                      | Boolean AND |
+------------------------------+-----------------------------------+
| ``not`` ``x``                | Boolean NOT |
+------------------------------+-----------------------------------+
| ``in``, ``not in``,          | Comparisons, including membership |
| ``is``, ``is not``, ``<``,   | tests and identity tests          |
| ``<=``, ``>``, ``>=``, |                                   |
| ``!=``, ``==``               | |
+------------------------------+-----------------------------------+
| ``|``                        | Bitwise OR |
+------------------------------+-----------------------------------+
| ``^``                        | Bitwise XOR |
+------------------------------+-----------------------------------+
| ``&``                        | Bitwise AND                       |
+------------------------------+-----------------------------------+
| ``<<``, ``>>``               | Shifts                            |
+------------------------------+-----------------------------------+
| ``+``, ``-``                 | Addition and subtraction |
+------------------------------+-----------------------------------+
| ``*``, ``@``, ``/``, ``//``, | Multiplication, matrix |
| ``%``                        | multiplication, division, |
|                              | remainder |
+------------------------------+-----------------------------------+
| ``+x``, ``-x``, ``~x``       | Positive, negative, bitwise NOT |
+------------------------------+-----------------------------------+
| ``**``                       | Exponentiation |
+------------------------------+-----------------------------------+
| ``await`` ``x``              | Await expression |
+------------------------------+-----------------------------------+
| ``x[index]``,                | Subscription, slicing, |
| ``x[index:index]``,          | call, attribute reference |
| ``x(arguments...)``,         | |
| ``x.attribute``              | |
+------------------------------+-----------------------------------+
| ``(expressions...)``,        | Binding or tuple display, |
| ``[expressions...]``,        | list display, |
| ``{key: value...}``,         | dictionary display, |
| ``{expressions...}``         | set display |
+------------------------------+-----------------------------------+


Examples of "await" expressions
'''''''''''''''''''''''''''''''

Valid syntax examples:

================================== ==================================
Expression                         Will be parsed as
================================== ==================================
``if await fut: pass``             ``if (await fut): pass``
``if await fut + 1: pass``         ``if (await fut) + 1: pass``
``pair = await fut, 'spam'``       ``pair = (await fut), 'spam'``
``with await fut, open(): pass``   ``with (await fut), open(): pass``
``await foo()['spam'].baz()()``    ``await ( foo()['spam'].baz()() )``
``return await coro()``            ``return ( await coro() )``
``res = await coro() ** 2``        ``res = (await coro()) ** 2``
``func(a1=await coro(), a2=0)``    ``func(a1=(await coro()), a2=0)``
``await foo() + await bar()``      ``(await foo()) + (await bar())``
``-await foo()``                   ``-(await foo())``
================================== ==================================

Invalid syntax examples:

================================== ==================================
Expression                         Should be written as
================================== ==================================
``await await coro()``             ``await (await coro())``
``await -coro()``                  ``await (-coro())``
================================== ==================================


Asynchronous Context Managers and "async with"
----------------------------------------------

An *asynchronous context manager* is a context manager that is able to
suspend execution in its *enter* and *exit* methods.

To make this possible, a new protocol for asynchronous context managers
is proposed.  Two new magic methods are added: ``__aenter__`` and
``__aexit__``. Both must return an *awaitable*.

An example of an asynchronous context manager::

     class AsyncContextManager:
         async def __aenter__(self):
             await log('entering context')

         async def __aexit__(self, exc_type, exc, tb):
             await log('exiting context')


New Syntax
''''''''''

A new statement for asynchronous context managers is proposed::

     async with EXPR as VAR:
         BLOCK


which is semantically equivalent to::

     mgr = (EXPR)
     aexit = type(mgr).__aexit__
     aenter = type(mgr).__aenter__(mgr)
     exc = True

     try:
         VAR = await aenter
         BLOCK
     except:
         if not await aexit(mgr, *sys.exc_info()):
             raise
     else:
         await aexit(mgr, None, None, None)


As with regular ``with`` statements, it is possible to specify multiple
context managers in a single ``async with`` statement.

It is an error to pass a regular context manager without ``__aenter__``
and ``__aexit__`` methods to ``async with``.  It is a ``SyntaxError``
to use ``async with`` outside of an ``async def`` function.


Example
'''''''

With *asynchronous context managers* it is easy to implement proper
database transaction managers for coroutines::

     async def commit(session, data):
         ...

         async with session.transaction():
             ...
             await session.update(data)
             ...

Code that needs locking also looks lighter::

     async with lock:
         ...

instead of::

     with (yield from lock):
         ...


Asynchronous Iterators and "async for"
--------------------------------------

An *asynchronous iterable* is able to call asynchronous code in its
*iter* implementation, and *asynchronous iterator* can call
asynchronous code in its *next* method.  To support asynchronous
iteration:

1. An object must implement an  ``__aiter__`` method returning an
    *awaitable* resulting in an *asynchronous iterator object*.

2. An *asynchronous iterator object* must implement an ``__anext__``
    method returning an *awaitable*.

3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration``
    exception.

An example of asynchronous iterable::

     class AsyncIterable:
         async def __aiter__(self):
             return self

         async def __anext__(self):
             data = await self.fetch_data()
             if data:
                 return data
             else:
                 raise StopAsyncIteration

         async def fetch_data(self):
             ...


New Syntax
''''''''''

A new statement for iterating through asynchronous iterators is
proposed::

     async for TARGET in ITER:
         BLOCK
     else:
         BLOCK2

which is semantically equivalent to::

     iter = (ITER)
     iter = await type(iter).__aiter__(iter)
     running = True
     while running:
         try:
             TARGET = await type(iter).__anext__(iter)
         except StopAsyncIteration:
             running = False
         else:
             BLOCK
     else:
         BLOCK2


It is a ``TypeError`` to pass a regular iterable without ``__aiter__``
method to ``async for``.  It is a ``SyntaxError`` to use ``async for``
outside of an ``async def`` function.

As for with regular ``for`` statement, ``async for`` has an optional
``else`` clause.


Example 1
'''''''''

With asynchronous iteration protocol it is possible to asynchronously
buffer data during iteration::

     async for data in cursor:
         ...

Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows
of data from a database after every ``N`` iterations.

The following code illustrates new asynchronous iteration protocol::

     class Cursor:
         def __init__(self):
             self.buffer = collections.deque()

         def _prefetch(self):
             ...

         async def __aiter__(self):
             return self

         async def __anext__(self):
             if not self.buffer:
                 self.buffer = await self._prefetch()
                 if not self.buffer:
                     raise StopAsyncIteration
             return self.buffer.popleft()

then the ``Cursor`` class can be used as follows::

     async for row in Cursor():
         print(row)

which would be equivalent to the following code::

     i = await Cursor().__aiter__()
     while True:
         try:
             row = await i.__anext__()
         except StopAsyncIteration:
             break
         else:
             print(row)


Example 2
'''''''''

The following is a utility class that transforms a regular iterable to
an asynchronous one.  While this is not a very useful thing to do, the
code illustrates the relationship between regular and asynchronous
iterators.

::

     class AsyncIteratorWrapper:
         def __init__(self, obj):
             self._it = iter(obj)

         async def __aiter__(self):
             return self

         async def __anext__(self):
             try:
                 value = next(self._it)
             except StopIteration:
                 raise StopAsyncIteration
             return value

     async for letter in AsyncIteratorWrapper("abc"):
         print(letter)


Why StopAsyncIteration?
'''''''''''''''''''''''

Coroutines are still based on generators internally.  So, before PEP
479, there was no fundamental difference between

::

     def g1():
         yield from fut
         return 'spam'

and

::

     def g2():
         yield from fut
         raise StopIteration('spam')

And since PEP 479 is accepted and enabled by default for coroutines,
the following example will have its ``StopIteration`` wrapped into a
``RuntimeError``

::

     async def a1():
         await fut
         raise StopIteration('spam')

The only way to tell the outside code that the iteration has ended is
to raise something other than ``StopIteration``.  Therefore, a new
built-in exception class ``StopAsyncIteration`` was added.

Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions
raised in coroutines are wrapped in ``RuntimeError``.


Coroutine objects
-----------------

Differences from generators
'''''''''''''''''''''''''''

This section applies only to *native coroutines* with
``CO_NATIVE_COROUTINE`` flag, i.e. defined with the new ``async def``
syntax.

**The behavior of existing *generator-based coroutines* in asyncio
remains unchanged.**

Great effort has been made to make sure that coroutines and
generators are treated as distinct concepts:

1. *Native coroutine objects* do not implement ``__iter__`` and
    ``__next__`` methods.  Therefore, they cannot be iterated over or
    passed to ``iter()``, ``list()``, ``tuple()`` and other built-ins.
    They also cannot be used in a ``for..in`` loop.

    An attempt to use ``__iter__`` or ``__next__`` on a *native
    coroutine object* will result in a ``TypeError``.

2. *Plain generators* cannot ``yield from`` *native coroutine objects*:
    doing so will result in a ``TypeError``.

3. *generator-based coroutines* (for asyncio code must be decorated
    with ``@asyncio.coroutine``) can ``yield from`` *native coroutine
    objects*.

4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()``
    return ``False`` for *native coroutine objects* and *native
    coroutine functions*.


Coroutine object methods
''''''''''''''''''''''''

Coroutines are based on generators internally, thus they share the
implementation.  Similarly to generator objects, coroutine objects have
``throw()``, ``send()`` and ``close()`` methods. ``StopIteration`` and
``GeneratorExit`` play the same role for coroutine objects (although
PEP 479 is enabled by default for coroutines).  See PEP 342, PEP 380,
and Python Documentation [11]_ for details.

``throw()``, ``send()`` methods for coroutine objects are used to push
values and raise errors into *Future-like* objects.


Debugging Features
------------------

A common beginner mistake is forgetting to use ``yield from`` on
coroutines::

     @asyncio.coroutine
     def useful():
         asyncio.sleep(1) # this will do noting without 'yield from'

For debugging this kind of mistakes there is a special debug mode in
asyncio, in which ``@coroutine`` decorator wraps all functions with a
special object with a destructor logging a warning.  Whenever a wrapped
generator gets garbage collected, a detailed logging message is
generated with information about where exactly the decorator function
was defined, stack trace of where it was collected, etc.  Wrapper
object also provides a convenient ``__repr__`` function with detailed
information about the generator.

The only problem is how to enable these debug capabilities.  Since
debug facilities should be a no-op in production mode, ``@coroutine``
decorator makes the decision of whether to wrap or not to wrap based on
an OS environment variable ``PYTHONASYNCIODEBUG``.  This way it is
possible to run asyncio programs with asyncio's own functions
instrumented.  ``EventLoop.set_debug``, a different debug facility, has
no impact on ``@coroutine`` decorator's behavior.

With this proposal, coroutines is a native, distinct from generators,
concept.  *In addition* to a ``RuntimeWarning`` being raised on
coroutines that were never awaited, it is proposed to add two new
functions to the ``sys`` module: ``set_coroutine_wrapper`` and
``get_coroutine_wrapper``.  This is to enable advanced debugging
facilities in asyncio and other frameworks (such as displaying where
exactly coroutine was created, and a more detailed stack trace of where
it was garbage collected).


New Standard Library Functions
------------------------------

* ``types.coroutine(gen)``.  See `types.coroutine()`_ section for
   details.

* ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a
   *coroutine object*.

* ``inspect.iscoroutinefunction(obj)`` returns ``True`` if ``obj`` is a
   *coroutine function*.

* ``inspect.isawaitable(obj)`` returns ``True`` if ``obj`` can be used
   in ``await`` expression.  See `Await Expression`_ for details.

* ``sys.set_coroutine_wrapper(wrapper)`` allows to intercept creation
   of *coroutine objects*.  ``wrapper`` must be a callable that accepts
   one argument: a *coroutine object* or ``None``.  ``None`` resets the
   wrapper.  If called twice, the new wrapper replaces the previous one.
   The function is thread-specific.  See `Debugging Features`_ for more
   details.

* ``sys.get_coroutine_wrapper()`` returns the current wrapper object.
   Returns ``None`` if no wrapper was set.  The function is
   thread-specific.  See  `Debugging Features`_ for more details.


Glossary
========

:Native coroutine:
     A coroutine function is declared with ``async def``. It uses
     ``await`` and ``return value``; see `New Coroutine Declaration
     Syntax`_ for details.

:Native coroutine object:
     Returned from a native coroutine function. See `Await Expression`_
     for details.

:Generator-based coroutine:
     Coroutines based on generator syntax.  Most common example are
     functions decorated with ``@asyncio.coroutine``.

:Generator-based coroutine object:
     Returned from a generator-based coroutine function.

:Coroutine:
     Either *native coroutine* or *generator-based coroutine*.

:Coroutine object:
     Either *native coroutine object* or *generator-based coroutine
     object*.

:Future-like object:
     An object with an ``__await__`` method, or a C object with
     ``tp_await`` function, returning an iterator.  Can be consumed by
     an ``await`` expression in a coroutine. A coroutine waiting for a
     Future-like object is suspended until the Future-like object's
     ``__await__`` completes, and returns the result.  See `Await
     Expression`_ for details.

:Awaitable:
     A *Future-like* object or a *coroutine object*.  See `Await
     Expression`_ for details.

:Asynchronous context manager:
    An asynchronous context manager has ``__aenter__`` and ``__aexit__``
    methods and can be used with ``async with``.  See `Asynchronous
    Context Managers and "async with"`_ for details.

:Asynchronous iterable:
     An object with an ``__aiter__`` method, which must return an
     *asynchronous iterator* object.  Can be used with ``async for``.
     See `Asynchronous Iterators and "async for"`_ for details.

:Asynchronous iterator:
     An asynchronous iterator has an ``__anext__`` method.  See
     `Asynchronous Iterators and "async for"`_ for details.


List of functions and methods
=============================

================= =================================== =================
Method            Can contain                         Can't contain
================= =================================== =================
async def func    await, return value                 yield, yield from
async def __a*__  await, return value                 yield, yield from
def __a*__        return awaitable                    await
def __await__     yield, yield from, return iterable  await
generator         yield, yield from, return value     await
================= =================================== =================

Where:

* "async def func": native coroutine;

* "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
   ``__aexit__`` defined with the ``async`` keyword;

* "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``,
   ``__aexit__`` defined without the ``async`` keyword, must return an
   *awaitable*;

* "def __await__": ``__await__`` method to implement *Future-like*
   objects;

* generator: a "regular" generator, function defined with ``def`` and
   which contains a least one ``yield`` or ``yield from`` expression.


Transition Plan
===============

To avoid backwards compatibility issues with ``async`` and ``await``
keywords, it was decided to modify ``tokenizer.c`` in such a way, that
it:

* recognizes ``async def`` ``NAME`` tokens combination;

* keeps track of regular ``def`` and ``async def`` indented blocks;

* while tokenizing ``async def`` block, it replaces ``'async'``
   ``NAME`` token with ``ASYNC``, and ``'await'`` ``NAME`` token with
   ``AWAIT``;

* while tokenizing ``def`` block, it yields ``'async'`` and ``'await'``
   ``NAME`` tokens as is.

This approach allows for seamless combination of new syntax features
(all of them available only in ``async`` functions) with any existing
code.

An example of having "async def" and "async" attribute in one piece of
code::

     class Spam:
         async = 42

     async def ham():
         print(getattr(Spam, 'async'))

     # The coroutine can be executed and will print '42'


Backwards Compatibility
-----------------------

This proposal preserves 100% backwards compatibility.


asyncio
'''''''

``asyncio`` module was adapted and tested to work with coroutines and
new statements.  Backwards compatibility is 100% preserved, i.e. all
existing code will work as-is.

The required changes are mainly:

1. Modify ``@asyncio.coroutine`` decorator to use new
    ``types.coroutine()`` function.

2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class.

3. Add ``ensure_future()`` as an alias for ``async()`` function.
    Deprecate ``async()`` function.


asyncio migration strategy
''''''''''''''''''''''''''

Because *plain generators* cannot ``yield from`` *native coroutine
objects* (see `Differences from generators`_ section for more details),
it is advised to make sure that all generator-based coroutines are
decorated with ``@asyncio.coroutine`` *before* starting to use the new
syntax.


async/await in CPython code base
''''''''''''''''''''''''''''''''

There is no use of ``await`` names in CPython.

``async`` is mostly used by asyncio.  We are addressing this by
renaming ``async()`` function to ``ensure_future()`` (see `asyncio`_
section for details.)

Another use of ``async`` keyword is in ``Lib/xml/dom/xmlbuilder.py``,
to define an ``async = False`` attribute for ``DocumentLS`` class.
There is no documentation or tests for it, it is not used anywhere else
in CPython.  It is replaced with a getter, that raises a
``DeprecationWarning``, advising to use ``async_`` attribute instead.
'async' attribute is not documented and is not used in CPython code
base.


Grammar Updates
---------------

Grammar changes are fairly minimal::

     decorated: decorators (classdef | funcdef | async_funcdef)
     async_funcdef: ASYNC funcdef

     compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt
                     | funcdef | classdef | decorated | async_stmt)

     async_stmt: ASYNC (funcdef | with_stmt | for_stmt)

     power: atom_expr ['**' factor]
     atom_expr: [AWAIT] atom trailer*


Transition Period Shortcomings
------------------------------

There is just one.

Until ``async`` and ``await`` are not proper keywords, it is not
possible (or at least very hard) to fix ``tokenizer.c`` to recognize
them on the **same line** with ``def`` keyword::

     # async and await will always be parsed as variables

     async def outer():                             # 1
         def nested(a=(await fut)):
             pass

     async def foo(): return (await fut)            # 2

Since ``await`` and ``async`` in such cases are parsed as ``NAME``
tokens, a ``SyntaxError`` will be raised.

To workaround these issues, the above examples can be easily rewritten
to a more readable form::

     async def outer():                             # 1
         a_default = await fut
         def nested(a=a_default):
             pass

     async def foo():                               # 2
         return (await fut)

This limitation will go away as soon as ``async`` and ``await`` are
proper keywords.


Deprecation Plans
-----------------

``async`` and ``await`` names will be softly deprecated in CPython 3.5
and 3.6. In 3.7 we will transform them to proper keywords.  Making
``async`` and ``await`` proper keywords before 3.7 might make it harder
for people to port their code to Python 3.


Design Considerations
=====================

PEP 3152
--------

PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines
(called "cofunctions").  Some key points:

1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is
    always a generator, even if there is no ``cocall`` expressions
    inside it.  Maps to ``async def`` in this proposal.

2. A new keyword ``cocall`` to call a *cofunction*.  Can only be used
    inside a *cofunction*.  Maps to ``await`` in this proposal (with
    some differences, see below.)

3. It is not possible to call a *cofunction* without a ``cocall``
    keyword.

4. ``cocall`` grammatically requires parentheses after it::

     atom: cocall | <existing alternatives for atom>
     cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
     cotrailer: '[' subscriptlist ']' | '.' NAME

5. ``cocall f(*args, **kwds)`` is semantically equivalent to
    ``yield from f.__cocall__(*args, **kwds)``.

Differences from this proposal:

1. There is no equivalent of ``__cocall__`` in this PEP, which is
    called and its result is passed to ``yield from`` in the ``cocall``
    expression. ``await`` keyword expects an *awaitable* object,
    validates the type, and executes ``yield from`` on it. Although,
    ``__await__`` method is similar to ``__cocall__``, but is only used
    to define *Future-like* objects.

2. ``await`` is defined in almost the same way as ``yield from`` in the
    grammar (it is later enforced that ``await`` can only be inside
    ``async def``).  It is possible to simply write ``await future``,
    whereas ``cocall`` always requires parentheses.

3. To make asyncio work with PEP 3152 it would be required to modify
    ``@asyncio.coroutine`` decorator to wrap all functions in an object
    with a ``__cocall__`` method, or to implement ``__cocall__`` on
    generators.  To call *cofunctions* from existing generator-based
    coroutines it would be required to use ``costart(cofunc, *args,
    **kwargs)`` built-in.

4. Since it is impossible to call a *cofunction* without a ``cocall``
    keyword, it automatically prevents the common mistake of forgetting
    to use ``yield from`` on generator-based coroutines.  This proposal
    addresses this problem with a different approach, see `Debugging
    Features`_.

5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine
    is that if is decided to implement coroutine-generators --
    coroutines with ``yield`` or ``async yield`` expressions -- we
    wouldn't need a ``cocall`` keyword to call them.  So we'll end up
    having ``__cocall__`` and no ``__call__`` for regular coroutines,
    and having ``__call__`` and no ``__cocall__`` for coroutine-
    generators.

6. Requiring parentheses grammatically also introduces a whole lot
    of new problems.

    The following code::

        await fut
        await function_returning_future()
        await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2))

    would look like::

        cocall fut()  # or cocall costart(fut)
        cocall (function_returning_future())()
        cocall asyncio.gather(costart(coro1, arg1, arg2),
                              costart(coro2, arg1, arg2))

7. There are no equivalents of ``async for`` and ``async with`` in PEP
    3152.


Coroutine-generators
--------------------

With ``async for`` keyword it is desirable to have a concept of a
*coroutine-generator* -- a coroutine with ``yield`` and ``yield from``
expressions.  To avoid any ambiguity with regular generators, we would
likely require to have an ``async`` keyword before ``yield``, and
``async yield from`` would raise a ``StopAsyncIteration`` exception.

While it is possible to implement coroutine-generators, we believe that
they are out of scope of this proposal.  It is an advanced concept that
should be carefully considered and balanced, with a non-trivial changes
in the implementation of current generator objects.  This is a matter
for a separate PEP.


Why "async" and "await" keywords
--------------------------------

async/await is not a new concept in programming languages:

* C# has it since long time ago [5]_;

* proposal to add async/await in ECMAScript 7 [2]_;
   see also Traceur project [9]_;

* Facebook's Hack/HHVM [6]_;

* Google's Dart language [7]_;

* Scala [8]_;

* proposal to add async/await to C++ [10]_;

* and many other less popular languages.

This is a huge benefit, as some users already have experience with
async/await, and because it makes working with many languages in one
project easier (Python with ECMAScript 7 for instance).


Why "__aiter__" returns awaitable
---------------------------------

In principle, ``__aiter__`` could be a regular function.  There are
several good reasons to make it a coroutine:

* as most of the ``__anext__``, ``__aenter__``, and ``__aexit__``
   methods are coroutines, users would often make a mistake defining it
   as ``async`` anyways;

* there might be a need to run some asynchronous operations in
   ``__aiter__``, for instance to prepare DB queries or do some file
   operation.


Importance of "async" keyword
-----------------------------

While it is possible to just implement ``await`` expression and treat
all functions with at least one ``await`` as coroutines, this approach
makes APIs design, code refactoring and its long time support harder.

Let's pretend that Python only has ``await`` keyword::

     def useful():
         ...
         await log(...)
         ...

     def important():
         await useful()

If ``useful()`` function is refactored and someone removes all
``await`` expressions from it, it would become a regular python
function, and all code that depends on it, including ``important()``
would be broken.  To mitigate this issue a decorator similar to
``@asyncio.coroutine`` has to be introduced.


Why "async def"
---------------

For some people bare ``async name(): pass`` syntax might look more
appealing than ``async def name(): pass``.  It is certainly easier to
type.  But on the other hand, it breaks the symmetry between ``async
def``, ``async with`` and ``async for``, where ``async`` is a modifier,
stating that the statement is asynchronous.  It is also more consistent
with the existing grammar.


Why not "await for" and "await with"
------------------------------------

``async`` is an adjective, and hence it is a better choice for a
*statement qualifier* keyword.  ``await for/with`` would imply that
something is awaiting for a completion of a ``for`` or ``with``
statement.


Why "async def" and not "def async"
-----------------------------------

``async`` keyword is a *statement qualifier*.  A good analogy to it are
"static", "public", "unsafe" keywords from other languages. "async
for" is an asynchronous "for" statement, "async with" is an
asynchronous "with" statement, "async def" is an asynchronous function.

Having "async" after the main statement keyword might introduce some
confusion, like "for async item in iterator" can be read as "for each
asynchronous item in iterator".

Having ``async`` keyword before ``def``, ``with`` and ``for`` also
makes the language grammar simpler.  And "async def" better separates
coroutines from regular functions visually.


Why not a __future__ import
---------------------------

`Transition Plan`_ section explains how tokenizer is modified to treat
``async`` and ``await`` as keywords *only* in ``async def`` blocks.
Hence ``async def`` fills the role that a module level compiler
declaration like ``from __future__ import async_await`` would otherwise
fill.


Why magic methods start with "a"
--------------------------------

New asynchronous magic methods ``__aiter__``, ``__anext__``,
``__aenter__``, and ``__aexit__`` all start with the same prefix "a".
An alternative proposal is to use "async" prefix, so that ``__aiter__``
becomes ``__async_iter__``. However, to align new magic methods with
the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided
to use a shorter version.


Why not reuse existing magic names
----------------------------------

An alternative idea about new asynchronous iterators and context
managers was to reuse existing magic methods, by adding an ``async``
keyword to their declarations::

     class CM:
         async def __enter__(self): # instead of __aenter__
             ...

This approach has the following downsides:

* it would not be possible to create an object that works in both
   ``with`` and ``async with`` statements;

* it would break backwards compatibility, as nothing prohibits from
   returning a Future-like objects from ``__enter__`` and/or
   ``__exit__`` in Python <= 3.4;

* one of the main points of this proposal is to make native coroutines
   as simple and foolproof as possible, hence the clear separation of
   the protocols.


Why not reuse existing "for" and "with" statements
--------------------------------------------------

The vision behind existing generator-based coroutines and this proposal
is to make it easy for users to see where the code might be suspended.
Making existing "for" and "with" statements to recognize asynchronous
iterators and context managers will inevitably create implicit suspend
points, making it harder to reason about the code.


Comprehensions
--------------

Syntax for asynchronous comprehensions could be provided, but this
construct is outside of the scope of this PEP.


Async lambda functions
----------------------

Syntax for asynchronous lambda functions could be provided, but this
construct is outside of the scope of this PEP.


Performance
===========

Overall Impact
--------------

This proposal introduces no observable performance impact.  Here is an
output of python's official set of benchmarks [4]_:

::

     python perf.py -r -b default ../cpython/python.exe 
../cpython-aw/python.exe

     [skipped]

     Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0:
     Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64
     x86_64 i386

     Total CPU cores: 8

     ### etree_iterparse ###
     Min: 0.365359 -> 0.349168: 1.05x faster
     Avg: 0.396924 -> 0.379735: 1.05x faster
     Significant (t=9.71)
     Stddev: 0.01225 -> 0.01277: 1.0423x larger

     The following not significant results are hidden, use -v to show them:
     django_v2, 2to3, etree_generate, etree_parse, etree_process, 
fastpickle,
     fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.


Tokenizer modifications
-----------------------

There is no observable slowdown of parsing python files with the
modified tokenizer: parsing of one 12Mb file
(``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount
of time.


async/await
-----------

The following micro-benchmark was used to determine performance
difference between "async" functions and generators::

     import sys
     import time

     def binary(n):
         if n <= 0:
             return 1
         l = yield from binary(n - 1)
         r = yield from binary(n - 1)
         return l + 1 + r

     async def abinary(n):
         if n <= 0:
             return 1
         l = await abinary(n - 1)
         r = await abinary(n - 1)
         return l + 1 + r

     def timeit(gen, depth, repeat):
         t0 = time.time()
         for _ in range(repeat):
             list(gen(depth))
         t1 = time.time()
         print('{}({}) * {}: total {:.3f}s'.format(
             gen.__name__, depth, repeat, t1-t0))

The result is that there is no observable performance difference.
Minimum timing of 3 runs

::

     abinary(19) * 30: total 12.985s
     binary(19) * 30: total 12.953s

Note that depth of 19 means 1,048,575 calls.


Reference Implementation
========================

The reference implementation can be found here: [3]_.

List of high-level changes and new protocols
--------------------------------------------

1. New syntax for defining coroutines: ``async def`` and new ``await``
    keyword.

2. New ``__await__`` method for Future-like objects, and new
    ``tp_await`` slot in ``PyTypeObject``.

3. New syntax for asynchronous context managers: ``async with``. And
    associated protocol with ``__aenter__`` and ``__aexit__`` methods.

4. New syntax for asynchronous iteration: ``async for``.  And
    associated protocol with ``__aiter__``, ``__aexit__`` and new built-
    in exception ``StopAsyncIteration``.

5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``,
    ``Await``.

6. New functions: ``sys.set_coroutine_wrapper(callback)``,
    ``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``,
    ``inspect.iscoroutinefunction()``, ``inspect.iscoroutine()``,
    and ``inspect.isawaitable()``.

7. New ``CO_COROUTINE`` and ``CO_NATIVE_COROUTINE`` bit flags for code
    objects.

While the list of changes and new things is not short, it is important
to understand, that most users will not use these features directly.
It is intended to be used in frameworks and libraries to provide users
with convenient to use and unambiguous APIs with ``async def``,
``await``, ``async for`` and ``async with`` syntax.


Working example
---------------

All concepts proposed in this PEP are implemented [3]_ and can be
tested.

::

     import asyncio

     async def echo_server():
         print('Serving on localhost:8000')
         await asyncio.start_server(handle_connection,
                                    'localhost', 8000)

     async def handle_connection(reader, writer):
         print('New connection...')

         while True:
             data = await reader.read(8192)

             if not data:
                 break

             print('Sending {:.10}... back'.format(repr(data)))
             writer.write(data)

     loop = asyncio.get_event_loop()
     loop.run_until_complete(echo_server())
     try:
         loop.run_forever()
     finally:
         loop.close()


References
==========

.. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine

.. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions

.. [3] https://github.com/1st1/cpython/tree/await

.. [4] https://hg.python.org/benchmarks

.. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx

.. [6] http://docs.hhvm.com/manual/en/hack.async.php

.. [7] https://www.dartlang.org/articles/await-async/

.. [8] http://docs.scala-lang.org/sips/pending/async.html

.. [9] 
https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental

.. [10] 
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)

.. [11] 
https://docs.python.org/3/reference/expressions.html#generator-iterator-methods

.. [12] https://docs.python.org/3/reference/expressions.html#primaries

Acknowledgments
===============

I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew
Svetlov, and ?ukasz Langa for their initial feedback.


Copyright
=========

This document has been placed in the public domain.

..
    Local Variables:
    mode: indented-text
    indent-tabs-mode: nil
    sentence-end-double-space: t
    fill-column: 70
    coding: utf-8
    End:


From yselivanov.ml at gmail.com  Tue May  5 18:48:11 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 12:48:11 -0400
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <CAHVvXxRXAn4zQZxji+jR99Zgm+sxM4oSBZxDKa_VCfbwq2Nx_w@mail.gmail.com>
References: <CAP7+vJ+m=qwo+bkhcp+b_Y35VeY8fq1ReraSMB_baV6-hNd-Bw@mail.gmail.com>
 <55411841.c5b3340a.1cf8.07e8@mx.google.com>
 <CACac1F_U7JvLJ=eLh9ou1c--nWVWLhhGTuU0=4zhU+NxKY=Hig@mail.gmail.com>
 <5541261A.9020909@gmail.com>
 <CACac1F95=2pqG5VFR9_5sdR_KOx=OXUp--0Jdkp5CSgYu7k+=A@mail.gmail.com>
 <5541341A.8090204@gmail.com>
 <CAJ6cK1YPp+w=021E1y_D4sV-DhuEVx+QX716KH8-3C2_X_WZHA@mail.gmail.com>
 <CAHVvXxRXAn4zQZxji+jR99Zgm+sxM4oSBZxDKa_VCfbwq2Nx_w@mail.gmail.com>
Message-ID: <5548F44B.2090905@gmail.com>

Hi Oscar,

I've updated the PEP with some fixes of the terminology:
https://hg.python.org/peps/rev/f156b272f860

I still think that 'coroutine functions' and 'coroutines'
is a better pair than 'async functions' and 'coroutines'.

First, it's similar to existing terminology for generators.
Second, it's less confusing.  With pep492 at some point,
using generators to implement coroutines won't be a wide
spread practice, so 'async def' functions will be the only
language construct that returns them.

Yury

On 2015-05-05 12:01 PM, Oscar Benjamin wrote:
> On 30 April 2015 at 09:50, Arnaud Delobelle <arnodel at gmail.com> wrote:
>>> I'm flexible about how we name 'async def' functions.  I like
>>> to call them "coroutines", because that's what they are, and
>>> that's how asyncio calls them.  It's also convenient to use
>>> 'coroutine-object' to explain what is the result of calling
>>> a coroutine.
>> I'd like the object created by an 'async def' statement to be called a
>> 'coroutine function' and the result of calling it to be called a
>> 'coroutine'.
> That would be an improvement over the confusing terminology in the PEP
> atm. The PEP proposes to name the inspect functions
> inspect.iscoroutine() and inspect.iscoroutinefunction(). According to
> the PEP iscoroutine() identifies "coroutine objects" and
> iscoroutinefunction() identifies "coroutine functions" -- a term which
> is not defined in the PEP but presumably means what the PEP calls a
> "coroutine" in the glossary.
>
> Calling the async def function an "async function" and the object it
> returns a "coroutine" makes for the clearest terminology IMO (provided
> the word coroutine is not also used for anything else). It would help
> to prevent both experienced and new users from confusing the two
> related but necessarily distinct concepts. Clearly distinct
> terminology makes it easier to explain/discuss something if nothing
> else because it saves repeating definitions all the time.
>
>
> --
> Oscar


From yselivanov.ml at gmail.com  Tue May  5 20:25:00 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 14:25:00 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5548A908.9010206@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com>
Message-ID: <55490AFC.8090907@gmail.com>

Hi Wolfgang,


On 2015-05-05 7:27 AM, Wolfgang wrote:
> Hi,
[..]
> Even the discussion on python-dev suggests there is some time needed
> to finalize all this.

I'd say that:

80% of the recent discussion of the PEP is about terminology.
10% is about whether we should have __future__ import or not.

>
> We forget to address the major problems here. How can someone in a
> "sync" script use this async stuff easy. How can async and sync stuff
> cooperate and we don't need to rewrite the world for async stuff.
> How can a normal user access the power of async stuff without rewriting
> all his code. So he can use a simple asyc request library in his code.
> How can a normal user learn and use all this in an easy way.

asyncio and twisted answered these questions ;) The answer is
that you have to write async implementations.

gevent has a different answer, but greenlents/stackless is
something that will never be merged in CPython and other
implementations.

>
> And for all this we still can't tell them "oh the async stuff solves
> the multiprocessing problem of Python learn it and switch to version
> 3.5". It does not and it is only most useful for networking stuff
> nothing more.

"networking stuff", and in particular, web, is a huge
part of current Python usage.  Please don't underestimate
that.


Thanks,
Yury

From guido at python.org  Tue May  5 21:10:01 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 12:10:01 -0700
Subject: [Python-Dev] ABCs - Re: PEP 492: async/await in Python;
	version 4
In-Reply-To: <5547adfd.14858c0a.58c7.13c4@mx.google.com>
References: <mi4fd4$9ak$1@ger.gmane.org>
 <5547adfd.14858c0a.58c7.13c4@mx.google.com>
Message-ID: <CAP7+vJ+E0cGVKLEUMMA2Bj=kurDzvuGDD+SWdDx1sLRpr8xU1Q@mail.gmail.com>

On Mon, May 4, 2015 at 10:35 AM, Jim J. Jewett <jimjjewett at gmail.com> wrote:

> Which reminds me ... *should* the "await" keyword work with any future,
> or is it really intentionally restricted to use with a single library
> module and 3rd party replacements?
>

You can make any Future type work with await by adding an __await__ method.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/6ba6f8f9/attachment.html>

From p.f.moore at gmail.com  Tue May  5 21:14:20 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 20:14:20 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <55490AFC.8090907@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
Message-ID: <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>

On 5 May 2015 at 19:25, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> On 2015-05-05 7:27 AM, Wolfgang wrote:
>> Even the discussion on python-dev suggests there is some time needed
>> to finalize all this.
>
> I'd say that:
>
> 80% of the recent discussion of the PEP is about terminology.
> 10% is about whether we should have __future__ import or not.

But the terminology discussion appears to revolve around people
finding the various concepts involved in asyncio (particularly the new
PEP, but also to an extent the existing implementation) confusing. I
can confirm, having tried to work through the asyncio docs, that the
underlying concepts and how they are explained, are confusing to an
outsider.

That's not to say that everything needs to be beginner-friendly, but
it *does* mean that it's hard for the wider Python community to
meaningfully comment, or evaluate or sanity-check the design. We're
left with a sense of "trust us, it makes sense if you need it,
everyone else can ignore it".

Personally, I feel as if PEP 492 is looking a little premature - maybe
the focus should be on making asyncio more accessible first, and
*then* adding syntax. You can argue that the syntax is needed to help
make async more accessible - but if that's the case then the
terminology debates and confusion are clear evidence that it's not
succeeding in that goal. Of course, that's based on my perception of
one of the goals of the PEP as being "make coroutines and asyncio more
accessible", If the actual goals are different, my conclusion is
invalid.

>> We forget to address the major problems here. How can someone in a
>> "sync" script use this async stuff easy. How can async and sync stuff
>> cooperate and we don't need to rewrite the world for async stuff.
>> How can a normal user access the power of async stuff without rewriting
>> all his code. So he can use a simple asyc request library in his code.
>> How can a normal user learn and use all this in an easy way.
>
> asyncio and twisted answered these questions ;) The answer is
> that you have to write async implementations.

Well, twisted always had defer_to_thread. Asyncio has run_in_executor,
but that seems to be callback-based rather than coroutine-based?

Many people use requests for their web access. There are good reasons
for this. Are you saying that until someone steps up and writes an
async implementation of requests, I have to make a choice - requests
or asyncio? Unfortunately, I can't see myself choosing asyncio in that
situation. Which again means that asyncio becomes "something that the
average user can't use". Which in turn further entrenches it as a
specialist-only tool.

As another example, in Twisted I could use defer_to_thread to
integrate Oracle database access into a twisted application (that's
what the twisted database stuff did under the hood). Can I do that
with asyncio? Will the syntax in the PEP help, hinder or be irrelevant
to that?

>> And for all this we still can't tell them "oh the async stuff solves
>> the multiprocessing problem of Python learn it and switch to version
>> 3.5". It does not and it is only most useful for networking stuff
>> nothing more.
>
> "networking stuff", and in particular, web, is a huge
> part of current Python usage.  Please don't underestimate
> that.

Without async versions of requests and similar, how much of a chunk of
the networking/web area will asyncio take? (Genuine question, I have
no idea). And how much extra will this PEP add? Those may not be fair
questions (and even if they are fair, the answers are probably
unknowable), but as an outsider, I feel only the costs of the asyncio
implementation (a new library that I don't understand, and now a
relatively large amount of new syntax and special methods I have to
ignore because they don't make sense to me). That's OK, but I think I
am being reasonable to ask for some sense of the level of benefits
others are getting to balance out the costs I incur.

Paul

From guido at python.org  Tue May  5 21:24:12 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 12:24:12 -0700
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <5548F44B.2090905@gmail.com>
References: <CAP7+vJ+m=qwo+bkhcp+b_Y35VeY8fq1ReraSMB_baV6-hNd-Bw@mail.gmail.com>
 <55411841.c5b3340a.1cf8.07e8@mx.google.com>
 <CACac1F_U7JvLJ=eLh9ou1c--nWVWLhhGTuU0=4zhU+NxKY=Hig@mail.gmail.com>
 <5541261A.9020909@gmail.com>
 <CACac1F95=2pqG5VFR9_5sdR_KOx=OXUp--0Jdkp5CSgYu7k+=A@mail.gmail.com>
 <5541341A.8090204@gmail.com>
 <CAJ6cK1YPp+w=021E1y_D4sV-DhuEVx+QX716KH8-3C2_X_WZHA@mail.gmail.com>
 <CAHVvXxRXAn4zQZxji+jR99Zgm+sxM4oSBZxDKa_VCfbwq2Nx_w@mail.gmail.com>
 <5548F44B.2090905@gmail.com>
Message-ID: <CAP7+vJLFL94y=vE-LBMpS4H7k6WSCwZGVMT8dH1r0YYR6b+0mw@mail.gmail.com>

On Tue, May 5, 2015 at 9:48 AM, Yury Selivanov <yselivanov.ml at gmail.com>
wrote:

> Hi Oscar,
>
> I've updated the PEP with some fixes of the terminology:
> https://hg.python.org/peps/rev/f156b272f860
>
> I still think that 'coroutine functions' and 'coroutines'
> is a better pair than 'async functions' and 'coroutines'.
>

Yes. This subtopic is now closed for further debate (please).

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/ef13a896/attachment.html>

From jimjjewett at gmail.com  Tue May  5 21:40:01 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Tue, 05 May 2015 12:40:01 -0700 (PDT)
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <5548EFF8.4060405@gmail.com>
Message-ID: <55491c91.4833370a.5838.38da@mx.google.com>


On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated PEP492.

Where are the following over-simplifications wrong?

(1)  The PEP is intended for use (almost exclusively) with
asychronous IO and a scheduler such as the asynchio event loop.

(2)  The new syntax is intended to make it easier to recognize when
a task's execution may be interrupted by arbitrary other tasks, and
the interrupted task therefore has to revalidate assumptions about
shared data.

With threads, CPython can always suspend a task between op-codes,
but with a sufficiently comprehensive loop (and sufficiently
coooperative tasks), tasks *should* only be suspended when they
make an explicit request to *wait* for an answer, and these points
*should* be marked syntactically.

(3)  The new constructs explicitly do NOT support any sort of
concurrent execution within a task; they are for use precisely
when otherwise parallel subtasks are being linearized by pausing
and waiting for the results.


Over-simplifications 4-6 assume a world with standardized futures
based on concurrent.futures, where .result either returns the
result or raises the exception (or raises another exception about
timeout or cancellation).

[Note that the actual PEP uses iteration over the results of a new
__await__ magic method, rather than .result on the object itself.
I couldn't tell whether this was for explicit marking, or just for
efficiency in avoiding future creation.]

(4)  "await EXPR" is just syntactic sugar for EXPR.result

except that, by being syntax, it better marks locations where
unrelated tasks might have a chance to change shared data.

[And that, as currently planned, the result of an await isn't
actually the result; it is an iterator of results.]

(5)  "async def" is just syntactic sugar for "def", 

except that, by being syntax, it better marks the signatures of
functions and methods where unrelated tasks might have a chance
to change shared data after execution has already begun.

(5A) As the PEP currently stands, it is also a promise that the
function will NOT produce a generator used as an iterator; if a
generator-iterator needs to wait for something else at some point,
that will need to be done differently.

I derive this limitation from
   "It is a ``SyntaxError`` to have ``yield`` or ``yield from``
    expressions in an ``async`` function."

but I don't understand how this limitation works with things like a
per-line file iterator that might need to wait for the file to
be initially opened.

(6)  async with EXPR as VAR:

would be equivalent to:

    with EXPR as VAR:

except that
  __enter__() would be replaced by next(await __enter__()) # __enter__().result
  __exit__() would be replaced by  next(await __exit__())  # __exit__().result


(7)  async for elem in iter:

would be shorthand for:

    for elem in iter:
        elem = next(await elem) # elem.result



-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From brett at python.org  Tue May  5 21:44:26 2015
From: brett at python.org (Brett Cannon)
Date: Tue, 05 May 2015 19:44:26 +0000
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
 <20150501191937.GB8013@stoneleaf.us> <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
Message-ID: <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>

On Tue, May 5, 2015 at 3:14 PM Paul Moore <p.f.moore at gmail.com> wrote:

> On 5 May 2015 at 19:25, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> > On 2015-05-05 7:27 AM, Wolfgang wrote:
> >> Even the discussion on python-dev suggests there is some time needed
> >> to finalize all this.
> >
> > I'd say that:
> >
> > 80% of the recent discussion of the PEP is about terminology.
> > 10% is about whether we should have __future__ import or not.
>
> But the terminology discussion appears to revolve around people
> finding the various concepts involved in asyncio (particularly the new
> PEP, but also to an extent the existing implementation) confusing. I
> can confirm, having tried to work through the asyncio docs, that the
> underlying concepts and how they are explained, are confusing to an
> outsider.
>
> That's not to say that everything needs to be beginner-friendly, but
> it *does* mean that it's hard for the wider Python community to
> meaningfully comment, or evaluate or sanity-check the design. We're
> left with a sense of "trust us, it makes sense if you need it,
> everyone else can ignore it".
>

Watch David Beazley's talk from PyCon this year and you can watch him
basically re-implement asyncio on stage in under 45 minutes. It's not as
complicated as it seems when you realize there is an event loop driving
everything (which people have been leaving out of the conversation since it
doesn't tie into the syntax directly).


>
> Personally, I feel as if PEP 492 is looking a little premature - maybe
> the focus should be on making asyncio more accessible first, and
> *then* adding syntax.


I think this ties the concept of adding syntax to Python to make
coroutine-based programming easier too much to asyncio; the latter is just
an implementation of the former. This PEP doesn't require asyncio beyond
the fact that will be what provides the initial event loop in the stdlib.


> You can argue that the syntax is needed to help
> make async more accessible - but if that's the case then the
> terminology debates and confusion are clear evidence that it's not
> succeeding in that goal.


Perhaps, but arguing about the nitty-gritty details of something doesn't
automatically lead to a clearer understanding of the higher level concept.
Discussing how turning a steering wheel in a car might help you grasp how
cars turn, but it isn't a requirement to get "turn the wheel left to make
the car go left".


> Of course, that's based on my perception of
> one of the goals of the PEP as being "make coroutines and asyncio more
> accessible", If the actual goals are different, my conclusion is
> invalid.
>

I think the goal is "make coroutines easier to use" and does not directly
relate to asyncio.


>
> >> We forget to address the major problems here. How can someone in a
> >> "sync" script use this async stuff easy. How can async and sync stuff
> >> cooperate and we don't need to rewrite the world for async stuff.
> >> How can a normal user access the power of async stuff without rewriting
> >> all his code. So he can use a simple asyc request library in his code.
> >> How can a normal user learn and use all this in an easy way.
> >
> > asyncio and twisted answered these questions ;) The answer is
> > that you have to write async implementations.
>
> Well, twisted always had defer_to_thread. Asyncio has run_in_executor,
> but that seems to be callback-based rather than coroutine-based?
>

Yep.


>
> Many people use requests for their web access. There are good reasons
> for this. Are you saying that until someone steps up and writes an
> async implementation of requests, I have to make a choice - requests
> or asyncio?


I believe so; you need something to implement __await__. This is true in
any language that implements co-routines.

Unfortunately, I can't see myself choosing asyncio in that
> situation. Which again means that asyncio becomes "something that the
> average user can't use". Which in turn further entrenches it as a
> specialist-only tool.
>

You forgot to append "... yet" to that statement. Just because something
isn't available out of the box without some effort to support doesn't mean
it will never happen, else there would be absolutely no Python 3 users out
there.


>
> As another example, in Twisted I could use defer_to_thread to
> integrate Oracle database access into a twisted application (that's
> what the twisted database stuff did under the hood). Can I do that
> with asyncio? Will the syntax in the PEP help, hinder or be irrelevant
> to that?
>
> >> And for all this we still can't tell them "oh the async stuff solves
> >> the multiprocessing problem of Python learn it and switch to version
> >> 3.5". It does not and it is only most useful for networking stuff
> >> nothing more.
> >
> > "networking stuff", and in particular, web, is a huge
> > part of current Python usage.  Please don't underestimate
> > that.
>
> Without async versions of requests and similar, how much of a chunk of
> the networking/web area will asyncio take? (Genuine question, I have
> no idea). And how much extra will this PEP add? Those may not be fair
> questions (and even if they are fair, the answers are probably
> unknowable), but as an outsider, I feel only the costs of the asyncio
> implementation (a new library that I don't understand, and now a
> relatively large amount of new syntax and special methods I have to
> ignore because they don't make sense to me). That's OK, but I think I
> am being reasonable to ask for some sense of the level of benefits
> others are getting to balance out the costs I incur.
>

Co-routine-based asynchronous programming is new to Python, so as a
community we don't have it as something everyone learns over time. If you
don't come from an area that supports it then it will be foreign to you and
not make sense without someone giving you a good tutorial on it. But
considering C#, Dart, and Ecmascript 6 (will) have co-routine support --
and those are just the languages I can name off the top of my head -- using
the exact same keywords suggests to me that it isn't *that* difficult of a
topic to teach people. This is just one of those PEPs where you have to
trust the people with experience in the area are making good design
decisions for those of us who aren't in a position to contribute directly
without more experience in the domain.

-Brett


>
> Paul
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/b194e329/attachment-0001.html>

From yselivanov.ml at gmail.com  Tue May  5 21:48:36 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 15:48:36 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
References: <554185C2.5080003@gmail.com>	<mhvs5p$tg8$1@ger.gmane.org>	<CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>	<mi0c1e$819$1@ger.gmane.org>	<5543CB61.2080905@gmail.com>	<mi0im0$lo5$1@ger.gmane.org>	<20150501191937.GB8013@stoneleaf.us>	<5543D2F4.3060207@gmail.com>	<CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>	<5543E1B7.6010804@gmail.com>	<CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>	<5548A908.9010206@gmail.com>	<55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
Message-ID: <55491E94.6080009@gmail.com>

Paul,

On 2015-05-05 3:14 PM, Paul Moore wrote:
> On 5 May 2015 at 19:25, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>> On 2015-05-05 7:27 AM, Wolfgang wrote:
>>> Even the discussion on python-dev suggests there is some time needed
>>> to finalize all this.
>> I'd say that:
>>
>> 80% of the recent discussion of the PEP is about terminology.
>> 10% is about whether we should have __future__ import or not.
> But the terminology discussion appears to revolve around people
> finding the various concepts involved in asyncio (particularly the new
> PEP, but also to an extent the existing implementation) confusing. I
> can confirm, having tried to work through the asyncio docs, that the
> underlying concepts and how they are explained, are confusing to an
> outsider.

I agree. We have to improve asyncio docs in this area.

>
> That's not to say that everything needs to be beginner-friendly, but
> it *does* mean that it's hard for the wider Python community to
> meaningfully comment, or evaluate or sanity-check the design. We're
> left with a sense of "trust us, it makes sense if you need it,
> everyone else can ignore it".
>
> Personally, I feel as if PEP 492 is looking a little premature - maybe
> the focus should be on making asyncio more accessible first, and
> *then* adding syntax. You can argue that the syntax is needed to help
> make async more accessible - but if that's the case then the
> terminology debates and confusion are clear evidence that it's not
> succeeding in that goal. Of course, that's based on my perception of
> one of the goals of the PEP as being "make coroutines and asyncio more
> accessible", If the actual goals are different, my conclusion is
> invalid.

Again, PEP 492 is not only for asyncio. *Any* framework can
use it, including Twisted.

As for terminology, I view this discussion differently.  It's
not about the technical details (Python has asymmetric
coroutines, that's it), but rather on how to disambiguate
coroutines implemented with generators and yield-from, from
new 'async def' coroutines.  I can't see any fundamental
problem with the PEP behind such discussions.

>
>>> We forget to address the major problems here. How can someone in a
>>> "sync" script use this async stuff easy. How can async and sync stuff
>>> cooperate and we don't need to rewrite the world for async stuff.
>>> How can a normal user access the power of async stuff without rewriting
>>> all his code. So he can use a simple asyc request library in his code.
>>> How can a normal user learn and use all this in an easy way.
>> asyncio and twisted answered these questions ;) The answer is
>> that you have to write async implementations.
> Well, twisted always had defer_to_thread. Asyncio has run_in_executor,
> but that seems to be callback-based rather than coroutine-based?
>
> Many people use requests for their web access. There are good reasons
> for this. Are you saying that until someone steps up and writes an
> async implementation of requests, I have to make a choice - requests
> or asyncio? Unfortunately, I can't see myself choosing asyncio in that
> situation. Which again means that asyncio becomes "something that the
> average user can't use". Which in turn further entrenches it as a
> specialist-only tool.

There is aiohttp library [1], which provides a client API
similar to requests.

And if you want to write high performance networking server
in python3 you *will* choose asyncio (or gevent/twisted
in python2).

And PEP 492 is aimed to make this whole async stuff more
accessible to an average user.

>
> As another example, in Twisted I could use defer_to_thread to
> integrate Oracle database access into a twisted application (that's
> what the twisted database stuff did under the hood). Can I do that
> with asyncio? Will the syntax in the PEP help, hinder or be irrelevant
> to that?

You can use 'loop.run_in_executor' in asyncio. It returns a
future that you can await on.  You can also provide a nice
facade for your Oracle-database code that provides a nice
API but uses asyncio thread executor behind the scenes.

>
>>> And for all this we still can't tell them "oh the async stuff solves
>>> the multiprocessing problem of Python learn it and switch to version
>>> 3.5". It does not and it is only most useful for networking stuff
>>> nothing more.
>> "networking stuff", and in particular, web, is a huge
>> part of current Python usage.  Please don't underestimate
>> that.
> Without async versions of requests and similar, how much of a chunk of
> the networking/web area will asyncio take? (Genuine question, I have
> no idea).

There are some things (like websockets) that are hard
to implement correctly in existing frameworks like django
and flask.  And these kind of things are becoming more
and more important.  Languages like Go were designed
specifically to allow writing efficient

> And how much extra will this PEP add? Those may not be fair
> questions (and even if they are fair, the answers are probably
> unknowable), but as an outsider, I feel only the costs of the asyncio
> implementation (a new library that I don't understand, and now a
> relatively large amount of new syntax and special methods I have to
> ignore because they don't make sense to me). That's OK, but I think I
> am being reasonable to ask for some sense of the level of benefits
> others are getting to balance out the costs I incur.
>
> Paul

It's chicken and egg problem.  Right now, current coroutines
via generators approach is cumbersome, it's harder to write
async code than it should be.  It stops the innovation in
this area.  Some languages like Go were specifically designed
to make network programming easier, and they now steal users
from Python.

There is no absence of libraries for Go (and it's a new
language!), btw.  Give people the right tools and they will
build what they need.

Yury

[1] https://github.com/KeepSafe/aiohttp

From yselivanov.ml at gmail.com  Tue May  5 22:00:36 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 16:00:36 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55491c91.4833370a.5838.38da@mx.google.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
Message-ID: <55492164.8090906@gmail.com>



On 2015-05-05 3:40 PM, Jim J. Jewett wrote:
> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated PEP492.
>
> Where are the following over-simplifications wrong?
>
> (1)  The PEP is intended for use (almost exclusively) with
> asychronous IO and a scheduler such as the asynchio event loop.

Yes. You can also use it for UI loops.  Basically, anything
that can call your code asynchronously.

>
> (2)  The new syntax is intended to make it easier to recognize when
> a task's execution may be interrupted by arbitrary other tasks, and
> the interrupted task therefore has to revalidate assumptions about
> shared data.
>
> With threads, CPython can always suspend a task between op-codes,
> but with a sufficiently comprehensive loop (and sufficiently
> coooperative tasks), tasks *should* only be suspended when they
> make an explicit request to *wait* for an answer, and these points
> *should* be marked syntactically.
>
> (3)  The new constructs explicitly do NOT support any sort of
> concurrent execution within a task; they are for use precisely
> when otherwise parallel subtasks are being linearized by pausing
> and waiting for the results.

Yes.
>
>
> Over-simplifications 4-6 assume a world with standardized futures
> based on concurrent.futures, where .result either returns the
> result or raises the exception (or raises another exception about
> timeout or cancellation).
>
> [Note that the actual PEP uses iteration over the results of a new
> __await__ magic method, rather than .result on the object itself.
> I couldn't tell whether this was for explicit marking, or just for
> efficiency in avoiding future creation.]
>
> (4)  "await EXPR" is just syntactic sugar for EXPR.result
>
> except that, by being syntax, it better marks locations where
> unrelated tasks might have a chance to change shared data.
>
> [And that, as currently planned, the result of an await isn't
> actually the result; it is an iterator of results.]

I'm not sure how to comment on (4).  Perhaps I don't
understand some notation that you're using.  If anything,
it's more of a syntactic sugar for 'yield from EXPR'.

>
> (5)  "async def" is just syntactic sugar for "def",
>
> except that, by being syntax, it better marks the signatures of
> functions and methods where unrelated tasks might have a chance
> to change shared data after execution has already begun.

It also sets "CO_COROUTINE | CO_GENERATOR" flags, that
are very important.

>
> (5A) As the PEP currently stands, it is also a promise that the
> function will NOT produce a generator used as an iterator; if a
> generator-iterator needs to wait for something else at some point,
> that will need to be done differently.
>
> I derive this limitation from
>     "It is a ``SyntaxError`` to have ``yield`` or ``yield from``
>      expressions in an ``async`` function."
>
> but I don't understand how this limitation works with things like a
> per-line file iterator that might need to wait for the file to
> be initially opened.

Per-line file iterator can be implemented with __aiter__,
__anext__ protocol.  __aiter__ is a coroutine, you can
open/start reading your file there.

>
> (6)  async with EXPR as VAR:
>
> would be equivalent to:
>
>      with EXPR as VAR:
>
> except that
>    __enter__() would be replaced by next(await __enter__()) # __enter__().result
>    __exit__() would be replaced by  next(await __exit__())  # __exit__().result

I'm not sure I understand what you mean by
"next(await EXPR)" notation.


Yury

From njs at pobox.com  Tue May  5 22:14:50 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Tue, 5 May 2015 13:14:50 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55491c91.4833370a.5838.38da@mx.google.com>
References: <5548EFF8.4060405@gmail.com>
 <55491c91.4833370a.5838.38da@mx.google.com>
Message-ID: <CAPJVwB=JD6AXt9mPofUMKNn4SbSoWnk+5rH3QFVhYGz0e08GnA@mail.gmail.com>

On May 5, 2015 12:40 PM, "Jim J. Jewett" <jimjjewett at gmail.com> wrote:
>
>
> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated PEP492.
>
> Where are the following over-simplifications wrong?
>
[...snip...]
>
> [Note that the actual PEP uses iteration over the results of a new
> __await__ magic method, rather than .result on the object itself.
> I couldn't tell whether this was for explicit marking, or just for
> efficiency in avoiding future creation.]
>
> (4)  "await EXPR" is just syntactic sugar for EXPR.result
>
> except that, by being syntax, it better marks locations where
> unrelated tasks might have a chance to change shared data.
>
> [And that, as currently planned, the result of an await isn't
> actually the result; it is an iterator of results.]

This is where you're missing a key idea. (And I agree that more high-level
docs are very much needed!) Remember that this is just regular single
threaded python code, so just writing EXPR.result cannot possibly cause the
current task to pause and another one to start running, and then magically
switch back somehow when the result does become available. Imagine trying
to implement a .result attribute that does that -- it's impossible.

Writing 'x = await socket1.read(1)' is actually equivalent to writing a
little loop like:

while True:
    # figure out what we need to happen to make progress
    needed = "data from socket 1"
    # suspend this function,
    # and send the main loop a message telling it what we need
    reply = (yield needed)
    # okay, the main loop woke us up again
    # let's see if they've sent us back what we asked for
    if reply.type == "data from socket 1":
        # got it!
        x = reply.payload
        break
    else:
        # if at first you don't succeed...
        continue

(Now stare at the formal definition of 'yield from' until you see how it
maps onto the above... And if you're wondering why we need a loop, think
about the case where instead of calling socket.read we're calling http.get
or something that requires multiple steps to complete.)

So there actually is semantically no iterator here -- the thing that looks
like an iterator is actually the chatter back and forth between the
lower-level code and the main loop that is orchestrating everything. Then
when that's done, it returns the single result.

-n
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/35eb9065/attachment.html>

From p.f.moore at gmail.com  Tue May  5 22:39:12 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 21:39:12 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
Message-ID: <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>

(Yury gave similar responses, so (a) I'll just respond here, and (b)
it's encouraging that you both responded so quickly with the same
message)

On 5 May 2015 at 20:44, Brett Cannon <brett at python.org> wrote:
>> That's not to say that everything needs to be beginner-friendly, but
>> it *does* mean that it's hard for the wider Python community to
>> meaningfully comment, or evaluate or sanity-check the design. We're
>> left with a sense of "trust us, it makes sense if you need it,
>> everyone else can ignore it".
>
> Watch David Beazley's talk from PyCon this year and you can watch him
> basically re-implement asyncio on stage in under 45 minutes. It's not as
> complicated as it seems when you realize there is an event loop driving
> everything (which people have been leaving out of the conversation since it
> doesn't tie into the syntax directly).

I'll watch that - it should be fun. But I have seen things like that
before, and I've got an idea how to write an event loop. You're right
that it's easy to lose track of the fundamentally simple idea in all
the complex discussions. To me that feels like a peculiar failure of
the abstraction, in that in some circumstances it makes things feel
*more* complex than they are :-)

>> Personally, I feel as if PEP 492 is looking a little premature - maybe
>> the focus should be on making asyncio more accessible first, and
>> *then* adding syntax.
>
> I think this ties the concept of adding syntax to Python to make
> coroutine-based programming easier too much to asyncio; the latter is just
> an implementation of the former. This PEP doesn't require asyncio beyond the
> fact that will be what provides the initial event loop in the stdlib.

It's very hard to separate coroutines from asyncio, because there's no
other example (not even a toy one) to reason about.

It would probably be helpful to have a concrete example of a basic
event loop that did *nothing* but schedule tasks. No IO waiting or
similar, just scheduling. I have a gut feeling that event loops are
more than just asyncio, but without examples to point to it's hard to
keep a focus on that fact. And even harder to isolate "what is an
event loop mechanism" from "what is asyncio specific". For example,
asyncio.BaseEventLoop has a create_connection method. That's
*obviously* not a fundamental aspect of a generic event loop, But
call_soon (presumably) is. Having a documented "basic event loop"
interface would probably help emphasise the idea than event loops
don't have to be asyncio. (Actually, what *is* the minimal event loop
interface that is needed for the various task/future mechanisms to
work, independently of asyncio? And what features of an event loop etc
are needed for the PEP, if it's being used outside of asyncio?)

I guess the other canonical event loop use case is GUI system message
dispatchers.

>> You can argue that the syntax is needed to help
>> make async more accessible - but if that's the case then the
>> terminology debates and confusion are clear evidence that it's not
>> succeeding in that goal.
>
> Perhaps, but arguing about the nitty-gritty details of something doesn't
> automatically lead to a clearer understanding of the higher level concept.
> Discussing how turning a steering wheel in a car might help you grasp how
> cars turn, but it isn't a requirement to get "turn the wheel left to make
> the car go left".

Fair point. If only I could avoid driving into walls :-)

>> Of course, that's based on my perception of
>> one of the goals of the PEP as being "make coroutines and asyncio more
>> accessible", If the actual goals are different, my conclusion is
>> invalid.
>
> I think the goal is "make coroutines easier to use" and does not directly
> relate to asyncio.

OK. But in that case, some examples using a non-asyncio toy "just
schedule tasks" event loop might help.

>> Well, twisted always had defer_to_thread. Asyncio has run_in_executor,
>> but that seems to be callback-based rather than coroutine-based?
>
> Yep.

... and so you can't use it with async/await?

>> Many people use requests for their web access. There are good reasons
>> for this. Are you saying that until someone steps up and writes an
>> async implementation of requests, I have to make a choice - requests
>> or asyncio?
>
> I believe so; you need something to implement __await__. This is true in any
> language that implements co-routines.
>
>> Unfortunately, I can't see myself choosing asyncio in that
>> situation. Which again means that asyncio becomes "something that the
>> average user can't use". Which in turn further entrenches it as a
>> specialist-only tool.
>
> You forgot to append "... yet" to that statement. Just because something
> isn't available out of the box without some effort to support doesn't mean
> it will never happen, else there would be absolutely no Python 3 users out
> there.

Fair point. Yuri mentioned aiohttp, as well. The one difference
between this and Python 2/3, is that here you *have* to have two
separate implementations. There's no equivalent of a "shared source"
async and synchronous implementation of requests. So the typical
"please support Python 3" issue that motivates projects to move
forward doesn't exist in the same way. It's not to say that there
won't be async versions of important libraries, it's just hard to see
how the dynamics will work. I can't see myself raising an issue on
cx_Oracle saying "please add asyncio support", and I don't know who
else I would ask...

> Co-routine-based asynchronous programming is new to Python, so as a
> community we don't have it as something everyone learns over time. If you
> don't come from an area that supports it then it will be foreign to you and
> not make sense without someone giving you a good tutorial on it. But
> considering C#, Dart, and Ecmascript 6 (will) have co-routine support -- and
> those are just the languages I can name off the top of my head -- using the
> exact same keywords suggests to me that it isn't *that* difficult of a topic
> to teach people. This is just one of those PEPs where you have to trust the
> people with experience in the area are making good design decisions for
> those of us who aren't in a position to contribute directly without more
> experience in the domain.

That's also a fair point, and it seems to me that there *is*
reasonably general feeling that the experts can be trusted on the
basic principles. There's also a huge amount of bikeshedding, but
that's pretty much inevitable :-)

But I do think that unless someone does something to offer some
non-asyncio examples of coroutine-based asynchronous programming in
Python, the link in people's minds between async and asyncio will
become more and more entrenched. While asyncio is the only real event
loop implementation, saying "async can be used for things other than
asyncio" is a rather theoretical point.

Is there anyone who feels they could write a stripped down but working
example of a valid Python event loop *without* the asyncio aspects? Or
is that what David Beazley's talk does? (I got the impression from
what you said that he was aiming at async IO rather than just a non-IO
event loop). Can asyncio.Future and asyncio.Task be reused with such
an event loop, or would those need to be reimplemented as well?
Writing your own event loop seems like a plausible exercise. Writing
your own version of the whole
task/future/coroutine/queue/synchronisation mechanisms seems like a
lot to expect. And the event loop policy mechanism says that it works
with loops that implement asyncio.BaseEventLoop (which as noted
includes things like create_connection, etc).

Paul

From p.f.moore at gmail.com  Tue May  5 22:44:32 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 21:44:32 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
Message-ID: <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>

On 5 May 2015 at 21:38, Guido van Rossum <guido at python.org> wrote:
> Jumping in to correct one fact.
>
> On Tue, May 5, 2015 at 12:44 PM, Brett Cannon <brett at python.org> wrote:
>>
>>
>> On Tue, May 5, 2015 at 3:14 PM Paul Moore <p.f.moore at gmail.com> wrote:
>>>
>>> Well, twisted always had defer_to_thread. Asyncio has run_in_executor,
>>> but that seems to be callback-based rather than coroutine-based?
>>
>>
>> Yep.
>
>
> The run_in_executor call is not callback-based -- the confusion probably
> stems from the name of the function argument ('callback'). It actually
> returns a Future representing the result (or error) of an operation, where
> the operation is represented by the function argument. So if you have e.g. a
> function
>
>     def factorial(n):
>         return 1 if n <= 0 else n*factorial(n-1)
>
> you can run it in an executor from your async(io) code like this:
>
>     loop = asyncio.get_event_loop()
>     result = yield from loop.run_in_executor(factorial, 100)
>
> (In a PEP 492 coroutine substitute await for yield from.)

Thanks, that's an important correction. Given that, run_in_executor is
the link to blocking calls that I was searching for. And yes, the
"callback" terminology does make this far from obvious, unfortunately.
As does the point at which it's introduced (before futures have been
described) and the fact that it says "this method is a coroutine"
rather than "this method returns a Future"[1].

Paul

[1] I'm still struggling to understand the terminology, so if those
two statements are equivalent, that's not yet obvious to me.

From guido at python.org  Tue May  5 22:38:15 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 13:38:15 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
 <20150501191937.GB8013@stoneleaf.us> <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
Message-ID: <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>

Jumping in to correct one fact.

On Tue, May 5, 2015 at 12:44 PM, Brett Cannon <brett at python.org> wrote:

>
> On Tue, May 5, 2015 at 3:14 PM Paul Moore <p.f.moore at gmail.com> wrote:
>
>> Well, twisted always had defer_to_thread. Asyncio has run_in_executor,
>> but that seems to be callback-based rather than coroutine-based?
>>
>
> Yep.
>

The run_in_executor call is not callback-based -- the confusion probably
stems from the name of the function argument ('callback'). It actually
returns a Future representing the result (or error) of an operation, where
the operation is represented by the function argument. So if you have e.g.
a function

    def factorial(n):
        return 1 if n <= 0 else n*factorial(n-1)

you can run it in an executor from your async(io) code like this:

    loop = asyncio.get_event_loop()
    result = yield from loop.run_in_executor(factorial, 100)

(In a PEP 492 coroutine substitute await for yield from.)

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/0be2fb96/attachment.html>

From guido at python.org  Tue May  5 22:55:33 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 13:55:33 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55491c91.4833370a.5838.38da@mx.google.com>
References: <5548EFF8.4060405@gmail.com>
 <55491c91.4833370a.5838.38da@mx.google.com>
Message-ID: <CAP7+vJ+=5bMxaQqn_gz5-2w1gfzvTnsRaKBY37utGGbv86pvfA@mail.gmail.com>

One small clarification:

On Tue, May 5, 2015 at 12:40 PM, Jim J. Jewett <jimjjewett at gmail.com> wrote:

> [...] but I don't understand how this limitation works with things like a
> per-line file iterator that might need to wait for the file to
> be initially opened.
>

 Note that PEP 492 makes it syntactically impossible to use a coroutine
function to implement an iterator using yield; this is because the
generator machinery is needed to implement the coroutine machinery.
However, the PEP allows the creation of asynchronous iterators using
classes that implement __aiter__ and __anext__. Any blocking you need to do
can happen in either of those. You just use `async for` to iterate over
such an "asynchronous stream".

(There's an issue with actually implementing an asynchronous stream mapped
to a disk file, because I/O multiplexing primitives like select() don't
actually support waiting for disk files -- but this is an unrelated
problem, and asynchronous streams are useful to handle I/O to/from network
connections, subprocesses (pipes) or local RPC connections. Checkout the
streams <https://docs.python.org/3/library/asyncio-stream.html> and
subprocess <https://docs.python.org/3/library/asyncio-subprocess.html>
submodules of the asyncio package. These streams would be great candidates
for adding __aiter__/__anext__ to support async for-loops, so the idiom for
iterating over them can once again closely resemble the idiom for iterating
over regular (synchronous) streams using for-loops.)

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/faff9118/attachment-0001.html>

From guido at python.org  Tue May  5 22:57:17 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 13:57:17 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
 <20150501191937.GB8013@stoneleaf.us> <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
Message-ID: <CAP7+vJJ31cWu2sewVevYNS+9KGCnV3dH3NaqbgjYU7Gu42p4Zg@mail.gmail.com>

On Tue, May 5, 2015 at 1:39 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> It's very hard to separate coroutines from asyncio, because there's no
> other example (not even a toy one) to reason about.
>

What about Greg Ewing's example?
http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/yf_current/Examples/Scheduler/scheduler.txt

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/7d89a4bf/attachment.html>

From yselivanov.ml at gmail.com  Tue May  5 22:57:55 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 16:57:55 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
References: <554185C2.5080003@gmail.com>	<mhvs5p$tg8$1@ger.gmane.org>	<CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>	<mi0c1e$819$1@ger.gmane.org>	<5543CB61.2080905@gmail.com>	<mi0im0$lo5$1@ger.gmane.org>	<20150501191937.GB8013@stoneleaf.us>	<5543D2F4.3060207@gmail.com>	<CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>	<5543E1B7.6010804@gmail.com>	<CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>	<5548A908.9010206@gmail.com>	<55490AFC.8090907@gmail.com>	<CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>	<CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
Message-ID: <55492ED3.8000704@gmail.com>

On 2015-05-05 4:39 PM, Paul Moore wrote:
> Is there anyone who feels they could write a stripped down but working
> example of a valid Python event loop*without*  the asyncio aspects? Or
> is that what David Beazley's talk does?
Yes, in David's talk, where he starts to use 'yield from' you
can simply use new coroutines.


Yury

From p.f.moore at gmail.com  Tue May  5 23:01:12 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 22:01:12 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55492164.8090906@gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
Message-ID: <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>

On 5 May 2015 at 21:00, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> On 2015-05-05 3:40 PM, Jim J. Jewett wrote:
>>
>> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated PEP492.
>>
>> Where are the following over-simplifications wrong?
>>
>> (1)  The PEP is intended for use (almost exclusively) with
>> asychronous IO and a scheduler such as the asynchio event loop.
>
> Yes. You can also use it for UI loops.  Basically, anything
> that can call your code asynchronously.

Given that the stdlib doesn't provide an example of such a UI loop,
what would a 3rd party module need to implement to provide such a
thing? Can any of the non-IO related parts of asyncio be reused for
the purpose, or must the 3rd party module implement everything from
scratch?

To me, this is an important question, as it cuts directly to the heart
of the impression people have that coroutines and async are "only for
asyncio".

I'd be interested in writing, for instructional purposes, a toy but
complete event loop. But I'm *not* really interested in trying to
reverse engineer the required interface.

Paul

From jimjjewett at gmail.com  Tue May  5 23:09:54 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Tue, 05 May 2015 14:09:54 -0700 (PDT)
Subject: [Python-Dev] PEP 492: Please mention the Event Loop
In-Reply-To: <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
Message-ID: <554931a2.551a370a.0ed9.4c61@mx.google.com>


On Tue May 5 21:44:26 CEST 2015,Brett Cannon wrote:

> It's not as
> complicated as it seems when you realize there is an event loop driving
> everything (which people have been leaving out of the conversation since it
> doesn't tie into the syntax directly).

Another reason people don't realize it is that the PEP goes out
of its way to avoid saying so.

I understand that you (and Yuri) don't want to tie the PEP too
tightly to the specific event loop implementation in
asyncio.events.AbstractEventLoop, but ... that particular
conflation isn't really what people are confused about.

"coroutines" often brings up thoughts of independent tasks.  Yuri may
well know that "(Python has asymmetric coroutines, that's it)", but
others have posted that this was a surprise -- and the people posting
here have far more python experience than most readers will.

Anyone deeply involved enough to recognize that this PEP is only about

  (1) a particular type of co-routine -- a subset even of prior python usage

  (2) used for a particular purpose

  (3) coordinated via an external scheduler

will already know that they can substitute other event loops.

Proposed second paragraph of the abstract:

This PEP assumes that the asynchronous tasks are scheduled and
coordinated by an Event Loop similar to that of stdlib module
asyncio.events.AbstractEventLoop.  While the PEP is not tied to
any specific Event Loop implementation, it is relevant only to
the kind of coroutine that uses "yield" as a signal to the scheduler,
indicating that the coroutine will be waiting until an event (such
as IO) is completed.

-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From guido at python.org  Tue May  5 23:12:44 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 14:12:44 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
 <20150501191937.GB8013@stoneleaf.us> <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
 <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>
Message-ID: <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>

On Tue, May 5, 2015 at 1:44 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> [Guido]
> > The run_in_executor call is not callback-based -- the confusion probably
> > stems from the name of the function argument ('callback'). It actually
> > returns a Future representing the result (or error) of an operation,
> where
> > the operation is represented by the function argument. So if you have
> e.g. a
> > function
> >
> >     def factorial(n):
> >         return 1 if n <= 0 else n*factorial(n-1)
> >
> > you can run it in an executor from your async(io) code like this:
> >
> >     loop = asyncio.get_event_loop()
> >     result = yield from loop.run_in_executor(factorial, 100)
> >
> > (In a PEP 492 coroutine substitute await for yield from.)
>
> Thanks, that's an important correction. Given that, run_in_executor is
> the link to blocking calls that I was searching for. And yes, the
> "callback" terminology does make this far from obvious, unfortunately.
> As does the point at which it's introduced (before futures have been
> described) and the fact that it says "this method is a coroutine"
> rather than "this method returns a Future"[1].
>
> Paul
>
> [1] I'm still struggling to understand the terminology, so if those
> two statements are equivalent, that's not yet obvious to me.
>

I apologize for the confusing documentation. We need more help from
qualified tech writers! Writing PEP 3156 was a huge undertaking for me;
after that I was exhausted and did not want to take on writing the end user
documentation as well, so it was left unfinished. :-(

In PEP 3156 (asyncio package) there are really three separate concepts:

- Future, which is a specific class (of which Task is a subclass);

- coroutine, by which in this context is meant a generator object obtained
by calling a generator function decorated with @asyncio.coroutine and
written to conform to the asyncio protocol for coroutines (i.e. don't use
bare yield, only use yield from, and the latter always with either a Future
or a coroutine as argument);

- either of the above, which is actually the most common requirement --
most asyncio functions that support one also support the other, and either
is allowable as the argument to `yield from`.

In the implementation we so often flipped between Future and coroutine that
I imagine sometimes the implementation and docs differ; also, we don't have
a good short name for "either of the above" so we end up using one or the
other as a shorthand.

*Unless* you want to attach callbacks, inspect the result or exception, or
cancel it (all of which require a Future), your code shouldn't be concerned
about the difference -- you should just use `res = yield from func(args)`
and use try/except to catch exceptions if you care. And if you do need a
Future, you can call the function asyncio.async() on it (which in PEP 492
is renamed to ensure_future()).

In the PEP 492 world, these concepts map as follows:

- Future translates to "something with an __await__ method" (and asyncio
Futures are trivially made compliant by defining Future.__await__ as an
alias for Future.__iter__);

- "asyncio coroutine" maps to "PEP 492 coroutine object" (either defined
with `async def` or a generator decorated with @types.coroutine -- note
that @asyncio.coroutine incorporates the latter);

- "either of the above" maps to "awaitable".

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/08304987/attachment.html>

From p.f.moore at gmail.com  Tue May  5 23:16:46 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 22:16:46 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJJ31cWu2sewVevYNS+9KGCnV3dH3NaqbgjYU7Gu42p4Zg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
 <CAP7+vJJ31cWu2sewVevYNS+9KGCnV3dH3NaqbgjYU7Gu42p4Zg@mail.gmail.com>
Message-ID: <CACac1F-yo0G7FDvwpWOoNOaoX1ndJBjehwyH7XsHD7JHDtO7Vg@mail.gmail.com>

On 5 May 2015 at 21:57, Guido van Rossum <guido at python.org> wrote:
> On Tue, May 5, 2015 at 1:39 PM, Paul Moore <p.f.moore at gmail.com> wrote:
>>
>> It's very hard to separate coroutines from asyncio, because there's no
>> other example (not even a toy one) to reason about.
>
> What about Greg Ewing's example?
> http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/yf_current/Examples/Scheduler/scheduler.txt

That doesn't cover any of the higher level abstractions like tasks or
futures (at least not by those names or with those interfaces). And I
don't see where the PEP 492 additions would fit in (OK, "replace yield
from with await" is part of it, but I don't see the rest).

We may be talking at cross purposes here. There's a lot of asyncio
that doesn't seem to me to be IO-related. Specifically the future and
task abstractions. I view those as relevant to "coroutine programming
in Python" because they are referenced in any discussion of coroutines
(you yield from a future, for example). If you see them as purely
asyncio related (and not directly usable from outside of an asyncio
context) then that may explain some of my confusion (but at the cost
of reducing the coroutine concept to something pretty trivial in the
absence of a library that independently implements these concepts).

In some ways I wish there had been an "asyncio" library that covered
the areas that are fundamentally about IO multiplexing. And a separate
library (just "async", maybe, although that's now a bad idea as it
clashes with a keyword :-)) that covered generic event loop, task and
synchronisation areas. But that's water under the bridge now.

Paul

From nad at acm.org  Tue May  5 23:18:29 2015
From: nad at acm.org (Ned Deily)
Date: Tue, 05 May 2015 14:18:29 -0700
Subject: [Python-Dev] Sub-claasing pathlib.Path seems impossible
References: <CAAb4jGkqyLh4EPgs2tntY1RMaDdig4wROJzX9Ek8ZepxTwq9-A@mail.gmail.com>
 <CAO41-mN5f=_iHgkxTfLcJhok_NWAwHLRW3JU9DEXbO7Qw4B9bQ@mail.gmail.com>
 <CAP7+vJLkrTaVB8Bp34wWTdKX0rA+OPB1Tz+9OJKKWLQwtUPrUQ@mail.gmail.com>
Message-ID: <nad-35091D.14182905052015@news.gmane.org>

In article 
<CAAb4jGkqyLh4EPgs2tntY1RMaDdig4wROJzX9Ek8ZepxTwq9-A at mail.gmail.com>,
 Christophe Bal <projetmbc at gmail.com> wrote:
> In this post
> <http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-d
> oes-not-work/29854141#29854141>,
> I have noticed a problem with the following code.
[...]
> This breaks the sub-classing from Python point of view. In the post
> <http://stackoverflow.com/questions/29850801/simple-subclassing-pathlib-path-d
> oes-not-work/29854141#29854141>,
> I give a hack to sub-class Path but it's a bit Unpythonic.

In article 
<CAP7+vJLkrTaVB8Bp34wWTdKX0rA+OPB1Tz+9OJKKWLQwtUPrUQ at mail.gmail.com>,
 Guido van Rossum <guido at python.org> wrote:
> It does sound like subclassing Path should be made easier.

Christophe, if you want to pursue this, you should open an issue for it 
on the Python bug tracker, bugs.python.org.  Otherwise, it will likely 
be forgotten here.

-- 
 Ned Deily,
 nad at acm.org


From guido at python.org  Tue May  5 23:25:01 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 14:25:01 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
Message-ID: <CAP7+vJJT=hgYwW1XA7Y6C1uwCv7jy3zeJPzf3diy5kFP5jV5_w@mail.gmail.com>

On Tue, May 5, 2015 at 2:01 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 5 May 2015 at 21:00, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> > On 2015-05-05 3:40 PM, Jim J. Jewett wrote:
> >>
> >> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated
> PEP492.
> >>
> >> Where are the following over-simplifications wrong?
> >>
> >> (1)  The PEP is intended for use (almost exclusively) with
> >> asychronous IO and a scheduler such as the asynchio event loop.
> >
> > Yes. You can also use it for UI loops.  Basically, anything
> > that can call your code asynchronously.
>
> Given that the stdlib doesn't provide an example of such a UI loop,
> what would a 3rd party module need to implement to provide such a
> thing? Can any of the non-IO related parts of asyncio be reused for
> the purpose, or must the 3rd party module implement everything from
> scratch?
>
> To me, this is an important question, as it cuts directly to the heart
> of the impression people have that coroutines and async are "only for
> asyncio".
>
> I'd be interested in writing, for instructional purposes, a toy but
> complete event loop. But I'm *not* really interested in trying to
> reverse engineer the required interface.
>

This is a great idea. What kind of application do you have in mind?

I think the main real-life use case for using coroutines with a UI event
loop is newer Windows code. C# (and IIUC VB) has coroutines very much along
the lines of PEP 492, and all code that does any kind of I/O (whether disk
or network) must be written as a coroutine. This requirement is enforced by
the C# compiler: the basic system calls for doing I/O are coroutines, and
in order to get their result you must use an await expression, which in
turn may only be used in a coroutine. Thus all code that may invoke an I/O
call ends up being a coroutine. This is exactly the type of constraint
we're trying to introduce into Python with PEP 492 (except of course we're
not making all I/O primitives coroutines -- that would be madness, we're
going with optional instead).

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/f3cb5600/attachment.html>

From p.f.moore at gmail.com  Tue May  5 23:29:39 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 22:29:39 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
 <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>
 <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>
Message-ID: <CACac1F_JhpCCrnz7+Uxf++AwDscgz5kgyE=u5hhPsLEsgP=1WQ@mail.gmail.com>

On 5 May 2015 at 22:12, Guido van Rossum <guido at python.org> wrote:
> I apologize for the confusing documentation. We need more help from
> qualified tech writers! Writing PEP 3156 was a huge undertaking for me;
> after that I was exhausted and did not want to take on writing the end user
> documentation as well, so it was left unfinished. :-(

Fair enough. When I properly document one of my projects, *then* I'll
think about complaining :-) These things happen.

> In PEP 3156 (asyncio package) there are really three separate concepts:
>
> - Future, which is a specific class (of which Task is a subclass);
>
> - coroutine, by which in this context is meant a generator object obtained
> by calling a generator function decorated with @asyncio.coroutine and
> written to conform to the asyncio protocol for coroutines (i.e. don't use
> bare yield, only use yield from, and the latter always with either a Future
> or a coroutine as argument);
>
> - either of the above, which is actually the most common requirement -- most
> asyncio functions that support one also support the other, and either is
> allowable as the argument to `yield from`.
>
> In the implementation we so often flipped between Future and coroutine that
> I imagine sometimes the implementation and docs differ; also, we don't have
> a good short name for "either of the above" so we end up using one or the
> other as a shorthand.

OK, that makes a lot of sense.

> *Unless* you want to attach callbacks, inspect the result or exception, or
> cancel it (all of which require a Future), your code shouldn't be concerned
> about the difference -- you should just use `res = yield from func(args)`
> and use try/except to catch exceptions if you care. And if you do need a
> Future, you can call the function asyncio.async() on it (which in PEP 492 is
> renamed to ensure_future()).

Again, makes sense. Although there are some bits of example code in
the docs that call asyncio.async() on a coroutine and throw away the
result (for example,
https://docs.python.org/3/library/asyncio-task.html#example-future-with-run-until-complete).
That confuses me. Are you saying that async() modifies its (coroutine)
argument to make it a Future? Rather than wrapping a coroutine in a
Future, which gets returned?

> In the PEP 492 world, these concepts map as follows:
>
> - Future translates to "something with an __await__ method" (and asyncio
> Futures are trivially made compliant by defining Future.__await__ as an
> alias for Future.__iter__);
>
> - "asyncio coroutine" maps to "PEP 492 coroutine object" (either defined
> with `async def` or a generator decorated with @types.coroutine -- note that
> @asyncio.coroutine incorporates the latter);
>
> - "either of the above" maps to "awaitable".

OK. Although "future" is a nicer term than "something with an
__await__ method" and the plethora of flavours of coroutine is not
great. But given that the only term we'll need in common cases is
"awaitable", it's still a net improvement.

So in the PEP 492 world, there's no such thing as a Task outside of
asyncio? Or, to put it another way, a Task is only relevant in an IO
context (unless an alternative event loop library implemented a
similar concept), and we should only be talking in terms of awaitables
and futures (given concurrent.futures and asyncio, I doubt you're
going to be able to stop people using "Future" for the generic term
for "something with an __await__ method" at best, and quite possibly
as equivalent to "awaitable", unfortunately).

Paul

From jimjjewett at gmail.com  Tue May  5 23:31:02 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Tue, 05 May 2015 14:31:02 -0700 (PDT)
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <55491E94.6080009@gmail.com>
Message-ID: <55493696.e017370a.0541.5026@mx.google.com>


Tue May 5 21:48:36 CEST 2015, Yury Selivanov wrote:

> As for terminology, I view this discussion differently.  It's
> not about the technical details (Python has asymmetric
> coroutines, that's it), but rather on how to disambiguate
> coroutines implemented with generators and yield-from, from
> new 'async def' coroutines.

Not just "How?", but "Why?".

Why do they *need* to be disambiguated?

With the benefit of having recently read all that discussion
(as opposed to just the PEP), my answer is ... uh ... that
generators vs "async def" is NOT an important distinction.
What matters (as best I can tell) is:

"something using yield (or yield from) to mark execution context switches"

  vs

"other kinds of callables, including those using yield to make an iterator"


I'm not quite sure that the actual proposal even really separates them
effectively, in part because the terminology keeps suggesting other
distinctions instead.  (The glossary does help; just not enough.)


-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From p.f.moore at gmail.com  Tue May  5 23:31:56 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 22:31:56 +0100
Subject: [Python-Dev] PEP 492: Please mention the Event Loop
In-Reply-To: <554931a2.551a370a.0ed9.4c61@mx.google.com>
References: <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <554931a2.551a370a.0ed9.4c61@mx.google.com>
Message-ID: <CACac1F80krJ+dTwv4u2czZAcXRCYEp9mXFRcNC5ZjRzgpP79Gg@mail.gmail.com>

On 5 May 2015 at 22:09, Jim J. Jewett <jimjjewett at gmail.com> wrote:
> Proposed second paragraph of the abstract:
>
> This PEP assumes that the asynchronous tasks are scheduled and
> coordinated by an Event Loop similar to that of stdlib module
> asyncio.events.AbstractEventLoop.  While the PEP is not tied to
> any specific Event Loop implementation, it is relevant only to
> the kind of coroutine that uses "yield" as a signal to the scheduler,
> indicating that the coroutine will be waiting until an event (such
> as IO) is completed.

+1. If that's not accurate then by all means correct any mistakes in
it. But assuming it *is* accurate, it would help a lot.
Paul

From yselivanov.ml at gmail.com  Tue May  5 23:38:09 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 17:38:09 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>	<55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
Message-ID: <55493841.9070601@gmail.com>

On 2015-05-05 5:01 PM, Paul Moore wrote:
> On 5 May 2015 at 21:00, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>> On 2015-05-05 3:40 PM, Jim J. Jewett wrote:
>>> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated PEP492.
>>>
>>> Where are the following over-simplifications wrong?
>>>
>>> (1)  The PEP is intended for use (almost exclusively) with
>>> asychronous IO and a scheduler such as the asynchio event loop.
>> Yes. You can also use it for UI loops.  Basically, anything
>> that can call your code asynchronously.
> Given that the stdlib doesn't provide an example of such a UI loop,
> what would a 3rd party module need to implement to provide such a
> thing? Can any of the non-IO related parts of asyncio be reused for
> the purpose, or must the 3rd party module implement everything from
> scratch?

The idea is that you integrate processing of UI events to
your event loop of choice.  For instance, Twisted has
integration for QT and other libraries [1].  This way you
can easily combine async network (or OS) calls with your
UI logic to avoid "callback hell".

Quick search for something like that for asyncio revealed
this library: [2].  This small library actually re-implements
relevant low-level parts of the asyncio event loop on top of
QT primitives (another approach).

Yury

[1] http://twistedmatrix.com/trac/wiki/QTReactor
[2] https://github.com/harvimt/quamash#usage -- see first_50

From p.f.moore at gmail.com  Tue May  5 23:40:46 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 22:40:46 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAP7+vJJT=hgYwW1XA7Y6C1uwCv7jy3zeJPzf3diy5kFP5jV5_w@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <CAP7+vJJT=hgYwW1XA7Y6C1uwCv7jy3zeJPzf3diy5kFP5jV5_w@mail.gmail.com>
Message-ID: <CACac1F_gKgajcvkkp8JtubSVk=iA2MQUNjM=7vDVBFV+kjmXrQ@mail.gmail.com>

On 5 May 2015 at 22:25, Guido van Rossum <guido at python.org> wrote:
>> I'd be interested in writing, for instructional purposes, a toy but
>> complete event loop. But I'm *not* really interested in trying to
>> reverse engineer the required interface.
>
> This is a great idea. What kind of application do you have in mind?

At this point, *all* I'm thinking of is a toy. So, an implementation
somewhat parallel to asyncio, but where the event loop just passes
control to the next task - so no IO multiplexing. Essentially Greg
Ewing's example up to, but not including, "Waiting for External
Events". And ideally I'd like to think that "Waiting for Resources"
can be omitted in favour of reusing
https://docs.python.org/3/library/asyncio-sync.html and
https://docs.python.org/3/library/asyncio-queue.html. My fear is,
however, that those parts of asyncio aren't reusable for other event
loops, and every event loop implementation has to reinvent those
wheels.

When I say "the required interface" I'm thinking in terms of "what's
needed to allow reuse of the generic parts of asyncio". If nothing of
asyncio is generic in those terms, then the exercise will be futile
(except in the negative sense of confirming that there are no reusable
async components in the stdlib).

Paul

From yselivanov.ml at gmail.com  Tue May  5 23:44:20 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 17:44:20 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <55493696.e017370a.0541.5026@mx.google.com>
References: <55493696.e017370a.0541.5026@mx.google.com>
Message-ID: <554939B4.5010701@gmail.com>



On 2015-05-05 5:31 PM, Jim J. Jewett wrote:
> Tue May 5 21:48:36 CEST 2015, Yury Selivanov wrote:
>
>> >As for terminology, I view this discussion differently.  It's
>> >not about the technical details (Python has asymmetric
>> >coroutines, that's it), but rather on how to disambiguate
>> >coroutines implemented with generators and yield-from, from
>> >new 'async def' coroutines.
> Not just "How?", but "Why?".
>
> Why do they*need*  to be disambiguated?

To clearly show how one interacts with the other, to explain
how backwards compatibility is implemented, and to better
illustrate some additional (and necessary) restrictions
we put on 'async def' coroutines.  Otherwise, the PEP would
be completely unreadable :)


Yury

From guido at python.org  Tue May  5 23:50:23 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 14:50:23 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F_JhpCCrnz7+Uxf++AwDscgz5kgyE=u5hhPsLEsgP=1WQ@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
 <20150501191937.GB8013@stoneleaf.us> <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
 <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>
 <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>
 <CACac1F_JhpCCrnz7+Uxf++AwDscgz5kgyE=u5hhPsLEsgP=1WQ@mail.gmail.com>
Message-ID: <CAP7+vJLht=iwjxBuh0qTP_xbGQAY5W6ehS0d+JUgPBzGFzt2zQ@mail.gmail.com>

On Tue, May 5, 2015 at 2:29 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 5 May 2015 at 22:12, Guido van Rossum <guido at python.org> wrote:
> > I apologize for the confusing documentation. We need more help from
> > qualified tech writers! Writing PEP 3156 was a huge undertaking for me;
> > after that I was exhausted and did not want to take on writing the end
> user
> > documentation as well, so it was left unfinished. :-(
>
> Fair enough. When I properly document one of my projects, *then* I'll
> think about complaining :-) These things happen.
>
> > In PEP 3156 (asyncio package) there are really three separate concepts:
> >
> > - Future, which is a specific class (of which Task is a subclass);
> >
> > - coroutine, by which in this context is meant a generator object
> obtained
> > by calling a generator function decorated with @asyncio.coroutine and
> > written to conform to the asyncio protocol for coroutines (i.e. don't use
> > bare yield, only use yield from, and the latter always with either a
> Future
> > or a coroutine as argument);
> >
> > - either of the above, which is actually the most common requirement --
> most
> > asyncio functions that support one also support the other, and either is
> > allowable as the argument to `yield from`.
> >
> > In the implementation we so often flipped between Future and coroutine
> that
> > I imagine sometimes the implementation and docs differ; also, we don't
> have
> > a good short name for "either of the above" so we end up using one or the
> > other as a shorthand.
>
> OK, that makes a lot of sense.
>
> > *Unless* you want to attach callbacks, inspect the result or exception,
> or
> > cancel it (all of which require a Future), your code shouldn't be
> concerned
> > about the difference -- you should just use `res = yield from func(args)`
> > and use try/except to catch exceptions if you care. And if you do need a
> > Future, you can call the function asyncio.async() on it (which in PEP
> 492 is
> > renamed to ensure_future()).
>
> Again, makes sense. Although there are some bits of example code in
> the docs that call asyncio.async() on a coroutine and throw away the
> result (for example,
>
> https://docs.python.org/3/library/asyncio-task.html#example-future-with-run-until-complete
> ).
> That confuses me. Are you saying that async() modifies its (coroutine)
> argument to make it a Future? Rather than wrapping a coroutine in a
> Future, which gets returned?
>

No, it wraps a coroutine (i.e. a generator) in a Task, but leaves a Future
alone. I'm stumped why that example calls async() and then throws the
result away. I suspect it won't work without it (or else Victor wouldn't
have added the call) but the reason seems, um, deep. I think wrapping it in
a Task enters the generator in the event loop's queue of runnables --
otherwise the generator may well be garbage-collected without ever running.

Such complexity doesn't belong in such a simple example though.


> > In the PEP 492 world, these concepts map as follows:
> >
> > - Future translates to "something with an __await__ method" (and asyncio
> > Futures are trivially made compliant by defining Future.__await__ as an
> > alias for Future.__iter__);
> >
> > - "asyncio coroutine" maps to "PEP 492 coroutine object" (either defined
> > with `async def` or a generator decorated with @types.coroutine -- note
> that
> > @asyncio.coroutine incorporates the latter);
> >
> > - "either of the above" maps to "awaitable".
>
> OK. Although "future" is a nicer term than "something with an
> __await__ method" and the plethora of flavours of coroutine is not
> great. But given that the only term we'll need in common cases is
> "awaitable", it's still a net improvement.
>
> So in the PEP 492 world, there's no such thing as a Task outside of
> asyncio? Or, to put it another way, a Task is only relevant in an IO
> context (unless an alternative event loop library implemented a
> similar concept), and we should only be talking in terms of awaitables
> and futures (given concurrent.futures and asyncio, I doubt you're
> going to be able to stop people using "Future" for the generic term
> for "something with an __await__ method" at best, and quite possibly
> as equivalent to "awaitable", unfortunately).
>

I'm not sure. But it's true that Futures and Tasks in asyncio serve the
purpose of linking the event loop (whose basic functioning is
callback-based) to coroutines (implemented by generators). The basic idea
is that when some I/O completes the event loop will call a callback
function registered for that particular I/O operation; the callback then is
actually a bound method of a Future or Task that causes the latter to be
marked as "complete" (i.e. having a result) which in turn will call other
callbacks (registered with the Future using add_done_callback()); in the
case of a Task (i.e. a special kind of Future that wraps a
generator/coroutine) this will resume the coroutine. (Actually it may
resume an entire stack of coroutines that are blocked waiting for each
other at yield-from; in my spare time I'm working on an explanation of the
machinery underlying yield, yield from and await that will explain this.)

It's likely that you could write a much simpler event loop by assuming just
coroutines (either the kind implemented by generators using yield from or
the PEP 492 kind). The reason asyncio uses callbacks at the lower levels is
the hope of fostering interoperability with Twisted and Tornado (and even
gevent, which also has an event loop at the bottom of everything).

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/9981fdaa/attachment.html>

From p.f.moore at gmail.com  Tue May  5 23:54:16 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 22:54:16 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55493841.9070601@gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
Message-ID: <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>

On 5 May 2015 at 22:38, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> n 2015-05-05 5:01 PM, Paul Moore wrote:
>>
>> On 5 May 2015 at 21:00, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>>>
>>> On 2015-05-05 3:40 PM, Jim J. Jewett wrote:
>>>>
>>>> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated
>>>> PEP492.
>>>>
>>>> Where are the following over-simplifications wrong?
>>>>
>>>> (1)  The PEP is intended for use (almost exclusively) with
>>>> asychronous IO and a scheduler such as the asynchio event loop.
>>>
>>> Yes. You can also use it for UI loops.  Basically, anything
>>> that can call your code asynchronously.
>>
>> Given that the stdlib doesn't provide an example of such a UI loop,
>> what would a 3rd party module need to implement to provide such a
>> thing? Can any of the non-IO related parts of asyncio be reused for
>> the purpose, or must the 3rd party module implement everything from
>> scratch?
>
> The idea is that you integrate processing of UI events to
> your event loop of choice.  For instance, Twisted has
> integration for QT and other libraries [1].  This way you
> can easily combine async network (or OS) calls with your
> UI logic to avoid "callback hell".

We seem to be talking at cross purposes. You say the PEP is *not*
exclusively intended for use with asyncio. You mention UI loops, but
when asked how to implement such a loop, you say that I integrate UI
events into my event loop of choice. But what options do I have for
"my event loop of choice"? Please provide a concrete example that
isn't asyncio. Can I use PEP 492 with Twisted (I doubt it, as Twisted
doesn't use yield from, which is Python 3.x only)? I contend that
there *is* no concrete example that currently exists, so I'm asking
what I'd need to do to write one. You pointed at qamash, but that
seems to be subclassing asyncio, so isn't "something that isn't
asyncio".

Note that I don't have a problem with there being no existing
implementation other than asyncio. I'd just like it if we could be
clear over exactly what we mean when we say "the PEP is not tied to
asyncio". It feels like the truth currently is "you can write your own
async framework that uses the new features introduced by the PEP". I
fully expect that *if* there's a need for async frameworks that aren't
fundamentally IO multiplexors, then it'll get easier to write them
over time (the main problem right now is a lack of good tutorial
examples of how to do so). But at the moment, asyncio seems to be the
only game in town (and I can imagine that it'll always be the main IO
multiplexor, unless existing frameworks like Twisted choose to compete
rather than integrate).

Paul

From njs at pobox.com  Wed May  6 00:01:05 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Tue, 5 May 2015 15:01:05 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
 <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>
 <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>
Message-ID: <CAPJVwB=4shOFRfL6QfmAsqOy1k1WXXY=yqsL_PUReQGToc5=Hw@mail.gmail.com>

On May 5, 2015 2:14 PM, "Guido van Rossum" <guido at python.org> wrote:
>
> In the PEP 492 world, these concepts map as follows:
>
> - Future translates to "something with an __await__ method" (and asyncio
Futures are trivially made compliant by defining Future.__await__ as an
alias for Future.__iter__);
>
> - "asyncio coroutine" maps to "PEP 492 coroutine object" (either defined
with `async def` or a generator decorated with @types.coroutine -- note
that @asyncio.coroutine incorporates the latter);
>
> - "either of the above" maps to "awaitable".

Err, aren't the first and third definitions above identical?

Surely we want to say: an async def function is a convenient shorthand for
creating a custom awaitable (exactly like how generators are a convenient
shorthand for creating custom iterators), and a Future is-an awaitable that
also adds some extra methods.

-n
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/07e64950/attachment.html>

From guido at python.org  Wed May  6 00:12:03 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 15:12:03 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAPJVwB=4shOFRfL6QfmAsqOy1k1WXXY=yqsL_PUReQGToc5=Hw@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
 <20150501191937.GB8013@stoneleaf.us> <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
 <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>
 <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>
 <CAPJVwB=4shOFRfL6QfmAsqOy1k1WXXY=yqsL_PUReQGToc5=Hw@mail.gmail.com>
Message-ID: <CAP7+vJKdz3f16O=xCT-Ee=hXhkh98DubdXSL=30VstCnkXU4LA@mail.gmail.com>

On Tue, May 5, 2015 at 3:01 PM, Nathaniel Smith <njs at pobox.com> wrote:

On May 5, 2015 2:14 PM, "Guido van Rossum" <guido at python.org> wrote:
> >
> > In the PEP 492 world, these concepts map as follows:
> >
> > - Future translates to "something with an __await__ method" (and asyncio
> Futures are trivially made compliant by defining Future.__await__ as an
> alias for Future.__iter__);
> >
> > - "asyncio coroutine" maps to "PEP 492 coroutine object" (either defined
> with `async def` or a generator decorated with @types.coroutine -- note
> that @asyncio.coroutine incorporates the latter);
> >
> > - "either of the above" maps to "awaitable".
>
> Err, aren't the first and third definitions above identical?
>
> Surely we want to say: an async def function is a convenient shorthand for
> creating a custom awaitable (exactly like how generators are a convenient
> shorthand for creating custom iterators), and a Future is-an awaitable that
> also adds some extra methods.
>

The current PEP 492 proposal does endow the object returned by calling an
async function (let's call it a coroutine object) with an __await__ method.
And there's a good reason for this -- the bytecode generated for await
treats coroutine objects special, just like the bytecode generated for
yield-from treats generator objects special. The special behavior they have
in common is the presence of send() and throw() methods, which are used to
allow send() and throw() calls on the outer generator to be passed into the
inner generator with minimal fuss. (This is the reason why "yield from X"
is *not* equivalent to "for x in X: yield x".)

@Yury: I have a feeling the PEP could use more clarity here -- perhaps the
section "Await Expression" should explain what the interepreter does for
each type of awaitable?

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/3283e2c5/attachment-0001.html>

From yselivanov.ml at gmail.com  Wed May  6 00:25:59 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 18:25:59 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>	<55492164.8090906@gmail.com>	<CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>	<55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
Message-ID: <55494377.3040608@gmail.com>

Paul,

On 2015-05-05 5:54 PM, Paul Moore wrote:
> On 5 May 2015 at 22:38, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>> n 2015-05-05 5:01 PM, Paul Moore wrote:
>>> On 5 May 2015 at 21:00, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>>>> On 2015-05-05 3:40 PM, Jim J. Jewett wrote:
>>>>> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated
>>>>> PEP492.
>>>>>
>>>>> Where are the following over-simplifications wrong?
>>>>>
>>>>> (1)  The PEP is intended for use (almost exclusively) with
>>>>> asychronous IO and a scheduler such as the asynchio event loop.
>>>> Yes. You can also use it for UI loops.  Basically, anything
>>>> that can call your code asynchronously.
>>> Given that the stdlib doesn't provide an example of such a UI loop,
>>> what would a 3rd party module need to implement to provide such a
>>> thing? Can any of the non-IO related parts of asyncio be reused for
>>> the purpose, or must the 3rd party module implement everything from
>>> scratch?
>> The idea is that you integrate processing of UI events to
>> your event loop of choice.  For instance, Twisted has
>> integration for QT and other libraries [1].  This way you
>> can easily combine async network (or OS) calls with your
>> UI logic to avoid "callback hell".
> We seem to be talking at cross purposes. You say the PEP is *not*
> exclusively intended for use with asyncio. You mention UI loops, but
> when asked how to implement such a loop, you say that I integrate UI
> events into my event loop of choice. But what options do I have for
> "my event loop of choice"? Please provide a concrete example that
> isn't asyncio.

Yes, there is no other popular event loop for 3.4 other
than asyncio, that uses coroutines based on generators
(as far as I know).

And yes, the PEP is not exclusively intended for use
with asyncio, but asyncio is the only library that ships
with Python, and is Python 3 ready, so its users will be
the first ones to directly benefit from this proposal.

> Can I use PEP 492 with Twisted (I doubt it, as Twisted
> doesn't use yield from, which is Python 3.x only)? I contend that
> there *is* no concrete example that currently exists, so I'm asking
> what I'd need to do to write one. You pointed at qamash, but that
> seems to be subclassing asyncio, so isn't "something that isn't
> asyncio".

When Twisted is ported to Python 3, I'd be really surprised
if it doesn't allow to use the new syntax.  @inlineCallbacks
implements a trampoline to make 'yields' work.  This is a
much slower approach than using 'yield from' (and 'await'
from PEP 492).  Not mentioning 'async with' and 'async for'
features.  (There shouldn't be a problem to support both
@inlineCallbacks and PEP 492 approach, if I'm not missing
something).

>
> Note that I don't have a problem with there being no existing
> implementation other than asyncio. I'd just like it if we could be
> clear over exactly what we mean when we say "the PEP is not tied to
> asyncio".


Well, "the PEP is not tied to asyncio" -- this is correct.
*The new syntax and new protocols know nothing about asyncio*.

asyncio will know about the PEP by implementing new protocols
where required etc (but supporting these new features isn't
in the scope of the PEP).


> It feels like the truth currently is "you can write your own
> async framework that uses the new features introduced by the PEP". I
> fully expect that *if* there's a need for async frameworks that aren't
> fundamentally IO multiplexors, then it'll get easier to write them
> over time (the main problem right now is a lack of good tutorial
> examples of how to do so). But at the moment, asyncio seems to be the
> only game in town (and I can imagine that it'll always be the main IO
> multiplexor, unless existing frameworks like Twisted choose to compete
> rather than integrate).

Agree.  But if the existing frameworks choose to compete,
or someone decides to write something better than asyncio,
they can benefit from PEP 492.


Yury

From guido at python.org  Wed May  6 00:28:07 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 15:28:07 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CACac1F_gKgajcvkkp8JtubSVk=iA2MQUNjM=7vDVBFV+kjmXrQ@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <CAP7+vJJT=hgYwW1XA7Y6C1uwCv7jy3zeJPzf3diy5kFP5jV5_w@mail.gmail.com>
 <CACac1F_gKgajcvkkp8JtubSVk=iA2MQUNjM=7vDVBFV+kjmXrQ@mail.gmail.com>
Message-ID: <CAP7+vJJndugsKv828c2m=NOuQsYS7evv-HpRqXz1erG=VHm54g@mail.gmail.com>

On Tue, May 5, 2015 at 2:40 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 5 May 2015 at 22:25, Guido van Rossum <guido at python.org> wrote:
>
[Paul:]

> >> I'd be interested in writing, for instructional purposes, a toy but
> >> complete event loop. But I'm *not* really interested in trying to
> >> reverse engineer the required interface.
> >
> > This is a great idea. What kind of application do you have in mind?
>
> At this point, *all* I'm thinking of is a toy. So, an implementation
> somewhat parallel to asyncio, but where the event loop just passes
> control to the next task - so no IO multiplexing. Essentially Greg
> Ewing's example up to, but not including, "Waiting for External
> Events". And ideally I'd like to think that "Waiting for Resources"
> can be omitted in favour of reusing
> https://docs.python.org/3/library/asyncio-sync.html and
> https://docs.python.org/3/library/asyncio-queue.html. My fear is,
> however, that those parts of asyncio aren't reusable for other event
> loops, and every event loop implementation has to reinvent those
> wheels.
>

It was never a goal of asyncio to have parts that were directly reusable by
other event loops without pulling in (almost) all of asyncio. The
interoperability offered by asyncio allows other event loops to implement
the same low-level interface as asyncio, or to build on top of asyncio.
(This is why the event loop uses callbacks and isn't coroutines/generators
all the way down.) Note that asyncio.get_event_loop() may return a loop
implemented by some other framework, and the rest of asyncio will then use
that event loop. This is enabled by the EventLoopPolicy interface.


> When I say "the required interface" I'm thinking in terms of "what's
> needed to allow reuse of the generic parts of asyncio". If nothing of
> asyncio is generic in those terms, then the exercise will be futile
> (except in the negative sense of confirming that there are no reusable
> async components in the stdlib).
>
> Paul
>

What do you hope to learn or teach by creating this toy example? And how do
you define "a complete event loop"?

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/0fca24a0/attachment.html>

From yselivanov.ml at gmail.com  Wed May  6 00:33:37 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 18:33:37 -0400
Subject: [Python-Dev] PEP 492: Please mention the Event Loop
In-Reply-To: <554931a2.551a370a.0ed9.4c61@mx.google.com>
References: <554931a2.551a370a.0ed9.4c61@mx.google.com>
Message-ID: <55494541.2060604@gmail.com>

Jim,

On 2015-05-05 5:09 PM, Jim J. Jewett wrote:
> On Tue May 5 21:44:26 CEST 2015,Brett Cannon wrote:
>
>> It's not as
>> complicated as it seems when you realize there is an event loop driving
>> everything (which people have been leaving out of the conversation since it
>> doesn't tie into the syntax directly).
[..]
> Proposed second paragraph of the abstract:
>
> This PEP assumes that the asynchronous tasks are scheduled and
> coordinated by an Event Loop similar to that of stdlib module
> asyncio.events.AbstractEventLoop.  While the PEP is not tied to
> any specific Event Loop implementation, it is relevant only to
> the kind of coroutine that uses "yield" as a signal to the scheduler,
> indicating that the coroutine will be waiting until an event (such
> as IO) is completed.
>
>

Thank you for this suggestion.  I've added it to the PEP:
https://hg.python.org/peps/rev/7ac132b24f1f


Yury

From p.f.moore at gmail.com  Wed May  6 00:37:02 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 23:37:02 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55494377.3040608@gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com>
Message-ID: <CACac1F8SHvQD_UbY2hzZ6vg-BX4KYKqzSD9cOET8HWBtkMvaQg@mail.gmail.com>

On 5 May 2015 at 23:25, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>> Note that I don't have a problem with there being no existing
>> implementation other than asyncio. I'd just like it if we could be
>> clear over exactly what we mean when we say "the PEP is not tied to
>> asyncio".
>
> Well, "the PEP is not tied to asyncio" -- this is correct.
> *The new syntax and new protocols know nothing about asyncio*.
>
> asyncio will know about the PEP by implementing new protocols
> where required etc (but supporting these new features isn't
> in the scope of the PEP).

Thanks. That's something that may be worth explicitly noting in the
PEP (I don't recall it from when I last looked but that was a while
ago).
Paul

From p.f.moore at gmail.com  Wed May  6 00:52:57 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 5 May 2015 23:52:57 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAP7+vJJndugsKv828c2m=NOuQsYS7evv-HpRqXz1erG=VHm54g@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <CAP7+vJJT=hgYwW1XA7Y6C1uwCv7jy3zeJPzf3diy5kFP5jV5_w@mail.gmail.com>
 <CACac1F_gKgajcvkkp8JtubSVk=iA2MQUNjM=7vDVBFV+kjmXrQ@mail.gmail.com>
 <CAP7+vJJndugsKv828c2m=NOuQsYS7evv-HpRqXz1erG=VHm54g@mail.gmail.com>
Message-ID: <CACac1F9ypZTXwHrLwdbi_X20if_phv3MRoo_w_66W-zxVQPTiw@mail.gmail.com>

On 5 May 2015 at 23:28, Guido van Rossum <guido at python.org> wrote:
>> At this point, *all* I'm thinking of is a toy. So, an implementation
>> somewhat parallel to asyncio, but where the event loop just passes
>> control to the next task - so no IO multiplexing. Essentially Greg
>> Ewing's example up to, but not including, "Waiting for External
>> Events". And ideally I'd like to think that "Waiting for Resources"
>> can be omitted in favour of reusing
>> https://docs.python.org/3/library/asyncio-sync.html and
>> https://docs.python.org/3/library/asyncio-queue.html. My fear is,
>> however, that those parts of asyncio aren't reusable for other event
>> loops, and every event loop implementation has to reinvent those
>> wheels.
>
> It was never a goal of asyncio to have parts that were directly reusable by
> other event loops without pulling in (almost) all of asyncio. The
> interoperability offered by asyncio allows other event loops to implement
> the same low-level interface as asyncio, or to build on top of asyncio.
> (This is why the event loop uses callbacks and isn't coroutines/generators
> all the way down.) Note that asyncio.get_event_loop() may return a loop
> implemented by some other framework, and the rest of asyncio will then use
> that event loop. This is enabled by the EventLoopPolicy interface.

OK, that's an entirely fair comment. It's difficult to tell from the
docs - there's nothing obviously io-related about the task
abstraction, or the synchronisation or queue primitives. But there's
equally no reason to assume that they would work with another
implementation. As I mentioned somewhere else, maybe refactoring the
bits of asyncio that can be reused into an asynclib module would be
useful. But based on what you said, there's no reason to assume that
would be an easy job. And without another event loop implementation,
it's not obvious that there's a justification for doing so.

> What do you hope to learn or teach by creating this toy example? And how do
> you define "a complete event loop"?

Well, one thing I hope to learn, I guess, is what "a complete event
loop" consists of :-)

More broadly, I'd like to get a better feel for what methods are
fundamental to an event loop. IIRC, we had this discussion way back at
the beginning of the asyncio development when I was unclear about why
create_connection had to be an event loop method. In the asyncio
context, it has to be because the event loop needs to know when
connections get created (excuse me probably misremembering the exact
reason from back then). But conversely, it's easy to imagine an event
loop unrelated to socket IO that doesn't have a create_connection
method. On the other hand, an event loop with no call_soon method
seems unlikely. So in essence I'm thinking about what a "sensible
minimum" event loop might be. An event loop ABC, if you like.

And following on from there, what useful abstractions (tasks,
synchronisation and queue primitives) can be built on top of such a
minimal interface. Basically, that's what I'm hoping to learn - what
is fundamental (or at least generally applicable) and what is related
to the purpose of a given implementation.

I've probably got enough from this discussion to try writing up some
code and see where it leads me.

Paul

PS You mentioned that a the callback-based nature of the asyncio event
loop is to simplify interoperability with callback-based frameworks
like Twisted. I guess the above ignores the possibility of event loops
that *aren't* callback-based. Or maybe it doesn't - that's possibly
another class of methods (callback-focused ones) that maybe can be
separated into their own ABC.

From guido at python.org  Wed May  6 01:05:57 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 16:05:57 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CACac1F9ypZTXwHrLwdbi_X20if_phv3MRoo_w_66W-zxVQPTiw@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <CAP7+vJJT=hgYwW1XA7Y6C1uwCv7jy3zeJPzf3diy5kFP5jV5_w@mail.gmail.com>
 <CACac1F_gKgajcvkkp8JtubSVk=iA2MQUNjM=7vDVBFV+kjmXrQ@mail.gmail.com>
 <CAP7+vJJndugsKv828c2m=NOuQsYS7evv-HpRqXz1erG=VHm54g@mail.gmail.com>
 <CACac1F9ypZTXwHrLwdbi_X20if_phv3MRoo_w_66W-zxVQPTiw@mail.gmail.com>
Message-ID: <CAP7+vJLiSay5MWKuVHf4U7Cja92ukMsNyaeVe2wmd8eZWqwtPg@mail.gmail.com>

I wonder if you could look at Tkinter for a very different view of the
world. While there are ways to integrate socket I/O with the Tcl/Tk event
loop, the typical Python app using Tkinter probably moves network I/O (if
it has any) to a separate thread, and ignores the delays of disk-based I/O
(because modern disk I/O is usually faster than the minimal event response
time -- assuming you don't have floppy disks :-).

I'm not entirely sure how you would use coroutines with Tkinter, but I
could imagine that e.g. mouse tracking code such as found in drawing apps
might be written more easily using a loop that uses await (or yield [from])
to get another event rather than as a callback function for the "mouse
move" event.

The mechanics of writing a multiplexer that receives Tkinter events and
uses them to decide which generator/coroutine to wake up might be too much
for your purpose, but it would provide a real-life example of an event loop
that's not built for network I/O.

On Tue, May 5, 2015 at 3:52 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 5 May 2015 at 23:28, Guido van Rossum <guido at python.org> wrote:
> >> At this point, *all* I'm thinking of is a toy. So, an implementation
> >> somewhat parallel to asyncio, but where the event loop just passes
> >> control to the next task - so no IO multiplexing. Essentially Greg
> >> Ewing's example up to, but not including, "Waiting for External
> >> Events". And ideally I'd like to think that "Waiting for Resources"
> >> can be omitted in favour of reusing
> >> https://docs.python.org/3/library/asyncio-sync.html and
> >> https://docs.python.org/3/library/asyncio-queue.html. My fear is,
> >> however, that those parts of asyncio aren't reusable for other event
> >> loops, and every event loop implementation has to reinvent those
> >> wheels.
> >
> > It was never a goal of asyncio to have parts that were directly reusable
> by
> > other event loops without pulling in (almost) all of asyncio. The
> > interoperability offered by asyncio allows other event loops to implement
> > the same low-level interface as asyncio, or to build on top of asyncio.
> > (This is why the event loop uses callbacks and isn't
> coroutines/generators
> > all the way down.) Note that asyncio.get_event_loop() may return a loop
> > implemented by some other framework, and the rest of asyncio will then
> use
> > that event loop. This is enabled by the EventLoopPolicy interface.
>
> OK, that's an entirely fair comment. It's difficult to tell from the
> docs - there's nothing obviously io-related about the task
> abstraction, or the synchronisation or queue primitives. But there's
> equally no reason to assume that they would work with another
> implementation. As I mentioned somewhere else, maybe refactoring the
> bits of asyncio that can be reused into an asynclib module would be
> useful. But based on what you said, there's no reason to assume that
> would be an easy job. And without another event loop implementation,
> it's not obvious that there's a justification for doing so.
>
> > What do you hope to learn or teach by creating this toy example? And how
> do
> > you define "a complete event loop"?
>
> Well, one thing I hope to learn, I guess, is what "a complete event
> loop" consists of :-)
>
> More broadly, I'd like to get a better feel for what methods are
> fundamental to an event loop. IIRC, we had this discussion way back at
> the beginning of the asyncio development when I was unclear about why
> create_connection had to be an event loop method. In the asyncio
> context, it has to be because the event loop needs to know when
> connections get created (excuse me probably misremembering the exact
> reason from back then). But conversely, it's easy to imagine an event
> loop unrelated to socket IO that doesn't have a create_connection
> method. On the other hand, an event loop with no call_soon method
> seems unlikely. So in essence I'm thinking about what a "sensible
> minimum" event loop might be. An event loop ABC, if you like.
>
> And following on from there, what useful abstractions (tasks,
> synchronisation and queue primitives) can be built on top of such a
> minimal interface. Basically, that's what I'm hoping to learn - what
> is fundamental (or at least generally applicable) and what is related
> to the purpose of a given implementation.
>
> I've probably got enough from this discussion to try writing up some
> code and see where it leads me.
>
> Paul
>
> PS You mentioned that a the callback-based nature of the asyncio event
> loop is to simplify interoperability with callback-based frameworks
> like Twisted. I guess the above ignores the possibility of event loops
> that *aren't* callback-based. Or maybe it doesn't - that's possibly
> another class of methods (callback-focused ones) that maybe can be
> separated into their own ABC.
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/23bcf81d/attachment.html>

From guido at python.org  Wed May  6 01:53:17 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 16:53:17 -0700
Subject: [Python-Dev] Accepting PEP 492 (async/await)
Message-ID: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>

Everybody,

In order to save myself a major headache I'm hereby accepting PEP 492.

I've been following Yury's efforts carefully and I am fully confident that
we're doing the right thing here. There is only so much effort we can put
into clarifying terminology and explaining coroutines. Somebody should
write a tutorial. (I started to write one, but I ran out of time after just
describing basic yield.)

I've given Yury clear instructions to focus on how to proceed -- he's to
work with another core dev on getting the implementation ready in time for
beta 1 (scheduled for May 24, but I think the target date should be May 19).

The acceptance is provisional in the PEP 411 sense (stretching its meaning
to apply to language changes). That is, we reserve the right to change the
specification (or even withdraw it, in a worst-case scenario) until 3.6,
although I expect we won't need to do this except for some peripheral
issues (e.g. the backward compatibility flags).

I now plan to go back to PEP 484 (type hints). Fortunately in that case
there's not much *implementation* that will land (just the typing.py
module), but there's still a lot of language in the PEP that needs updating
(check the PEP 484 tracker <https://github.com/ambv/typehinting/issues>).

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/15c8add8/attachment-0001.html>

From guido at python.org  Wed May  6 01:58:40 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 16:58:40 -0700
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
Message-ID: <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>

I totally forgot to publicly congratulate Yury on this PEP. He's put a huge
effort into writing the PEP and the implementation and managing the
discussion, first on python-ideas, later on python-dev. Congrats, Yury! And
thanks for your efforts. Godspeed.

On Tue, May 5, 2015 at 4:53 PM, Guido van Rossum <guido at python.org> wrote:

> Everybody,
>
> In order to save myself a major headache I'm hereby accepting PEP 492.
>
> I've been following Yury's efforts carefully and I am fully confident that
> we're doing the right thing here. There is only so much effort we can put
> into clarifying terminology and explaining coroutines. Somebody should
> write a tutorial. (I started to write one, but I ran out of time after just
> describing basic yield.)
>
> I've given Yury clear instructions to focus on how to proceed -- he's to
> work with another core dev on getting the implementation ready in time for
> beta 1 (scheduled for May 24, but I think the target date should be May 19).
>
> The acceptance is provisional in the PEP 411 sense (stretching its meaning
> to apply to language changes). That is, we reserve the right to change the
> specification (or even withdraw it, in a worst-case scenario) until 3.6,
> although I expect we won't need to do this except for some peripheral
> issues (e.g. the backward compatibility flags).
>
> I now plan to go back to PEP 484 (type hints). Fortunately in that case
> there's not much *implementation* that will land (just the typing.py
> module), but there's still a lot of language in the PEP that needs updating
> (check the PEP 484 tracker <https://github.com/ambv/typehinting/issues>).
>
> --
> --Guido van Rossum (python.org/~guido)
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/df2ad278/attachment.html>

From ethan at stoneleaf.us  Wed May  6 02:04:37 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Tue, 5 May 2015 17:04:37 -0700
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
 <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>
Message-ID: <20150506000437.GD1827@stoneleaf.us>

Congratulations, Yury!

--
~Ethan~

From yselivanov.ml at gmail.com  Wed May  6 02:13:07 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 05 May 2015 20:13:07 -0400
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
Message-ID: <55495C93.4060400@gmail.com>

On 2015-05-05 7:53 PM, Guido van Rossum wrote:
> Everybody,
>
> In order to save myself a major headache I'm hereby accepting PEP 492.
>
> I've been following Yury's efforts carefully and I am fully confident that
> we're doing the right thing here. There is only so much effort we can put
> into clarifying terminology and explaining coroutines. Somebody should
> write a tutorial. (I started to write one, but I ran out of time after just
> describing basic yield.)
>
>

Thank you, Guido!
Yury

From victor.stinner at gmail.com  Wed May  6 02:13:40 2015
From: victor.stinner at gmail.com (Victor Stinner)
Date: Wed, 6 May 2015 02:13:40 +0200
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
Message-ID: <CAMpsgwZvA8VtS-8XesAzm9EPN9Kq-W_to54u7rFEaUnLDVd-Gw@mail.gmail.com>

Hi,

2015-05-06 1:53 GMT+02:00 Guido van Rossum <guido at python.org>:
> In order to save myself a major headache I'm hereby accepting PEP 492.

Great! Congrats Yury.

> I've given Yury clear instructions to focus on how to proceed -- he's to
> work with another core dev on getting the implementation ready in time for
> beta 1 (scheduled for May 24, but I think the target date should be May 19).

The implementation takes place at:
https://bugs.python.org/issue24017

Yury works at https://github.com/1st1/cpython in git branches.

I already sent a first review. But I don't feel able to review the
change on the grammar, parser or things like that.

Victor

From tjreedy at udel.edu  Wed May  6 03:03:52 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Tue, 05 May 2015 21:03:52 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55494377.3040608@gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>	<55492164.8090906@gmail.com>	<CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>	<55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com>
Message-ID: <mibp9r$9b6$1@ger.gmane.org>

On 5/5/2015 6:25 PM, Yury Selivanov wrote:

> Yes, there is no other popular event loop for 3.4 other
> than asyncio,

There is the tk(inter) event loop which also ships with CPython, and 
which is commonly used.

> that uses coroutines based on generators

Oh ;-) Tkinter event loop is callback based.  AFAIK, so is the asyncio 
event loop, but that is somehow masked by tasks that interface to 
coroutines.  Do you think the 'somehow' could be adapted to work with 
the tkinter loop?

What I do not understand is how io events become event loop Event 
instances.  For tk, keyboard and mouse actions seen by the OS become tk 
Events associated with a widget.  Some widgets generate events. User 
code can also generate (pseudo)events.

My specific use case is to be able to run a program in a separate 
process, but display the output in the gui process -- something like 
this (in Idle, for instance).  (Apologies if this misuses the new keywords.)

async def menu_handler()
     ow = OutputWindow(args)  # tk Widget
     proc = subprocess.Popen (or multiprocessing equivalent)
     out = (stdout from process)
     await for line in out:
         ow.write(line)
     finish()

I want the handler to not block event processing, and disappear after 
finishing.  Might 492 make this possible someday?  Or would having 'line 
in pipe' or just 'data in pipe' translated to a tk event likely require 
a patch to tk?

-- 
Terry Jan Reedy


From guido at python.org  Wed May  6 04:59:16 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 5 May 2015 19:59:16 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <mibp9r$9b6$1@ger.gmane.org>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com> <mibp9r$9b6$1@ger.gmane.org>
Message-ID: <CAP7+vJKoeefmJULXuuv7Gnz2QyX+rmHLzXh_08VB6drXiRgoVg@mail.gmail.com>

For this you should probably use an integration of asyncio (which can do
async subprocess output nicely) with Tkinter. Over in tulip-land there is
an demo of such an integration.

On Tue, May 5, 2015 at 6:03 PM, Terry Reedy <tjreedy at udel.edu> wrote:

> On 5/5/2015 6:25 PM, Yury Selivanov wrote:
>
>  Yes, there is no other popular event loop for 3.4 other
>> than asyncio,
>>
>
> There is the tk(inter) event loop which also ships with CPython, and which
> is commonly used.
>
>  that uses coroutines based on generators
>>
>
> Oh ;-) Tkinter event loop is callback based.  AFAIK, so is the asyncio
> event loop, but that is somehow masked by tasks that interface to
> coroutines.  Do you think the 'somehow' could be adapted to work with the
> tkinter loop?
>
> What I do not understand is how io events become event loop Event
> instances.  For tk, keyboard and mouse actions seen by the OS become tk
> Events associated with a widget.  Some widgets generate events. User code
> can also generate (pseudo)events.
>
> My specific use case is to be able to run a program in a separate process,
> but display the output in the gui process -- something like this (in Idle,
> for instance).  (Apologies if this misuses the new keywords.)
>
> async def menu_handler()
>     ow = OutputWindow(args)  # tk Widget
>     proc = subprocess.Popen (or multiprocessing equivalent)
>     out = (stdout from process)
>     await for line in out:
>         ow.write(line)
>     finish()
>
> I want the handler to not block event processing, and disappear after
> finishing.  Might 492 make this possible someday?  Or would having 'line in
> pipe' or just 'data in pipe' translated to a tk event likely require a
> patch to tk?
>
> --
> Terry Jan Reedy
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/2c6ea021/attachment.html>

From ben at bendarnell.com  Wed May  6 06:05:28 2015
From: ben at bendarnell.com (Ben Darnell)
Date: Tue, 5 May 2015 21:05:28 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <55494377.3040608@gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com>
Message-ID: <CAFkYKJ5=bmWkCP5vPVQ_k=zgr3ZOSNv2QghMqUnngeO5-Jdb6Q@mail.gmail.com>

On Tue, May 5, 2015 at 3:25 PM, Yury Selivanov <yselivanov.ml at gmail.com>
wrote:

>
> Yes, there is no other popular event loop for 3.4 other
> than asyncio, that uses coroutines based on generators
> (as far as I know).
>

Tornado supports Python 3.4 and uses generator-based coroutines. We use
`yield` instead of `yield from` for compatibility with Python 2. I have a
patch to support the new async/await syntax here:
https://github.com/bdarnell/tornado/commit/e3b71c3441e9f87a29a9b112901b7644b5b6edb8

Overall, I like the PEP. I've been reluctant to embrace `yield from` for
Tornado coroutines (Tornado's Futures do not implement `__iter__`) because
I'm worried about confusion between `yield` and `yield from`, but async and
await are explicit enough that that's not really a problem.

My one request would be that there be a type or ABC corresponding to
inspect.isawaitable(). Tornado uses functools.singledispatch to handle
interoperability with other coroutine frameworks, so it would be best if we
could distinguish awaitables from other objects in a way that is compatible
with singledispatch. The patch above simply registers types.GeneratorType
which isn't quite correct.

-Ben


>
> And yes, the PEP is not exclusively intended for use
> with asyncio, but asyncio is the only library that ships
> with Python, and is Python 3 ready, so its users will be
> the first ones to directly benefit from this proposal.
>
>  Can I use PEP 492 with Twisted (I doubt it, as Twisted
>> doesn't use yield from, which is Python 3.x only)? I contend that
>> there *is* no concrete example that currently exists, so I'm asking
>> what I'd need to do to write one. You pointed at qamash, but that
>> seems to be subclassing asyncio, so isn't "something that isn't
>> asyncio".
>>
>
> When Twisted is ported to Python 3, I'd be really surprised
> if it doesn't allow to use the new syntax.  @inlineCallbacks
> implements a trampoline to make 'yields' work.  This is a
> much slower approach than using 'yield from' (and 'await'
> from PEP 492).  Not mentioning 'async with' and 'async for'
> features.  (There shouldn't be a problem to support both
> @inlineCallbacks and PEP 492 approach, if I'm not missing
> something).
>
>
>> Note that I don't have a problem with there being no existing
>> implementation other than asyncio. I'd just like it if we could be
>> clear over exactly what we mean when we say "the PEP is not tied to
>> asyncio".
>>
>
>
> Well, "the PEP is not tied to asyncio" -- this is correct.
> *The new syntax and new protocols know nothing about asyncio*.
>
> asyncio will know about the PEP by implementing new protocols
> where required etc (but supporting these new features isn't
> in the scope of the PEP).
>
>
>  It feels like the truth currently is "you can write your own
>> async framework that uses the new features introduced by the PEP". I
>> fully expect that *if* there's a need for async frameworks that aren't
>> fundamentally IO multiplexors, then it'll get easier to write them
>> over time (the main problem right now is a lack of good tutorial
>> examples of how to do so). But at the moment, asyncio seems to be the
>> only game in town (and I can imagine that it'll always be the main IO
>> multiplexor, unless existing frameworks like Twisted choose to compete
>> rather than integrate).
>>
>
> Agree.  But if the existing frameworks choose to compete,
> or someone decides to write something better than asyncio,
> they can benefit from PEP 492.
>
>
> Yury
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/ben%40bendarnell.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/47fcd4c2/attachment.html>

From greg.ewing at canterbury.ac.nz  Wed May  6 08:46:47 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 06 May 2015 18:46:47 +1200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
Message-ID: <5549B8D7.5000100@canterbury.ac.nz>

Paul Moore wrote:
> It would probably be helpful to have a concrete example of a basic
> event loop that did *nothing* but schedule tasks. No IO waiting or
> similar, just scheduling.

Take a look at the example I developed when working
on the yield-from pep:

http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/yf_current/Examples/Scheduler/scheduler.txt

The first version of the event loop presented there
does exactly that, just schedules tasks in a round-
robin fashion. Then I gradually add more features
to it.

> Actually, what *is* the minimal event loop
> interface that is needed for the various task/future mechanisms to
> work, independently of asyncio?

I don't it's possible to answer that question,
because there isn't a single answer. The minimal
set of features that an event loop needs depends
on what you want to achieve with it.

Even the notion of "just schedules tasks" is
ambiguous. What does "schedule" mean? Does it
just mean round-robin switching between them, or
should they be able to synchronise with each
other in some way? Should it be possible for a
task to suspend itself for an interval of real
world time, or does that come under the heading
of I/O (since you're waiting for an external
event, i.e. the computer's clock reaching some
time)? Etc.

 > And what features of an event loop etc
 > are needed for the PEP, if it's being used outside of asyncio?)

I don't think *any* particular event loop
features are needed.

You can't pick out any of these features as
being "core". For example, it would be possible
to have an event loop that handled socket I/O
but *didn't* do round-robin scheduling -- it
could just keep on running the same task, even
if it yielded, until it blocked waiting for
an external event. Such a scheduler would
probably be quite adequate for many use cases.

It seems to me that the idea of "generator-based
tasks managed by an event loop" is more of a
design pattern than something you can write a
detailed API specification for.

Another problem with the "core" idea is that
you can't start with an event loop that "just does
scheduling" and then add on other features such
as I/O *from the outside*. There has to be some
point at which everything comes together, which
means choosing something like select() or
poll() or I/O completion queues, and build that
into the heart of your event loop. At that point
it's no longer something with a simple core.

-- 
Greg

From p.f.moore at gmail.com  Wed May  6 10:07:10 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 6 May 2015 09:07:10 +0100
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
 <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>
Message-ID: <CACac1F8Dke35+D9oEL++gX5FH2fXU84T=Yw7TnCvq53Za3XtgQ@mail.gmail.com>

On 6 May 2015 at 00:58, Guido van Rossum <guido at python.org> wrote:
> I totally forgot to publicly congratulate Yury on this PEP. He's put a huge
> effort into writing the PEP and the implementation and managing the
> discussion, first on python-ideas, later on python-dev. Congrats, Yury! And
> thanks for your efforts. Godspeed.

Agreed, congratulations! There's been a lot of debate on this PEP, and
Yury has done a great job of responding where needed and keeping
things on track, which can't have been easy. Thanks for all the work.

Paul.

From greg.ewing at canterbury.ac.nz  Wed May  6 10:20:54 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 06 May 2015 20:20:54 +1200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F-yo0G7FDvwpWOoNOaoX1ndJBjehwyH7XsHD7JHDtO7Vg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
 <CAP7+vJJ31cWu2sewVevYNS+9KGCnV3dH3NaqbgjYU7Gu42p4Zg@mail.gmail.com>
 <CACac1F-yo0G7FDvwpWOoNOaoX1ndJBjehwyH7XsHD7JHDtO7Vg@mail.gmail.com>
Message-ID: <5549CEE6.5070805@canterbury.ac.nz>

Paul Moore wrote:

>>What about Greg Ewing's example?
>>http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/yf_current/Examples/Scheduler/scheduler.txt
> 
> That doesn't cover any of the higher level abstractions like tasks or
> futures (at least not by those names or with those interfaces).

Because a minimal event loop doesn't *need* those.

In my little scheduler, a "task" is nothing more than
a yield-frommable object sitting on a queue of things
to be run. There is no need to wrap it in another
object.

And there's really no need for the concept of a
"future" at all, except maybe at the boundary
between generator-based async code and other things
that are based on callbacks. Even then, a "future"
is really just "an object that can be passed to
yield-from". There is no need for a concrete
Future class, it's just a protocol.

> And I
> don't see where the PEP 492 additions would fit in (OK, "replace yield
> from with await" is part of it, but I don't see the rest).

That's really all there is to it. The rest is
concerned with catching certain kinds of mistakes,
and providing convenient syntax for some patterns
of using 'await'.

> There's a lot of asyncio
> that doesn't seem to me to be IO-related. Specifically the future and
> task abstractions. I view those as relevant to "coroutine programming
> in Python" because they are referenced in any discussion of coroutines
> (you yield from a future, for example).

Only because they've been elevated to prominence
by asyncio and its documentation, which I regard
as unfortunate.

When Guido was designing asyncio, I tried very
hard to dissuade him from giving futures such a
central place in the model. I saw them as an
unnecessary concept that would only clutter up
people's thinking. Seeing all the confusion now,
I'm more convinced than ever that I was right. :-(

> In some ways I wish there had been an "asyncio" library that covered
> the areas that are fundamentally about IO multiplexing. And a separate
> library (just "async", maybe, although that's now a bad idea as it
> clashes with a keyword :-)) that covered generic event loop, task and
> synchronisation areas.

As I said before, I don't think it's really
possible to factor an event loop into those kinds
of parts. You may be able to factor the *API* that
way, but any given implementation has to address
all the parts at once.

-- 
Greg

From andrew.svetlov at gmail.com  Wed May  6 10:22:34 2015
From: andrew.svetlov at gmail.com (Andrew Svetlov)
Date: Wed, 6 May 2015 11:22:34 +0300
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CACac1F8Dke35+D9oEL++gX5FH2fXU84T=Yw7TnCvq53Za3XtgQ@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
 <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>
 <CACac1F8Dke35+D9oEL++gX5FH2fXU84T=Yw7TnCvq53Za3XtgQ@mail.gmail.com>
Message-ID: <CAL3CFcUDGpJ=H6EZc2FOr_=b_tafnGxZWRsWbOiKwe_=uSQ5sQ@mail.gmail.com>

Congrats, Yury!

On Wed, May 6, 2015 at 11:07 AM, Paul Moore <p.f.moore at gmail.com> wrote:
> On 6 May 2015 at 00:58, Guido van Rossum <guido at python.org> wrote:
>> I totally forgot to publicly congratulate Yury on this PEP. He's put a huge
>> effort into writing the PEP and the implementation and managing the
>> discussion, first on python-ideas, later on python-dev. Congrats, Yury! And
>> thanks for your efforts. Godspeed.
>
> Agreed, congratulations! There's been a lot of debate on this PEP, and
> Yury has done a great job of responding where needed and keeping
> things on track, which can't have been easy. Thanks for all the work.
>
> Paul.
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com



-- 
Thanks,
Andrew Svetlov

From p.f.moore at gmail.com  Wed May  6 10:27:16 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 6 May 2015 09:27:16 +0100
Subject: [Python-Dev] Minimal async event loop and async utilities (Was: PEP
 492: async/await in Python; version 4)
Message-ID: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>

On 6 May 2015 at 07:46, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Another problem with the "core" idea is that
> you can't start with an event loop that "just does
> scheduling" and then add on other features such
> as I/O *from the outside*. There has to be some
> point at which everything comes together, which
> means choosing something like select() or
> poll() or I/O completion queues, and build that
> into the heart of your event loop. At that point
> it's no longer something with a simple core.

Looking at asyncio.queues, the only features it needs are:

1. asyncio.events.get_event_loop()
2. asyncio.futures.Future - creating a standalone Future
3. asyncio.locks.Event
4. @coroutine

locks.Event in turn only needs the other 3 items. And you can ignore
get_event_loop() as it's only used to get the default loop, you can
pass in your own.

And asyncio.futures only uses get_event_loop (and _format_callback)
from asyncio.events.

Futures require the loop to support:
1. call_soon
2. call_exception_handler
3. get_debug

So, to some extent (how far is something I'd need to code up a loop to
confirm) you can build the Futures and synchronisation mechanisms with
an event loop that supports only this "minimal interface".

Essentially, that's my goal - to allow people who want to write (say)
a Windows GUI event loop, or a Windows event loop based of
WaitForXXXObject, or a Tkinter loop, or whatever, to *not* have to
write their own implementation of synchronisation or future objects.

That may mean lifting the asyncio code and putting it into a separate
library, to make the separation between "asyncio-dependent" and
"general async" clearer. Or if asyncio's provisional status doesn't
last long enough to do that, we may end up with an asyncio
implementation and a separate (possibly 3rd party) "general"
implementation.

Paul.

From tds333+pydev at gmail.com  Wed May  6 10:38:52 2015
From: tds333+pydev at gmail.com (Wolfgang Langner)
Date: Wed, 06 May 2015 10:38:52 +0200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <55490AFC.8090907@gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
Message-ID: <5549D31C.7020604@gmail.com>

Hi Yury,


On 05.05.2015 20:25, Yury Selivanov wrote:

>>
>> We forget to address the major problems here. How can someone in a
>> "sync" script use this async stuff easy. How can async and sync stuff
>> cooperate and we don't need to rewrite the world for async stuff.
>> How can a normal user access the power of async stuff without rewriting
>> all his code. So he can use a simple asyc request library in his code.
>> How can a normal user learn and use all this in an easy way.
> 
> asyncio and twisted answered these questions ;) The answer is
> that you have to write async implementations.
> 
> gevent has a different answer, but greenlents/stackless is
> something that will never be merged in CPython and other
> implementations.
> 

I think monkeypatching and the gevent way is wrong. And explicit is better
than implicit.

I have to clear this. I meant how can we make this async stuff more accessible
to the average sync user. Sometime even without writing or knowing how
to write coroutines or other async stuff.

Let me explain it with a deeper example (Some of it is related to Python 2 and
twisted):

I had the same problem for a server application using twisted. Provide
easy interfaces to my users most not aware of async stuff.

My solution was to write my own decorator similar to twisted's
@inlineCallbacks. On top of it I added one more level to the decorator
and distinguish if it was called from the main thread (also running the
mainloop) and other threads. This object usable also as decorator
is called "task" and has some utility methods. And returns a "deferred".
(in asyncio this would be a Future)

Resulting in code like:

@task
def myAsyncFunction(webaddress):
  result = yield fetch(webaddress)
  # only to show sleep example
  yield task.sleep(0.1)
  task.Return(result)

Usable in a sync context (extra script thread):

def do():
  content = myAsyncFunction("http://xyz")

or async context:

@task
def ado():
  content = yield myAsyncFunction("http://xyz")


The task decorator has functionality to check if something is
called from the main thread (-> also a task and async)
or it is called from another thread (-> sync or script).

So this async interface is usable from both worlds. If someone
operates async he/she must only know the task decorator and when to yield.
If he/she uses it in sync mode nothing special has to be done.

To allow all this the server starts the async main loop in the main thread
and executes the script in an extra script thread. The user has every time his
own thread, also for rpc stuff. The only way to switch into the main loop
is to decorate a function as @task, every task is a coroutine and executed
in the main thread (also thread of main loop).

Benefit of all this:

- Easy to write a async task it is marked as one and special stuff belongs
  to the task object. (task.Return is needed because we are in Python 2)
- The normal user executes his stuff in his own thread and he/she
  can program in sync mode. No problem it is an extra thread and the main loop
  does not block.
- A network client or other stuff has to be written only once, most time this
  can be a @task in the async world. But this should not care the end user.
  We don't have to implement all twice once for async and once for the sync
  world. -> Less overhead

This is what I mean if I say we must address the bridging problem between the
worlds.
It think it is the wrong way to divide it in async and sync stuff and
duplicate all networking libraries in the sync and async ones.

For me the answer is to write one async netowrk library and use it in both,
a sync script and in an async main loop. With an easy interface and not
forcing the user to know this is an async library I have to do something special.

And in future go one step further and use all this combined with PyParallel
to solve the multiprocessing problem in Python.
(Script in extra thread, mainloop in main thread, executed and managed via
PyParallel avoiding the gil)
But this is only a vision/dream of mine.


>>
>> And for all this we still can't tell them "oh the async stuff solves
>> the multiprocessing problem of Python learn it and switch to version
>> 3.5". It does not and it is only most useful for networking stuff
>> nothing more.
> 
> "networking stuff", and in particular, web, is a huge
> part of current Python usage.  Please don't underestimate
> that.

I do not. But for most they want only use it as a client
and the main concern for most is "I want to get this web page"
and not "I will implement a web client have to do this async and get it".


Regards,

Wolfgang


From p.f.moore at gmail.com  Wed May  6 10:52:16 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 6 May 2015 09:52:16 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <5549CEE6.5070805@canterbury.ac.nz>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
 <CAP7+vJJ31cWu2sewVevYNS+9KGCnV3dH3NaqbgjYU7Gu42p4Zg@mail.gmail.com>
 <CACac1F-yo0G7FDvwpWOoNOaoX1ndJBjehwyH7XsHD7JHDtO7Vg@mail.gmail.com>
 <5549CEE6.5070805@canterbury.ac.nz>
Message-ID: <CACac1F9Uczj4hy=KxmeC4XSiOdDN-1sj+fb9NeZhxn-D5QuDAg@mail.gmail.com>

On 6 May 2015 at 09:20, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
>> That doesn't cover any of the higher level abstractions like tasks or
>> futures (at least not by those names or with those interfaces).
>
> Because a minimal event loop doesn't *need* those.

It doesn't *need* them, but as abstractions they allow easier building
of reusable higher-level libraries. You can write an event loop with
nothing but coroutines, but to build reusable libraries on top of it,
you need some common interfaces.

> In my little scheduler, a "task" is nothing more than
> a yield-frommable object sitting on a queue of things
> to be run. There is no need to wrap it in another
> object.
>
> And there's really no need for the concept of a
> "future" at all, except maybe at the boundary
> between generator-based async code and other things
> that are based on callbacks. Even then, a "future"
> is really just "an object that can be passed to
> yield-from". There is no need for a concrete
> Future class, it's just a protocol.

Agreed, you don't need a Future class, all you need is to agree what
reusable code is allowed to do with the core objects you are passing
around - that's how duck typing works. The objects *I* can see are
futures (in a PEP 492 world, "awaitables" which may or may not be
equivalent in terms of the operations you'd want to focus on) and the
event loop itself.

In your example, the event loop is implicit (as it's a singleton, you
use global functions rather than methods on the loop object) but
that's a minor detail.

>> And I
>> don't see where the PEP 492 additions would fit in (OK, "replace yield
>> from with await" is part of it, but I don't see the rest).
>
> That's really all there is to it. The rest is
> concerned with catching certain kinds of mistakes,
> and providing convenient syntax for some patterns
> of using 'await'.

So, "things you can wait on" have one operation - "wait for a result".
That's OK. You can create such things as coroutines, which is also
fine. You may want to create such things explicitly (equivalent to
generators vs __iter__) - maybe that's where __aiter__ comes in in PEP
492 and the Future class in asyncio. Again, all fine.

You also need operations like "schedule a thing to run", which is the
event loop "interface". Your sample has the following basic event loop
methods that I can see: run, schedule, unschedule, and
expire_timeslice (that last one may be an implementation detail, but
the other 3 seem pretty basic). PEP 492 has nothing to say on the
event loop side of things (something that became clear to me during
this discussion).

>> There's a lot of asyncio
>> that doesn't seem to me to be IO-related. Specifically the future and
>> task abstractions. I view those as relevant to "coroutine programming
>> in Python" because they are referenced in any discussion of coroutines
>> (you yield from a future, for example).
>
> Only because they've been elevated to prominence
> by asyncio and its documentation, which I regard
> as unfortunate.
>
> When Guido was designing asyncio, I tried very
> hard to dissuade him from giving futures such a
> central place in the model. I saw them as an
> unnecessary concept that would only clutter up
> people's thinking. Seeing all the confusion now,
> I'm more convinced than ever that I was right. :-(

Futures seem to me to be (modulo a few details) what "awaitables" are
in PEP 492. I can't see how you can meaningfully talk about event
loops in a Python context without having *some* term for "things you
wait for". Maybe Future wasn't a good name, and maybe the parallel
with concurrent.futures.Future wasn't helpful (I think both things
were fine, but you may not) but we had to have *some* way of talking
about them, and of constructing standalone awaitables. PEP 492 has
new, and hopefully better, ways, but I think that awaitables *have* to
be central to any model where you wait for things...

By the way, it feels to me like I'm now arguing in favour of PEP 492
with a reasonable understanding of what it "means". Assuming what I
said above isn't complete rubbish, thanks to everyone who's helped me
get to this point of understanding through this thread! (And if I
haven't understood, that's my fault, and still thanks to everyone for
their efforts :-))

Paul

From greg.ewing at canterbury.ac.nz  Wed May  6 11:16:37 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 06 May 2015 21:16:37 +1200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CAP7+vJKdz3f16O=xCT-Ee=hXhkh98DubdXSL=30VstCnkXU4LA@mail.gmail.com>
References: <554185C2.5080003@gmail.com>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CAP7+vJKfNDfVqD8J7EeeUyRAv6jdJBfroo-esZTjBkO2FwnHpQ@mail.gmail.com>
 <CACac1F_wd_gYhZiCb4beEO91RDGREckkWBx2kKrj_MivY9sT4Q@mail.gmail.com>
 <CAP7+vJ+62EHSL61p+dQ1ZOBGsFd6vr4t9sdeehabK7tYRA1FAg@mail.gmail.com>
 <CAPJVwB=4shOFRfL6QfmAsqOy1k1WXXY=yqsL_PUReQGToc5=Hw@mail.gmail.com>
 <CAP7+vJKdz3f16O=xCT-Ee=hXhkh98DubdXSL=30VstCnkXU4LA@mail.gmail.com>
Message-ID: <5549DBF5.901@canterbury.ac.nz>

Guido van Rossum wrote:
> the bytecode generated for 
> await treats coroutine objects special, just like the bytecode generated 
> for yield-from treats generator objects special. The special behavior 
> they have in common is the presence of send() and throw() methods,

I don't think that's quit accurate. Yield-from treats
any object having send() and throw() methods the same
way it treats a generator -- there's nothing special
about the generator *type*. Presumably 'await' is the
same.

-- 
Greg

From oscar.j.benjamin at gmail.com  Wed May  6 11:20:50 2015
From: oscar.j.benjamin at gmail.com (Oscar Benjamin)
Date: Wed, 6 May 2015 10:20:50 +0100
Subject: [Python-Dev] PEP 492: What is the real goal?
In-Reply-To: <5548F44B.2090905@gmail.com>
References: <CAP7+vJ+m=qwo+bkhcp+b_Y35VeY8fq1ReraSMB_baV6-hNd-Bw@mail.gmail.com>
 <55411841.c5b3340a.1cf8.07e8@mx.google.com>
 <CACac1F_U7JvLJ=eLh9ou1c--nWVWLhhGTuU0=4zhU+NxKY=Hig@mail.gmail.com>
 <5541261A.9020909@gmail.com>
 <CACac1F95=2pqG5VFR9_5sdR_KOx=OXUp--0Jdkp5CSgYu7k+=A@mail.gmail.com>
 <5541341A.8090204@gmail.com>
 <CAJ6cK1YPp+w=021E1y_D4sV-DhuEVx+QX716KH8-3C2_X_WZHA@mail.gmail.com>
 <CAHVvXxRXAn4zQZxji+jR99Zgm+sxM4oSBZxDKa_VCfbwq2Nx_w@mail.gmail.com>
 <5548F44B.2090905@gmail.com>
Message-ID: <CAHVvXxTL2MsMBygXKW_PVpyvm+CAKeugQA2q-WMR+XA346aCXQ@mail.gmail.com>

On 5 May 2015 at 17:48, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
>
> I've updated the PEP with some fixes of the terminology:
> https://hg.python.org/peps/rev/f156b272f860

Yes that looks better.

> I still think that 'coroutine functions' and 'coroutines'
> is a better pair than 'async functions' and 'coroutines'.

Fair enough. The terminology in the PEP seems consistent now which is
more important than the exact terms used.


--
Oscar

From greg.ewing at canterbury.ac.nz  Wed May  6 13:00:49 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 06 May 2015 23:00:49 +1200
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F9Uczj4hy=KxmeC4XSiOdDN-1sj+fb9NeZhxn-D5QuDAg@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
 <CAP7+vJJ31cWu2sewVevYNS+9KGCnV3dH3NaqbgjYU7Gu42p4Zg@mail.gmail.com>
 <CACac1F-yo0G7FDvwpWOoNOaoX1ndJBjehwyH7XsHD7JHDtO7Vg@mail.gmail.com>
 <5549CEE6.5070805@canterbury.ac.nz>
 <CACac1F9Uczj4hy=KxmeC4XSiOdDN-1sj+fb9NeZhxn-D5QuDAg@mail.gmail.com>
Message-ID: <5549F461.9010308@canterbury.ac.nz>

Paul Moore wrote:
> I can't see how you can meaningfully talk about event
> loops in a Python context without having *some* term for "things you
> wait for".

PEP 3152 was my attempt at showing how you could do that.

-- 
Greg

From greg.ewing at canterbury.ac.nz  Wed May  6 13:07:15 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 06 May 2015 23:07:15 +1200
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <mibp9r$9b6$1@ger.gmane.org>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com> <mibp9r$9b6$1@ger.gmane.org>
Message-ID: <5549F5E3.5090101@canterbury.ac.nz>

Terry Reedy wrote:

> What I do not understand is how io events become event loop Event 
> instances.

They don't become Events exactly, but you can register
a callback to be called when a file becomes ready for
reading or writing, see:

http://effbot.org/pyfaq/can-i-have-tk-events-handled-while-waiting-for-i-o.htm

That's probably enough of a hook to be able to get
asyncio-style file I/O working on top of Tkinter.

-- 
Greg

From larry at hastings.org  Wed May  6 13:27:20 2015
From: larry at hastings.org (Larry Hastings)
Date: Wed, 06 May 2015 04:27:20 -0700
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
Message-ID: <5549FA98.4090503@hastings.org>

On 05/05/2015 04:53 PM, Guido van Rossum wrote:
> I've given Yury clear instructions to focus on how to proceed -- he's 
> to work with another core dev on getting the implementation ready in 
> time for beta 1 (scheduled for May 24, but I think the target date 
> should be May 19).

Released on Sunday May 24 means it'll be tagged on Saturday May 23. 
Please take care that it be checked in by then.

Your friendly neighborhood release manager,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/7ef10c3f/attachment.html>

From guido at python.org  Wed May  6 17:46:00 2015
From: guido at python.org (Guido van Rossum)
Date: Wed, 6 May 2015 08:46:00 -0700
Subject: [Python-Dev] Minimal async event loop and async utilities (Was:
 PEP 492: async/await in Python; version 4)
In-Reply-To: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>
References: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>
Message-ID: <CAP7+vJ+KL5NmLJZLuKbDYX8i2V_zZPUNaqXmfacxb7Cj43jqgg@mail.gmail.com>

On Wed, May 6, 2015 at 1:27 AM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 6 May 2015 at 07:46, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> > Another problem with the "core" idea is that
> > you can't start with an event loop that "just does
> > scheduling" and then add on other features such
> > as I/O *from the outside*. There has to be some
> > point at which everything comes together, which
> > means choosing something like select() or
> > poll() or I/O completion queues, and build that
> > into the heart of your event loop. At that point
> > it's no longer something with a simple core.
>
> Looking at asyncio.queues, the only features it needs are:
>
> 1. asyncio.events.get_event_loop()
> 2. asyncio.futures.Future - creating a standalone Future
> 3. asyncio.locks.Event
> 4. @coroutine
>
> locks.Event in turn only needs the other 3 items. And you can ignore
> get_event_loop() as it's only used to get the default loop, you can
> pass in your own.
>
> And asyncio.futures only uses get_event_loop (and _format_callback)
> from asyncio.events.
>
> Futures require the loop to support:
> 1. call_soon
> 2. call_exception_handler
> 3. get_debug
>
> So, to some extent (how far is something I'd need to code up a loop to
> confirm) you can build the Futures and synchronisation mechanisms with
> an event loop that supports only this "minimal interface".
>
> Essentially, that's my goal - to allow people who want to write (say)
> a Windows GUI event loop, or a Windows event loop based of
> WaitForXXXObject, or a Tkinter loop, or whatever, to *not* have to
> write their own implementation of synchronisation or future objects.
>
> That may mean lifting the asyncio code and putting it into a separate
> library, to make the separation between "asyncio-dependent" and
> "general async" clearer. Or if asyncio's provisional status doesn't
> last long enough to do that, we may end up with an asyncio
> implementation and a separate (possibly 3rd party) "general"
> implementation.
>

This is actually a great idea, and I encourage you to go forward with it.
The biggest piece missing from your inventory is probably Task, which is
needed to wrap a Future around a coroutine.

I expect you'll also want to build cancellation into your "base async
framework"; and the primitives to wait for multiple awaitables. The next
step would be some mechanism to implement call_later()/call_at() (but this
needs to be pluggable since for a "real" event loop it needs to be
implemented by the basic I/O selector).

If you can get this working it would be great to include this in the stdlib
as a separate "asynclib" library. The original asyncio library would then
be a specific implementation (using a subclass of asynclib.EventLoop) that
adds I/O, subprocesses, and integrates with the selectors module (or with
IOCP, on Windows).

I don't see any particular hurry to get this in before 3.5; the refactoring
of asyncio can be done later, in a backward compatible way. It would be a
good way to test the architecture of asyncio!

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/f856582c/attachment.html>

From guido at python.org  Wed May  6 17:57:56 2015
From: guido at python.org (Guido van Rossum)
Date: Wed, 6 May 2015 08:57:56 -0700
Subject: [Python-Dev] Ancient use of generators
Message-ID: <CAP7+vJKgcJCcvr0Jr4zOR39+Guv09c4BwGDr5LdaokDf3u-reg@mail.gmail.com>

For those interested in tracking the history of generators and coroutines
in Python, I just found out that PEP 342
<https://www.python.org/dev/peps/pep-0342/> (which introduced
send/throw/close and made "generators as coroutines" a mainstream Python
concept) harks back to PEP 288 <https://www.python.org/dev/peps/pep-0288/>,
which was rejected. PEP 288 also proposed some changes to generators. The
interesting bit though is in the references: there are two links to old
articles by David Mertz that describe using generators in state machines
and other interesting and unconventional applications of generators. All
these well predated PEP 342, so yield was a statement and could not receive
a value from the function calling next() -- communication was through a
shared class instance.

http://gnosis.cx/publish/programming/charming_python_b5.txt
http://gnosis.cx/publish/programming/charming_python_b7.txt

Enjoy!

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/9466300d/attachment.html>

From yselivanov.ml at gmail.com  Wed May  6 19:30:52 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Wed, 06 May 2015 13:30:52 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAFkYKJ5=bmWkCP5vPVQ_k=zgr3ZOSNv2QghMqUnngeO5-Jdb6Q@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com>
 <CAFkYKJ5=bmWkCP5vPVQ_k=zgr3ZOSNv2QghMqUnngeO5-Jdb6Q@mail.gmail.com>
Message-ID: <554A4FCC.4080104@gmail.com>

Hi Ben,

On 2015-05-06 12:05 AM, Ben Darnell wrote:
> On Tue, May 5, 2015 at 3:25 PM, Yury Selivanov <yselivanov.ml at gmail.com>
> wrote:
>
>> Yes, there is no other popular event loop for 3.4 other
>> than asyncio, that uses coroutines based on generators
>> (as far as I know).
>>
> Tornado supports Python 3.4 and uses generator-based coroutines. We use
> `yield` instead of `yield from` for compatibility with Python 2. I have a
> patch to support the new async/await syntax here:
> https://github.com/bdarnell/tornado/commit/e3b71c3441e9f87a29a9b112901b7644b5b6edb8

I don't know how this happened, especially since I've used
Tornado myself!  It's amazing that Tornado will have support
of async/await when 3.5 is out!

>
> Overall, I like the PEP. I've been reluctant to embrace `yield from` for
> Tornado coroutines (Tornado's Futures do not implement `__iter__`) because
> I'm worried about confusion between `yield` and `yield from`, but async and
> await are explicit enough that that's not really a problem.
>
> My one request would be that there be a type or ABC corresponding to
> inspect.isawaitable(). Tornado uses functools.singledispatch to handle
> interoperability with other coroutine frameworks, so it would be best if we
> could distinguish awaitables from other objects in a way that is compatible
> with singledispatch. The patch above simply registers types.GeneratorType
> which isn't quite correct.


Sure. I'll add Awaitable and Coroutine ABCs.

Thanks,
Yury


From tjreedy at udel.edu  Wed May  6 23:32:11 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 06 May 2015 17:32:11 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAP7+vJKoeefmJULXuuv7Gnz2QyX+rmHLzXh_08VB6drXiRgoVg@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com> <mibp9r$9b6$1@ger.gmane.org>
 <CAP7+vJKoeefmJULXuuv7Gnz2QyX+rmHLzXh_08VB6drXiRgoVg@mail.gmail.com>
Message-ID: <mie18u$vs8$1@ger.gmane.org>

On 5/5/2015 10:59 PM, Guido van Rossum wrote:
> For this you should probably use an integration of asyncio (which can do
> async subprocess output nicely) with Tkinter. Over in tulip-land there
> is an demo of such an integration.

After redirection from googlecode tulip, I found
https://github.com/python/asyncio/tree/master/examples
None of the 4 *process*.py examples mention tkinter.

I also found "Create a Tkinter/Tulip integration"
https://github.com/python/asyncio/issues/21
with attachment tk_ayncio.zip
copied (with 'async' replacing 'tulip') to
https://bitbucket.org/haypo/asyncio_staging/src/bb76064d80b0a03bf3f7b13652e595dfe475c7f8/asyncio_tkinter/?at=default

None of the integration files mention subprocess, so I presume you are 
suggesting that I use a modification of one of the example subprocess 
coroutines with the integration framework.

If this works well, might it make sense to consider using an elaboration 
of examples/subprocess_shell.py to replace subprocess socket 
communication with pipe comminication?

> On Tue, May 5, 2015 at 6:03 PM, Terry Reedy <tjreedy at udel.edu
> <mailto:tjreedy at udel.edu>> wrote:

>     My specific use case is to be able to run a program in a separate
>     process, but display the output in the gui process -- something like
>     this (in Idle, for instance).  (Apologies if this misuses the new
>     keywords.)
>
>     async def menu_handler()
>          ow = OutputWindow(args)  # tk Widget
>          proc = subprocess.Popen (or multiprocessing equivalent)
>          out = (stdout from process)
>          await for line in out:
>              ow.write(line)
>          finish()
>
>     I want the handler to not block event processing, and disappear
>     after finishing.  Might 492 make this possible someday?  Or would
>     having 'line in pipe' or just 'data in pipe' translated to a tk
>     event likely require a patch to tk?

-- 
Terry Jan Reedy


From guido at python.org  Wed May  6 23:39:30 2015
From: guido at python.org (Guido van Rossum)
Date: Wed, 6 May 2015 14:39:30 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <mie18u$vs8$1@ger.gmane.org>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com> <mibp9r$9b6$1@ger.gmane.org>
 <CAP7+vJKoeefmJULXuuv7Gnz2QyX+rmHLzXh_08VB6drXiRgoVg@mail.gmail.com>
 <mie18u$vs8$1@ger.gmane.org>
Message-ID: <CAP7+vJJMDhbZ_RPVAkR7jCT0YiOUnW4TB+pe74c5NZwitBO6TA@mail.gmail.com>

Sorry to send you on such a wild goose chase! I did mean the issue you
found #21). I just updated it with a link to a thread that has more
news:  https://groups.google.com/forum/#!searchin/python-tulip/tkinter/python-tulip/TaSVW-pjWro/hCP6qS4eRnAJ

I wasn't able to verify the version by Luciano Ramalho. (And yes, extending
all this to working with a subprocess is left as an exercise. It's all
pretty academic IMO, given Tkinter's lack of popularity outside IDLE.)
<https://groups.google.com/forum/#%21searchin/python-tulip/tkinter/python-tulip/TaSVW-pjWro/hCP6qS4eRnAJ>

On Wed, May 6, 2015 at 2:32 PM, Terry Reedy <tjreedy at udel.edu> wrote:

> On 5/5/2015 10:59 PM, Guido van Rossum wrote:
>
>> For this you should probably use an integration of asyncio (which can do
>> async subprocess output nicely) with Tkinter. Over in tulip-land there
>> is an demo of such an integration.
>>
>
> After redirection from googlecode tulip, I found
> https://github.com/python/asyncio/tree/master/examples
> None of the 4 *process*.py examples mention tkinter.
>
> I also found "Create a Tkinter/Tulip integration"
> https://github.com/python/asyncio/issues/21
> with attachment tk_ayncio.zip
> copied (with 'async' replacing 'tulip') to
>
> https://bitbucket.org/haypo/asyncio_staging/src/bb76064d80b0a03bf3f7b13652e595dfe475c7f8/asyncio_tkinter/?at=default
>
> None of the integration files mention subprocess, so I presume you are
> suggesting that I use a modification of one of the example subprocess
> coroutines with the integration framework.
>
> If this works well, might it make sense to consider using an elaboration
> of examples/subprocess_shell.py to replace subprocess socket communication
> with pipe comminication?
>
>  On Tue, May 5, 2015 at 6:03 PM, Terry Reedy <tjreedy at udel.edu
>> <mailto:tjreedy at udel.edu>> wrote:
>>
>
>      My specific use case is to be able to run a program in a separate
>>     process, but display the output in the gui process -- something like
>>     this (in Idle, for instance).  (Apologies if this misuses the new
>>     keywords.)
>>
>>     async def menu_handler()
>>          ow = OutputWindow(args)  # tk Widget
>>          proc = subprocess.Popen (or multiprocessing equivalent)
>>          out = (stdout from process)
>>          await for line in out:
>>              ow.write(line)
>>          finish()
>>
>>     I want the handler to not block event processing, and disappear
>>     after finishing.  Might 492 make this possible someday?  Or would
>>     having 'line in pipe' or just 'data in pipe' translated to a tk
>>     event likely require a patch to tk?
>>
>
> --
> Terry Jan Reedy
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/c0d7656b/attachment.html>

From ben at bendarnell.com  Thu May  7 01:06:38 2015
From: ben at bendarnell.com (Ben Darnell)
Date: Wed, 6 May 2015 16:06:38 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org>
 <20150501191937.GB8013@stoneleaf.us> <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <5543E1B7.6010804@gmail.com>
 <CAJ6cK1bQ1swiamzQ3BYvxTZO=qDSbmfCNNKNmNN0NWtkZuZihA@mail.gmail.com>
 <5548A908.9010206@gmail.com> <55490AFC.8090907@gmail.com>
 <CACac1F-0d4Y2ta1Z9Di4sX-fMFqj-LDeHuQeoCXGUZc-w-=RVQ@mail.gmail.com>
 <CAP1=2W63bgK5kLLxdhxGJF8BnvCGyvzSh74Dinf9pcKOviMxvg@mail.gmail.com>
 <CACac1F8WmuA8S6UgHU74X+=ZFw7rULRCoR-c90qtkQ-jp_RmoA@mail.gmail.com>
Message-ID: <CAFkYKJ5-mmZBDtR=poKTm45omsfq0s1GfmQMy+TirbUOUniQaQ@mail.gmail.com>

On Tue, May 5, 2015 at 1:39 PM, Paul Moore <p.f.moore at gmail.com> wrote:

>
> It would probably be helpful to have a concrete example of a basic
> event loop that did *nothing* but schedule tasks. No IO waiting or
> similar, just scheduling. I have a gut feeling that event loops are
> more than just asyncio, but without examples to point to it's hard to
> keep a focus on that fact. And even harder to isolate "what is an
> event loop mechanism" from "what is asyncio specific". For example,
> asyncio.BaseEventLoop has a create_connection method. That's
> *obviously* not a fundamental aspect of a generic event loop, But
> call_soon (presumably) is. Having a documented "basic event loop"
> interface would probably help emphasise the idea than event loops
> don't have to be asyncio. (Actually, what *is* the minimal event loop
> interface that is needed for the various task/future mechanisms to
> work, independently of asyncio? And what features of an event loop etc
> are needed for the PEP, if it's being used outside of asyncio?)
>

Twisted has a pretty good taxonomy of event loop methods, in the interfaces
at the bottom of this page:
http://twistedmatrix.com/documents/15.1.0/core/howto/reactor-basics.html
and the comparison matrix at
http://twistedmatrix.com/documents/15.1.0/core/howto/choosing-reactor.html

The asyncio event loops implement most of these (not the exact interfaces,
but the same functionality). Tornado implements FDSet, Time, and part of
Threads in the IOLoop itself, with the rest of the functionality coming
from separate classes. (You may wonder then why Twisted and asyncio put
everything in the core event loop? It's necessary to abstract over certain
platform differences, which is one big reason why Tornado has poor support
for Windows).

-Ben



>
> I guess the other canonical event loop use case is GUI system message
> dispatchers.
>
> >> You can argue that the syntax is needed to help
> >> make async more accessible - but if that's the case then the
> >> terminology debates and confusion are clear evidence that it's not
> >> succeeding in that goal.
> >
> > Perhaps, but arguing about the nitty-gritty details of something doesn't
> > automatically lead to a clearer understanding of the higher level
> concept.
> > Discussing how turning a steering wheel in a car might help you grasp how
> > cars turn, but it isn't a requirement to get "turn the wheel left to make
> > the car go left".
>
> Fair point. If only I could avoid driving into walls :-)
>
> >> Of course, that's based on my perception of
> >> one of the goals of the PEP as being "make coroutines and asyncio more
> >> accessible", If the actual goals are different, my conclusion is
> >> invalid.
> >
> > I think the goal is "make coroutines easier to use" and does not directly
> > relate to asyncio.
>
> OK. But in that case, some examples using a non-asyncio toy "just
> schedule tasks" event loop might help.
>
> >> Well, twisted always had defer_to_thread. Asyncio has run_in_executor,
> >> but that seems to be callback-based rather than coroutine-based?
> >
> > Yep.
>
> ... and so you can't use it with async/await?
>
> >> Many people use requests for their web access. There are good reasons
> >> for this. Are you saying that until someone steps up and writes an
> >> async implementation of requests, I have to make a choice - requests
> >> or asyncio?
> >
> > I believe so; you need something to implement __await__. This is true in
> any
> > language that implements co-routines.
> >
> >> Unfortunately, I can't see myself choosing asyncio in that
> >> situation. Which again means that asyncio becomes "something that the
> >> average user can't use". Which in turn further entrenches it as a
> >> specialist-only tool.
> >
> > You forgot to append "... yet" to that statement. Just because something
> > isn't available out of the box without some effort to support doesn't
> mean
> > it will never happen, else there would be absolutely no Python 3 users
> out
> > there.
>
> Fair point. Yuri mentioned aiohttp, as well. The one difference
> between this and Python 2/3, is that here you *have* to have two
> separate implementations. There's no equivalent of a "shared source"
> async and synchronous implementation of requests. So the typical
> "please support Python 3" issue that motivates projects to move
> forward doesn't exist in the same way. It's not to say that there
> won't be async versions of important libraries, it's just hard to see
> how the dynamics will work. I can't see myself raising an issue on
> cx_Oracle saying "please add asyncio support", and I don't know who
> else I would ask...
>
> > Co-routine-based asynchronous programming is new to Python, so as a
> > community we don't have it as something everyone learns over time. If you
> > don't come from an area that supports it then it will be foreign to you
> and
> > not make sense without someone giving you a good tutorial on it. But
> > considering C#, Dart, and Ecmascript 6 (will) have co-routine support --
> and
> > those are just the languages I can name off the top of my head -- using
> the
> > exact same keywords suggests to me that it isn't *that* difficult of a
> topic
> > to teach people. This is just one of those PEPs where you have to trust
> the
> > people with experience in the area are making good design decisions for
> > those of us who aren't in a position to contribute directly without more
> > experience in the domain.
>
> That's also a fair point, and it seems to me that there *is*
> reasonably general feeling that the experts can be trusted on the
> basic principles. There's also a huge amount of bikeshedding, but
> that's pretty much inevitable :-)
>
> But I do think that unless someone does something to offer some
> non-asyncio examples of coroutine-based asynchronous programming in
> Python, the link in people's minds between async and asyncio will
> become more and more entrenched. While asyncio is the only real event
> loop implementation, saying "async can be used for things other than
> asyncio" is a rather theoretical point.
>
> Is there anyone who feels they could write a stripped down but working
> example of a valid Python event loop *without* the asyncio aspects? Or
> is that what David Beazley's talk does? (I got the impression from
> what you said that he was aiming at async IO rather than just a non-IO
> event loop). Can asyncio.Future and asyncio.Task be reused with such
> an event loop, or would those need to be reimplemented as well?
> Writing your own event loop seems like a plausible exercise. Writing
> your own version of the whole
> task/future/coroutine/queue/synchronisation mechanisms seems like a
> lot to expect. And the event loop policy mechanism says that it works
> with loops that implement asyncio.BaseEventLoop (which as noted
> includes things like create_connection, etc).
>
> Paul
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/ben%40bendarnell.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/e535e230/attachment-0001.html>

From tjreedy at udel.edu  Thu May  7 03:32:19 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 06 May 2015 21:32:19 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAP7+vJJMDhbZ_RPVAkR7jCT0YiOUnW4TB+pe74c5NZwitBO6TA@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com> <mibp9r$9b6$1@ger.gmane.org>
 <CAP7+vJKoeefmJULXuuv7Gnz2QyX+rmHLzXh_08VB6drXiRgoVg@mail.gmail.com>
 <mie18u$vs8$1@ger.gmane.org>
 <CAP7+vJJMDhbZ_RPVAkR7jCT0YiOUnW4TB+pe74c5NZwitBO6TA@mail.gmail.com>
Message-ID: <miefb6$ke5$1@ger.gmane.org>

On 5/6/2015 5:39 PM, Guido van Rossum wrote:
> Sorry to send you on such a wild goose chase! I did mean the issue you
> found #21). I just updated it with a link to a thread that has more
> news:
> https://groups.google.com/forum/#!searchin/python-tulip/tkinter/python-tulip/TaSVW-pjWro/hCP6qS4eRnAJ
> <https://groups.google.com/forum/#%21searchin/python-tulip/tkinter/python-tulip/TaSVW-pjWro/hCP6qS4eRnAJ>
> I wasn't able to verify the version by Luciano Ramalho. (And yes,
> extending all this to working with a subprocess is left as an exercise.
> It's all pretty academic IMO, given Tkinter's lack of popularity outside
> IDLE.)

On Stackoverflow pyside has gotten 40 questions tagged in the last 30 
days, wxpython 70 in the last 30 days, pyqt 114 in 30 days, while 
tkinter has gotten 101 in the last week, which would project to about 
425 in the last 30 days.  So tkinter is being used at least by 
beginners. There have been a few tkinter and python-asyncio questions.

-- 
Terry Jan Reedy


From ncoghlan at gmail.com  Thu May  7 04:30:40 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 7 May 2015 12:30:40 +1000
Subject: [Python-Dev] Clarification of PEP 476 "opting out" section
In-Reply-To: <5541E0E6.6060701@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
Message-ID: <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>

On 30 Apr 2015 5:59 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:

> On 30.04.2015 02:33, Nick Coghlan wrote:
> > Hi folks,
> >
> > This is just a note to highlight the fact that I tweaked the "Opting
> > out" section in PEP 476 based on various discussions I've had over the
> > past few months: https://hg.python.org/peps/rev/dfd96ee9d6a8
> >
> > The notable changes:
> >
> > * the example monkeypatching code handles AttributeError when looking
> > up "ssl._create_unverified_context", in order to accommodate older
> > versions of Python that don't have PEP 476 implemented
> > * new paragraph making it clearer that while the intended use case for
> > the monkeypatching trick is as a workaround to handle environments
> > where you *know* HTTPS certificate verification won't work properly
> > (including explicit references to sitecustomize.py and Standard
> > Operating Environments for Python), there's also a secondary use case
> > in allowing applications to provide a system administrator controlled
> > setting to globally disable certificate verification (hence the change
> > to the example code)
> > * new paragraph making it explicit that even though we've improved
> > Python's default behaviour, particularly security sensitive
> > applications should still provide their own context rather than
> > relying on the defaults
>
> Can we please make the monkeypatch a regular part of Python's
> site.py which can enabled via an environment variable, say
> export PYTHONHTTPSVERIFY=0.
>
> See http://bugs.python.org/issue23857 for the discussion.
>
> Esp. for Python 2.7.9 the default verification from PEP 476
> is causing problems for admins who want to upgrade their
> Python installation without breaking applications using
> Python. They need an easy and official non-hackish way to
> opt-out from the PEP 476 default on a per application basis.

My current recommendation to the Red Hat Python team is to put
together a draft PEP 394 style informational PEP to define a
documented configuration file based approach that redistributors may
choose to implement in order to provide a smoother transition path for
their end users:
https://bugzilla.redhat.com/show_bug.cgi?id=1173041#c8

The problem I identified with reusing the environment variable based
approach that we used with hash randomisation is that "-E" turns the
upstream default behaviour back on. That was acceptable for hash
randomisation, but would be too limiting here (as the appropriate
value for the HTTPS certificate verification configuration setting
relates primarily to the general state of SSL/TLS certificate
management in the environment where Python is being deployed, moreso
than to the specific networked applications that are being run).

I actually do think it would be good to have such a feature as a
native part of Python 2.7 in order to provide a nicer "revert to the
pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
"there's no easy way to turn HTTPS certificate verification off
globally" state of affairs to Python 3), but I don't currently have
the time available to push for that against the "end users can't be
trusted not to turn certificate verification off when they should be
fixing their certificate management instead" perspective.

Cheers,
Nick.

From ncoghlan at gmail.com  Thu May  7 05:38:43 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 7 May 2015 13:38:43 +1000
Subject: [Python-Dev] Ancient use of generators
In-Reply-To: <CAP7+vJKgcJCcvr0Jr4zOR39+Guv09c4BwGDr5LdaokDf3u-reg@mail.gmail.com>
References: <CAP7+vJKgcJCcvr0Jr4zOR39+Guv09c4BwGDr5LdaokDf3u-reg@mail.gmail.com>
Message-ID: <CADiSq7e=AqReLZoP=vCSxmV2RuGRSt9RtHpV5X2gMDOjfEqbew@mail.gmail.com>

David Beazley's tutorials on these topics are also excellent:

* Generator Tricks for Systems Programmers (PyCon 2008:
http://www.dabeaz.com/generators/)
* A Curious Course on Coroutines and Concurrency (PyCon 2009:
http://www.dabeaz.com/coroutines/)
* Generators: The Final Frontier (PyCon 2014:
http://www.dabeaz.com/finalgenerator/)

The first one focuses on iteration, the second expands to cover PEP
342 and sending values into coroutines, while the last expands to
cover *all* the different ways we use generator suspension points
these days (including in context managers and asynchronous I/O).

The async I/O section is particularly interesting because David
initially uses threads in order to delay getting into the complexities
of event loops, while still illustrating the concept of "waiting for
other things to happen".

(Note: once you get to Part 5 of the last tutorial, you're getting to
stuff that really pushes the boundaries of what generators can do,
using examples from domains that are complex in their own right. The
very last section also contains some wise words around the fact that
we're genuinely pushing out the boundaries of the language's
expressive capabilities, which does carry some non-trivial risks)

I do believe that last tutorial also does a good job of illustrating a
lot of the complexity that PEP 492 is intended to *hide* from
(future!) end users by simplifying the story to "use async & await for
cooperative multitasking, threads & processes for preemptive
multitasking, and yield & yield from for iteration", rather than
having the "cooperative multitasking" case continue to be a particular
way of using yield & yield from as it is in Python 3.4 (and a way of
use "yield" in earlier releases).

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From guido at python.org  Thu May  7 06:13:40 2015
From: guido at python.org (Guido van Rossum)
Date: Wed, 6 May 2015 21:13:40 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <miefb6$ke5$1@ger.gmane.org>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <55494377.3040608@gmail.com> <mibp9r$9b6$1@ger.gmane.org>
 <CAP7+vJKoeefmJULXuuv7Gnz2QyX+rmHLzXh_08VB6drXiRgoVg@mail.gmail.com>
 <mie18u$vs8$1@ger.gmane.org>
 <CAP7+vJJMDhbZ_RPVAkR7jCT0YiOUnW4TB+pe74c5NZwitBO6TA@mail.gmail.com>
 <miefb6$ke5$1@ger.gmane.org>
Message-ID: <CAP7+vJ+KqafJt2-PmQJ6ZYEvuvfgv8H7ySvCu4mTtHOeuKNL5w@mail.gmail.com>

Thanks for these stats! (Though I'm sure there's some bias because Tkinter
is in the stdlib and the others aren't. Stdlib status still counts a lot
for many people, pip notwithstanding. (But that's a different thread. :-))

FWIW I tried to get the Tkinter-asyncio demo mentioned in the above thread
to work but I couldn't (then again Tk on OS X is difficult). If someone
could turn this into a useful event loop that blends asyncio and Tkinter
that would be truly awesome!

On Wed, May 6, 2015 at 6:32 PM, Terry Reedy <tjreedy at udel.edu> wrote:

> On 5/6/2015 5:39 PM, Guido van Rossum wrote:
>
>> Sorry to send you on such a wild goose chase! I did mean the issue you
>> found #21). I just updated it with a link to a thread that has more
>> news:
>>
>> https://groups.google.com/forum/#!searchin/python-tulip/tkinter/python-tulip/TaSVW-pjWro/hCP6qS4eRnAJ
>> <
>> https://groups.google.com/forum/#%21searchin/python-tulip/tkinter/python-tulip/TaSVW-pjWro/hCP6qS4eRnAJ
>> >
>> I wasn't able to verify the version by Luciano Ramalho. (And yes,
>> extending all this to working with a subprocess is left as an exercise.
>> It's all pretty academic IMO, given Tkinter's lack of popularity outside
>> IDLE.)
>>
>
> On Stackoverflow pyside has gotten 40 questions tagged in the last 30
> days, wxpython 70 in the last 30 days, pyqt 114 in 30 days, while tkinter
> has gotten 101 in the last week, which would project to about 425 in the
> last 30 days.  So tkinter is being used at least by beginners. There have
> been a few tkinter and python-asyncio questions.
>
>
> --
> Terry Jan Reedy
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/299c98ee/attachment.html>

From arnodel at gmail.com  Sun May  3 20:59:03 2015
From: arnodel at gmail.com (Arnaud Delobelle)
Date: Sun, 3 May 2015 19:59:03 +0100
Subject: [Python-Dev] PEP 492: async/await in Python; version 4
In-Reply-To: <20150503152438.GU5663@ando.pearwood.info>
References: <554185C2.5080003@gmail.com> <mhvs5p$tg8$1@ger.gmane.org>
 <CAP7+vJ+VVgrdtXP8zRyv9_oy49Trf8wUdDbJ--OC+yBqqFT1Fw@mail.gmail.com>
 <mi0c1e$819$1@ger.gmane.org> <5543CB61.2080905@gmail.com>
 <mi0im0$lo5$1@ger.gmane.org> <20150501191937.GB8013@stoneleaf.us>
 <5543D2F4.3060207@gmail.com>
 <CAJ6cK1bfBe040TKCSXBLjEnCLTjcydeDWXHt+Rj9VHFOHy+pkA@mail.gmail.com>
 <20150503152438.GU5663@ando.pearwood.info>
Message-ID: <CAJ6cK1Z-u2XM7c21X8WuneJ_1Y1aS=BH=6+90wHsJP1cfde5Kw@mail.gmail.com>

On 3 May 2015 at 16:24, Steven D'Aprano <steve at pearwood.info> wrote:
> On Fri, May 01, 2015 at 09:24:47PM +0100, Arnaud Delobelle wrote:
>
>> I'm not convinced that allowing an object to be both a normal and an
>> async iterator is a good thing.  It could be a recipe for confusion.
>
> In what way?
>
> I'm thinking that the only confusion would be if you wrote "async for"
> instead of "for", or vice versa, and instead of getting an exception you
> got the (a)syncronous behaviour you didn't want.

Yes.  This is the same kind of confusion that this PEP is trying hard
to get rid of in other parts (i.e. the confusion between 'yield' and
'yield from' in current coroutines).

> But I have no intuition for how likely it is that you could write an
> asyncronous for loop, leave out the async, and still have the code do
> something meaningful.

Well if you've made you object both iterable and 'async iterable' then
it's very likely that you're going to get something meaningful out of
either kind of iteration.  Just not the way you want it if you
mistakenly left out (or added) the 'async' keyword in your loop.

> Other than that, I think it would be fine to have an object be both a
> syncronous and asyncronous iterator. You specify the behaviour you want
> by how you use it. We can already do that, e.g. unittest's assertRaises
> is both a test assertion and a context manager.

The latter is fine, because there is no danger of mistaking one for
the other, unlike iterators and 'async iterators'.

But my argument is simply that there is no good reason to aim for the
ability to have object conform to both protocols.  So it shouldn't be
a criterion when looking at the merits of a proposal.  I may very well
be wrong but I haven't yet seen a compelling case for an object
implementing both protocols.

Cheers,

-- 
Arnaud

From arnodel at gmail.com  Sun May  3 21:03:09 2015
From: arnodel at gmail.com (Arnaud Delobelle)
Date: Sun, 3 May 2015 20:03:09 +0100
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <55457864.60500@canterbury.ac.nz>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info>
 <mi0lc7$1eh$1@ger.gmane.org>
 <CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>
 <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>
 <CAP7+vJJaLsUok3pVjsuAKe8CpHaBr2O3GXOAGmX=HoqPc=t8JA@mail.gmail.com>
 <55457864.60500@canterbury.ac.nz>
Message-ID: <CAJ6cK1btJYcuxrMoj=vC4uV0=XJyWJ_EP3w0wWEfpvkJd=nV6A@mail.gmail.com>

On 3 May 2015 at 02:22, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Guido van Rossum wrote:
>>
>> On Sat, May 2, 2015 at 1:18 PM, Arnaud Delobelle <arnodel at gmail.com
>> <mailto:arnodel at gmail.com>> wrote:
>>
>>     Does this mean that
>>     somehow "await x" guarantees that the coroutine will suspend at least
>>     once?
>
>
> No. First, it's possible for x to finish without yielding.
> But even if x yields, there is no guarantee that the
> scheduler will run something else -- it might just
> resume the same task, even if there is another one that
> could run. It's up to the scheduler whether it
> implements any kind of "fair" scheduling policy.

That's what I understood but the example ('yielding()') provided by
Ron Adam seemed to imply otherwise, so I wanted to clarify.

-- 
Arnaud

From ryan at ryanhiebert.com  Tue May  5 23:29:44 2015
From: ryan at ryanhiebert.com (Ryan Hiebert)
Date: Tue, 5 May 2015 16:29:44 -0500
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAP7+vJ+=5bMxaQqn_gz5-2w1gfzvTnsRaKBY37utGGbv86pvfA@mail.gmail.com>
References: <5548EFF8.4060405@gmail.com>
 <55491c91.4833370a.5838.38da@mx.google.com>
 <CAP7+vJ+=5bMxaQqn_gz5-2w1gfzvTnsRaKBY37utGGbv86pvfA@mail.gmail.com>
Message-ID: <72DF8A88-3B2D-4A24-A799-D3134EAAF10E@ryanhiebert.com>

This draft proposal for async generators in ECMAScript 7 may be interesting reading to those who haven?t already: https://github.com/jhusain/asyncgenerator <https://github.com/jhusain/asyncgenerator>

This talk also has some good ideas about them, though the interesting stuff about using async generator syntax is all on the last slide, and not really explained: https://www.youtube.com/watch?v=gawmdhCNy-A <https://www.youtube.com/watch?v=gawmdhCNy-A>
> On May 5, 2015, at 3:55 PM, Guido van Rossum <guido at python.org> wrote:
> 
> One small clarification:
> 
> On Tue, May 5, 2015 at 12:40 PM, Jim J. Jewett <jimjjewett at gmail.com <mailto:jimjjewett at gmail.com>> wrote:
> [...] but I don't understand how this limitation works with things like a
> per-line file iterator that might need to wait for the file to
> be initially opened.
> 
>  Note that PEP 492 makes it syntactically impossible to use a coroutine function to implement an iterator using yield; this is because the generator machinery is needed to implement the coroutine machinery. However, the PEP allows the creation of asynchronous iterators using classes that implement __aiter__ and __anext__. Any blocking you need to do can happen in either of those. You just use `async for` to iterate over such an "asynchronous stream".
> 
> (There's an issue with actually implementing an asynchronous stream mapped to a disk file, because I/O multiplexing primitives like select() don't actually support waiting for disk files -- but this is an unrelated problem, and asynchronous streams are useful to handle I/O to/from network connections, subprocesses (pipes) or local RPC connections. Checkout the streams <https://docs.python.org/3/library/asyncio-stream.html> and subprocess <https://docs.python.org/3/library/asyncio-subprocess.html> submodules of the asyncio package. These streams would be great candidates for adding __aiter__/__anext__ to support async for-loops, so the idiom for iterating over them can once again closely resemble the idiom for iterating over regular (synchronous) streams using for-loops.)
> 
> -- 
> --Guido van Rossum (python.org/~guido <http://python.org/~guido>)
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/ryan%40ryanhiebert.com

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/e2c4d707/attachment-0001.html>

From rajiv.kumar at gmail.com  Wed May  6 00:36:53 2015
From: rajiv.kumar at gmail.com (Rajiv Kumar)
Date: Tue, 5 May 2015 15:36:53 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
Message-ID: <CAGWVM=iTosr8xTa4fCJrXXjzmTHOPB7HbYh_xjXhdDRH+mgDvQ@mail.gmail.com>

I wrote a little example[1] that has a bare-bones implementation of Go
style channels via a custom event loop. I used it to translate the prime
sieve example from Go[2] almost directly to Python. The code uses "message
= await channel.receive()" to mimic Go's "message <- channel". Instead of
using "go func()" to fire off a goroutine, I add the PEP492 coroutine to my
simple event loop.

It's not an efficient implementation - really just a proof of concept that
you can use async/await in your own code without any reference to asyncio.
I ended up writing it as I was thinking about how PEP 342 style coroutines
might look like in an async/await world.

In the course of writing this, I did find that it would be useful to have
the PEP document how event loops should advance the coroutines (via
.send(None) for example). It would also be helpful to have the semantics of
how await interacts with different kinds of awaitables documented. I had to
play with Yury's implementation to see what it does if the __await__ just
returns iter([1,2,3]) for example.

- Rajiv

[1] https://gist.github.com/vrajivk/c505310fb79d412afcd5#file-sieve-py
     https://gist.github.com/vrajivk/c505310fb79d412afcd5#file-channel-py

[2] https://golang.org/doc/play/sieve.go


On Tue, May 5, 2015 at 2:54 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 5 May 2015 at 22:38, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> > n 2015-05-05 5:01 PM, Paul Moore wrote:
> >>
> >> On 5 May 2015 at 21:00, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> >>>
> >>> On 2015-05-05 3:40 PM, Jim J. Jewett wrote:
> >>>>
> >>>> On Tue May 5 18:29:44 CEST 2015, Yury Selivanov posted an updated
> >>>> PEP492.
> >>>>
> >>>> Where are the following over-simplifications wrong?
> >>>>
> >>>> (1)  The PEP is intended for use (almost exclusively) with
> >>>> asychronous IO and a scheduler such as the asynchio event loop.
> >>>
> >>> Yes. You can also use it for UI loops.  Basically, anything
> >>> that can call your code asynchronously.
> >>
> >> Given that the stdlib doesn't provide an example of such a UI loop,
> >> what would a 3rd party module need to implement to provide such a
> >> thing? Can any of the non-IO related parts of asyncio be reused for
> >> the purpose, or must the 3rd party module implement everything from
> >> scratch?
> >
> > The idea is that you integrate processing of UI events to
> > your event loop of choice.  For instance, Twisted has
> > integration for QT and other libraries [1].  This way you
> > can easily combine async network (or OS) calls with your
> > UI logic to avoid "callback hell".
>
> We seem to be talking at cross purposes. You say the PEP is *not*
> exclusively intended for use with asyncio. You mention UI loops, but
> when asked how to implement such a loop, you say that I integrate UI
> events into my event loop of choice. But what options do I have for
> "my event loop of choice"? Please provide a concrete example that
> isn't asyncio. Can I use PEP 492 with Twisted (I doubt it, as Twisted
> doesn't use yield from, which is Python 3.x only)? I contend that
> there *is* no concrete example that currently exists, so I'm asking
> what I'd need to do to write one. You pointed at qamash, but that
> seems to be subclassing asyncio, so isn't "something that isn't
> asyncio".
>
> Note that I don't have a problem with there being no existing
> implementation other than asyncio. I'd just like it if we could be
> clear over exactly what we mean when we say "the PEP is not tied to
> asyncio". It feels like the truth currently is "you can write your own
> async framework that uses the new features introduced by the PEP". I
> fully expect that *if* there's a need for async frameworks that aren't
> fundamentally IO multiplexors, then it'll get easier to write them
> over time (the main problem right now is a lack of good tutorial
> examples of how to do so). But at the moment, asyncio seems to be the
> only game in town (and I can imagine that it'll always be the main IO
> multiplexor, unless existing frameworks like Twisted choose to compete
> rather than integrate).
>
> Paul
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/rajiv.kumar%40gmail.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/15801825/attachment-0001.html>

From pmiscml at gmail.com  Wed May  6 12:03:37 2015
From: pmiscml at gmail.com (Paul Sokolovsky)
Date: Wed, 6 May 2015 13:03:37 +0300
Subject: [Python-Dev] Minimal async event loop and async utilities (Was:
 PEP 492: async/await in Python; version 4)
In-Reply-To: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>
References: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>
Message-ID: <20150506130337.3afc095d@x230>

Hello,

On Wed, 6 May 2015 09:27:16 +0100
Paul Moore <p.f.moore at gmail.com> wrote:

> On 6 May 2015 at 07:46, Greg Ewing <greg.ewing at canterbury.ac.nz>
> wrote:
> > Another problem with the "core" idea is that
> > you can't start with an event loop that "just does
> > scheduling" and then add on other features such
> > as I/O *from the outside*. There has to be some
> > point at which everything comes together, which
> > means choosing something like select() or
> > poll() or I/O completion queues, and build that
> > into the heart of your event loop. At that point
> > it's no longer something with a simple core.

[]

> So, to some extent (how far is something I'd need to code up a loop to
> confirm) you can build the Futures and synchronisation mechanisms with
> an event loop that supports only this "minimal interface".
> 
> Essentially, that's my goal - to allow people who want to write (say)
> a Windows GUI event loop, or a Windows event loop based of
> WaitForXXXObject, or a Tkinter loop, or whatever, to *not* have to
> write their own implementation of synchronisation or future objects.
> 
> That may mean lifting the asyncio code and putting it into a separate
> library, to make the separation between "asyncio-dependent" and
> "general async" clearer. Or if asyncio's provisional status doesn't
> last long enough to do that, we may end up with an asyncio
> implementation and a separate (possibly 3rd party) "general"
> implementation.

MicroPython has alternative implementation of asyncio subset. It's
structured as a generic scheduler component "uasyncio.core"
https://github.com/micropython/micropython-lib/blob/master/uasyncio.core/uasyncio/core.py
(170 total lines) and "uasyncio" which adds I/O scheduling on top of
it:
https://github.com/micropython/micropython-lib/blob/master/uasyncio/uasyncio/__init__.py

"uasyncio.core" can be used separately, and is intended for usage as
such on e.g. microcontrollers. It's built around native Python concept
of coroutines (plus callbacks). It doesn't include concept of futures.
They can be added as an extension built on top, but so far I didn't see
need for that, while having developed a web picoframework 
for uasyncio (https://github.com/pfalcon/picoweb)


-- 
Best regards,
 Paul                          mailto:pmiscml at gmail.com

From gmludo at gmail.com  Wed May  6 14:20:46 2015
From: gmludo at gmail.com (Ludovic Gasc)
Date: Wed, 6 May 2015 14:20:46 +0200
Subject: [Python-Dev] Accepting PEP 492 (async/await)
In-Reply-To: <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>
References: <CAP7+vJK4djBhdbjq4+38-ZPU6DcZLy=KS2A0STwoCxwwpPBJ5w@mail.gmail.com>
 <CAP7+vJLOgW5w6UHiCSabxMxmeBJc5A3Qp4jGcRqAN2wYGhmVPw@mail.gmail.com>
Message-ID: <CAON-fpH5Nd3q=XCCQwKB57gSBFZ1J0=g5WqUxfsrgbFgCjQtag@mail.gmail.com>

Thank you Yuri for the all process (PEP+code+handle debate).

It's the first time I follow the genesis of a PEP, from the idea to the
acceptation, it was very instructive to me.

--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2015-05-06 1:58 GMT+02:00 Guido van Rossum <guido at python.org>:

> I totally forgot to publicly congratulate Yury on this PEP. He's put a
> huge effort into writing the PEP and the implementation and managing the
> discussion, first on python-ideas, later on python-dev. Congrats, Yury! And
> thanks for your efforts. Godspeed.
>
> On Tue, May 5, 2015 at 4:53 PM, Guido van Rossum <guido at python.org> wrote:
>
>> Everybody,
>>
>> In order to save myself a major headache I'm hereby accepting PEP 492.
>>
>> I've been following Yury's efforts carefully and I am fully confident
>> that we're doing the right thing here. There is only so much effort we can
>> put into clarifying terminology and explaining coroutines. Somebody should
>> write a tutorial. (I started to write one, but I ran out of time after just
>> describing basic yield.)
>>
>> I've given Yury clear instructions to focus on how to proceed -- he's to
>> work with another core dev on getting the implementation ready in time for
>> beta 1 (scheduled for May 24, but I think the target date should be May 19).
>>
>> The acceptance is provisional in the PEP 411 sense (stretching its
>> meaning to apply to language changes). That is, we reserve the right to
>> change the specification (or even withdraw it, in a worst-case scenario)
>> until 3.6, although I expect we won't need to do this except for some
>> peripheral issues (e.g. the backward compatibility flags).
>>
>> I now plan to go back to PEP 484 (type hints). Fortunately in that case
>> there's not much *implementation* that will land (just the typing.py
>> module), but there's still a lot of language in the PEP that needs updating
>> (check the PEP 484 tracker <https://github.com/ambv/typehinting/issues>).
>>
>> --
>> --Guido van Rossum (python.org/~guido)
>>
>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/b0c14d36/attachment.html>

From mertz at gnosis.cx  Wed May  6 22:21:51 2015
From: mertz at gnosis.cx (David Mertz)
Date: Wed, 6 May 2015 15:21:51 -0500
Subject: [Python-Dev] Ancient use of generators
In-Reply-To: <CAP7+vJKgcJCcvr0Jr4zOR39+Guv09c4BwGDr5LdaokDf3u-reg@mail.gmail.com>
References: <CAP7+vJKgcJCcvr0Jr4zOR39+Guv09c4BwGDr5LdaokDf3u-reg@mail.gmail.com>
Message-ID: <CAEbHw4Yt1N4OW19AH5MC-+xOeQzvbF2npECK9MiTU+3r7OfBaw@mail.gmail.com>

I'm glad to see that everything old is new again.  All the stuff being
discussed nowadays, even up through PEP 492, was largely what I was trying
to show in 2002 .... the syntax just got nicer in the intervening 13 years
:-).

On Wed, May 6, 2015 at 10:57 AM, Guido van Rossum <guido at python.org> wrote:

> For those interested in tracking the history of generators and coroutines
> in Python, I just found out that PEP 342
> <https://www.python.org/dev/peps/pep-0342/> (which introduced
> send/throw/close and made "generators as coroutines" a mainstream Python
> concept) harks back to PEP 288 <https://www.python.org/dev/peps/pep-0288/>,
> which was rejected. PEP 288 also proposed some changes to generators. The
> interesting bit though is in the references: there are two links to old
> articles by David Mertz that describe using generators in state machines
> and other interesting and unconventional applications of generators. All
> these well predated PEP 342, so yield was a statement and could not receive
> a value from the function calling next() -- communication was through a
> shared class instance.
>
> http://gnosis.cx/publish/programming/charming_python_b5.txt
> http://gnosis.cx/publish/programming/charming_python_b7.txt
>
> Enjoy!
>
> --
> --Guido van Rossum (python.org/~guido)
>



-- 
Keeping medicines from the bloodstreams of the sick; food
from the bellies of the hungry; books from the hands of the
uneducated; technology from the underdeveloped; and putting
advocates of freedom in prisons.  Intellectual property is
to the 21st century what the slave trade was to the 16th.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150506/d656a0be/attachment.html>

From ron3200 at gmail.com  Thu May  7 15:50:08 2015
From: ron3200 at gmail.com (Ron Adam)
Date: Thu, 07 May 2015 09:50:08 -0400
Subject: [Python-Dev] PEP 492 quibble and request
In-Reply-To: <CAJ6cK1btJYcuxrMoj=vC4uV0=XJyWJ_EP3w0wWEfpvkJd=nV6A@mail.gmail.com>
References: <20150430002147.GE10248@stoneleaf.us>
 <CADiSq7f3WrOPZX9omV2=fXq4WWdtmUpLG9OMgyqDDBCC_45n7A@mail.gmail.com>
 <CAP7+vJKbDReb87rwbnc90+ncw=J0=R2hAFR6Xpgtrj-3gSDX3g@mail.gmail.com>
 <CADiSq7cxMDJ32cmrWBHScPMFL22-CMTFu9yoTL1kCoJ_RsWvCg@mail.gmail.com>
 <CAP7+vJJPrvN1XJ83S5PdSwzaJcu3VJvqbHJ20kwJr6=rUbgNMA@mail.gmail.com>
 <20150501115447.GJ5663@ando.pearwood.info> <mi0lc7$1eh$1@ger.gmane.org>
 <CAP7+vJJgEbXJN6RA6U4WkTHr43cbgCAvm+aEnf4rf8Wnb6ZOKQ@mail.gmail.com>
 <CAJ6cK1bhhj87GmLbsagSKOCnYPXRiO8iEsCnnvLTt6rV_ttJZg@mail.gmail.com>
 <CAP7+vJJaLsUok3pVjsuAKe8CpHaBr2O3GXOAGmX=HoqPc=t8JA@mail.gmail.com>
 <55457864.60500@canterbury.ac.nz>
 <CAJ6cK1btJYcuxrMoj=vC4uV0=XJyWJ_EP3w0wWEfpvkJd=nV6A@mail.gmail.com>
Message-ID: <mifqih$sfn$1@ger.gmane.org>



On 05/03/2015 03:03 PM, Arnaud Delobelle wrote:
> On 3 May 2015 at 02:22, Greg Ewing<greg.ewing at canterbury.ac.nz>  wrote:
>> >Guido van Rossum wrote:
>>> >>
>>> >>On Sat, May 2, 2015 at 1:18 PM, Arnaud Delobelle <arnodel at gmail.com
>>> >><mailto:arnodel at gmail.com>> wrote:
>>> >>
>>> >>     Does this mean that
>>> >>     somehow "await x" guarantees that the coroutine will suspend at least
>>> >>     once?
>> >
>> >
>> >No. First, it's possible for x to finish without yielding.
>> >But even if x yields, there is no guarantee that the
>> >scheduler will run something else -- it might just
>> >resume the same task, even if there is another one that
>> >could run. It's up to the scheduler whether it
>> >implements any kind of "fair" scheduling policy.
> That's what I understood but the example ('yielding()') provided by
> Ron Adam seemed to imply otherwise, so I wanted to clarify.


Guido is correct of course.  In examples I've used before with trampolines, 
a co-routine would be yielded back to the event loop, and if there was any 
other co-routines in the event loop they would execute first.  I'm not sure 
if async and await can be used with a trampoline type scheduler.

A scheduler might use a timer or priority based system system to schedule 
events.  So yes, it's up to the scheduler and the pep492 is intended to be 
flexible as to what scheduler is used.

Cheers,
    Ron









From dimaqq at gmail.com  Thu May  7 16:25:27 2015
From: dimaqq at gmail.com (Dima Tisnek)
Date: Thu, 7 May 2015 16:25:27 +0200
Subject: [Python-Dev] What's missing in PEP-484 (Type hints)
In-Reply-To: <20150430123345.GD5663@ando.pearwood.info>
References: <CAGGBzX+yci3B7BQqwxkV1z1U_NA4VT__YaJ8s5XT2S3DPLUtuA@mail.gmail.com>
 <20150430123345.GD5663@ando.pearwood.info>
Message-ID: <CAGGBzX+0xXk8mPHFs5aKmT1Jvb5hMCfZL1wDa1SU7HTzJNOT4Q@mail.gmail.com>

On 30 April 2015 at 14:33, Steven D'Aprano <steve at pearwood.info> wrote:
> On Thu, Apr 30, 2015 at 01:41:53PM +0200, Dima Tisnek wrote:
>
>> # Syntactic sugar
>> "Beautiful is better than ugly, " thus nice syntax is needed.
>> Current syntax is very mechanical.
>> Syntactic sugar is needed on top of current PEP.
>
> I think the annotation syntax is beautiful. It reminds me of Pascal.

Haha, good one!

>
>
>> # internal vs external
>> @intify
>> def foo() -> int:
>>     b = "42"
>>     return b  # check 1
>> x = foo() // 2  # check 2
>>
>> Does the return type apply to implementation (str) or decorated callable (int)?
>
> I would expect that a static type checker would look at foo, and flag
> this as an error. The annotation says that foo returns an int, but it
> clearly returns a string. That's an obvious error.

Is this per PEP, or just a guess?

I think PEP needs to be explicit about this.

[snipped]
>
>     lambda arg: arg + 1
>
> Obviously arg must be a Number, since it has to support addition with
> ints.

Well, no, it can be any custom type that implements __add__
Anyhow, the question was about non-trivial lambdas.

>> # local variables
>> Not mentioned in the PEP
>> Non-trivial code could really use these.
>
> Normally local variables will have their type inferred from the
> operations done to them:
>
>     s = arg[1:]  # s has the same type as arg

Good point, should be mentioned in PEP.

Technically, type can be empty list, mixed list or custom return type
for overloaded __getitem__ that accepts slices.

>> # comprehensions
>> [3 * x.data for x in foo if "bar" in x.type]
>> Arguable, perhaps annotation is only needed on `foo` here, but then
>> how complex comprehensions, e.g. below, the intermediate comprehension
>> could use an annotation
>> [xx for y in [...] if ...]
>
> A list comprehension is obviously of type List. If you need to give a
> more specific hint:
>
> result = [expr for x in things if cond(x)]  #type: List[Whatever]
>
> See also the discussion of "cast" in the PEP.
>
> https://www.python.org/dev/peps/pep-0484/#id25

Good point for overall comprehension type.
re: cast, personally I have some reservations against placing `cast()`
into runtime path.

I'm sorry if I was not clear. My question was how should type of
ephemeral `x` be specified.
In other words, can types be specified on anything inside a comprehension?

>> # class attributes
>> s = socket.socket(...)
>> s.type, s.family, s.proto  # int
>> s.fileno  # callable
>> If annotations are only available for methods, it will lead to
>> Java-style explicit getters and setters.
>> Python language and data model prefers properties instead, thus
>> annotations are needed on attributes.
>
> class Thing:
>     a = 42  # can be inferred
>     b = []  # inferred as List[Any]
>     c = []  #type: List[float]

Good point, I suppose comments + stub file may be sufficient.

Stub is better (required?) because it allows to specify types of
attributes that are not assigned in class scope, but that are expected
to be there as result of __init__ or because it's a C extension.

An example in PEP would be good.

From dimaqq at gmail.com  Thu May  7 16:28:37 2015
From: dimaqq at gmail.com (Dima Tisnek)
Date: Thu, 7 May 2015 16:28:37 +0200
Subject: [Python-Dev] What's missing in PEP-484 (Type hints)
In-Reply-To: <20150430115411.GN429@tonks>
References: <CAGGBzX+yci3B7BQqwxkV1z1U_NA4VT__YaJ8s5XT2S3DPLUtuA@mail.gmail.com>
 <20150430115411.GN429@tonks>
Message-ID: <CAGGBzX+GAZ46yO30pJNbWCCOmyANbUx=LqNDcmgM6Y0C6+JAUw@mail.gmail.com>

>> # plain data
>> user1 = dict(id=123,  # always int
>>     name="uuu",  # always str
>>     ...)  # other fields possible
>> smth = [42, "xx", ...]
>> (why not namedtuple? b/c extensible, mutable)
>> At least one PHP IDE allows to annotate PDO.
>> Perhaps it's just bad taste in Python? Or is there a valid use-case?
>
> Most (all?) of this is actually mentioned in the PEP:
> https://www.python.org/dev/peps/pep-0484/#type-comments

The question was about mixed containers, e.g.:
x = [12, "Jane"]
y = [13, "John", 99.9]

There first element is always int, second always str, and rest is variable.

From guido at python.org  Thu May  7 17:39:25 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 7 May 2015 08:39:25 -0700
Subject: [Python-Dev] What's missing in PEP-484 (Type hints)
In-Reply-To: <CAGGBzX+0xXk8mPHFs5aKmT1Jvb5hMCfZL1wDa1SU7HTzJNOT4Q@mail.gmail.com>
References: <CAGGBzX+yci3B7BQqwxkV1z1U_NA4VT__YaJ8s5XT2S3DPLUtuA@mail.gmail.com>
 <20150430123345.GD5663@ando.pearwood.info>
 <CAGGBzX+0xXk8mPHFs5aKmT1Jvb5hMCfZL1wDa1SU7HTzJNOT4Q@mail.gmail.com>
Message-ID: <CAP7+vJJGgpe1px5RYxhrWEwp3eScq19grSRy=ru2BzDAmt9MQg@mail.gmail.com>

On Thu, May 7, 2015 at 7:25 AM, Dima Tisnek <dimaqq at gmail.com> wrote:

> On 30 April 2015 at 14:33, Steven D'Aprano <steve at pearwood.info> wrote:
> > On Thu, Apr 30, 2015 at 01:41:53PM +0200, Dima Tisnek wrote:
> >> # internal vs external
> >> @intify
> >> def foo() -> int:
> >>     b = "42"
> >>     return b  # check 1
> >> x = foo() // 2  # check 2
> >>
> >> Does the return type apply to implementation (str) or decorated
> callable (int)?
> >
> > I would expect that a static type checker would look at foo, and flag
> > this as an error. The annotation says that foo returns an int, but it
> > clearly returns a string. That's an obvious error.
>
> Is this per PEP, or just a guess?
>
> I think PEP needs to be explicit about this.
>

The PEP shouldn't have to explain all the rules for type inferencing.
There's a section "What is checked?" that says (amongst other things):

  The body of a checked function is checked for consistency with the
  given annotations.  The annotations are also used to check correctness
  of calls appearing in other checked functions.

> Normally local variables will have their type inferred from the
> > operations done to them:
> >
> >     s = arg[1:]  # s has the same type as arg
>
> Good point, should be mentioned in PEP.
>

Again, what do you want the PEP to say? I am trying to keep the PEP shorter
than the actual code that implements the type checker. :-)


> Technically, type can be empty list, mixed list or custom return type
> for overloaded __getitem__ that accepts slices.
>
> I'm sorry if I was not clear. My question was how should type of
> ephemeral `x` be specified.
> In other words, can types be specified on anything inside a comprehension?
>

That's actually a good question; the PEP shows some examples of #type:
comments in peculiar places, but there's no example using list
comprehensions. Your best bet is to leave it to the type inferencer; if
your comprehension is so complex that need to put type annotations on parts
of it, you may be better off rewriting it as a regular for-loop, which
offers more options for annotations.


> Stub is better (required?) because it allows to specify types of
> attributes that are not assigned in class scope, but that are expected
> to be there as result of __init__ or because it's a C extension.
>

Note that the PEP does not explicitly say whether the information of a stub
might be *merged* with the information gleaned from the source code. The
basic model is that if a stub is present the implementation source code is
not read at all by the type checker (and, conversely, information from
stubs is not available at all at runtime). But it is possible for some type
checker to improve upon this model.


> An example in PEP would be good.
>

Can you give an example that I can edit and put in the PEP?

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150507/cc198c9d/attachment.html>

From guido at python.org  Thu May  7 19:42:52 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 7 May 2015 10:42:52 -0700
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAGWVM=iTosr8xTa4fCJrXXjzmTHOPB7HbYh_xjXhdDRH+mgDvQ@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <CAGWVM=iTosr8xTa4fCJrXXjzmTHOPB7HbYh_xjXhdDRH+mgDvQ@mail.gmail.com>
Message-ID: <CAP7+vJJNF0DuLDg4Zr8OaVOYes==L5moU6qUvjSLpFXGGRting@mail.gmail.com>

On Tue, May 5, 2015 at 3:36 PM, Rajiv Kumar <rajiv.kumar at gmail.com> wrote:

> I wrote a little example[1] that has a bare-bones implementation of Go
> style channels via a custom event loop. I used it to translate the prime
> sieve example from Go[2] almost directly to Python. The code uses "message
> = await channel.receive()" to mimic Go's "message <- channel". Instead of
> using "go func()" to fire off a goroutine, I add the PEP492 coroutine to my
> simple event loop.
>

Cool example!

It's not an efficient implementation - really just a proof of concept that
> you can use async/await in your own code without any reference to asyncio.
> I ended up writing it as I was thinking about how PEP 342 style coroutines
> might look like in an async/await world.
>
> In the course of writing this, I did find that it would be useful to have
> the PEP document how event loops should advance the coroutines (via
> .send(None) for example). It would also be helpful to have the semantics of
> how await interacts with different kinds of awaitables documented. I had to
> play with Yury's implementation to see what it does if the __await__ just
> returns iter([1,2,3]) for example.
>

I've found this too. :-) Yury, perhaps you could show a brief example in
the PEP of how to "drive" a coroutine from e.g. main()?


> - Rajiv
>
> [1] https://gist.github.com/vrajivk/c505310fb79d412afcd5#file-sieve-py
>      https://gist.github.com/vrajivk/c505310fb79d412afcd5#file-channel-py
>
> [2] https://golang.org/doc/play/sieve.go
>

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150507/b552dcdd/attachment.html>

From yselivanov.ml at gmail.com  Thu May  7 20:01:48 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Thu, 07 May 2015 14:01:48 -0400
Subject: [Python-Dev] PEP 492: async/await in Python; version 5
In-Reply-To: <CAP7+vJJNF0DuLDg4Zr8OaVOYes==L5moU6qUvjSLpFXGGRting@mail.gmail.com>
References: <55491c91.4833370a.5838.38da@mx.google.com>
 <55492164.8090906@gmail.com>
 <CACac1F-PttdYx4jXKnDkvyybW+q_UVkk79pO6M6Uo+GO+D3SRA@mail.gmail.com>
 <55493841.9070601@gmail.com>
 <CACac1F8AVnqzVxQjRetWhgQtjkzxMDYTKfB47oGFkZFw9BShfw@mail.gmail.com>
 <CAGWVM=iTosr8xTa4fCJrXXjzmTHOPB7HbYh_xjXhdDRH+mgDvQ@mail.gmail.com>
 <CAP7+vJJNF0DuLDg4Zr8OaVOYes==L5moU6qUvjSLpFXGGRting@mail.gmail.com>
Message-ID: <554BA88C.7020003@gmail.com>



On 2015-05-07 1:42 PM, Guido van Rossum wrote:
> On Tue, May 5, 2015 at 3:36 PM, Rajiv Kumar <rajiv.kumar at gmail.com> wrote:
>
>> I wrote a little example[1] that has a bare-bones implementation of Go
>> style channels via a custom event loop. I used it to translate the prime
>> sieve example from Go[2] almost directly to Python. The code uses "message
>> = await channel.receive()" to mimic Go's "message <- channel". Instead of
>> using "go func()" to fire off a goroutine, I add the PEP492 coroutine to my
>> simple event loop.
>>
> Cool example!
>
> It's not an efficient implementation - really just a proof of concept that
>> you can use async/await in your own code without any reference to asyncio.
>> I ended up writing it as I was thinking about how PEP 342 style coroutines
>> might look like in an async/await world.
>>
>> In the course of writing this, I did find that it would be useful to have
>> the PEP document how event loops should advance the coroutines (via
>> .send(None) for example). It would also be helpful to have the semantics of
>> how await interacts with different kinds of awaitables documented. I had to
>> play with Yury's implementation to see what it does if the __await__ just
>> returns iter([1,2,3]) for example.
>>
> I've found this too. :-) Yury, perhaps you could show a brief example in
> the PEP of how to "drive" a coroutine from e.g. main()?

OK, will do!

Thanks,
Yury

From martin at v.loewis.de  Thu May  7 21:23:54 2015
From: martin at v.loewis.de (=?windows-1252?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Thu, 07 May 2015 21:23:54 +0200
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
 <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>
Message-ID: <554BBBCA.9090909@v.loewis.de>

Am 02.05.15 um 21:57 schrieb Adam Barto?:
> Even if sys.stdin contained a file-like object with proper encoding
> attribute, it wouldn't work since sys.stdin has to be instance of <type
> 'file'>. So the question is, whether it is possible to make a file instance
> in Python that is also customizable so it may call my code. For the first
> thing, how to change the value of encoding attribute of a file object.

If, by "in Python", you mean both "in pure Python", and "in Python 2",
then the answer is no. If you can add arbitrary C code, then you might
be able to hack your C library's stdio implementation to delegate fread
calls to your code.

I recommend to use Python 3 instead.

Regards,
Martin


From mal at egenix.com  Fri May  8 10:52:20 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 08 May 2015 10:52:20 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var (was: Clarification of
 PEP 476 "opting out" section)
In-Reply-To: <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
Message-ID: <554C7944.2020905@egenix.com>

On 07.05.2015 04:30, Nick Coghlan wrote:
>> Can we please make the monkeypatch a regular part of Python's
>> site.py which can enabled via an environment variable, say
>> export PYTHONHTTPSVERIFY=0.
>>
>> See http://bugs.python.org/issue23857 for the discussion.
> ...
> I actually do think it would be good to have such a feature as a
> native part of Python 2.7 in order to provide a nicer "revert to the
> pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
> "there's no easy way to turn HTTPS certificate verification off
> globally" state of affairs to Python 3), but I don't currently have
> the time available to push for that against the "end users can't be
> trusted not to turn certificate verification off when they should be
> fixing their certificate management instead" perspective.

We're currently working on a new release of eGenix PyRun and this
will include Python 2.7.9.

We do want to add such an env switch to disable the cert verification,
so would like to know whether we can use PYTHONHTTPSVERIFY for this
or not.

We mainly need this to reenable simple use of self-signed certificates
which 2.7.9 disables.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 08 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From ncoghlan at gmail.com  Fri May  8 11:36:48 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 8 May 2015 19:36:48 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var (was: Clarification of
 PEP 476 "opting out" section)
In-Reply-To: <554C7944.2020905@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
Message-ID: <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>

On 8 May 2015 6:52 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
>
> On 07.05.2015 04:30, Nick Coghlan wrote:
> >> Can we please make the monkeypatch a regular part of Python's
> >> site.py which can enabled via an environment variable, say
> >> export PYTHONHTTPSVERIFY=0.
> >>
> >> See http://bugs.python.org/issue23857 for the discussion.
> > ...
> > I actually do think it would be good to have such a feature as a
> > native part of Python 2.7 in order to provide a nicer "revert to the
> > pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
> > "there's no easy way to turn HTTPS certificate verification off
> > globally" state of affairs to Python 3), but I don't currently have
> > the time available to push for that against the "end users can't be
> > trusted not to turn certificate verification off when they should be
> > fixing their certificate management instead" perspective.
>
> We're currently working on a new release of eGenix PyRun and this
> will include Python 2.7.9.
>
> We do want to add such an env switch to disable the cert verification,
> so would like to know whether we can use PYTHONHTTPSVERIFY for this
> or not.

That's a slightly misleading quotation of my post, as I'm opposed to the
use of an environment variable for this, due to the fact that using the
"-E" switch will then revert to the upstream default behaviour of verifying
certificates, rather defeating the point of introducing the legacy
infrastructure compatibility feature in the first place.

A new informational PEP akin to PEP 394 that defines a config file location
& contents for downstream redistributors that need a smoother transition
plan for PEP 476 will let us handle this in a consistent way across
redistributors that's also compatible with runtime use of the -E switch.

Cheers,
Nick.

>
> We mainly need this to reenable simple use of self-signed certificates
> which 2.7.9 disables.
>
> --
> Marc-Andre Lemburg
> eGenix.com
>
> Professional Python Services directly from the Source  (#1, May 08 2015)
> >>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
> >>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
> >>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
> ________________________________________________________________________
>
> ::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::
>
>    eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
>     D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
>            Registered at Amtsgericht Duesseldorf: HRB 46611
>                http://www.egenix.com/company/contact/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150508/4b93b60f/attachment-0001.html>

From mal at egenix.com  Fri May  8 12:13:52 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 08 May 2015 12:13:52 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
Message-ID: <554C8C60.8000603@egenix.com>

On 08.05.2015 11:36, Nick Coghlan wrote:
> On 8 May 2015 6:52 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
>>
>> On 07.05.2015 04:30, Nick Coghlan wrote:
>>>> Can we please make the monkeypatch a regular part of Python's
>>>> site.py which can enabled via an environment variable, say
>>>> export PYTHONHTTPSVERIFY=0.
>>>>
>>>> See http://bugs.python.org/issue23857 for the discussion.
>>> ...
>>> I actually do think it would be good to have such a feature as a
>>> native part of Python 2.7 in order to provide a nicer "revert to the
>>> pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
>>> "there's no easy way to turn HTTPS certificate verification off
>>> globally" state of affairs to Python 3), but I don't currently have
>>> the time available to push for that against the "end users can't be
>>> trusted not to turn certificate verification off when they should be
>>> fixing their certificate management instead" perspective.
>>
>> We're currently working on a new release of eGenix PyRun and this
>> will include Python 2.7.9.
>>
>> We do want to add such an env switch to disable the cert verification,
>> so would like to know whether we can use PYTHONHTTPSVERIFY for this
>> or not.
> 
> That's a slightly misleading quotation of my post, as I'm opposed to the
> use of an environment variable for this, due to the fact that using the
> "-E" switch will then revert to the upstream default behaviour of verifying
> certificates, rather defeating the point of introducing the legacy
> infrastructure compatibility feature in the first place.

Oh, sorry. I read your email implying that you are fine with
the env var approach.

I don't really see the issue with -E, though. It's well possible
to internally set PYTHONHTTPSVERIFY=0 when -E is used to regain
backwards compatibility per default for Python 2.7.

Regarding the config file approach and letting distributions
set their own defaults:

I don't think it's a good idea to have one distribution
default to verifying HTTPS certs via a global config file
and another distribution do the opposite.

Python itself should define the defaults to be used, not
the distributions.

The Python Linux distribution is too complex already due to the
many different ways Python is installed on the systems.

Not having to deal with this complexity was the main motivation
for us to create eGenix PyRun, since it works the same on
all Linux platforms and doesn't use any of the system
wide installed Python interpreters, settings or packages
(unless you tell it to).

> A new informational PEP akin to PEP 394 that defines a config file location
> & contents for downstream redistributors that need a smoother transition
> plan for PEP 476 will let us handle this in a consistent way across
> redistributors that's also compatible with runtime use of the -E switch.

Regardless of whether a global config file is a good idea or not,
I don't think we can wait with 2.7.10 until a whole new PEP process
has run through.

> Cheers,
> Nick.
> 
>>
>> We mainly need this to reenable simple use of self-signed certificates
>> which 2.7.9 disables.
>>
>> --
>> Marc-Andre Lemburg
>> eGenix.com
>>
>> Professional Python Services directly from the Source  (#1, May 08 2015)
>>>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
>> ________________________________________________________________________
>>
>> ::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::
>>
>>    eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
>>     D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
>>            Registered at Amtsgericht Duesseldorf: HRB 46611
>>                http://www.egenix.com/company/contact/
> 

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 08 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From solipsis at pitrou.net  Fri May  8 13:50:29 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Fri, 8 May 2015 13:50:29 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
Message-ID: <20150508135029.11653a1e@fsol>


Using an environment variable to disable a security feature sounds like
an extremely bad idea. Environment variables are hidden state.
Generally you don't know up front which values they will have when
running an executable, and people don't think about inspecting them.
This opens the door to mistakes (or even attacks) where certificate
validation is disabled without the user knowing.

Regards

Antoine.



On Fri, 08 May 2015 12:13:52 +0200
"M.-A. Lemburg" <mal at egenix.com> wrote:
> On 08.05.2015 11:36, Nick Coghlan wrote:
> > On 8 May 2015 6:52 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
> >>
> >> On 07.05.2015 04:30, Nick Coghlan wrote:
> >>>> Can we please make the monkeypatch a regular part of Python's
> >>>> site.py which can enabled via an environment variable, say
> >>>> export PYTHONHTTPSVERIFY=0.
> >>>>
> >>>> See http://bugs.python.org/issue23857 for the discussion.
> >>> ...
> >>> I actually do think it would be good to have such a feature as a
> >>> native part of Python 2.7 in order to provide a nicer "revert to the
> >>> pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
> >>> "there's no easy way to turn HTTPS certificate verification off
> >>> globally" state of affairs to Python 3), but I don't currently have
> >>> the time available to push for that against the "end users can't be
> >>> trusted not to turn certificate verification off when they should be
> >>> fixing their certificate management instead" perspective.
> >>
> >> We're currently working on a new release of eGenix PyRun and this
> >> will include Python 2.7.9.
> >>
> >> We do want to add such an env switch to disable the cert verification,
> >> so would like to know whether we can use PYTHONHTTPSVERIFY for this
> >> or not.
> > 
> > That's a slightly misleading quotation of my post, as I'm opposed to the
> > use of an environment variable for this, due to the fact that using the
> > "-E" switch will then revert to the upstream default behaviour of verifying
> > certificates, rather defeating the point of introducing the legacy
> > infrastructure compatibility feature in the first place.
> 
> Oh, sorry. I read your email implying that you are fine with
> the env var approach.
> 
> I don't really see the issue with -E, though. It's well possible
> to internally set PYTHONHTTPSVERIFY=0 when -E is used to regain
> backwards compatibility per default for Python 2.7.
> 
> Regarding the config file approach and letting distributions
> set their own defaults:
> 
> I don't think it's a good idea to have one distribution
> default to verifying HTTPS certs via a global config file
> and another distribution do the opposite.
> 
> Python itself should define the defaults to be used, not
> the distributions.
> 
> The Python Linux distribution is too complex already due to the
> many different ways Python is installed on the systems.
> 
> Not having to deal with this complexity was the main motivation
> for us to create eGenix PyRun, since it works the same on
> all Linux platforms and doesn't use any of the system
> wide installed Python interpreters, settings or packages
> (unless you tell it to).
> 
> > A new informational PEP akin to PEP 394 that defines a config file location
> > & contents for downstream redistributors that need a smoother transition
> > plan for PEP 476 will let us handle this in a consistent way across
> > redistributors that's also compatible with runtime use of the -E switch.
> 
> Regardless of whether a global config file is a good idea or not,
> I don't think we can wait with 2.7.10 until a whole new PEP process
> has run through.
> 
> > Cheers,
> > Nick.
> > 
> >>
> >> We mainly need this to reenable simple use of self-signed certificates
> >> which 2.7.9 disables.
> >>
> >> --
> >> Marc-Andre Lemburg
> >> eGenix.com
> >>
> >> Professional Python Services directly from the Source  (#1, May 08 2015)
> >>>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
> >>>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
> >>>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
> >> ________________________________________________________________________
> >>
> >> ::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::
> >>
> >>    eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
> >>     D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
> >>            Registered at Amtsgericht Duesseldorf: HRB 46611
> >>                http://www.egenix.com/company/contact/
> > 
> 




From status at bugs.python.org  Fri May  8 18:08:25 2015
From: status at bugs.python.org (Python tracker)
Date: Fri,  8 May 2015 18:08:25 +0200 (CEST)
Subject: [Python-Dev] Summary of Python tracker Issues
Message-ID: <20150508160825.4E4C456682@psf.upfronthosting.co.za>


ACTIVITY SUMMARY (2015-05-01 - 2015-05-08)
Python tracker at http://bugs.python.org/

To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.

Issues counts and deltas:
  open    4838 ( -3)
  closed 31069 (+44)
  total  35907 (+41)

Open issues with patches: 2252 


Issues opened (25)
==================

#24107: Add support for retrieving the certificate chain
http://bugs.python.org/issue24107  opened by Lukasa

#24109: Documentation for difflib uses optparse
http://bugs.python.org/issue24109  opened by idahogray

#24110: zipfile.ZipFile.write() does not accept bytes arcname
http://bugs.python.org/issue24110  opened by july

#24111: Valgrind suppression file should be updated
http://bugs.python.org/issue24111  opened by Antony.Lee

#24114: ctypes.utils uninitialized variable 'path'
http://bugs.python.org/issue24114  opened by kees

#24115: PyObject_IsInstance() and PyObject_IsSubclass() can fail
http://bugs.python.org/issue24115  opened by serhiy.storchaka

#24116: --with-pydebug has no effect when the final python binary is c
http://bugs.python.org/issue24116  opened by aleb

#24117: Wrong range checking in GB18030 decoder.
http://bugs.python.org/issue24117  opened by Ma Lin

#24119: Carry comments with the AST
http://bugs.python.org/issue24119  opened by brett.cannon

#24120: pathlib.(r)glob stops on PermissionDenied exception
http://bugs.python.org/issue24120  opened by Gregorio

#24124: Two versions of instructions for installing Python modules
http://bugs.python.org/issue24124  opened by skip.montanaro

#24126: newlines attribute does not get set after calling readline()
http://bugs.python.org/issue24126  opened by arekfu

#24127: Fatal error in launcher: Job information querying failed
http://bugs.python.org/issue24127  opened by gavstar

#24129: Incorrect (misleading) statement in the execution model docume
http://bugs.python.org/issue24129  opened by levkivskyi

#24130: Remove -fno-common compile option from OS X framework builds?
http://bugs.python.org/issue24130  opened by ned.deily

#24131: [configparser] Add section/option delimiter to ExtendedInterpo
http://bugs.python.org/issue24131  opened by giflw

#24132: Direct sub-classing of pathlib.Path
http://bugs.python.org/issue24132  opened by projetmbc

#24136: document PEP 448
http://bugs.python.org/issue24136  opened by benjamin.peterson

#24137: Force not using _default_root in IDLE
http://bugs.python.org/issue24137  opened by serhiy.storchaka

#24138: Speed up range() by caching and modifying long objects
http://bugs.python.org/issue24138  opened by larry

#24139: Use sqlite3 extended error codes
http://bugs.python.org/issue24139  opened by Dima.Tisnek

#24140: In pdb using "until X" doesn't seem to have effect in commands
http://bugs.python.org/issue24140  opened by vyktor

#24142: ConfigParser._read doesn't join multi-line values collected wh
http://bugs.python.org/issue24142  opened by fhoech

#24143: Makefile in tarball don't provide make uninstall target
http://bugs.python.org/issue24143  opened by krichter

#24145: Support |= for parameters in converters
http://bugs.python.org/issue24145  opened by larry



Most recent 15 issues with no replies (15)
==========================================

#24143: Makefile in tarball don't provide make uninstall target
http://bugs.python.org/issue24143

#24140: In pdb using "until X" doesn't seem to have effect in commands
http://bugs.python.org/issue24140

#24137: Force not using _default_root in IDLE
http://bugs.python.org/issue24137

#24136: document PEP 448
http://bugs.python.org/issue24136

#24131: [configparser] Add section/option delimiter to ExtendedInterpo
http://bugs.python.org/issue24131

#24129: Incorrect (misleading) statement in the execution model docume
http://bugs.python.org/issue24129

#24115: PyObject_IsInstance() and PyObject_IsSubclass() can fail
http://bugs.python.org/issue24115

#24114: ctypes.utils uninitialized variable 'path'
http://bugs.python.org/issue24114

#24111: Valgrind suppression file should be updated
http://bugs.python.org/issue24111

#24104: Use after free in xmlparser_setevents (2)
http://bugs.python.org/issue24104

#24103: Use after free in xmlparser_setevents (1)
http://bugs.python.org/issue24103

#24097: Use after free in PyObject_GetState
http://bugs.python.org/issue24097

#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087

#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084

#24063: Support Mageia and Arch Linux in the platform module
http://bugs.python.org/issue24063



Most recent 15 issues waiting for review (15)
=============================================

#24145: Support |= for parameters in converters
http://bugs.python.org/issue24145

#24142: ConfigParser._read doesn't join multi-line values collected wh
http://bugs.python.org/issue24142

#24138: Speed up range() by caching and modifying long objects
http://bugs.python.org/issue24138

#24130: Remove -fno-common compile option from OS X framework builds?
http://bugs.python.org/issue24130

#24117: Wrong range checking in GB18030 decoder.
http://bugs.python.org/issue24117

#24114: ctypes.utils uninitialized variable 'path'
http://bugs.python.org/issue24114

#24109: Documentation for difflib uses optparse
http://bugs.python.org/issue24109

#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102

#24091: Use after free in Element.extend (1)
http://bugs.python.org/issue24091

#24087: Documentation doesn't explain the term "coroutine" (PEP 342)
http://bugs.python.org/issue24087

#24084: pstats: sub-millisecond display
http://bugs.python.org/issue24084

#24082: Obsolete note in argument parsing (c-api/arg.rst)
http://bugs.python.org/issue24082

#24076: sum() several times slower on Python 3
http://bugs.python.org/issue24076

#24068: statistics module - incorrect results with boolean input
http://bugs.python.org/issue24068

#24064: Make the property doctstring writeable
http://bugs.python.org/issue24064



Top 10 most discussed issues (10)
=================================

#22906: PEP 479: Change StopIteration handling inside generators
http://bugs.python.org/issue22906  15 msgs

#20179: Derby #10: Convert 50 sites to Argument Clinic across 4 files
http://bugs.python.org/issue20179  13 msgs

#24132: Direct sub-classing of pathlib.Path
http://bugs.python.org/issue24132  11 msgs

#24127: Fatal error in launcher: Job information querying failed
http://bugs.python.org/issue24127   9 msgs

#22881: show median in benchmark results
http://bugs.python.org/issue22881   7 msgs

#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102   6 msgs

#21800: Implement RFC 6855 (IMAP Support for UTF-8) in imaplib.
http://bugs.python.org/issue21800   5 msgs

#23888: Fixing fractional expiry time bug in cookiejar
http://bugs.python.org/issue23888   5 msgs

#24085: large memory overhead when pyc is recompiled
http://bugs.python.org/issue24085   5 msgs

#24117: Wrong range checking in GB18030 decoder.
http://bugs.python.org/issue24117   5 msgs



Issues closed (43)
==================

#2292: Missing *-unpacking generalizations
http://bugs.python.org/issue2292  closed by benjamin.peterson

#20148: Derby: Convert the _sre module to use Argument Clinic
http://bugs.python.org/issue20148  closed by serhiy.storchaka

#20168: Derby: Convert the _tkinter module to use Argument Clinic
http://bugs.python.org/issue20168  closed by serhiy.storchaka

#20274: sqlite module has bad argument parsing code, including undefin
http://bugs.python.org/issue20274  closed by larry

#21520: Erroneous zipfile test failure if the string 'bad' appears in 
http://bugs.python.org/issue21520  closed by larry

#22334: test_tcl.test_split() fails on "x86 FreeBSD 7.2 3.x" buildbot
http://bugs.python.org/issue22334  closed by serhiy.storchaka

#23330: h2py.py regular expression missing
http://bugs.python.org/issue23330  closed by serhiy.storchaka

#23880: Tkinter: getint and getdouble should support Tcl_Obj
http://bugs.python.org/issue23880  closed by serhiy.storchaka

#23911: Move path-based bootstrap code to a separate frozen file.
http://bugs.python.org/issue23911  closed by eric.snow

#23920: Should Clinic have "nullable" or types=NoneType?
http://bugs.python.org/issue23920  closed by larry

#24000: More fixes for the Clinic mapping of converters to format unit
http://bugs.python.org/issue24000  closed by larry

#24001: Clinic: use raw types in types= set
http://bugs.python.org/issue24001  closed by larry

#24051: Argument Clinic no longer works with single optional argument
http://bugs.python.org/issue24051  closed by serhiy.storchaka

#24060: Clearify necessities for logging with timestamps
http://bugs.python.org/issue24060  closed by python-dev

#24066: send_message should take all the addresses in the To: header i
http://bugs.python.org/issue24066  closed by kirelagin

#24081: Obsolete caveat in reload() docs
http://bugs.python.org/issue24081  closed by r.david.murray

#24088: yield expression confusion
http://bugs.python.org/issue24088  closed by gvanrossum

#24089: argparse crashes with AssertionError
http://bugs.python.org/issue24089  closed by ned.deily

#24092: Use after free in Element.extend (2)
http://bugs.python.org/issue24092  closed by serhiy.storchaka

#24093: Use after free in Element.remove
http://bugs.python.org/issue24093  closed by serhiy.storchaka

#24094: Use after free during json encoding (PyType_IsSubtype)
http://bugs.python.org/issue24094  closed by python-dev

#24095: Use after free during json encoding a dict (2)
http://bugs.python.org/issue24095  closed by benjamin.peterson

#24096: Use after free in get_filter
http://bugs.python.org/issue24096  closed by python-dev

#24099: Use after free in siftdown (1)
http://bugs.python.org/issue24099  closed by rhettinger

#24100: Use after free in siftdown (2)
http://bugs.python.org/issue24100  closed by rhettinger

#24101: Use after free in siftup
http://bugs.python.org/issue24101  closed by rhettinger

#24105: Use after free during json encoding a dict (3)
http://bugs.python.org/issue24105  closed by benjamin.peterson

#24106: Messed up indentation makes undesired piece of code being run!
http://bugs.python.org/issue24106  closed by r.david.murray

#24108: fnmatch.translate('*.txt') fails
http://bugs.python.org/issue24108  closed by r.david.murray

#24112: %b does not work, as a binary output formatter
http://bugs.python.org/issue24112  closed by steven.daprano

#24113: shlex constructor unreachable code
http://bugs.python.org/issue24113  closed by rhettinger

#24118: http.client example is no longer valid
http://bugs.python.org/issue24118  closed by python-dev

#24121: collections page doesn't mention that deques are mutable
http://bugs.python.org/issue24121  closed by rhettinger

#24122: Install fails after configure sets the extending/embedding ins
http://bugs.python.org/issue24122  closed by doko

#24123: Python 2.7 Tutorial Conflicting behavior with WeakValueDiction
http://bugs.python.org/issue24123  closed by jessembacon

#24125: Fix for #23865 breaks docutils
http://bugs.python.org/issue24125  closed by serhiy.storchaka

#24128: Documentation links are forwarded to Python 2
http://bugs.python.org/issue24128  closed by r.david.murray

#24133: Add 'composable' decorator to functools (with @ matrix multipl
http://bugs.python.org/issue24133  closed by r.david.murray

#24134: assertRaises can behave differently
http://bugs.python.org/issue24134  closed by serhiy.storchaka

#24135: Policy for altering sys.path
http://bugs.python.org/issue24135  closed by r.david.murray

#24141: Python 3 ships an outdated valgrind suppressison file.
http://bugs.python.org/issue24141  closed by ned.deily

#24144: Docs discourage use of binascii.unhexlify etc.
http://bugs.python.org/issue24144  closed by r.david.murray

#24146: ast.literal_eval doesn't support the Python ternary operator
http://bugs.python.org/issue24146  closed by r.david.murray

From benjamin at python.org  Fri May  8 18:39:40 2015
From: benjamin at python.org (Benjamin Peterson)
Date: Fri, 08 May 2015 12:39:40 -0400
Subject: [Python-Dev] coming soon: 2.7.10
Message-ID: <1431103180.1273453.264534077.4FAD289B@webmail.messagingengine.com>

In the spirit of regular releases, it's time to release 2.7.10. I'm
going to plan to cut rc1 this weekend with a final in 2 weeks.

I apologize for the short notice; time has crept up on me, and I have
commitments in June that prevent pushing releases into that month.

From ncoghlan at gmail.com  Sat May  9 01:56:25 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 9 May 2015 09:56:25 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <20150508135029.11653a1e@fsol>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com> <20150508135029.11653a1e@fsol>
Message-ID: <CADiSq7dcrz+37CXvM9h+1-H9KjCM7dJ5ZnnL9NFME4QtEucDUQ@mail.gmail.com>

On 8 May 2015 9:52 pm, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
>
>
> Using an environment variable to disable a security feature sounds like
> an extremely bad idea. Environment variables are hidden state.
> Generally you don't know up front which values they will have when
> running an executable, and people don't think about inspecting them.
> This opens the door to mistakes (or even attacks) where certificate
> validation is disabled without the user knowing.

Yes, that's also a consideration. A config file lets us bring the full
battery of Linux security tools (most obviously file system permissions,
but also other systems like SELinux & AppArmor) to bear on controlling who
(and what) has permission to change the default security settings.

Cheers,
Nick.

>
> Regards
>
> Antoine.
>
>
>
> On Fri, 08 May 2015 12:13:52 +0200
> "M.-A. Lemburg" <mal at egenix.com> wrote:
> > On 08.05.2015 11:36, Nick Coghlan wrote:
> > > On 8 May 2015 6:52 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
> > >>
> > >> On 07.05.2015 04:30, Nick Coghlan wrote:
> > >>>> Can we please make the monkeypatch a regular part of Python's
> > >>>> site.py which can enabled via an environment variable, say
> > >>>> export PYTHONHTTPSVERIFY=0.
> > >>>>
> > >>>> See http://bugs.python.org/issue23857 for the discussion.
> > >>> ...
> > >>> I actually do think it would be good to have such a feature as a
> > >>> native part of Python 2.7 in order to provide a nicer "revert to the
> > >>> pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
> > >>> "there's no easy way to turn HTTPS certificate verification off
> > >>> globally" state of affairs to Python 3), but I don't currently have
> > >>> the time available to push for that against the "end users can't be
> > >>> trusted not to turn certificate verification off when they should be
> > >>> fixing their certificate management instead" perspective.
> > >>
> > >> We're currently working on a new release of eGenix PyRun and this
> > >> will include Python 2.7.9.
> > >>
> > >> We do want to add such an env switch to disable the cert
verification,
> > >> so would like to know whether we can use PYTHONHTTPSVERIFY for this
> > >> or not.
> > >
> > > That's a slightly misleading quotation of my post, as I'm opposed to
the
> > > use of an environment variable for this, due to the fact that using
the
> > > "-E" switch will then revert to the upstream default behaviour of
verifying
> > > certificates, rather defeating the point of introducing the legacy
> > > infrastructure compatibility feature in the first place.
> >
> > Oh, sorry. I read your email implying that you are fine with
> > the env var approach.
> >
> > I don't really see the issue with -E, though. It's well possible
> > to internally set PYTHONHTTPSVERIFY=0 when -E is used to regain
> > backwards compatibility per default for Python 2.7.
> >
> > Regarding the config file approach and letting distributions
> > set their own defaults:
> >
> > I don't think it's a good idea to have one distribution
> > default to verifying HTTPS certs via a global config file
> > and another distribution do the opposite.
> >
> > Python itself should define the defaults to be used, not
> > the distributions.
> >
> > The Python Linux distribution is too complex already due to the
> > many different ways Python is installed on the systems.
> >
> > Not having to deal with this complexity was the main motivation
> > for us to create eGenix PyRun, since it works the same on
> > all Linux platforms and doesn't use any of the system
> > wide installed Python interpreters, settings or packages
> > (unless you tell it to).
> >
> > > A new informational PEP akin to PEP 394 that defines a config file
location
> > > & contents for downstream redistributors that need a smoother
transition
> > > plan for PEP 476 will let us handle this in a consistent way across
> > > redistributors that's also compatible with runtime use of the -E
switch.
> >
> > Regardless of whether a global config file is a good idea or not,
> > I don't think we can wait with 2.7.10 until a whole new PEP process
> > has run through.
> >
> > > Cheers,
> > > Nick.
> > >
> > >>
> > >> We mainly need this to reenable simple use of self-signed
certificates
> > >> which 2.7.9 disables.
> > >>
> > >> --
> > >> Marc-Andre Lemburg
> > >> eGenix.com
> > >>
> > >> Professional Python Services directly from the Source  (#1, May 08
2015)
> > >>>>> Python Projects, Coaching and Consulting ...
http://www.egenix.com/
> > >>>>> mxODBC Plone/Zope Database Adapter ...
http://zope.egenix.com/
> > >>>>> mxODBC, mxDateTime, mxTextTools ...
http://python.egenix.com/
> > >>
________________________________________________________________________
> > >>
> > >> ::::: Try our mxODBC.Connect Python Database Interface for free !
::::::
> > >>
> > >>    eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
> > >>     D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
> > >>            Registered at Amtsgericht Duesseldorf: HRB 46611
> > >>                http://www.egenix.com/company/contact/
> > >
> >
>
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150509/90063441/attachment.html>

From ncoghlan at gmail.com  Sat May  9 02:29:43 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 9 May 2015 10:29:43 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <554C8C60.8000603@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
Message-ID: <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>

On 8 May 2015 8:14 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
>
> On 08.05.2015 11:36, Nick Coghlan wrote:
> > On 8 May 2015 6:52 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
> >>
> >> On 07.05.2015 04:30, Nick Coghlan wrote:
> >>>> Can we please make the monkeypatch a regular part of Python's
> >>>> site.py which can enabled via an environment variable, say
> >>>> export PYTHONHTTPSVERIFY=0.
> >>>>
> >>>> See http://bugs.python.org/issue23857 for the discussion.
> >>> ...
> >>> I actually do think it would be good to have such a feature as a
> >>> native part of Python 2.7 in order to provide a nicer "revert to the
> >>> pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
> >>> "there's no easy way to turn HTTPS certificate verification off
> >>> globally" state of affairs to Python 3), but I don't currently have
> >>> the time available to push for that against the "end users can't be
> >>> trusted not to turn certificate verification off when they should be
> >>> fixing their certificate management instead" perspective.
> >>
> >> We're currently working on a new release of eGenix PyRun and this
> >> will include Python 2.7.9.
> >>
> >> We do want to add such an env switch to disable the cert verification,
> >> so would like to know whether we can use PYTHONHTTPSVERIFY for this
> >> or not.
> >
> > That's a slightly misleading quotation of my post, as I'm opposed to the
> > use of an environment variable for this, due to the fact that using the
> > "-E" switch will then revert to the upstream default behaviour of
verifying
> > certificates, rather defeating the point of introducing the legacy
> > infrastructure compatibility feature in the first place.
>
> Oh, sorry. I read your email implying that you are fine with
> the env var approach.
>
> I don't really see the issue with -E, though. It's well possible
> to internally set PYTHONHTTPSVERIFY=0 when -E is used to regain
> backwards compatibility per default for Python 2.7.
>
> Regarding the config file approach and letting distributions
> set their own defaults:
>
> I don't think it's a good idea to have one distribution
> default to verifying HTTPS certs via a global config file
> and another distribution do the opposite.
>
> Python itself should define the defaults to be used, not
> the distributions.

As a result of the discussions around both PEP 466 and 476, I'm now firmly
of the view that it's correct for the upstream Python community to assume
the public internet as its standard operating environment when it comes to
network security settings, and for those of us working for commercial
redistributors to subsequently bear the cost of adapting from that upstream
assumption to the different assumptions that may apply in the context of
organisational intranets.

That's also why I've come around to the view that an informational PEP with
recommendations for redistributors, rather than an actual change to Python
2.7, is the right answer (at least initially) when it comes to providing a
smoother transition plan for PEP 476 - the folks saying "it's a bad idea to
make this easy to turn off" are *right* from the perspective of operating
over the public internet, or with well designed internal SSL/TLS
certificate management, it's just also a *necessary* idea (in my view) to
help CIOs and other infrastructure leads responsibly and effectively manage
the wide range of risks associated with internal infrastructure upgrades.

Regards,
Nick.

>
> The Python Linux distribution is too complex already due to the
> many different ways Python is installed on the systems.
>
> Not having to deal with this complexity was the main motivation
> for us to create eGenix PyRun, since it works the same on
> all Linux platforms and doesn't use any of the system
> wide installed Python interpreters, settings or packages
> (unless you tell it to).
>
> > A new informational PEP akin to PEP 394 that defines a config file
location
> > & contents for downstream redistributors that need a smoother transition
> > plan for PEP 476 will let us handle this in a consistent way across
> > redistributors that's also compatible with runtime use of the -E switch.
>
> Regardless of whether a global config file is a good idea or not,
> I don't think we can wait with 2.7.10 until a whole new PEP process
> has run through.
>
> > Cheers,
> > Nick.
> >
> >>
> >> We mainly need this to reenable simple use of self-signed certificates
> >> which 2.7.9 disables.
> >>
> >> --
> >> Marc-Andre Lemburg
> >> eGenix.com
> >>
> >> Professional Python Services directly from the Source  (#1, May 08
2015)
> >>>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
> >>>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
> >>>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
> >>
________________________________________________________________________
> >>
> >> ::::: Try our mxODBC.Connect Python Database Interface for free !
::::::
> >>
> >>    eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
> >>     D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
> >>            Registered at Amtsgericht Duesseldorf: HRB 46611
> >>                http://www.egenix.com/company/contact/
> >
>
> --
> Marc-Andre Lemburg
> eGenix.com
>
> Professional Python Services directly from the Source  (#1, May 08 2015)
> >>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
> >>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
> >>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
> ________________________________________________________________________
>
> ::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::
>
>    eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
>     D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
>            Registered at Amtsgericht Duesseldorf: HRB 46611
>                http://www.egenix.com/company/contact/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150509/0b34c33c/attachment-0001.html>

From drekin at gmail.com  Sat May  9 14:39:46 2015
From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=)
Date: Sat, 9 May 2015 14:39:46 +0200
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <554BBBCA.9090909@v.loewis.de>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
 <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>
 <554BBBCA.9090909@v.loewis.de>
Message-ID: <CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>

I already have a solution in Python 3 (see
https://github.com/Drekin/win-unicode-console,
https://pypi.python.org/pypi/win_unicode_console), I was just considering
adding support for Python 2 as well. I think I have an working example in
Python 2 using ctypes.

On Thu, May 7, 2015 at 9:23 PM, "Martin v. L?wis" <martin at v.loewis.de>
wrote:

> Am 02.05.15 um 21:57 schrieb Adam Barto?:
> > Even if sys.stdin contained a file-like object with proper encoding
> > attribute, it wouldn't work since sys.stdin has to be instance of <type
> > 'file'>. So the question is, whether it is possible to make a file
> instance
> > in Python that is also customizable so it may call my code. For the first
> > thing, how to change the value of encoding attribute of a file object.
>
> If, by "in Python", you mean both "in pure Python", and "in Python 2",
> then the answer is no. If you can add arbitrary C code, then you might
> be able to hack your C library's stdio implementation to delegate fread
> calls to your code.
>
> I recommend to use Python 3 instead.
>
> Regards,
> Martin
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150509/eb395cff/attachment.html>

From v+python at g.nevcal.com  Sat May  9 19:22:53 2015
From: v+python at g.nevcal.com (Glenn Linderman)
Date: Sat, 09 May 2015 10:22:53 -0700
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
 <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>
 <554BBBCA.9090909@v.loewis.de>
 <CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>
Message-ID: <554E426D.40307@g.nevcal.com>

On 5/9/2015 5:39 AM, Adam Barto? wrote:
> I already have a solution in Python 3 (see 
> https://github.com/Drekin/win-unicode-console, 
> https://pypi.python.org/pypi/win_unicode_console), I was just 
> considering adding support for Python 2 as well. I think I have an 
> working example in Python 2 using ctypes.

Is this going to get released in 3.5, I hope?  Python 3 is pretty 
limited without some solution for Unicode on the console... probably the 
biggest deficiency I have found in Python 3, since its introduction. It 
has great Unicode support for files and processing, which convinced me 
to switch from Perl, and I like so much else about it, that I can hardly 
code in Perl any more (I still support a few Perl programs, but have 
ported most of them to Python).

I wondered if all your recent questions about Py 2 were as a result of 
porting the above to Py 2... I only have one program left that I was 
forced to write in Py 2 because of library dependencies, and I think 
that library is finally being ported to Py 3, whew!  So while I laud 
your efforts, and no doubt they will benefit some folks for a few years 
yet, I hope never to use your Py 2 port myself!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150509/8c5b37df/attachment.html>

From mal at egenix.com  Sat May  9 20:13:59 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Sat, 09 May 2015 20:13:59 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
Message-ID: <554E4E67.4040405@egenix.com>

On 09.05.2015 02:29, Nick Coghlan wrote:
> On 8 May 2015 8:14 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
>>
>> On 08.05.2015 11:36, Nick Coghlan wrote:
>>> On 8 May 2015 6:52 pm, "M.-A. Lemburg" <mal at egenix.com> wrote:
>>>>
>>>> On 07.05.2015 04:30, Nick Coghlan wrote:
>>>>>> Can we please make the monkeypatch a regular part of Python's
>>>>>> site.py which can enabled via an environment variable, say
>>>>>> export PYTHONHTTPSVERIFY=0.
>>>>>>
>>>>>> See http://bugs.python.org/issue23857 for the discussion.
>>>>> ...
>>>>> I actually do think it would be good to have such a feature as a
>>>>> native part of Python 2.7 in order to provide a nicer "revert to the
>>>>> pre-PEP-476 behaviour" experience for Python 2.7 users (leaving the
>>>>> "there's no easy way to turn HTTPS certificate verification off
>>>>> globally" state of affairs to Python 3), but I don't currently have
>>>>> the time available to push for that against the "end users can't be
>>>>> trusted not to turn certificate verification off when they should be
>>>>> fixing their certificate management instead" perspective.
>>>>
>>>> We're currently working on a new release of eGenix PyRun and this
>>>> will include Python 2.7.9.
>>>>
>>>> We do want to add such an env switch to disable the cert verification,
>>>> so would like to know whether we can use PYTHONHTTPSVERIFY for this
>>>> or not.
>>>
>>> That's a slightly misleading quotation of my post, as I'm opposed to the
>>> use of an environment variable for this, due to the fact that using the
>>> "-E" switch will then revert to the upstream default behaviour of
> verifying
>>> certificates, rather defeating the point of introducing the legacy
>>> infrastructure compatibility feature in the first place.
>>
>> Oh, sorry. I read your email implying that you are fine with
>> the env var approach.
>>
>> I don't really see the issue with -E, though. It's well possible
>> to internally set PYTHONHTTPSVERIFY=0 when -E is used to regain
>> backwards compatibility per default for Python 2.7.
>>
>> Regarding the config file approach and letting distributions
>> set their own defaults:
>>
>> I don't think it's a good idea to have one distribution
>> default to verifying HTTPS certs via a global config file
>> and another distribution do the opposite.
>>
>> Python itself should define the defaults to be used, not
>> the distributions.
> 
> As a result of the discussions around both PEP 466 and 476, I'm now firmly
> of the view that it's correct for the upstream Python community to assume
> the public internet as its standard operating environment when it comes to
> network security settings, and for those of us working for commercial
> redistributors to subsequently bear the cost of adapting from that upstream
> assumption to the different assumptions that may apply in the context of
> organisational intranets.
> 
> That's also why I've come around to the view that an informational PEP with
> recommendations for redistributors, rather than an actual change to Python
> 2.7, is the right answer (at least initially) when it comes to providing a
> smoother transition plan for PEP 476 - the folks saying "it's a bad idea to
> make this easy to turn off" are *right* from the perspective of operating
> over the public internet, or with well designed internal SSL/TLS
> certificate management, it's just also a *necessary* idea (in my view) to
> help CIOs and other infrastructure leads responsibly and effectively manage
> the wide range of risks associated with internal infrastructure upgrades.

I don't agree. We've broken the contract that people had with Python 2.7
by introducing a major breakage in a patch level release very far
down the line in 2.7.9, without providing an easy and official
way to opt-out that does not involve hacking your installation.

IMO, we should not fall for the somewhat arrogant view that
we know better than all the Python users out there when it
comes to running secure systems.

By providing a way to intentionally switch off the new default,
we do make people aware of the risks and that's good enough,
while still maintaining the contract people rightly expect of
patch level releases of Python.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 09 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From storchaka at gmail.com  Sat May  9 21:01:23 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Sat, 09 May 2015 22:01:23 +0300
Subject: [Python-Dev] Free lists
Message-ID: <milli3$4ln$1@ger.gmane.org>

Here is a statistic for most called PyObject_INIT or PyObject_INIT_VAR 
for types (collected during running Python tests on 32-bit Linux).

type                                count       %   acc.%

builtin_function_or_method      116012007  36.29%  36.29%
method                           52465386  16.41%  52.70%
int                              42828741  13.40%  66.09%
str                              37017098  11.58%  77.67%
generator                        14026583   4.39%  82.06%
list_iterator                     8731329   2.73%  84.79%
bytes                             7217934   2.26%  87.04%
tuple_iterator                    5042563   1.58%  88.62%
float                             4672980   1.46%  90.08%
set                               3319699   1.04%  91.12%
_io.StringIO                      3000369   0.94%  92.06%
str_iterator                      2126838   0.67%  92.73%
list                              2031059   0.64%  93.36%
dict                              1691993   0.53%  93.89%
method-wrapper                    1573139   0.49%  94.38%
function                          1472062   0.46%  94.84%
traceback                         1388278   0.43%  95.28%
tuple                             1132071   0.35%  95.63%
memoryview                        1092173   0.34%  95.97%
cell                              1049496   0.33%  96.30%
managedbuffer                     1036889   0.32%  96.63%
bytearray                          711969   0.22%  96.85%
range_iterator                     496924   0.16%  97.00%
range                              483971   0.15%  97.15%
super                              472447   0.15%  97.30%
map                                449567   0.14%  97.44%
frame                              427320   0.13%  97.58%
set_iterator                       423392   0.13%  97.71%
Leaf                               398705   0.12%  97.83%
symtable                           374412   0.12%  97.95%

Types for which free lists already are used: builtin_function_or_method, 
method, float, tuple, list, dict, frame. Some free list implementations 
(e.g. for tuple) don't call PyObject_INIT/PyObject_INIT_VAR. That is why 
numbers are such low for tuples.

Perhaps it is worth to add free lists for other types: int, str, bytes, 
generator, list and tuple iterators?

Shortened tables for variable-sized objects (that calls PyObject_INIT_VAR):

int                              42828741  13.40%
                              0     425353   0.99%   0.99%
                              1   21399290  49.96%  50.96%
                              2   10496856  24.51%  75.47%
                              3    4873346  11.38%  86.85%
                              4    1021563   2.39%  89.23%
                              5    1246444   2.91%  92.14%
                              6     733676   1.71%  93.85%
                              7     123074   0.29%  94.14%
                              8     139203   0.33%  94.47%
...

bytes                             7217934   2.26%
                              0        842   0.01%   0.01%
                              1     179469   2.49%   2.50%
                              2     473306   6.56%   9.06%
                              3     254968   3.53%  12.59%
                              4    1169164  16.20%  28.79%
                              5      72806   1.01%  29.79%
                              6     128668   1.78%  31.58%
                              7     169694   2.35%  33.93%
                              8     155154   2.15%  36.08%
                              9      67320   0.93%  37.01%
                             10      51703   0.72%  37.73%
                             11      42574   0.59%  38.32%
                             12     108947   1.51%  39.83%
                             13      40812   0.57%  40.39%
                             14     126783   1.76%  42.15%
                             15      37873   0.52%  42.67%
                             16     447482   6.20%  48.87%
                             17     194320   2.69%  51.56%
                             18     251685   3.49%  55.05%
                             19     159435   2.21%  57.26%
                             20     212521   2.94%  60.20%
...
                             31      18751   0.26%  67.32%
                             32     159781   2.21%  69.54%
                             33       8332   0.12%  69.65%
...
                             63      19841   0.27%  79.21%
                             64     144982   2.01%  81.22%
                             65       5216   0.07%  81.29%
...
                            127       1354   0.02%  85.44%
                            128     376539   5.22%  90.66%
                            129      17468   0.24%  90.90%
...
                            255        178   0.00%  92.39%
                            256      11993   0.17%  92.55%
                            257        124   0.00%  92.56%
...


From larry at hastings.org  Sat May  9 21:51:09 2015
From: larry at hastings.org (Larry Hastings)
Date: Sat, 09 May 2015 12:51:09 -0700
Subject: [Python-Dev] Free lists
In-Reply-To: <milli3$4ln$1@ger.gmane.org>
References: <milli3$4ln$1@ger.gmane.org>
Message-ID: <554E652D.1000104@hastings.org>

On 05/09/2015 12:01 PM, Serhiy Storchaka wrote:
> Here is a statistic for most called PyObject_INIT or PyObject_INIT_VAR 
> for types (collected during running Python tests on 32-bit Linux).

Can you produce these statistics for a 64-bit build?


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150509/8f0265b4/attachment.html>

From storchaka at gmail.com  Sun May 10 00:06:18 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Sun, 10 May 2015 01:06:18 +0300
Subject: [Python-Dev] Free lists
In-Reply-To: <554E652D.1000104@hastings.org>
References: <milli3$4ln$1@ger.gmane.org> <554E652D.1000104@hastings.org>
Message-ID: <mim0cq$2hh$1@ger.gmane.org>

On 09.05.15 22:51, Larry Hastings wrote:
> On 05/09/2015 12:01 PM, Serhiy Storchaka wrote:
>> Here is a statistic for most called PyObject_INIT or PyObject_INIT_VAR
>> for types (collected during running Python tests on 32-bit Linux).
>
> Can you produce these statistics for a 64-bit build?

Sorry, no. All my computers are ran under 32-bit Linux.


From graffatcolmingov at gmail.com  Sun May 10 01:25:49 2015
From: graffatcolmingov at gmail.com (Ian Cordasco)
Date: Sat, 9 May 2015 18:25:49 -0500
Subject: [Python-Dev] Free lists
In-Reply-To: <mim0cq$2hh$1@ger.gmane.org>
References: <milli3$4ln$1@ger.gmane.org> <554E652D.1000104@hastings.org>
 <mim0cq$2hh$1@ger.gmane.org>
Message-ID: <CAN-Kwu2S_Ly1ueWbRSVfQZTYVoAnmX9n054kc62j+UR5zC8rRg@mail.gmail.com>

On May 9, 2015 5:07 PM, "Serhiy Storchaka" <storchaka at gmail.com> wrote:
>
> On 09.05.15 22:51, Larry Hastings wrote:
>>
>> On 05/09/2015 12:01 PM, Serhiy Storchaka wrote:
>>>
>>> Here is a statistic for most called PyObject_INIT or PyObject_INIT_VAR
>>> for types (collected during running Python tests on 32-bit Linux).
>>
>>
>> Can you produce these statistics for a 64-bit build?
>
>
> Sorry, no. All my computers are ran under 32-bit Linux.
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/graffatcolmingov%40gmail.com

Can you share how you gathered them so someone could run them on a 64-bit
build?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150509/4f40418d/attachment.html>

From rosuav at gmail.com  Sun May 10 01:44:13 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Sun, 10 May 2015 09:44:13 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <554E4E67.4040405@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
Message-ID: <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>

On Sun, May 10, 2015 at 4:13 AM, M.-A. Lemburg <mal at egenix.com> wrote:
> By providing a way to intentionally switch off the new default,
> we do make people aware of the risks and that's good enough,
> while still maintaining the contract people rightly expect of
> patch level releases of Python.

Just as long as it's the sysadmin, and NOT some random attacker over
the internet, who has the power to downgrade security. Environment
variables can be attacked in various ways.

ChrisA

From robertc at robertcollins.net  Sun May 10 05:04:44 2015
From: robertc at robertcollins.net (Robert Collins)
Date: Sun, 10 May 2015 15:04:44 +1200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
Message-ID: <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>

On 10 May 2015 at 11:44, Chris Angelico <rosuav at gmail.com> wrote:
> On Sun, May 10, 2015 at 4:13 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>> By providing a way to intentionally switch off the new default,
>> we do make people aware of the risks and that's good enough,
>> while still maintaining the contract people rightly expect of
>> patch level releases of Python.
>
> Just as long as it's the sysadmin, and NOT some random attacker over
> the internet, who has the power to downgrade security. Environment
> variables can be attacked in various ways.

They can, and the bash fun was very good evidence of that.

OTOH if someones environment is at risk, PATH and PYTHONPATH are
already very effective attack vectors.

-Rob

-- 
Robert Collins <rbtcollins at hp.com>
Distinguished Technologist
HP Converged Cloud

From ncoghlan at gmail.com  Sun May 10 06:07:24 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 10 May 2015 14:07:24 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
Message-ID: <CADiSq7eD7B1R0sWUmZu-GRLyzDSEhRgJqUN3AP3u7zGZ3XVCfw@mail.gmail.com>

On 10 May 2015 at 13:04, Robert Collins <robertc at robertcollins.net> wrote:
> On 10 May 2015 at 11:44, Chris Angelico <rosuav at gmail.com> wrote:
>> On Sun, May 10, 2015 at 4:13 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>>> By providing a way to intentionally switch off the new default,
>>> we do make people aware of the risks and that's good enough,
>>> while still maintaining the contract people rightly expect of
>>> patch level releases of Python.
>>
>> Just as long as it's the sysadmin, and NOT some random attacker over
>> the internet, who has the power to downgrade security. Environment
>> variables can be attacked in various ways.
>
> They can, and the bash fun was very good evidence of that.
>
> OTOH if someones environment is at risk, PATH and PYTHONPATH are
> already very effective attack vectors.

Right, which is why -E is an important existing hardening technique
for protecting privileged services against local attackers. Isolated
mode in Python 3.4+ is easier to use, but you can get a functional
equivalent in Python 2 using:

* running from a directory under /usr (Program Files on Windows)
rather than your home directory (to protect against sys.path[0] based
attacks)
* running with -E (to protect against PYTHON* environment variable attacks)
* running with -S (to protect against site.py and sitecustomize.py
based attacks)
* running with -s (to protect against hostile packages in the user
site directory)

That's how I came to the conclusion that adding a new environment
variable to turn off a network security hardening feature isn't a good
idea:

* it significantly increases the attack surface area if you're *not*
using -E when running a privileged service
* it doesn't work at all if you *are* using -E when running a privileged service

That was OK when we were dealing with the hash randomisation problem,
mostly because the consequence of that vulnerability was "denial of
service", and the question of whether or not hash randomisation caused
problems came up on an application-by-application basis, rather than
being related to the way an entire network environment was managed.
The question becomes very different when the failure mode we're
talking about is transparent interception of nominally confidential
communication.

Instead, we want a configuration file stored in a protected directory,
such that for an attacker to modify it they *already* need to have
achieved a local privilege escalation, in which case, they can just
attack the system certificate store directly, rather than messing
about downgrading Python's default HTTPS verification settings.

In my case, I don't actually need the *feature itself* in upstream
CPython, but I *would* like to have upstream CPython's blessing of the
design as a recommendation to redistributors that need a capability
like this to meet the needs of their end users. I've been talking
about "someone" putting together a PEP to that effect, so given this
discussion, I'll go ahead and do that myself, with Robert Kuska listed
as co-author (since he came up with the general design I'm advocating
for).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From storchaka at gmail.com  Sun May 10 08:22:36 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Sun, 10 May 2015 09:22:36 +0300
Subject: [Python-Dev] Free lists
In-Reply-To: <CAN-Kwu2S_Ly1ueWbRSVfQZTYVoAnmX9n054kc62j+UR5zC8rRg@mail.gmail.com>
References: <milli3$4ln$1@ger.gmane.org> <554E652D.1000104@hastings.org>
 <mim0cq$2hh$1@ger.gmane.org>
 <CAN-Kwu2S_Ly1ueWbRSVfQZTYVoAnmX9n054kc62j+UR5zC8rRg@mail.gmail.com>
Message-ID: <mimtfd$c3u$1@ger.gmane.org>

On 10.05.15 02:25, Ian Cordasco wrote:
> Can you share how you gathered them so someone could run them on a
> 64-bit build?

This is quick and dirty patch. It generates 8 GB log file!

patch --merge -p1 <PyObject_INIT_stat.diff
make -s -j2
./python -Wd -m test.regrtest -w -uall 2>PyObject_INIT.log
python3 PyObject_INIT_stat.py <PyObject_INIT.log >PyObject_INIT.stat

Perhaps compiling with COUNT_ALLOCS will produce similar statistic for 
types (but without statistics for sizes) and should be much faster.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: PyObject_INIT_stat.diff
Type: text/x-patch
Size: 952 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/e0d035ad/attachment.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: PyObject_INIT_stat.py
Type: text/x-python
Size: 1090 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/e0d035ad/attachment.py>

From storchaka at gmail.com  Sun May 10 10:23:17 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Sun, 10 May 2015 11:23:17 +0300
Subject: [Python-Dev] Free lists
In-Reply-To: <milli3$4ln$1@ger.gmane.org>
References: <milli3$4ln$1@ger.gmane.org>
Message-ID: <min4hn$dmk$1@ger.gmane.org>

Here is comparable statistic collected from tests ran with an executable 
buil with COUNT_ALLOCS.

type                                count       %   acc.%

tuple                           448855278  29.50%  29.50%
frame                           203515969  13.38%  42.88%
str                             182658237  12.01%  54.89%
builtin_function_or_method      156724634  10.30%  65.19%
int                             106561963   7.00%  72.19%
method                           88269762   5.80%  78.00%
list                             50340630   3.31%  81.31%
slice                            36650028   2.41%  83.71%
dict                             34429310   2.26%  85.98%
generator                        33035375   2.17%  88.15%
bytes                            29230573   1.92%  90.07%
function                         24953392   1.64%  91.71%
list_iterator                    21236155   1.40%  93.11%
tuple_iterator                   16800947   1.10%  94.21%
cell                             16369317   1.08%  95.29%
float                             7079162   0.47%  95.75%
_sre.SRE_Match                    6342612   0.42%  96.17%
set                               5322829   0.35%  96.52%
TokenInfo                         5077251   0.33%  96.85%
code                              3643664   0.24%  97.09%
traceback                         3510709   0.23%  97.32%
memoryview                        2860799   0.19%  97.51%
managedbuffer                     2762975   0.18%  97.69%
method-wrapper                    2590642   0.17%  97.86%
Name                              1681233   0.11%  97.97%
bytearray                         1598429   0.11%  98.08%
_io.StringIO                      1439456   0.09%  98.17%
weakref                           1341485   0.09%  98.26%
super                              911811   0.06%  98.32%
range                              798254   0.05%  98.37%



From drekin at gmail.com  Sun May 10 15:28:25 2015
From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=)
Date: Sun, 10 May 2015 15:28:25 +0200
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
 <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>
 <554BBBCA.9090909@v.loewis.de>
 <CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>
Message-ID: <CACvLUan6YQoj5=ECckCKZbEjVoxBna_LTt9mVfP2ewm6aOkGwQ@mail.gmail.com>

Glenn Linderman wrote:
> Is this going to get released in 3.5, I hope?  Python 3 is pretty
> limited without some solution for Unicode on the console... probably the
> biggest deficiency I have found in Python 3, since its introduction. It
> has great Unicode support for files and processing, which convinced me
> to switch from Perl, and I like so much else about it, that I can hardly
> code in Perl any more (I still support a few Perl programs, but have
> ported most of them to Python).

I'd love to see it included in 3.5, but I doubt that will happen. For one
thing, it's only two weeks till beta 1, which is feature freeze. And
mainly, my package is mostly hacking into existing Python environment. A
proper implementation would need some changes in Python someone would have
to do. See for example my proposal
http://bugs.python.org/issue17620#msg234439. I'm not competent to write a
patch myself and I have also no feedback to the proposed idea. On the other
hand, using the package is good enough for me so I didn't further bring
attention to the proposal.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/7fd90d70/attachment.html>

From skip.montanaro at gmail.com  Sun May 10 16:04:11 2015
From: skip.montanaro at gmail.com (Skip Montanaro)
Date: Sun, 10 May 2015 09:04:11 -0500
Subject: [Python-Dev] Mac popups running make test
Message-ID: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>

I haven't run the test suite in awhile. I am in the midst of running it on
my Mac running Yosemite 10.10.3. Twice now, I've gotten this popup:


?
I assume this is testing some server listening on localhost. Is this a new
thing, either with the Python test suite or with Mac OS X? (I'd normally be
hidden behind a NAT firewall, but at the moment I am on a miserable public
connection in a Peet's Coffee, so it takes on slightly more importance...)

I've also seen the Crash Reporter pop up many times, but as far as I could
tell, in all cases the test suite output told me it was expected. Perhaps
tests which listen for network connections should also mention that, at
least on Macs?

Thx,

Skip
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/e1ae810f/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Screen Shot 2015-05-10 at 8.55.28 AM.png
Type: image/png
Size: 36209 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/e1ae810f/attachment-0001.png>

From brett at python.org  Sun May 10 16:07:28 2015
From: brett at python.org (Brett Cannon)
Date: Sun, 10 May 2015 14:07:28 +0000
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
Message-ID: <CAP1=2W4cG+pcPJUzNyd_FTEwsFpyOU7HKygRoA1DVwxM-wcBNw@mail.gmail.com>

On Sun, May 10, 2015 at 10:04 AM Skip Montanaro <skip.montanaro at gmail.com>
wrote:

> I haven't run the test suite in awhile. I am in the midst of running it on
> my Mac running Yosemite 10.10.3. Twice now, I've gotten this popup:
>
>
> ?
> I assume this is testing some server listening on localhost. Is this a new
> thing, either with the Python test suite or with Mac OS X? (I'd normally be
> hidden behind a NAT firewall, but at the moment I am on a miserable public
> connection in a Peet's Coffee, so it takes on slightly more importance...)
>

It's not new.


>
> I've also seen the Crash Reporter pop up many times, but as far as I could
> tell, in all cases the test suite output told me it was expected. Perhaps
> tests which listen for network connections should also mention that, at
> least on Macs?
>

Wouldn't hurt. Just requires tracking down which test(s) triggers it (might
be more than one and I don't know if answering that popup applies for the
rest of the test execution or once per test if you use -j).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/c35b06a0/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Screen Shot 2015-05-10 at 8.55.28 AM.png
Type: image/png
Size: 36209 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/c35b06a0/attachment.png>

From larry at hastings.org  Sun May 10 17:32:02 2015
From: larry at hastings.org (Larry Hastings)
Date: Sun, 10 May 2015 08:32:02 -0700
Subject: [Python-Dev] Free lists
In-Reply-To: <mimtfd$c3u$1@ger.gmane.org>
References: <milli3$4ln$1@ger.gmane.org> <554E652D.1000104@hastings.org>
 <mim0cq$2hh$1@ger.gmane.org>
 <CAN-Kwu2S_Ly1ueWbRSVfQZTYVoAnmX9n054kc62j+UR5zC8rRg@mail.gmail.com>
 <mimtfd$c3u$1@ger.gmane.org>
Message-ID: <554F79F2.1000309@hastings.org>

On 05/09/2015 11:22 PM, Serhiy Storchaka wrote:
> On 10.05.15 02:25, Ian Cordasco wrote:
>> Can you share how you gathered them so someone could run them on a
>> 64-bit build?
>
> This is quick and dirty patch. It generates 8 GB log file!

I ran it under 64-bit Linux.  Actually it generated a 10GB log file.  It 
was stalled at test_multiprocessing_fork for five hours so I killed it.

What follows are the (apparently) partial results.  I think it makes a 
good case for a one-element freelist for 64-bit builds.


//arry/

--

type                                count       %   acc.%

builtin_function_or_method      131028598  37.25%  37.25%
method                           52062496  14.80%  52.05%
int                              47600237  13.53%  65.59%
str                              43841584  12.46%  78.05%
generator                        14038624   3.99%  82.04%
float                             8617481   2.45%  84.49%
list_iterator                     8214121   2.34%  86.83%
bytes                             7884898   2.24%  89.07%
tuple_iterator                    5172174   1.47%  90.54%
_io.StringIO                      3482733   0.99%  91.53%
set                               3335168   0.95%  92.48%
str_iterator                      2856373   0.81%  93.29%
list                              2245981   0.64%  93.93%
dict                              1682253   0.48%  94.41%
method-wrapper                    1574412   0.45%  94.86%
function                          1475393   0.42%  95.28%
traceback                         1417094   0.40%  95.68%
tuple                             1181899   0.34%  96.01%
memoryview                        1103226   0.31%  96.33%
cell                              1047245   0.30%  96.63%
managedbuffer                     1044764   0.30%  96.92%
bytearray                          714337   0.20%  97.13%
range_iterator                     498240   0.14%  97.27%
range                              485325   0.14%  97.41%
super                              473542   0.13%  97.54%
map                                446608   0.13%  97.67%
frame                              426570   0.12%  97.79%
set_iterator                       424526   0.12%  97.91%
Leaf                               391824   0.11%  98.02%
symtable                           376815   0.11%  98.13%

int                              47600237  13.53%
                              0     294964   0.62%   0.62%
                              1   36135772  75.92%  76.53%
                              2    4504046   9.46%  86.00%
                              3    2109837   4.43%  90.43%
                              4    1277995   2.68%  93.11%
                              5     542775   1.14%  94.25%
                              6     485451   1.02%  95.27%
...

bytes                             7884898   2.24%
                              0        849   0.01%   0.01%
                              1     250357   3.18%   3.19%
                              2     450310   5.71%   8.90%
                              3     259659   3.29%  12.19%
                              4    1157554  14.68%  26.87%
                              5      77493   0.98%  27.85%
                              6     139816   1.77%  29.63%
                              7     165399   2.10%  31.72%
                              8     191821   2.43%  34.16%
                              9      63009   0.80%  34.96%
                             10      48751   0.62%  35.57%
                             11      50505   0.64%  36.22%
                             12      94186   1.19%  37.41%
                             13      33927   0.43%  37.84%
                             14     123546   1.57%  39.41%
                             15      36565   0.46%  39.87%
                             16     447183   5.67%  45.54%
                             17     186609   2.37%  47.91%
                             18    1301737  16.51%  64.42%
...

tuple                             1181899   0.34%
                              0         47   0.00%   0.00%
                              1     120156  10.17%  10.17%
                              2     340983  28.85%  39.02%
                              3      80924   6.85%  45.87%
                              4      78908   6.68%  52.54%
                              5      35502   3.00%  55.55%
                              6     171292  14.49%  70.04%
                              7     136474  11.55%  81.59%
                              8      48435   4.10%  85.69%
...



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/a1793b1e/attachment-0001.html>

From taleinat at gmail.com  Sun May 10 19:29:41 2015
From: taleinat at gmail.com (Tal Einat)
Date: Sun, 10 May 2015 20:29:41 +0300
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <CAP1=2W4cG+pcPJUzNyd_FTEwsFpyOU7HKygRoA1DVwxM-wcBNw@mail.gmail.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
 <CAP1=2W4cG+pcPJUzNyd_FTEwsFpyOU7HKygRoA1DVwxM-wcBNw@mail.gmail.com>
Message-ID: <CALWZvp67xpJcRXQ1=v=r8ixsUdE1dSANyoqa7uuKD2=FqgL_Eg@mail.gmail.com>

On Sun, May 10, 2015 at 5:07 PM, Brett Cannon <brett at python.org> wrote:

>
>
> On Sun, May 10, 2015 at 10:04 AM Skip Montanaro <skip.montanaro at gmail.com>
> wrote:
>
>> I haven't run the test suite in awhile. I am in the midst of running it
>> on my Mac running Yosemite 10.10.3. Twice now, I've gotten this popup:
>>
>>
>> ?
>> I assume this is testing some server listening on localhost. Is this a
>> new thing, either with the Python test suite or with Mac OS X? (I'd
>> normally be hidden behind a NAT firewall, but at the moment I am on a
>> miserable public connection in a Peet's Coffee, so it takes on slightly
>> more importance...)
>>
>
> It's not new.
>

Indeed, I've run into this as well.


>
>> I've also seen the Crash Reporter pop up many times, but as far as I
>> could tell, in all cases the test suite output told me it was expected.
>> Perhaps tests which listen for network connections should also mention
>> that, at least on Macs?
>>
>
> Wouldn't hurt. Just requires tracking down which test(s) triggers it
> (might be more than one and I don't know if answering that popup applies
> for the rest of the test execution or once per test if you use -j).
>

If anyone starts working on this, let me know if I can help, e.g. trying
things on my own Mac.

- Tal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/74b60a1a/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Screen Shot 2015-05-10 at 8.55.28 AM.png
Type: image/png
Size: 36209 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/74b60a1a/attachment.png>

From taleinat at gmail.com  Sun May 10 20:14:41 2015
From: taleinat at gmail.com (Tal Einat)
Date: Sun, 10 May 2015 21:14:41 +0300
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <554F9E70.3010509@willingconsulting.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
 <CAP1=2W4cG+pcPJUzNyd_FTEwsFpyOU7HKygRoA1DVwxM-wcBNw@mail.gmail.com>
 <CALWZvp67xpJcRXQ1=v=r8ixsUdE1dSANyoqa7uuKD2=FqgL_Eg@mail.gmail.com>
 <554F9E70.3010509@willingconsulting.com>
Message-ID: <CALWZvp4D4C3+z=Xw2USvQTDXsZhQMw9o6zoyZVM2UeECaW2QxA@mail.gmail.com>

On Sun, May 10, 2015 at 9:07 PM, Carol Willing <
willingc at willingconsulting.com> wrote:

>
> On 5/10/15 10:29 AM, Tal Einat wrote:
>
>  On Sun, May 10, 2015 at 5:07 PM, Brett Cannon <brett at python.org> wrote:
>
>>
>>
>> On Sun, May 10, 2015 at 10:04 AM Skip Montanaro <skip.montanaro at gmail.com>
>> wrote:
>>
>>> I haven't run the test suite in awhile. I am in the midst of running it
>>> on my Mac running Yosemite 10.10.3. Twice now, I've gotten this popup:
>>>
>>>
>>> ?
>>>  I assume this is testing some server listening on localhost. Is this a
>>> new thing, either with the Python test suite or with Mac OS X? (I'd
>>> normally be hidden behind a NAT firewall, but at the moment I am on a
>>> miserable public connection in a Peet's Coffee, so it takes on slightly
>>> more importance...)
>>>
>>
>>  It's not new.
>>
>
>  Indeed, I've run into this as well.
>
>
>>
>>>  I've also seen the Crash Reporter pop up many times, but as far as I
>>> could tell, in all cases the test suite output told me it was expected.
>>> Perhaps tests which listen for network connections should also mention
>>> that, at least on Macs?
>>>
>>
>>  Wouldn't hurt. Just requires tracking down which test(s) triggers it
>> (might be more than one and I don't know if answering that popup applies
>> for the rest of the test execution or once per test if you use -j).
>>
>
>  If anyone starts working on this, let me know if I can help, e.g. trying
> things on my own Mac.
>
>    I believe that the message has to do with OS X's sandboxing
> implementation and the setting of the sandbox's entitlement keys. Here's an
> Apple doc:
> https://developer.apple.com/library/ios/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html#//apple_ref/doc/uid/TP40011195-CH4-SW9
>
> I'm unaware of a way to work around this other than using Apple's code
> signing or adjusting target build settings in XCode :( If anyone knows a
> good way to workaround or manually set permission (other than clicking the
> Allow button), I would be interested.
>

I was reading about this a few weeks ago an recall finding a way to ad-hoc
sign the built python executable. Here's a link below. I haven't tried
this, though, and don't know if it would work with a python executable
rather than a proper OSX app. If it does work, it would be useful to add
this as a tool and/or mention it in the developer docs.

http://apple.stackexchange.com/a/121010

- Tal Einat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/8f52efc0/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: image/png
Size: 36209 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/8f52efc0/attachment-0001.png>

From larry at hastings.org  Sun May 10 20:28:19 2015
From: larry at hastings.org (Larry Hastings)
Date: Sun, 10 May 2015 11:28:19 -0700
Subject: [Python-Dev] Is it kosher to use a buffer after release?
Message-ID: <554FA343.7020202@hastings.org>



In Python's argument parsing code (convertsimple in Python/getargs.c), a 
couple of format units* accept "read-only bytes-like objects", aka 
read-only buffer objects.  They call a helper function called 
convertbuffer() which uses the buffer protocol to extract a pointer to 
the memory.

Here's the relevant bit of code:

    static Py_ssize_t
    convertbuffer(PyObject *arg, void **p, char **errmsg)
    {
    Py_buffer view;
    ...

    if (getbuffer(arg, &view, errmsg) < 0)
         return -1;
    count = view.len;
    *p = view.buf;
    PyBuffer_Release(&view);
    return count;
    }


getbuffer() uses the buffer protocol to fill in the "view" buffer. If 
it's successful, "view" is a valid buffer.  We store the pointer to the 
buffer's memory in output parameter p.

THEN WE RELEASE THE BUFFER.

THEN WE RETURN TO THE CALLER.

In case you missed the big helpful capital letters, we are returning a 
pointer given to us by PyObject_GetBuffer(), which we have already 
released by calling PyBuffer_Release().  The buffer protocol 
documentation for bf_releasebuffer makes it sound like this pointer 
could easily be invalid after the release call finishes.

Am I missing something, or is this code relying on an implementation 
detail it shouldn't--namely that you can continue using a pointer to 
some (most? all?) buffer memory even after releasing it?


//arry/

* Specifically: s# y y# z#
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/a28e70dd/attachment.html>

From storchaka at gmail.com  Sun May 10 20:46:16 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Sun, 10 May 2015 21:46:16 +0300
Subject: [Python-Dev] Is it kosher to use a buffer after release?
In-Reply-To: <554FA343.7020202@hastings.org>
References: <554FA343.7020202@hastings.org>
Message-ID: <mio91p$fv9$1@ger.gmane.org>

On 10.05.15 21:28, Larry Hastings wrote:
> In Python's argument parsing code (convertsimple in Python/getargs.c), a
> couple of format units* accept "read-only bytes-like objects", aka
> read-only buffer objects.  They call a helper function called
> convertbuffer() which uses the buffer protocol to extract a pointer to
> the memory.
>
> Here's the relevant bit of code:
>
>     static Py_ssize_t
>     convertbuffer(PyObject *arg, void **p, char **errmsg)
>     {
>     Py_buffer view;
>     ...
>
>     if (getbuffer(arg, &view, errmsg) < 0)
>          return -1;
>     count = view.len;
>     *p = view.buf;
>     PyBuffer_Release(&view);
>     return count;
>     }
>
>
> getbuffer() uses the buffer protocol to fill in the "view" buffer. If
> it's successful, "view" is a valid buffer.  We store the pointer to the
> buffer's memory in output parameter p.
>
> THEN WE RELEASE THE BUFFER.
>
> THEN WE RETURN TO THE CALLER.
>
> In case you missed the big helpful capital letters, we are returning a
> pointer given to us by PyObject_GetBuffer(), which we have already
> released by calling PyBuffer_Release().  The buffer protocol
> documentation for bf_releasebuffer makes it sound like this pointer
> could easily be invalid after the release call finishes.
>
> Am I missing something, or is this code relying on an implementation
> detail it shouldn't--namely that you can continue using a pointer to
> some (most? all?) buffer memory even after releasing it?

You are missing following code:

     if (pb != NULL && pb->bf_releasebuffer != NULL) {
         *errmsg = "read-only bytes-like object";
         return -1;
     }

convertbuffer() is applicable only for types for which 
PyBuffer_Release() is no-op. That is why there are different format 
units for read-only buffers and for general buffers. That is why new 
buffer protocol was introduced.


From dreamingforward at gmail.com  Sun May 10 18:34:52 2015
From: dreamingforward at gmail.com (Mark Rosenblitt-Janssen)
Date: Sun, 10 May 2015 11:34:52 -0500
Subject: [Python-Dev] anomaly
Message-ID: <CAMjeLr--RfnfgWc1hdOcHiEBpq3QwC7SZk-m14wM65JtpbgMLQ@mail.gmail.com>

Here's something that might be wrong in Python (tried on v2.7):

>>> class int(str): pass

>>> int(3)
'3'

Mark

From willingc at willingconsulting.com  Sun May 10 20:07:44 2015
From: willingc at willingconsulting.com (Carol Willing)
Date: Sun, 10 May 2015 11:07:44 -0700
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <CALWZvp67xpJcRXQ1=v=r8ixsUdE1dSANyoqa7uuKD2=FqgL_Eg@mail.gmail.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
 <CAP1=2W4cG+pcPJUzNyd_FTEwsFpyOU7HKygRoA1DVwxM-wcBNw@mail.gmail.com>
 <CALWZvp67xpJcRXQ1=v=r8ixsUdE1dSANyoqa7uuKD2=FqgL_Eg@mail.gmail.com>
Message-ID: <554F9E70.3010509@willingconsulting.com>

On 5/10/15 10:29 AM, Tal Einat wrote:
> On Sun, May 10, 2015 at 5:07 PM, Brett Cannon <brett at python.org 
> <mailto:brett at python.org>> wrote:
>
>
>
>     On Sun, May 10, 2015 at 10:04 AM Skip Montanaro
>     <skip.montanaro at gmail.com <mailto:skip.montanaro at gmail.com>> wrote:
>
>         I haven't run the test suite in awhile. I am in the midst of
>         running it on my Mac running Yosemite 10.10.3. Twice now, I've
>         gotten this popup:
>
>
>         ?
>         I assume this is testing some server listening on localhost.
>         Is this a new thing, either with the Python test suite or with
>         Mac OS X? (I'd normally be hidden behind a NAT firewall, but
>         at the moment I am on a miserable public connection in a
>         Peet's Coffee, so it takes on slightly more importance...)
>
>
>     It's not new.
>
>
> Indeed, I've run into this as well.
>
>
>         I've also seen the Crash Reporter pop up many times, but as
>         far as I could tell, in all cases the test suite output told
>         me it was expected. Perhaps tests which listen for network
>         connections should also mention that, at least on Macs?
>
>
>     Wouldn't hurt. Just requires tracking down which test(s) triggers
>     it (might be more than one and I don't know if answering that
>     popup applies for the rest of the test execution or once per test
>     if you use -j).
>
>
> If anyone starts working on this, let me know if I can help, e.g. 
> trying things on my own Mac.
>
I believe that the message has to do with OS X's sandboxing 
implementation and the setting of the sandbox's entitlement keys. Here's 
an Apple doc: 
https://developer.apple.com/library/ios/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html#//apple_ref/doc/uid/TP40011195-CH4-SW9

I'm unaware of a way to work around this other than using Apple's code 
signing or adjusting target build settings in XCode :( If anyone knows a 
good way to workaround or manually set permission (other than clicking 
the Allow button), I would be interested.

Warmly,
Carol
-- 
*Carol Willing*
Developer | Willing Consulting
https://willingconsulting.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/cbe1586b/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: image/png
Size: 36209 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150510/cbe1586b/attachment-0001.png>

From dreamingforward at gmail.com  Mon May 11 02:14:18 2015
From: dreamingforward at gmail.com (Mark Rosenblitt-Janssen)
Date: Sun, 10 May 2015 19:14:18 -0500
Subject: [Python-Dev] anomaly
In-Reply-To: <CAMjeLr9v3vVHmyThb=uTNTG6XJcWmebZE35=cmFognfsKj6-Sg@mail.gmail.com>
References: <CAMjeLr--RfnfgWc1hdOcHiEBpq3QwC7SZk-m14wM65JtpbgMLQ@mail.gmail.com>
 <CAMjeLr-_jd2wAJJTZRO3+p58hzD+B3UiDyrkhz-Vw-=Lopmrmg@mail.gmail.com>
 <CAMjeLr9v3vVHmyThb=uTNTG6XJcWmebZE35=cmFognfsKj6-Sg@mail.gmail.com>
Message-ID: <CAMjeLr-67ZJNF3N5nWuDNGRWjOY_B41++737TxC4V6DYFH1t5Q@mail.gmail.com>

In case the example given at the start of the thread wasn't
interesting enough, it also works in the other direction:

>>> class str(int):  pass

>>> str('2')
2  #<----- an integer!!!

Mark

From mal at egenix.com  Mon May 11 10:04:25 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 11 May 2015 10:04:25 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
Message-ID: <55506289.1000705@egenix.com>

On 10.05.2015 05:04, Robert Collins wrote:
> On 10 May 2015 at 11:44, Chris Angelico <rosuav at gmail.com> wrote:
>> On Sun, May 10, 2015 at 4:13 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>>> By providing a way to intentionally switch off the new default,
>>> we do make people aware of the risks and that's good enough,
>>> while still maintaining the contract people rightly expect of
>>> patch level releases of Python.
>>
>> Just as long as it's the sysadmin, and NOT some random attacker over
>> the internet, who has the power to downgrade security. Environment
>> variables can be attacked in various ways.
> 
> They can, and the bash fun was very good evidence of that.
> 
> OTOH if someones environment is at risk, PATH and PYTHONPATH are
> already very effective attack vectors.

If an attacker has access to the process environment, you're doomed
anyway, so that's not really an argument for or against using
environment variables :-)

You'd just need to create a file os.py and point PYTHONPATH at it.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 11 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From ncoghlan at gmail.com  Mon May 11 10:09:00 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 11 May 2015 18:09:00 +1000
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <CACvLUan6YQoj5=ECckCKZbEjVoxBna_LTt9mVfP2ewm6aOkGwQ@mail.gmail.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CADiSq7eJPx5s9S009=vcDDx29=t0e8Kc1aYLPrjaS57w505Rog@mail.gmail.com>
 <CACvLUamkLb3PvfjOj-1cGWk7CWUTdboRkWRJKozowh6-1VjHAQ@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
 <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>
 <554BBBCA.9090909@v.loewis.de>
 <CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>
 <CACvLUan6YQoj5=ECckCKZbEjVoxBna_LTt9mVfP2ewm6aOkGwQ@mail.gmail.com>
Message-ID: <CADiSq7cbK+BQeBrMqBt3su0CybYsZtH82tXpnG0q_JtHQ2B2Ew@mail.gmail.com>

On 10 May 2015 at 23:28, Adam Barto? <drekin at gmail.com> wrote:
> Glenn Linderman wrote:
>> Is this going to get released in 3.5, I hope?  Python 3 is pretty
>> limited without some solution for Unicode on the console... probably the
>> biggest deficiency I have found in Python 3, since its introduction. It
>> has great Unicode support for files and processing, which convinced me
>> to switch from Perl, and I like so much else about it, that I can hardly
>> code in Perl any more (I still support a few Perl programs, but have
>> ported most of them to Python).
>
> I'd love to see it included in 3.5, but I doubt that will happen. For one
> thing, it's only two weeks till beta 1, which is feature freeze. And mainly,
> my package is mostly hacking into existing Python environment. A proper
> implementation would need some changes in Python someone would have to do.
> See for example my proposal http://bugs.python.org/issue17620#msg234439. I'm
> not competent to write a patch myself and I have also no feedback to the
> proposed idea. On the other hand, using the package is good enough for me so
> I didn't further bring attention to the proposal.

Right, and while I'm interested in seeing this improved, I'm not
especially familiar with the internal details of our terminal
interaction implementation, and even less so when it comes to the
Windows terminal. Steve Dower's also had his hands full working on the
Windows installer changes, and several of our other Windows folks
aren't C programmers.

PEP 432 (the interpreter startup sequence improvements) will be back
on the agenda for Python 3.6, so the 3.6 time frame seems more
plausible at this point.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From p.f.moore at gmail.com  Mon May 11 10:12:24 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Mon, 11 May 2015 09:12:24 +0100
Subject: [Python-Dev] anomaly
In-Reply-To: <CAMjeLr--RfnfgWc1hdOcHiEBpq3QwC7SZk-m14wM65JtpbgMLQ@mail.gmail.com>
References: <CAMjeLr--RfnfgWc1hdOcHiEBpq3QwC7SZk-m14wM65JtpbgMLQ@mail.gmail.com>
Message-ID: <CACac1F_EDsHyDFJR-H0MnBgqpmmfX-441B7c_nS6xk3h976-cw@mail.gmail.com>

On 10 May 2015 at 17:34, Mark Rosenblitt-Janssen
<dreamingforward at gmail.com> wrote:
> Here's something that might be wrong in Python (tried on v2.7):
>
>>>> class int(str): pass
>
>>>> int(3)
> '3'

It's not wrong as such. It is allowed to define your own class that
subclasses a builtin class, and it's allowed to shadow builtin names.
So while this is (obviously) bad practice, it's not wrong.

For a simpler example:

Python 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:44:40) [MSC v.1600
64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> str
<class 'str'>
>>> str = "Hello"
>>> str
'Hello'

Paul

From ncoghlan at gmail.com  Mon May 11 11:13:30 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 11 May 2015 19:13:30 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <55506289.1000705@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
Message-ID: <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>

On 11 May 2015 at 18:04, M.-A. Lemburg <mal at egenix.com> wrote:
> On 10.05.2015 05:04, Robert Collins wrote:
>> On 10 May 2015 at 11:44, Chris Angelico <rosuav at gmail.com> wrote:
>>> On Sun, May 10, 2015 at 4:13 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>>>> By providing a way to intentionally switch off the new default,
>>>> we do make people aware of the risks and that's good enough,
>>>> while still maintaining the contract people rightly expect of
>>>> patch level releases of Python.
>>>
>>> Just as long as it's the sysadmin, and NOT some random attacker over
>>> the internet, who has the power to downgrade security. Environment
>>> variables can be attacked in various ways.
>>
>> They can, and the bash fun was very good evidence of that.
>>
>> OTOH if someones environment is at risk, PATH and PYTHONPATH are
>> already very effective attack vectors.
>
> If an attacker has access to the process environment, you're doomed
> anyway, so that's not really an argument for or against using
> environment variables :-)

The core issue lies in managing the "user" vs "administrator"
permissions split. Even for self-administered systems, it's
recommended practice to run *without* administrative permissions
normally, and only elevate when you need them. One of the things
you're watching out for in such cases is that it shouldn't be possible
for an attacker to make a change to the user environment, and have
that propagate to have an effect on a process running with
administrative access. One of the recommended hardening measures
against that kind of attack vector is to *turn off* Python's
environment variable processing when launching Python processes with
administrative access.

We didn't care about that in the hash randomisation case, as the
compatibility concern there applied on a per application basis, and
caring about hash order was technically a bug in its own right. By
contrast, in the situation we're worried about for certificate
verification compatibility, the issue is environmental: certificate
management in many private intranets isn't yet to the same standard as
that on the public internet, so administrators may have a valid reason
for defaulting Python back to the old behaviour, and redistributors
may feel obliged to provide an opt-in period prior to switching the
default behaviour to opt-out. Having the new setting be ignored in
Python processes run under a hardened configuration means that an
environment variable based solution won't have the desired effect in
providing that smoother migration path to the more hardened
configuration.

I've now written a draft "recommendations to redistributors" PEP for
Robert's configuration file based design:
https://www.python.org/dev/peps/pep-0493/ (exact file names & config
setting names TBD)

I wouldn't be opposed to seeing that as an upstream Python 2.7.x
feature, but agreement amongst redistributors on using a file-based
approach is the main outcome I'm looking for.

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From mal at egenix.com  Mon May 11 11:22:09 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 11 May 2015 11:22:09 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>	<CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>	<55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
Message-ID: <555074C1.80909@egenix.com>

On 11.05.2015 11:13, Nick Coghlan wrote:
> On 11 May 2015 at 18:04, M.-A. Lemburg <mal at egenix.com> wrote:
>> On 10.05.2015 05:04, Robert Collins wrote:
>>> On 10 May 2015 at 11:44, Chris Angelico <rosuav at gmail.com> wrote:
>>>> On Sun, May 10, 2015 at 4:13 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>>>>> By providing a way to intentionally switch off the new default,
>>>>> we do make people aware of the risks and that's good enough,
>>>>> while still maintaining the contract people rightly expect of
>>>>> patch level releases of Python.
>>>>
>>>> Just as long as it's the sysadmin, and NOT some random attacker over
>>>> the internet, who has the power to downgrade security. Environment
>>>> variables can be attacked in various ways.
>>>
>>> They can, and the bash fun was very good evidence of that.
>>>
>>> OTOH if someones environment is at risk, PATH and PYTHONPATH are
>>> already very effective attack vectors.
>>
>> If an attacker has access to the process environment, you're doomed
>> anyway, so that's not really an argument for or against using
>> environment variables :-)
> 
> The core issue lies in managing the "user" vs "administrator"
> permissions split. Even for self-administered systems, it's
> recommended practice to run *without* administrative permissions
> normally, and only elevate when you need them. One of the things
> you're watching out for in such cases is that it shouldn't be possible
> for an attacker to make a change to the user environment, and have
> that propagate to have an effect on a process running with
> administrative access. One of the recommended hardening measures
> against that kind of attack vector is to *turn off* Python's
> environment variable processing when launching Python processes with
> administrative access.

The env var would not be read at Python startup time, only
when loading the ssl module, so the -E switch would not have
the effect of disabling it - unlike the hash seed logic, which
is run (and has to be run) at Python startup time.

> We didn't care about that in the hash randomisation case, as the
> compatibility concern there applied on a per application basis, and
> caring about hash order was technically a bug in its own right. By
> contrast, in the situation we're worried about for certificate
> verification compatibility, the issue is environmental: certificate
> management in many private intranets isn't yet to the same standard as
> that on the public internet, so administrators may have a valid reason
> for defaulting Python back to the old behaviour, and redistributors
> may feel obliged to provide an opt-in period prior to switching the
> default behaviour to opt-out. Having the new setting be ignored in
> Python processes run under a hardened configuration means that an
> environment variable based solution won't have the desired effect in
> providing that smoother migration path to the more hardened
> configuration.
> 
> I've now written a draft "recommendations to redistributors" PEP for
> Robert's configuration file based design:
> https://www.python.org/dev/peps/pep-0493/ (exact file names & config
> setting names TBD)

The Fastly cache seems to be having problems again. I only get:
503 Backend is unhealthy - Details: cache-fra1225-FRA 1431335851 2631441948

> I wouldn't be opposed to seeing that as an upstream Python 2.7.x
> feature, but agreement amongst redistributors on using a file-based
> approach is the main outcome I'm looking for.

Can't we have both ?

I don't think that we can wait for a whole PEP process to
run through to fix this regression in 2.7.9.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 11 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From ncoghlan at gmail.com  Mon May 11 12:15:27 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 11 May 2015 20:15:27 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <555074C1.80909@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
Message-ID: <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>

On 11 May 2015 at 19:22, M.-A. Lemburg <mal at egenix.com> wrote:
> On 11.05.2015 11:13, Nick Coghlan wrote:
>> I wouldn't be opposed to seeing that as an upstream Python 2.7.x
>> feature, but agreement amongst redistributors on using a file-based
>> approach is the main outcome I'm looking for.
>
> Can't we have both ?

I'd strongly advise against it, as we're deliberately increasing the
attack surface area here by providing a potential path to carry out a
downgrade attack on the HTTPS certificate verification by forcing it
back to the old behaviour.

The main existing environment variable based attack vector would be to
manage to run a process with administrative privileges and
SSL_CERT_DIR and/or SSL_CERT_FILE pointing to a certificate written to
a user or world-writable directory by the attacker. Providing a "don't
verify HTTPS" flag at the interpreter level would let an attacker skip
the first step of writing the certificate file to disk somewhere,
making a system compromise harder to detect. (An especially paranoid
SSL implementation would disallow reading certs from locations that
allow write access to non-administrative users, but I don't believe
OpenSSL is that paranoid)

By contrast, the configuration file shouldn't provide a new attack
vector (or simplify any existing attack vector), as if you have the
permissions needed to modify the config file, you likely also have the
permissions needed to modify the system certificate store, and the
latter is a *far* more interesting attack vector than a downgrade
attack solely against Python.

Thus the environment variable based off switch is neither necessary
(as an administrator controlled configuration file can do the job),
nor sufficient (it can't handle the -E switch), *and* it represents an
increase in the attack surface area relative to a Python
implementation without the capability.

> I don't think that we can wait for a whole PEP process to
> run through

Matrix multiplication made it through the PEP process in 8 days. If we
do this as a redistributor recommendation rather than attempting to
get it into upstream Python 2.7, we could potentially propose you take
on the role of BDFL-Delegate and mark it as Accepted as soon as the
two of us agree on a common approach.

The reason I think that's a reasonable way forward is because we
already know there are folks opposed to making the change upstream. If
the PEP just provides recommendations for redistributors that *do*
decide to provide a "global off switch" to revert to the old
behaviour, then the perspective of the folks opposed to the feature is
respected by the fact that this is a feature some redistributors *may*
choose to add to provide a smoother migration path to more secure
default HTTPS handling, rather than something upstream provides by
default.

I assume the Debian, Ubuntu and Suse folks won't care, as they have
all already decided against backporting the change to their long term
support releases where the compatibility break would pose a problem
(and I can certainly sympathise with that perspective given the
dependency on backporting the PEP 466 SSL changes first, and the work
involved in seeking consensus on a smoother migration path from the
old default to the new one).

It would be nice to hear from ActiveState, Enthought & Continuum
Analytics as well, but if they don't chime in here, I don't see any
particular need to go chasing them explicitly.

>to fix this regression in 2.7.9.

We made the decision when PEP 476 was accepted that this change turned
a silent security failure into a noisy one, rather than being a
regression in its own right. PEP 493 isn't about disagreeing with that
decision, it's about providing a smoother upgrade path in contexts
where letting the security failure remain silent is deemed to be
preferred in the near term.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From donald at stufft.io  Mon May 11 12:23:11 2015
From: donald at stufft.io (Donald Stufft)
Date: Mon, 11 May 2015 06:23:11 -0400
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
Message-ID: <89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>


> On May 11, 2015, at 6:15 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> On 11 May 2015 at 19:22, M.-A. Lemburg <mal at egenix.com> wrote:
>> On 11.05.2015 11:13, Nick Coghlan wrote:
>>> I wouldn't be opposed to seeing that as an upstream Python 2.7.x
>>> feature, but agreement amongst redistributors on using a file-based
>>> approach is the main outcome I'm looking for.
>> 
>> Can't we have both ?
> 
> I'd strongly advise against it, as we're deliberately increasing the
> attack surface area here by providing a potential path to carry out a
> downgrade attack on the HTTPS certificate verification by forcing it
> back to the old behaviour.
> 
> The main existing environment variable based attack vector would be to
> manage to run a process with administrative privileges and
> SSL_CERT_DIR and/or SSL_CERT_FILE pointing to a certificate written to
> a user or world-writable directory by the attacker. Providing a "don't
> verify HTTPS" flag at the interpreter level would let an attacker skip
> the first step of writing the certificate file to disk somewhere,
> making a system compromise harder to detect. (An especially paranoid
> SSL implementation would disallow reading certs from locations that
> allow write access to non-administrative users, but I don't believe
> OpenSSL is that paranoid)
> 
> By contrast, the configuration file shouldn't provide a new attack
> vector (or simplify any existing attack vector), as if you have the
> permissions needed to modify the config file, you likely also have the
> permissions needed to modify the system certificate store, and the
> latter is a *far* more interesting attack vector than a downgrade
> attack solely against Python.
> 
> Thus the environment variable based off switch is neither necessary
> (as an administrator controlled configuration file can do the job),
> nor sufficient (it can't handle the -E switch), *and* it represents an
> increase in the attack surface area relative to a Python
> implementation without the capability.
> 
>> I don't think that we can wait for a whole PEP process to
>> run through
> 
> Matrix multiplication made it through the PEP process in 8 days. If we
> do this as a redistributor recommendation rather than attempting to
> get it into upstream Python 2.7, we could potentially propose you take
> on the role of BDFL-Delegate and mark it as Accepted as soon as the
> two of us agree on a common approach.
> 
> The reason I think that's a reasonable way forward is because we
> already know there are folks opposed to making the change upstream. If
> the PEP just provides recommendations for redistributors that *do*
> decide to provide a "global off switch" to revert to the old
> behaviour, then the perspective of the folks opposed to the feature is
> respected by the fact that this is a feature some redistributors *may*
> choose to add to provide a smoother migration path to more secure
> default HTTPS handling, rather than something upstream provides by
> default.
> 
> I assume the Debian, Ubuntu and Suse folks won't care, as they have
> all already decided against backporting the change to their long term
> support releases where the compatibility break would pose a problem
> (and I can certainly sympathise with that perspective given the
> dependency on backporting the PEP 466 SSL changes first, and the work
> involved in seeking consensus on a smoother migration path from the
> old default to the new one).
> 
> It would be nice to hear from ActiveState, Enthought & Continuum
> Analytics as well, but if they don't chime in here, I don't see any
> particular need to go chasing them explicitly.
> 
>> to fix this regression in 2.7.9.
> 
> We made the decision when PEP 476 was accepted that this change turned
> a silent security failure into a noisy one, rather than being a
> regression in its own right. PEP 493 isn't about disagreeing with that
> decision, it's about providing a smoother upgrade path in contexts
> where letting the security failure remain silent is deemed to be
> preferred in the near term.
> 


I don't really agree that the decision to disable TLS is an environment one,
it's really a per application decision. This is why I was against having some
sort of global off switch for all of Python because just because one
application needs it turned off doesn't mean you want it turned off for another
Python application. You might have some script that is interacting with a
custom internal server which doesn?t have a valid TLS certificate but then you
also have pip* installed which is reaching out to PyPI and downloading code
from the internet. You might want to disable TLS verification for the first but
you almost certainly don't want it to disable TLS verification for the second
one.

In this regard I think that environment variables are somewhat better because
they are far easier to set per application instead of in a way that affects
every python program. Per application is the *right* scope for this setting,
especially in a system where people may or may not realize what is written in
Python and what isn't. I think it's absolutely wrong to give people a footgun
in the terms of a switch that turns off all of Python's TLS verification when
for many applications the fact they use Python is simply an implementation
detail.

That being said, since it's not being included in Python core and it's only
some patch that some downstream's are going to apply I also don't really care
that much because it's not going to effect me and if it turns out to be a bad
idea and a footgun like I think it is, then the blame can rest on those
downstreams and not us :)

I'm also not a fan of the environment variable either really for a lot of the
reasons you've outlined here.

* Ignoring the fact that pip has (via requests/urllib3) worked around this
  deficiency in Python and isn't going to be affected by this configuration
  switch at all.


---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 801 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150511/487da866/attachment.sig>

From solipsis at pitrou.net  Mon May 11 12:39:12 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Mon, 11 May 2015 12:39:12 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>
Message-ID: <20150511123912.70816e2e@fsol>



I'm in entire agreement with Donald below.

Regards

Antoine.


On Mon, 11 May 2015 06:23:11 -0400
Donald Stufft <donald at stufft.io> wrote:
> 
> I don't really agree that the decision to disable TLS is an environment one,
> it's really a per application decision. This is why I was against having some
> sort of global off switch for all of Python because just because one
> application needs it turned off doesn't mean you want it turned off for another
> Python application. You might have some script that is interacting with a
> custom internal server which doesn?t have a valid TLS certificate but then you
> also have pip* installed which is reaching out to PyPI and downloading code
> from the internet. You might want to disable TLS verification for the first but
> you almost certainly don't want it to disable TLS verification for the second
> one.
> 
> In this regard I think that environment variables are somewhat better because
> they are far easier to set per application instead of in a way that affects
> every python program. Per application is the *right* scope for this setting,
> especially in a system where people may or may not realize what is written in
> Python and what isn't. I think it's absolutely wrong to give people a footgun
> in the terms of a switch that turns off all of Python's TLS verification when
> for many applications the fact they use Python is simply an implementation
> detail.
> 
> That being said, since it's not being included in Python core and it's only
> some patch that some downstream's are going to apply I also don't really care
> that much because it's not going to effect me and if it turns out to be a bad
> idea and a footgun like I think it is, then the blame can rest on those
> downstreams and not us :)
> 
> I'm also not a fan of the environment variable either really for a lot of the
> reasons you've outlined here.
> 
> * Ignoring the fact that pip has (via requests/urllib3) worked around this
>   deficiency in Python and isn't going to be affected by this configuration
>   switch at all.
> 
> 
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
> 
> 




From ncoghlan at gmail.com  Mon May 11 12:47:00 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 11 May 2015 20:47:00 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>
Message-ID: <CADiSq7fAYLCiq_Yk7ikWUWbPiqk_LFzN5+J14w4fsaz+YL-n+g@mail.gmail.com>

On 11 May 2015 at 20:23, Donald Stufft <donald at stufft.io> wrote:
> On May 11, 2015, at 6:15 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> We made the decision when PEP 476 was accepted that this change turned
>> a silent security failure into a noisy one, rather than being a
>> regression in its own right. PEP 493 isn't about disagreeing with that
>> decision, it's about providing a smoother upgrade path in contexts
>> where letting the security failure remain silent is deemed to be
>> preferred in the near term.
>
> I don't really agree that the decision to disable TLS is an environment one,
> it's really a per application decision. This is why I was against having some
> sort of global off switch for all of Python because just because one
> application needs it turned off doesn't mean you want it turned off for another
> Python application.

The scenario I'm interested in is the one where it *was* off globally
(i.e. you were already running Python 2.7.8 or earlier) and you want
to manage a global rollout of a new Python version that supports being
configured to verify HTTPS certificates by default, while making the
decision on whether or not to enable HTTPS certificate verification on
a server-by-server basis, rather than having that decision be coupled
directly to the rollout of the updated version of Python.

I agree that the desired end state is where Python 3 is, and where
upstream Python 2.7.9+ is, this is solely about how to facilitate
folks getting from point A to point B without an intervening window of
"I broke the world and now my boss is yelling at me about it" :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From mal at egenix.com  Mon May 11 12:58:19 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 11 May 2015 12:58:19 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7fAYLCiq_Yk7ikWUWbPiqk_LFzN5+J14w4fsaz+YL-n+g@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>	<CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>	<55506289.1000705@egenix.com>	<CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>	<555074C1.80909@egenix.com>	<CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>	<89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>
 <CADiSq7fAYLCiq_Yk7ikWUWbPiqk_LFzN5+J14w4fsaz+YL-n+g@mail.gmail.com>
Message-ID: <55508B4B.7060707@egenix.com>

On 11.05.2015 12:47, Nick Coghlan wrote:
> On 11 May 2015 at 20:23, Donald Stufft <donald at stufft.io> wrote:
>> On May 11, 2015, at 6:15 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>> We made the decision when PEP 476 was accepted that this change turned
>>> a silent security failure into a noisy one, rather than being a
>>> regression in its own right. PEP 493 isn't about disagreeing with that
>>> decision, it's about providing a smoother upgrade path in contexts
>>> where letting the security failure remain silent is deemed to be
>>> preferred in the near term.
>>
>> I don't really agree that the decision to disable TLS is an environment one,
>> it's really a per application decision. This is why I was against having some
>> sort of global off switch for all of Python because just because one
>> application needs it turned off doesn't mean you want it turned off for another
>> Python application.
> 
> The scenario I'm interested in is the one where it *was* off globally
> (i.e. you were already running Python 2.7.8 or earlier) and you want
> to manage a global rollout of a new Python version that supports being
> configured to verify HTTPS certificates by default, while making the
> decision on whether or not to enable HTTPS certificate verification on
> a server-by-server basis, rather than having that decision be coupled
> directly to the rollout of the updated version of Python.

I guess we're approaching this from different perspectives :-)

I'm mostly interested in having a switch that can be set on
a per application basis, not globally.

I think the global default is fine and I'm just looking for a way to have
admins disable it on a case-by-case basis for those applications which
have problems with the new default. Hence the env var approach
- the admin would simply edit the application's startup shell
script, add the env var and that's it.

For pip et al. which don't use the ssl module, the admins can simply
continue using older versions for those applications - ones which
don't implement the extra verification. In many cases, this is not
necessary, since production environments typically don't use PyPI
at all: they use a local directory with the needed distribution
files, which is both more secure and reliable.

> I agree that the desired end state is where Python 3 is, and where
> upstream Python 2.7.9+ is, this is solely about how to facilitate
> folks getting from point A to point B without an intervening window of
> "I broke the world and now my boss is yelling at me about it" :)

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 11 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From donald at stufft.io  Mon May 11 13:16:58 2015
From: donald at stufft.io (Donald Stufft)
Date: Mon, 11 May 2015 07:16:58 -0400
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7fAYLCiq_Yk7ikWUWbPiqk_LFzN5+J14w4fsaz+YL-n+g@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>
 <CADiSq7fAYLCiq_Yk7ikWUWbPiqk_LFzN5+J14w4fsaz+YL-n+g@mail.gmail.com>
Message-ID: <39FA939A-D71E-4FC4-9AEB-8D9FCDF02CD4@stufft.io>


> On May 11, 2015, at 6:47 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> On 11 May 2015 at 20:23, Donald Stufft <donald at stufft.io> wrote:
>> On May 11, 2015, at 6:15 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>> We made the decision when PEP 476 was accepted that this change turned
>>> a silent security failure into a noisy one, rather than being a
>>> regression in its own right. PEP 493 isn't about disagreeing with that
>>> decision, it's about providing a smoother upgrade path in contexts
>>> where letting the security failure remain silent is deemed to be
>>> preferred in the near term.
>> 
>> I don't really agree that the decision to disable TLS is an environment one,
>> it's really a per application decision. This is why I was against having some
>> sort of global off switch for all of Python because just because one
>> application needs it turned off doesn't mean you want it turned off for another
>> Python application.
> 
> The scenario I'm interested in is the one where it *was* off globally
> (i.e. you were already running Python 2.7.8 or earlier) and you want
> to manage a global rollout of a new Python version that supports being
> configured to verify HTTPS certificates by default, while making the
> decision on whether or not to enable HTTPS certificate verification on
> a server-by-server basis, rather than having that decision be coupled
> directly to the rollout of the updated version of Python.
> 
> I agree that the desired end state is where Python 3 is, and where
> upstream Python 2.7.9+ is, this is solely about how to facilitate
> folks getting from point A to point B without an intervening window of
> "I broke the world and now my boss is yelling at me about it" :)
> 

Oh, another issue that I forgot to mention--

A fair number of people had no idea that Python wasn't validating TLS before
2.7.9/3.4.3 however as part of the processing of changing that in 2.7.9 a lot
of people became aware that Python's before 2.7.9 didn't validate but that
Python 2.7.9+ does. I worry that if Redhat (or anyone) ships a Python 2.7.9
that doesn't verify by default then they are going to be shipping something
which defies the expectations of those users who were relying on the fact that
Python 2.7.9+ was supposed to be secure by default now. You're (understandibly)
focusing on "I already have my thing running on Python 2.7.8 and I want to
yum update and get 2.7.9 and have things not visibly break", however there is
the other use case of "I'm setting up a new environment, and I installed RHEL
and got 2.7.9, I remembered reading in LWN that 2.7.9 verifies now so I must
be safe". If you *do* provide such a switch, defaulting it to verify and having
people where that breaks go in and turn it off is probably a safer mechanism
since the cases where 2.7.9 verification breaks things for people is a visible
change where the case that someone expects 2.7.9 to verify and it doesn't isn't
a visible change and is easily missed unless they go out of their way to try
and test it against a server with an invalid certificate.

Either way, if there is some sort of global off switch, having that off switch
set to off should raise some kind of warning (like urllib3 does if you use
the unverified HTTPS methods). To be clear, I don't mean that using the built
in ssl module APIs to disable verification should raise a warning, I mean the
hypothetical "make my Python insecurely access HTTPS" configuration file (or
environment variable) that is being proposed.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 801 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150511/914483da/attachment.sig>

From ncoghlan at gmail.com  Mon May 11 16:06:45 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 00:06:45 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <1516155771.12933368.1431346559516.JavaMail.zimbra@redhat.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>
 <CADiSq7fAYLCiq_Yk7ikWUWbPiqk_LFzN5+J14w4fsaz+YL-n+g@mail.gmail.com>
 <39FA939A-D71E-4FC4-9AEB-8D9FCDF02CD4@stufft.io>
 <1516155771.12933368.1431346559516.JavaMail.zimbra@redhat.com>
Message-ID: <CADiSq7fOoKumyOVPkVxB8+3343bV-o08U1_nUbjqe1ZtK5px4g@mail.gmail.com>

On 11 May 2015 10:16 pm, "Robert Kuska" <rkuska at redhat.com> wrote:

> > >
> >
> > Oh, another issue that I forgot to mention--
> >
> > A fair number of people had no idea that Python wasn't validating TLS
before
> > 2.7.9/3.4.3 however as part of the processing of changing that in 2.7.9
a lot
> > of people became aware that Python's before 2.7.9 didn't validate but
that
> > Python 2.7.9+ does. I worry that if Redhat (or anyone) ships a Python
2.7.9
> > that doesn't verify by default then they are going to be shipping
something
> > which defies the expectations of those users who were relying on the
fact
> > that
> > Python 2.7.9+ was supposed to be secure by default now. You're
> > (understandibly)
> > focusing on "I already have my thing running on Python 2.7.8 and I want
to
> > yum update and get 2.7.9 and have things not visibly break",

As Robert noted, it would be a matter of updating to a 2.7.5 with more
patches backported, rather than rebasing to a newer upstream version.

I can make the "do not change the default behaviour relative to the
corresponding upstream version" guidance explicit in the PEP, though.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/5f102069/attachment-0001.html>

From benjamin at python.org  Mon May 11 17:24:12 2015
From: benjamin at python.org (Benjamin Peterson)
Date: Mon, 11 May 2015 11:24:12 -0400
Subject: [Python-Dev] [RELEASE] Python 2.7.10 release candidate 1
Message-ID: <1431357852.505333.265716865.5DD10081@webmail.messagingengine.com>

It is my privilege to announce the first release candidate of 2.7.10,
the next bugfix release in the 2.7 series.

Downloads are at

   https://www.python.org/downloads/release/python-2710rc1/

The full changelog is at

   https://hg.python.org/cpython/raw-file/80ccce248ba2/Misc/NEWS

Please consider testing 2.7.10rc1 with your application and reporting
bugs to

   https://bugs.python.org

Regards,
Benjamin

From mal at egenix.com  Mon May 11 20:49:56 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 11 May 2015 20:49:56 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>	<CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>	<55506289.1000705@egenix.com>	<CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>	<555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
Message-ID: <5550F9D4.5050701@egenix.com>

On 11.05.2015 12:15, Nick Coghlan wrote:
> On 11 May 2015 at 19:22, M.-A. Lemburg <mal at egenix.com> wrote:
>> On 11.05.2015 11:13, Nick Coghlan wrote:
>>> I wouldn't be opposed to seeing that as an upstream Python 2.7.x
>>> feature, but agreement amongst redistributors on using a file-based
>>> approach is the main outcome I'm looking for.
>>
>> Can't we have both ?
> 
> I'd strongly advise against it, as we're deliberately increasing the
> attack surface area here by providing a potential path to carry out a
> downgrade attack on the HTTPS certificate verification by forcing it
> back to the old behaviour.
> 
> The main existing environment variable based attack vector would be to
> manage to run a process with administrative privileges and
> SSL_CERT_DIR and/or SSL_CERT_FILE pointing to a certificate written to
> a user or world-writable directory by the attacker. Providing a "don't
> verify HTTPS" flag at the interpreter level would let an attacker skip
> the first step of writing the certificate file to disk somewhere,
> making a system compromise harder to detect. (An especially paranoid
> SSL implementation would disallow reading certs from locations that
> allow write access to non-administrative users, but I don't believe
> OpenSSL is that paranoid)

Correct. OpenSSL will happily read the cert files from anywhere
you point it to.

> By contrast, the configuration file shouldn't provide a new attack
> vector (or simplify any existing attack vector), as if you have the
> permissions needed to modify the config file, you likely also have the
> permissions needed to modify the system certificate store, and the
> latter is a *far* more interesting attack vector than a downgrade
> attack solely against Python.
> 
> Thus the environment variable based off switch is neither necessary
> (as an administrator controlled configuration file can do the job),
> nor sufficient (it can't handle the -E switch), *and* it represents an
> increase in the attack surface area relative to a Python
> implementation without the capability.

Whether or not -E will have an effect on the env var depends
on the implementation. At the moment, -E only has an effect
on the C runtime, while the stdlib happily reads from os.environ
without taking the flag into account.

As proposed, the PYTHONHTTPSVERIFY would only affect the ssl
module and only be checked when loading this module, i.e. not
at Python startup time.

>> I don't think that we can wait for a whole PEP process to
>> run through
> 
> Matrix multiplication made it through the PEP process in 8 days. If we
> do this as a redistributor recommendation rather than attempting to
> get it into upstream Python 2.7, we could potentially propose you take
> on the role of BDFL-Delegate and mark it as Accepted as soon as the
> two of us agree on a common approach.
> 
> The reason I think that's a reasonable way forward is because we
> already know there are folks opposed to making the change upstream. If
> the PEP just provides recommendations for redistributors that *do*
> decide to provide a "global off switch" to revert to the old
> behaviour, then the perspective of the folks opposed to the feature is
> respected by the fact that this is a feature some redistributors *may*
> choose to add to provide a smoother migration path to more secure
> default HTTPS handling, rather than something upstream provides by
> default.
> 
> I assume the Debian, Ubuntu and Suse folks won't care, as they have
> all already decided against backporting the change to their long term
> support releases where the compatibility break would pose a problem
> (and I can certainly sympathise with that perspective given the
> dependency on backporting the PEP 466 SSL changes first, and the work
> involved in seeking consensus on a smoother migration path from the
> old default to the new one).
> 
> It would be nice to hear from ActiveState, Enthought & Continuum
> Analytics as well, but if they don't chime in here, I don't see any
> particular need to go chasing them explicitly.

I think the approach to only consider a subset of redistributors
as viable targets for such a switch is a bit too narrow.

You are leaving out all the parties which use custom
Python installations to run their applications, e.g.
the Plone and Zope community, the ZenOSS community,
the many Windows applications built on Python, etc.

>> to fix this regression in 2.7.9.
> 
> We made the decision when PEP 476 was accepted that this change turned
> a silent security failure into a noisy one, rather than being a
> regression in its own right. PEP 493 isn't about disagreeing with that
> decision, it's about providing a smoother upgrade path in contexts
> where letting the security failure remain silent is deemed to be
> preferred in the near term.

The change wasn't regression. The missing downgrade path
is a regression.

Some other comments on PEP 493:

* I don't think we really want to add the overhead of
  having to parse an INI file every time Python starts up.
  Please remember that we just parsing of the sysconfig
  data not long ago because we wanted to avoid this startup
  time.

* I don't see why the attack surface of using an INI file
  somewhere in the system should be smaller than e.g. using
  sitecustomize.py

* If done right, we'd also need a switch to ignore this
  global config file and recommend using it to reduce the
  attack surface (for the same reason you explain in the
  PEP)

* I don't think a global switch is the right way forward.
  Many applications on properly configured systems will
  work fine with the new default. The downgrade option is
  only needed for those cases, where they don't and you
  don't have a good way to fix the application.

* Most applications use some kind of virtualenv
  Python environment to run the code. These are
  typically isolated from the system Python installation
  and so wouldn't want to use a system wide global INI
  neither.

* The -S switch completely disables importing site.py.
  That's not really a viable solution in the age of
  pip - your local installation wouldn't find the installed
  packages anymore, since these are installed in site-packages/
  which again, is set up by site.py.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 11 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From p.f.moore at gmail.com  Mon May 11 21:42:26 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Mon, 11 May 2015 20:42:26 +0100
Subject: [Python-Dev] [Python-checkins] cpython (3.4): asyncio: async()
 function is deprecated in favour of ensure_future().
In-Reply-To: <20150511185047.84480.21325@psf.io>
References: <20150511185047.84480.21325@psf.io>
Message-ID: <CACac1F9mpcDqDbQ+w4bvk7QGo4d1Rj+NioDn3M2Ac3Q2qWVaVA@mail.gmail.com>

On 11 May 2015 at 19:50, yury.selivanov <python-checkins at python.org> wrote:
> https://hg.python.org/cpython/rev/b78127eafad7
> changeset:   95956:b78127eafad7
> branch:      3.4
> parent:      95953:a983d63e3321
> user:        Yury Selivanov <yselivanov at sprymix.com>
> date:        Mon May 11 14:48:38 2015 -0400
> summary:
>   asyncio: async() function is deprecated in favour of ensure_future().
>
> files:
>   Lib/asyncio/base_events.py                   |   2 +-
>   Lib/asyncio/tasks.py                         |  27 ++++-
>   Lib/asyncio/windows_events.py                |   2 +-
>   Lib/test/test_asyncio/test_base_events.py    |   6 +-
>   Lib/test/test_asyncio/test_tasks.py          |  48 +++++----
>   Lib/test/test_asyncio/test_windows_events.py |   2 +-
>   Misc/NEWS                                    |   4 +-
>   7 files changed, 57 insertions(+), 34 deletions(-)

Surely this should include a doc change?
Paul

From yselivanov.ml at gmail.com  Mon May 11 21:47:16 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Mon, 11 May 2015 15:47:16 -0400
Subject: [Python-Dev] [Python-checkins] cpython (3.4): asyncio: async()
 function is deprecated in favour of ensure_future().
In-Reply-To: <CACac1F9mpcDqDbQ+w4bvk7QGo4d1Rj+NioDn3M2Ac3Q2qWVaVA@mail.gmail.com>
References: <20150511185047.84480.21325@psf.io>
 <CACac1F9mpcDqDbQ+w4bvk7QGo4d1Rj+NioDn3M2Ac3Q2qWVaVA@mail.gmail.com>
Message-ID: <55510744.6050703@gmail.com>

Yes, I'm in the process of writing it ;)  (as well as for new 
set_task_factory())

Thanks,
Yury

On 2015-05-11 3:42 PM, Paul Moore wrote:
> On 11 May 2015 at 19:50, yury.selivanov <python-checkins at python.org> wrote:
>> https://hg.python.org/cpython/rev/b78127eafad7
>> changeset:   95956:b78127eafad7
>> branch:      3.4
>> parent:      95953:a983d63e3321
>> user:        Yury Selivanov <yselivanov at sprymix.com>
>> date:        Mon May 11 14:48:38 2015 -0400
>> summary:
>>    asyncio: async() function is deprecated in favour of ensure_future().
>>
>> files:
>>    Lib/asyncio/base_events.py                   |   2 +-
>>    Lib/asyncio/tasks.py                         |  27 ++++-
>>    Lib/asyncio/windows_events.py                |   2 +-
>>    Lib/test/test_asyncio/test_base_events.py    |   6 +-
>>    Lib/test/test_asyncio/test_tasks.py          |  48 +++++----
>>    Lib/test/test_asyncio/test_windows_events.py |   2 +-
>>    Misc/NEWS                                    |   4 +-
>>    7 files changed, 57 insertions(+), 34 deletions(-)
> Surely this should include a doc change?
> Paul
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com


From p.f.moore at gmail.com  Mon May 11 21:52:34 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Mon, 11 May 2015 20:52:34 +0100
Subject: [Python-Dev] [Python-checkins] cpython (3.4): asyncio: async()
 function is deprecated in favour of ensure_future().
In-Reply-To: <55510744.6050703@gmail.com>
References: <20150511185047.84480.21325@psf.io>
 <CACac1F9mpcDqDbQ+w4bvk7QGo4d1Rj+NioDn3M2Ac3Q2qWVaVA@mail.gmail.com>
 <55510744.6050703@gmail.com>
Message-ID: <CACac1F-=sQt+ogttHuyAtxH-n=XvdP-sHYRDydyo+dmcVQbzzg@mail.gmail.com>

On 11 May 2015 at 20:47, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> Yes, I'm in the process of writing it ;)  (as well as for new
> set_task_factory())

Cool - sorry for being a nag :-)
Paul

From stefan_ml at behnel.de  Mon May 11 21:53:31 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Mon, 11 May 2015 21:53:31 +0200
Subject: [Python-Dev] Free lists
In-Reply-To: <milli3$4ln$1@ger.gmane.org>
References: <milli3$4ln$1@ger.gmane.org>
Message-ID: <mir1br$27a$1@ger.gmane.org>

Serhiy Storchaka schrieb am 09.05.2015 um 21:01:
> Here is a statistic for most called PyObject_INIT or PyObject_INIT_VAR for
> types (collected during running Python tests on 32-bit Linux).

I'm aware that this includes lots of tests for the Python code in the
stdlib, so these numbers are most likely not too far from what real-world
code would give, but wouldn't it be even better to collect statistics from
a (quick) benchmark suite run? Test suites tend to be very regular, flat
and broad, unlike most production code.

Stefan


From p.f.moore at gmail.com  Mon May 11 22:37:09 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Mon, 11 May 2015 21:37:09 +0100
Subject: [Python-Dev] Minimal async event loop and async utilities (Was:
 PEP 492: async/await in Python; version 4)
In-Reply-To: <CAP7+vJ+KL5NmLJZLuKbDYX8i2V_zZPUNaqXmfacxb7Cj43jqgg@mail.gmail.com>
References: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>
 <CAP7+vJ+KL5NmLJZLuKbDYX8i2V_zZPUNaqXmfacxb7Cj43jqgg@mail.gmail.com>
Message-ID: <CACac1F9a5eV4O0rjLvGADc1JrXjxdX7PW9g6TPpvgbk+b8vf8Q@mail.gmail.com>

On 6 May 2015 at 16:46, Guido van Rossum <guido at python.org> wrote:
> This is actually a great idea, and I encourage you to go forward with it.
> The biggest piece missing from your inventory is probably Task, which is
> needed to wrap a Future around a coroutine.

OK, I've been doing some work on this. You're right, the asyncio
framework makes Future a key component.

But I'm not 100% sure why Future (and Task) have to be so fundamental.
Ignoring cancellation (see below!) I can build pretty much all of a
basic event loop, plus equivalents of the asyncio locks and queues
modules, without needing the concept of a Future at all. The
create_task function becomes simply a function to add a coroutine to
the ready queue, in this context. I can't return a Task (because I
haven't implemented the Task or Future classes) but I don't actually
know what significant functionality is lost as a result - is there a
reasonably accessible example of where using the return value from
create_task is important anywhere?

A slightly more complicated issue is with the run_until_complete
function, which takes a Future, and hence is fundamentally tied to the
Future API. However, it seems to me that a "minimal" implementation
could work by having a run_until_complete() that just took an
awaitable (i.e., anything that you can yield from). Again, is there a
specific reason that you ended up going with run_until_complete taking
a Future rather than just a coroutine? I think (but haven't confirmed
yet by implementing it) that it should be possible to create a
coroutine that acts like a Future, in the sense that you can tell it
from outside (via send()) that it's completed and set its return
value. But this is all theory, and if you have any practical
experience that shows I'm going down a dead end, I'd be glad to know.

I'm not sure how useful this line of attack will be - if the API isn't
compatible with asyncio.BaseEventLoop, it's not very useful in
practice. On the other hand, if I can build a loop without Future or
Task classes, it may indicate that those classes aren't quite as
fundamental as asyncio makes them (which may allow some
simplifications or generalisations).

> I expect you'll also want to build cancellation into your "base async
> framework"; and the primitives to wait for multiple awaitables. The next
> step would be some mechanism to implement call_later()/call_at() (but this
> needs to be pluggable since for a "real" event loop it needs to be
> implemented by the basic I/O selector).

These are where I suspect I'll have the most trouble if I haven't got
a solid understanding of the role of the Future and Task classes (or
alternatively, how to avoid them :-)) So I'm holding off on worrying
about them for now. But certainly they need to be covered. In
particular, call_later/call_at are the only "generic" example of any
form of wait that actually *waits*, rather than returning immediately.
So as you say, implementing them will show how the basic mechanism can
be extended with a "real" selector (whether for I/O, or GUI events, or
whatever).

> If you can get this working it would be great to include this in the stdlib
> as a separate "asynclib" library. The original asyncio library would then be
> a specific implementation (using a subclass of asynclib.EventLoop) that adds
> I/O, subprocesses, and integrates with the selectors module (or with IOCP,
> on Windows).

One thing I've not really considered in the above, is how a
refactoring like this would work. Ignoring the "let's try to remove
the Future class" approach above, my "basic event loop" is mostly just
an alternative implementation of an event loop (or maybe an
alternative policy - I'm not sure I understand the need for policies
yet). So it may simply be a case of ripping coroutines.py, futures.py,
locks.py, log.py, queues.py, and tasks.py out of asyncio and adding a
new equivalent of events.py with my "minimal" loop in it. (So far,
when I've tried to do that I get hit with some form of circular import
problem - I've not worked out why yet, or how asyncio avoids the same
problem).

That in itself would probably be a useful refactoring, splitting out
the IO aspects of asyncio from the event loop / async aspects.

> I don't see any particular hurry to get this in before 3.5; the refactoring
> of asyncio can be done later, in a backward compatible way. It would be a
> good way to test the architecture of asyncio!

Agreed. It's also not at all clear to me how the new async/await
syntax would fit in with this, so that probably needs some time to
settle down. For example, in Python 3.5 would run_until_complete take
an awaitable rather than a Future?

Paul

From v+python at g.nevcal.com  Mon May 11 22:38:29 2015
From: v+python at g.nevcal.com (Glenn Linderman)
Date: Mon, 11 May 2015 13:38:29 -0700
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <CADiSq7cbK+BQeBrMqBt3su0CybYsZtH82tXpnG0q_JtHQ2B2Ew@mail.gmail.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>	<CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>	<CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>	<CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>	<CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>	<87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>	<CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>	<87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>	<CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>	<87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>	<CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>	<554BBBCA.9090909@v.loewis.de>	<CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>	<CACvLUan6YQoj5=ECckCKZbEjVoxBna_LTt9mVfP2ewm6aOkGwQ@mail.gmail.com>
 <CADiSq7cbK+BQeBrMqBt3su0CybYsZtH82tXpnG0q_JtHQ2B2Ew@mail.gmail.com>
Message-ID: <55511345.1020507@g.nevcal.com>

On 5/11/2015 1:09 AM, Nick Coghlan wrote:
> On 10 May 2015 at 23:28, Adam Barto? <drekin at gmail.com> wrote:
>> Glenn Linderman wrote:
>>> Is this going to get released in 3.5, I hope?  Python 3 is pretty
>>> limited without some solution for Unicode on the console... probably the
>>> biggest deficiency I have found in Python 3, since its introduction. It
>>> has great Unicode support for files and processing, which convinced me
>>> to switch from Perl, and I like so much else about it, that I can hardly
>>> code in Perl any more (I still support a few Perl programs, but have
>>> ported most of them to Python).
>> I'd love to see it included in 3.5, but I doubt that will happen. For one
>> thing, it's only two weeks till beta 1, which is feature freeze. And mainly,
>> my package is mostly hacking into existing Python environment. A proper
>> implementation would need some changes in Python someone would have to do.
>> See for example my proposal http://bugs.python.org/issue17620#msg234439. I'm
>> not competent to write a patch myself and I have also no feedback to the
>> proposed idea. On the other hand, using the package is good enough for me so
>> I didn't further bring attention to the proposal.
> Right, and while I'm interested in seeing this improved, I'm not
> especially familiar with the internal details of our terminal
> interaction implementation, and even less so when it comes to the
> Windows terminal. Steve Dower's also had his hands full working on the
> Windows installer changes, and several of our other Windows folks
> aren't C programmers.
>
> PEP 432 (the interpreter startup sequence improvements) will be back
> on the agenda for Python 3.6, so the 3.6 time frame seems more
> plausible at this point.
>
> Cheers,
> Nick.
>
Wow!  Another bug that'll reach a decade in age before being fixed...
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150511/bc35d062/attachment.html>

From guido at python.org  Tue May 12 00:05:36 2015
From: guido at python.org (Guido van Rossum)
Date: Mon, 11 May 2015 15:05:36 -0700
Subject: [Python-Dev] Minimal async event loop and async utilities (Was:
 PEP 492: async/await in Python; version 4)
In-Reply-To: <CACac1F9a5eV4O0rjLvGADc1JrXjxdX7PW9g6TPpvgbk+b8vf8Q@mail.gmail.com>
References: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>
 <CAP7+vJ+KL5NmLJZLuKbDYX8i2V_zZPUNaqXmfacxb7Cj43jqgg@mail.gmail.com>
 <CACac1F9a5eV4O0rjLvGADc1JrXjxdX7PW9g6TPpvgbk+b8vf8Q@mail.gmail.com>
Message-ID: <CAP7+vJKt1PLFJ8peP2JEk8oJSUh4g9DPC40hXGPasiO7ZTWQJQ@mail.gmail.com>

On Mon, May 11, 2015 at 1:37 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 6 May 2015 at 16:46, Guido van Rossum <guido at python.org> wrote:
> > This is actually a great idea, and I encourage you to go forward with it.
> > The biggest piece missing from your inventory is probably Task, which is
> > needed to wrap a Future around a coroutine.
>
> OK, I've been doing some work on this. You're right, the asyncio
> framework makes Future a key component.
>
> But I'm not 100% sure why Future (and Task) have to be so fundamental.
> Ignoring cancellation (see below!) I can build pretty much all of a
> basic event loop, plus equivalents of the asyncio locks and queues
> modules, without needing the concept of a Future at all. The
> create_task function becomes simply a function to add a coroutine to
> the ready queue, in this context. I can't return a Task (because I
> haven't implemented the Task or Future classes) but I don't actually
> know what significant functionality is lost as a result - is there a
> reasonably accessible example of where using the return value from
> create_task is important anywhere?
>

In asyncio the Task object is used to wait for the result. Of course if all
you need is to wait for the result you don't need to call create_task() --
so in your situation it's uninteresting. But Task is needed for
cancellation and Future is needed so I/O completion can be implemented
using callback functions.


> A slightly more complicated issue is with the run_until_complete
> function, which takes a Future, and hence is fundamentally tied to the
> Future API. However, it seems to me that a "minimal" implementation
> could work by having a run_until_complete() that just took an
> awaitable (i.e., anything that you can yield from). Again, is there a
> specific reason that you ended up going with run_until_complete taking
> a Future rather than just a coroutine?


Actually it takes a Future *or* a coroutine. (The docs or the arg name may
be confusing.) In asyncio, pretty much everything that takes one takes the
other.


> I think (but haven't confirmed
> yet by implementing it) that it should be possible to create a
> coroutine that acts like a Future, in the sense that you can tell it
> from outside (via send()) that it's completed and set its return
> value. But this is all theory, and if you have any practical
> experience that shows I'm going down a dead end, I'd be glad to know.
>

I don't know -- I never explored that.


> I'm not sure how useful this line of attack will be - if the API isn't
> compatible with asyncio.BaseEventLoop, it's not very useful in
> practice. On the other hand, if I can build a loop without Future or
> Task classes, it may indicate that those classes aren't quite as
> fundamental as asyncio makes them (which may allow some
> simplifications or generalisations).
>

Have you tried to implement waiting for I/O yet?

OTOH you may look at micropython's uasyncio -- IIRC it doesn't have Futures
and it definitely has I/O waiting.


> > I expect you'll also want to build cancellation into your "base async
> > framework"; and the primitives to wait for multiple awaitables. The next
> > step would be some mechanism to implement call_later()/call_at() (but
> this
> > needs to be pluggable since for a "real" event loop it needs to be
> > implemented by the basic I/O selector).
>
> These are where I suspect I'll have the most trouble if I haven't got
> a solid understanding of the role of the Future and Task classes (or
> alternatively, how to avoid them :-)) So I'm holding off on worrying
> about them for now. But certainly they need to be covered. In
> particular, call_later/call_at are the only "generic" example of any
> form of wait that actually *waits*, rather than returning immediately.
> So as you say, implementing them will show how the basic mechanism can
> be extended with a "real" selector (whether for I/O, or GUI events, or
> whatever).
>

Right.


> > If you can get this working it would be great to include this in the
> stdlib
> > as a separate "asynclib" library. The original asyncio library would
> then be
> > a specific implementation (using a subclass of asynclib.EventLoop) that
> adds
> > I/O, subprocesses, and integrates with the selectors module (or with
> IOCP,
> > on Windows).
>
> One thing I've not really considered in the above, is how a
> refactoring like this would work. Ignoring the "let's try to remove
> the Future class" approach above, my "basic event loop" is mostly just
> an alternative implementation of an event loop (or maybe an
> alternative policy - I'm not sure I understand the need for policies
> yet).


A policy is mostly a wrapper around an event loop factory plus state that
records the current event loop.


> So it may simply be a case of ripping coroutines.py, futures.py,
> locks.py, log.py, queues.py, and tasks.py out of asyncio and adding a
> new equivalent of events.py with my "minimal" loop in it. (So far,
> when I've tried to do that I get hit with some form of circular import
> problem - I've not worked out why yet, or how asyncio avoids the same
> problem).
>

That sounds like a surface problem. Keep on debugging. :-)


> That in itself would probably be a useful refactoring, splitting out
> the IO aspects of asyncio from the event loop / async aspects.
>

Well, if you can.


> > I don't see any particular hurry to get this in before 3.5; the
> refactoring
> > of asyncio can be done later, in a backward compatible way. It would be a
> > good way to test the architecture of asyncio!
>
> Agreed. It's also not at all clear to me how the new async/await
> syntax would fit in with this, so that probably needs some time to
> settle down. For example, in Python 3.5 would run_until_complete take
> an awaitable rather than a Future?
>

It doesn't need to change -- it already calls async() on its argument
before doing anything (though with PEP 492 that function will be renamed to
ensure_future()).

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150511/f255ebd4/attachment-0001.html>

From nad at acm.org  Tue May 12 01:05:58 2015
From: nad at acm.org (Ned Deily)
Date: Mon, 11 May 2015 16:05:58 -0700
Subject: [Python-Dev] Mac popups running make test
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
 <CAP1=2W4cG+pcPJUzNyd_FTEwsFpyOU7HKygRoA1DVwxM-wcBNw@mail.gmail.com>
 <CALWZvp67xpJcRXQ1=v=r8ixsUdE1dSANyoqa7uuKD2=FqgL_Eg@mail.gmail.com>
 <554F9E70.3010509@willingconsulting.com>
 <CALWZvp4D4C3+z=Xw2USvQTDXsZhQMw9o6zoyZVM2UeECaW2QxA@mail.gmail.com>
Message-ID: <nad-7C99D0.16055811052015@news.gmane.org>

In article 
<CALWZvp4D4C3+z=Xw2USvQTDXsZhQMw9o6zoyZVM2UeECaW2QxA at mail.gmail.com>,
 Tal Einat <taleinat at gmail.com> wrote:
> On Sun, May 10, 2015 at 9:07 PM, Carol Willing <
> willingc at willingconsulting.com> wrote:
> > On 5/10/15 10:29 AM, Tal Einat wrote:
> >  On Sun, May 10, 2015 at 5:07 PM, Brett Cannon <brett at python.org> wrote:
> >> On Sun, May 10, 2015 at 10:04 AM Skip Montanaro <skip.montanaro at gmail.com>
> >> wrote:
> >>> I haven't run the test suite in awhile. I am in the midst of running it
> >>> on my Mac running Yosemite 10.10.3. Twice now, I've gotten this popup:
> >>>  I assume this is testing some server listening on localhost. Is this a
> >>> new thing, either with the Python test suite or with Mac OS X? (I'd
> >>> normally be hidden behind a NAT firewall, but at the moment I am on a
> >>> miserable public connection in a Peet's Coffee, so it takes on slightly
> >>> more importance...)
> >>  It's not new.
> >  Indeed, I've run into this as well.
> >>>  I've also seen the Crash Reporter pop up many times, but as far as I
> >>> could tell, in all cases the test suite output told me it was expected.
> >>> Perhaps tests which listen for network connections should also mention
> >>> that, at least on Macs?
> >>  Wouldn't hurt. Just requires tracking down which test(s) triggers it
> >> (might be more than one and I don't know if answering that popup applies
> >> for the rest of the test execution or once per test if you use -j).
> >  If anyone starts working on this, let me know if I can help, e.g. trying
> > things on my own Mac.
\> >    I believe that the message has to do with OS X's sandboxing
> > implementation and the setting of the sandbox's entitlement keys. Here's an
> > Apple doc:
> > https://developer.apple.com/library/ios/documentation/Miscellaneous/Referenc
> > e/EntitlementKeyReference/Chapters/EnablingAppSandbox.html#//apple_ref/doc/u
> > id/TP40011195-CH4-SW9
> > I'm unaware of a way to work around this other than using Apple's code
> > signing or adjusting target build settings in XCode :( If anyone knows a
> > good way to workaround or manually set permission (other than clicking the
> > Allow button), I would be interested.
> I was reading about this a few weeks ago an recall finding a way to ad-hoc
> sign the built python executable. Here's a link below. I haven't tried
> this, though, and don't know if it would work with a python executable
> rather than a proper OSX app. If it does work, it would be useful to add
> this as a tool and/or mention it in the developer docs.
> 
> http://apple.stackexchange.com/a/121010

I believe the issue has to do with the OS X application firewall and not 
sandboxing, as vanilla Python on OS X is not sandboxed.  See:

https://support.apple.com/en-us/HT201642

As described there, codesigned applications are automatically authorized 
to accept inbound connections; that's the workaround proposed in the 
apple.stackexchange cite.  But arbitrarily signing development binaries 
after every compile is probably not a good idea.  Another option is to 
configure the firewall but that probably only can be made to work with a 
framework build of Python which launches Python within an app bundle.  
In any case, please open an issue on the bug tracker so we can follow up 
on this.

-- 
 Ned Deily,
 nad at acm.org


From ezio.melotti at gmail.com  Tue May 12 04:24:56 2015
From: ezio.melotti at gmail.com (Ezio Melotti)
Date: Tue, 12 May 2015 05:24:56 +0300
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
Message-ID: <CACBhJdEU9e_6_xy9iH0WL5SFmZVhSNwjPS5AZ82B=CQCcgqbAQ@mail.gmail.com>

On Sun, May 10, 2015 at 5:04 PM, Skip Montanaro
<skip.montanaro at gmail.com> wrote:
>
> ...
> I've also seen the Crash Reporter pop up many times,

I don't know how to get rid of the popup you mentioned, but Windows
had problems with the crash popups for a long time.
Eventually it got fixed:
  https://hg.python.org/cpython/file/default/Lib/test/support/__init__.py#l2202
  http://bugs.python.org/issue11732
  http://bugs.python.org/issue18948
  http://bugs.python.org/issue23314

Perhaps Mac OS has something similar too?

Best Regards,
Ezio Melotti

>
> but as far as I could tell, in all cases the test suite output told me it was expected. Perhaps tests which listen for network connections should also mention that, at least on Macs?
>
> Thx,
>
> Skip
>

From ncoghlan at gmail.com  Tue May 12 05:03:25 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 13:03:25 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <5550F9D4.5050701@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
Message-ID: <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>

On 12 May 2015 at 04:49, M.-A. Lemburg <mal at egenix.com> wrote:
> On 11.05.2015 12:15, Nick Coghlan wrote:
>> By contrast, the configuration file shouldn't provide a new attack
>> vector (or simplify any existing attack vector), as if you have the
>> permissions needed to modify the config file, you likely also have the
>> permissions needed to modify the system certificate store, and the
>> latter is a *far* more interesting attack vector than a downgrade
>> attack solely against Python.
>>
>> Thus the environment variable based off switch is neither necessary
>> (as an administrator controlled configuration file can do the job),
>> nor sufficient (it can't handle the -E switch), *and* it represents an
>> increase in the attack surface area relative to a Python
>> implementation without the capability.
>
> Whether or not -E will have an effect on the env var depends
> on the implementation. At the moment, -E only has an effect
> on the C runtime, while the stdlib happily reads from os.environ
> without taking the flag into account.

I had an off-list discussion with Christian Heimes about that in
relation to the OpenSSL flags, and he pointed out the reason -E
specifically needs to be a command line switch is because that is the
only way to affect how environment variables are processed during
interpreter startup. Once an application is up and running, further
environment variable sanitisation can be handled at an application
level by whitelisting entries in os.environ and deleting everything
else.

> As proposed, the PYTHONHTTPSVERIFY would only affect the ssl
> module and only be checked when loading this module, i.e. not
> at Python startup time.

Right, the same is true for the configuration file proposal.

>> It would be nice to hear from ActiveState, Enthought & Continuum
>> Analytics as well, but if they don't chime in here, I don't see any
>> particular need to go chasing them explicitly.
>
> I think the approach to only consider a subset of redistributors
> as viable targets for such a switch is a bit too narrow.
>
> You are leaving out all the parties which use custom
> Python installations to run their applications, e.g.
> the Plone and Zope community, the ZenOSS community,
> the many Windows applications built on Python, etc.

No, they already have a solution: monkeypatch (or just plain patch)
the SSL module. That's an upstream supported technique, which is why
it's documented in the PEP.

The problem we (as in Red Hat) ran into was that that technique
*doesn't work* for the case of backporting PEP 476 to Python 2.7.5 as
an opt-in feature.

>>> to fix this regression in 2.7.9.
>>
>> We made the decision when PEP 476 was accepted that this change turned
>> a silent security failure into a noisy one, rather than being a
>> regression in its own right. PEP 493 isn't about disagreeing with that
>> decision, it's about providing a smoother upgrade path in contexts
>> where letting the security failure remain silent is deemed to be
>> preferred in the near term.
>
> The change wasn't regression. The missing downgrade path
> is a regression.

It's a shame we don't have "-X" options in Python 2, as that would be
a nice hard-to-attack option (although it wouldn't play well with
subprocesses)

> Some other comments on PEP 493:
>
> * I don't think we really want to add the overhead of
>   having to parse an INI file every time Python starts up.
>   Please remember that we just parsing of the sysconfig
>   data not long ago because we wanted to avoid this startup
>   time.

Compared to the overhead of reading from the system cert database,
reading a config file at ssl module import time should be trivial.

> * I don't see why the attack surface of using an INI file
>   somewhere in the system should be smaller than e.g. using
>   sitecustomize.py

You can put sitecustomize.py in a user directory, and if there's no
system wide sitecustomize, Python will read it automatically (unless
user site directories are turned off).

> * If done right, we'd also need a switch to ignore this
>   global config file and recommend using it to reduce the
>   attack surface (for the same reason you explain in the
>   PEP)

No, the recommendation there would be to upgrade to a newer version of
Python that doesn't offer this downgrade capability. It's a proposal
for a transition smoothing technique, not a permanent capability (I
did suggest the latter at one point, but the discussion on the issue
tracker persuaded me that was a bad idea, with the increased attack
surface being a key part of that change of heart).

> * I don't think a global switch is the right way forward.
>   Many applications on properly configured systems will
>   work fine with the new default. The downgrade option is
>   only needed for those cases, where they don't and you
>   don't have a good way to fix the application.

And techniques like chroots and containers let you do that selectively.

The key thing I'm after is an agreed technique for backporting to
earlier 2.7.x releases that allows PEP 476 to be provided as an opt-in
capability, rather than gating it on folks upgrading to 2.7.9, which
isn't going to happen for *years* in a great many environments (Ubuntu
14.04 LTS, for example, doesn't go end of life until 2019 and ships
2.7.6 + non-intrusive security patches, while RHEL 7 doesn't go end of
life until 2024 and ships 2.7.5 + compatible backports that address
customer problems).

While other Linux vendors have currently decided to leave the HTTPS
problem unfixed in their long term support releases (due to the risk
of causing service failures in customer environments), I'm hoping they
may revisit those decisions if there's a specific technique already
agreed with upstream for backporting the capability in a way that
makes it an opt-in feature that customers can switch on independently
of the inclusion of the feature backport in the system Python.

> * Most applications use some kind of virtualenv
>   Python environment to run the code. These are
>   typically isolated from the system Python installation
>   and so wouldn't want to use a system wide global INI
>   neither.

Having a separate configuration setting that controls verification in
virtual environments is a good idea. That could also provide the
per-application opt-out capability that you're after.

> * The -S switch completely disables importing site.py.
>   That's not really a viable solution in the age of
>   pip - your local installation wouldn't find the installed
>   packages anymore, since these are installed in site-packages/
>   which again, is set up by site.py.

Yes, getting an administrative application to the point where -S can
be used means getting it to a point where it has *no* Python
dependencies outside the standard library. It can certainly be done,
but often won't be worth the hassle. As a result, using -s to turn off
the user site directory and -E to turn off PYTHONPATH processing are
the more common sys.path related hardening techniques in Python 2.7.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Tue May 12 05:59:21 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 13:59:21 +1000
Subject: [Python-Dev] Unicode literals in Python 2.7
In-Reply-To: <55511345.1020507@g.nevcal.com>
References: <CACvLUamoX2H9_KWTTP-gX1xorqtvAYo0sXNq3Uvmbgz0S7oTCA@mail.gmail.com>
 <CAMpsgwaSO=gsoLKtzC8HqA6GyBTEdQcNw+2N-L3t5A+HqsZ9dw@mail.gmail.com>
 <CACvLUamzsZ_xYP7_jtBq=HfMh=MEOv7E96pLhkDQ_8-37+heZQ@mail.gmail.com>
 <CAP7+vJKW64ZA65fCsrvNeVA84UD-Czip=JW+cE8fyyQYk6E=_A@mail.gmail.com>
 <CACvLUamm89dK1bpim6LXHv9VrZDCNXqL9SXV_cvXy-QM73mgRw@mail.gmail.com>
 <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUan8qT4MWgWTq8z70ZhvcyqzxFR7QA+t=xr44TA=XwXqaA@mail.gmail.com>
 <87383horxx.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUanOxU3_8Frt5h_Bu0LwcYe-bkNTjo46Na1Shts0EgY-Zg@mail.gmail.com>
 <87sibfnf0x.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CACvLUa=riBbRk2tDKB3+fh0WOd8b6S07XPr=8Fe5TDwg6nVVYA@mail.gmail.com>
 <554BBBCA.9090909@v.loewis.de>
 <CACvLUakvdygYJ757PeD98uCFdnjXXnPZNTZ6Lv0CsCUrLmXHeQ@mail.gmail.com>
 <CACvLUan6YQoj5=ECckCKZbEjVoxBna_LTt9mVfP2ewm6aOkGwQ@mail.gmail.com>
 <CADiSq7cbK+BQeBrMqBt3su0CybYsZtH82tXpnG0q_JtHQ2B2Ew@mail.gmail.com>
 <55511345.1020507@g.nevcal.com>
Message-ID: <CADiSq7cfocZ_Xxi+jeYYf3M5NZ2gO+C65ZtGbT9Fyan3OT2YPw@mail.gmail.com>

On 12 May 2015 at 06:38, Glenn Linderman <v+python at g.nevcal.com> wrote:
> On 5/11/2015 1:09 AM, Nick Coghlan wrote:
> On 10 May 2015 at 23:28, Adam Barto? <drekin at gmail.com> wrote:
> I'd love to see it included in 3.5, but I doubt that will happen. For one
> thing, it's only two weeks till beta 1, which is feature freeze. And mainly,
> my package is mostly hacking into existing Python environment. A proper
> implementation would need some changes in Python someone would have to do.
> See for example my proposal http://bugs.python.org/issue17620#msg234439. I'm
> not competent to write a patch myself and I have also no feedback to the
> proposed idea. On the other hand, using the package is good enough for me so
> I didn't further bring attention to the proposal.
>
> Right, and while I'm interested in seeing this improved, I'm not
> especially familiar with the internal details of our terminal
> interaction implementation, and even less so when it comes to the
> Windows terminal. Steve Dower's also had his hands full working on the
> Windows installer changes, and several of our other Windows folks
> aren't C programmers.
>
> PEP 432 (the interpreter startup sequence improvements) will be back
> on the agenda for Python 3.6, so the 3.6 time frame seems more
> plausible at this point.
>
> Cheers,
> Nick.
>
> Wow!  Another bug that'll reach a decade in age before being fixed...

Yep, that tends to happen with complex cross-platform bugs & RFEs that
require domain expertise in multiple areas to resolve. It's one of the
areas that operating system vendors are typically best equipped to
handle, but we haven't historically had that kind of major
institutional backing for CPython core development (that *is*
changing, but it's a relatively recent phenomenon).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From mal at egenix.com  Tue May 12 09:57:32 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Tue, 12 May 2015 09:57:32 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>	<CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>	<55506289.1000705@egenix.com>	<CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>	<555074C1.80909@egenix.com>	<CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>	<5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
Message-ID: <5551B26C.2080800@egenix.com>

On 12.05.2015 05:03, Nick Coghlan wrote:
> On 12 May 2015 at 04:49, M.-A. Lemburg <mal at egenix.com> wrote:
>> On 11.05.2015 12:15, Nick Coghlan wrote:
>>> By contrast, the configuration file shouldn't provide a new attack
>>> vector (or simplify any existing attack vector), as if you have the
>>> permissions needed to modify the config file, you likely also have the
>>> permissions needed to modify the system certificate store, and the
>>> latter is a *far* more interesting attack vector than a downgrade
>>> attack solely against Python.
>>>
>>> Thus the environment variable based off switch is neither necessary
>>> (as an administrator controlled configuration file can do the job),
>>> nor sufficient (it can't handle the -E switch), *and* it represents an
>>> increase in the attack surface area relative to a Python
>>> implementation without the capability.
>>
>> Whether or not -E will have an effect on the env var depends
>> on the implementation. At the moment, -E only has an effect
>> on the C runtime, while the stdlib happily reads from os.environ
>> without taking the flag into account.
> 
> I had an off-list discussion with Christian Heimes about that in
> relation to the OpenSSL flags, and he pointed out the reason -E
> specifically needs to be a command line switch is because that is the
> only way to affect how environment variables are processed during
> interpreter startup. Once an application is up and running, further
> environment variable sanitisation can be handled at an application
> level by whitelisting entries in os.environ and deleting everything
> else.
> 
>> As proposed, the PYTHONHTTPSVERIFY would only affect the ssl
>> module and only be checked when loading this module, i.e. not
>> at Python startup time.
> 
> Right, the same is true for the configuration file proposal.
> 
>>> It would be nice to hear from ActiveState, Enthought & Continuum
>>> Analytics as well, but if they don't chime in here, I don't see any
>>> particular need to go chasing them explicitly.
>>
>> I think the approach to only consider a subset of redistributors
>> as viable targets for such a switch is a bit too narrow.
>>
>> You are leaving out all the parties which use custom
>> Python installations to run their applications, e.g.
>> the Plone and Zope community, the ZenOSS community,
>> the many Windows applications built on Python, etc.
> 
> No, they already have a solution: monkeypatch (or just plain patch)
> the SSL module. That's an upstream supported technique, which is why
> it's documented in the PEP.

Again, this is not a proper solution for your friendly
sys admin. They won't go in and patch Python, but rather
consider it broken, if it doesn't provide a configuration
option and simply stay with 2.7.8.

We've had that discussion already, so I won't go into details
again.

> The problem we (as in Red Hat) ran into was that that technique
> *doesn't work* for the case of backporting PEP 476 to Python 2.7.5 as
> an opt-in feature.

And neither does it work for people deploying Windows apps
built on Python (where the code is usually hidden away
in a ZIP archive or even compiled into a DLL), or people
deploying Plone or ZenOSS.

It's not only Red Hat's customers that are affected :-)

>>>> to fix this regression in 2.7.9.
>>>
>>> We made the decision when PEP 476 was accepted that this change turned
>>> a silent security failure into a noisy one, rather than being a
>>> regression in its own right. PEP 493 isn't about disagreeing with that
>>> decision, it's about providing a smoother upgrade path in contexts
>>> where letting the security failure remain silent is deemed to be
>>> preferred in the near term.
>>
>> The change wasn't regression. The missing downgrade path
>> is a regression.
> 
> It's a shame we don't have "-X" options in Python 2, as that would be
> a nice hard-to-attack option (although it wouldn't play well with
> subprocesses)
> 
>> Some other comments on PEP 493:
>>
>> * I don't think we really want to add the overhead of
>>   having to parse an INI file every time Python starts up.
>>   Please remember that we just parsing of the sysconfig
>>   data not long ago because we wanted to avoid this startup
>>   time.
> 
> Compared to the overhead of reading from the system cert database,
> reading a config file at ssl module import time should be trivial.

The cert database is only read when importing the ssl
module, not with each Python startup, so that makes a
big difference.

The INI file would have to parsed at ssl module load time
as well to work around this.

>> * I don't see why the attack surface of using an INI file
>>   somewhere in the system should be smaller than e.g. using
>>   sitecustomize.py
> 
> You can put sitecustomize.py in a user directory, and if there's no
> system wide sitecustomize, Python will read it automatically (unless
> user site directories are turned off).

In a user based installation (which most applications shipping
their own Python installation are), you can always do this
provided you can gain the application user permissions.

For a system installed Python, you'd always have to run
Python with -E -S -s to avoid this, but then you'd also
loose the system wide site-packages dir, so I'm not sure
whether that's really an option.

>> * If done right, we'd also need a switch to ignore this
>>   global config file and recommend using it to reduce the
>>   attack surface (for the same reason you explain in the
>>   PEP)
> 
> No, the recommendation there would be to upgrade to a newer version of
> Python that doesn't offer this downgrade capability. It's a proposal
> for a transition smoothing technique, not a permanent capability (I
> did suggest the latter at one point, but the discussion on the issue
> tracker persuaded me that was a bad idea, with the increased attack
> surface being a key part of that change of heart).

You mean the config file approach would only exist for
Python 2.7 ?

>> * I don't think a global switch is the right way forward.
>>   Many applications on properly configured systems will
>>   work fine with the new default. The downgrade option is
>>   only needed for those cases, where they don't and you
>>   don't have a good way to fix the application.
> 
> And techniques like chroots and containers let you do that selectively.

Nick, I lost you there.

Are you suggesting that sys admins who want to upgrade to Python 2.7.9
and need the downgrade option should put the whole application
into a container to work around a design deficiency in the
downgrade option ?

> The key thing I'm after is an agreed technique for backporting to
> earlier 2.7.x releases that allows PEP 476 to be provided as an opt-in
> capability, rather than gating it on folks upgrading to 2.7.9, which
> isn't going to happen for *years* in a great many environments (Ubuntu
> 14.04 LTS, for example, doesn't go end of life until 2019 and ships
> 2.7.6 + non-intrusive security patches, while RHEL 7 doesn't go end of
> life until 2024 and ships 2.7.5 + compatible backports that address
> customer problems).
> 
> While other Linux vendors have currently decided to leave the HTTPS
> problem unfixed in their long term support releases (due to the risk
> of causing service failures in customer environments), I'm hoping they
> may revisit those decisions if there's a specific technique already
> agreed with upstream for backporting the capability in a way that
> makes it an opt-in feature that customers can switch on independently
> of the inclusion of the feature backport in the system Python.

Again, you're only seeing Red Hat's customers as target for
this. Not everyone in this world runs on Red Hat, even
if you'd probably like that :-)

OS distributions can easily patch their system Python
as needed.

The point here is that sys admins should not
have to patch Python to make things work again, in case
an application is not prepared for the certificate
verification - which is rather likely, since the pre-Python
2.7.9 doesn't even provide the necessary APIs to pass
certificate locations to urllib or urllib2.

>> * Most applications use some kind of virtualenv
>>   Python environment to run the code. These are
>>   typically isolated from the system Python installation
>>   and so wouldn't want to use a system wide global INI
>>   neither.
> 
> Having a separate configuration setting that controls verification in
> virtual environments is a good idea. That could also provide the
> per-application opt-out capability that you're after.

The env var would already enable virtualenv to set this up
automatically, without any local config file.

To cover all the different cases, you will inevitably have to
provide a system wide config file, a user one and a local one,
just like we have for distutils:

https://docs.python.org/2/install/index.html#distutils-configuration-files

>> * The -S switch completely disables importing site.py.
>>   That's not really a viable solution in the age of
>>   pip - your local installation wouldn't find the installed
>>   packages anymore, since these are installed in site-packages/
>>   which again, is set up by site.py.
> 
> Yes, getting an administrative application to the point where -S can
> be used means getting it to a point where it has *no* Python
> dependencies outside the standard library. It can certainly be done,
> but often won't be worth the hassle. As a result, using -s to turn off
> the user site directory and -E to turn off PYTHONPATH processing are
> the more common sys.path related hardening techniques in Python 2.7.

Overall, I find the config file approach a too big a hammer to solve
this simple problem.

If there were more use cases for a Python config file, the added
overhead could pay off, but then we should put more thought into
optimizing the config file load time and probably end up using
a Python module as config file (which provides these optimizations
for free).

In the end, we'd be introducing another sitecustomize.py,
usercustomize.py and perhaps localcustomize.py - only with
fixed locations rather than importing them via sys.path.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 12 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From me at the-compiler.org  Mon May 11 09:40:53 2015
From: me at the-compiler.org (Florian Bruhin)
Date: Mon, 11 May 2015 09:40:53 +0200
Subject: [Python-Dev] anomaly
In-Reply-To: <CAMjeLr-67ZJNF3N5nWuDNGRWjOY_B41++737TxC4V6DYFH1t5Q@mail.gmail.com>
 <CAMjeLr--RfnfgWc1hdOcHiEBpq3QwC7SZk-m14wM65JtpbgMLQ@mail.gmail.com>
Message-ID: <20150511074053.GG429@tonks>

* Mark Rosenblitt-Janssen <dreamingforward at gmail.com> [2015-05-10 11:34:52 -0500]:
> Here's something that might be wrong in Python (tried on v2.7):
> 
> >>> class int(str): pass
> 
> >>> int(3)
> '3'

What's so odd about this? "class int" is an assignment to "int", i.e.
what you're doing here is basically:

int = str
int(3)  # really str(3)

* Mark Rosenblitt-Janssen <dreamingforward at gmail.com> [2015-05-10 19:14:18 -0500]:
> In case the example given at the start of the thread wasn't
> interesting enough, it also works in the other direction:
> 
> >>> class str(int):  pass
> 
> >>> str('2')
> 2  #<----- an integer!!!

Same thing. You're shadowing the builtin.

Florian

-- 
http://www.the-compiler.org | me at the-compiler.org (Mail/XMPP)
   GPG: 916E B0C8 FD55 A072 | http://the-compiler.org/pubkey.asc
         I love long mails! | http://email.is-not-s.ms/
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 819 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150511/04d69d80/attachment-0001.sig>

From rkuska at redhat.com  Mon May 11 14:15:59 2015
From: rkuska at redhat.com (Robert Kuska)
Date: Mon, 11 May 2015 08:15:59 -0400 (EDT)
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <39FA939A-D71E-4FC4-9AEB-8D9FCDF02CD4@stufft.io>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <89DA3F9D-F476-4711-BABC-A778E2F3FE06@stufft.io>
 <CADiSq7fAYLCiq_Yk7ikWUWbPiqk_LFzN5+J14w4fsaz+YL-n+g@mail.gmail.com>
 <39FA939A-D71E-4FC4-9AEB-8D9FCDF02CD4@stufft.io>
Message-ID: <1516155771.12933368.1431346559516.JavaMail.zimbra@redhat.com>



----- Original Message -----
> From: "Donald Stufft" <donald at stufft.io>
> To: "Nick Coghlan" <ncoghlan at gmail.com>
> Cc: "python-dev" <python-dev at python.org>, "M.-A. Lemburg" <mal at egenix.com>
> Sent: Monday, May 11, 2015 1:16:58 PM
> Subject: Re: [Python-Dev] PYTHONHTTPSVERIFY env var
> 
> 
> > On May 11, 2015, at 6:47 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> > 
> > On 11 May 2015 at 20:23, Donald Stufft <donald at stufft.io> wrote:
> >> On May 11, 2015, at 6:15 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> >>> We made the decision when PEP 476 was accepted that this change turned
> >>> a silent security failure into a noisy one, rather than being a
> >>> regression in its own right. PEP 493 isn't about disagreeing with that
> >>> decision, it's about providing a smoother upgrade path in contexts
> >>> where letting the security failure remain silent is deemed to be
> >>> preferred in the near term.
> >> 
> >> I don't really agree that the decision to disable TLS is an environment
> >> one,
> >> it's really a per application decision. This is why I was against having
> >> some
> >> sort of global off switch for all of Python because just because one
> >> application needs it turned off doesn't mean you want it turned off for
> >> another
> >> Python application.
> > 
> > The scenario I'm interested in is the one where it *was* off globally
> > (i.e. you were already running Python 2.7.8 or earlier) and you want
> > to manage a global rollout of a new Python version that supports being
> > configured to verify HTTPS certificates by default, while making the
> > decision on whether or not to enable HTTPS certificate verification on
> > a server-by-server basis, rather than having that decision be coupled
> > directly to the rollout of the updated version of Python.
> > 
> > I agree that the desired end state is where Python 3 is, and where
> > upstream Python 2.7.9+ is, this is solely about how to facilitate
> > folks getting from point A to point B without an intervening window of
> > "I broke the world and now my boss is yelling at me about it" :)
> > 
> 
> Oh, another issue that I forgot to mention--
> 
> A fair number of people had no idea that Python wasn't validating TLS before
> 2.7.9/3.4.3 however as part of the processing of changing that in 2.7.9 a lot
> of people became aware that Python's before 2.7.9 didn't validate but that
> Python 2.7.9+ does. I worry that if Redhat (or anyone) ships a Python 2.7.9
> that doesn't verify by default then they are going to be shipping something
> which defies the expectations of those users who were relying on the fact
> that
> Python 2.7.9+ was supposed to be secure by default now. You're
> (understandibly)
> focusing on "I already have my thing running on Python 2.7.8 and I want to
> yum update and get 2.7.9 and have things not visibly break", however there is
> the other use case of "I'm setting up a new environment, and I installed RHEL
> and got 2.7.9, I remembered reading in LWN that 2.7.9 verifies now so I must
> be safe". If you *do* provide such a switch, defaulting it to verify and
> having

We (Red Hat) will not update python to 2.7.9, we ship 2.7.5 and backport 
bugfixes/features based on users demand.

> people where that breaks go in and turn it off is probably a safer mechanism
> since the cases where 2.7.9 verification breaks things for people is a
> visible
> change where the case that someone expects 2.7.9 to verify and it doesn't
> isn't
> a visible change and is easily missed unless they go out of their way to try
> and test it against a server with an invalid certificate.
> 
> Either way, if there is some sort of global off switch, having that off
> switch
> set to off should raise some kind of warning (like urllib3 does if you use
> the unverified HTTPS methods). To be clear, I don't mean that using the built
> in ssl module APIs to disable verification should raise a warning, I mean the
> hypothetical "make my Python insecurely access HTTPS" configuration file (or
> environment variable) that is being proposed.
> 
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
> 
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/rkuska%40redhat.com
> 


Regards
Robert Kuska
{rkuska}

From ncoghlan at gmail.com  Tue May 12 10:56:07 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 18:56:07 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <5551B26C.2080800@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
Message-ID: <CADiSq7cE_gJtAdbsaLiwqwaHYm8D1LTHqYCoj_rGk_Kwnc+9mg@mail.gmail.com>

On 12 May 2015 at 17:57, M.-A. Lemburg <mal at egenix.com> wrote:
> The point here is that sys admins should not
> have to patch Python to make things work again, in case
> an application is not prepared for the certificate
> verification - which is rather likely, since the pre-Python
> 2.7.9 doesn't even provide the necessary APIs to pass
> certificate locations to urllib or urllib2.

You've persuaded me that we should describe *both* configuration
mechanisms in the recommendations PEP (one for a cross-platform
environment variable based approach that also works for embedded
Python instances and user level distributions, one for a configuration
file based approach that only covers *nix system Python
installations), and leave the decision to redistributors as to which
approach makes the most sense for their particular audience.

Neither goes into upstream CPython 2.7, and neither goes into any
version of Python 3.

>> Yes, getting an administrative application to the point where -S can
>> be used means getting it to a point where it has *no* Python
>> dependencies outside the standard library. It can certainly be done,
>> but often won't be worth the hassle. As a result, using -s to turn off
>> the user site directory and -E to turn off PYTHONPATH processing are
>> the more common sys.path related hardening techniques in Python 2.7.
>
> Overall, I find the config file approach a too big a hammer to solve
> this simple problem.

The problem of managing this security issue on behalf of customers
without risking breaking their world is in no way simple - hence why
most redistributors lobbed it into the "too hard" basket instead, and
why we've been discussing possible UX improvements with upstream
on-and-off for a couple of months now :)

> If there were more use cases for a Python config file, the added
> overhead could pay off, but then we should put more thought into
> optimizing the config file load time and probably end up using
> a Python module as config file (which provides these optimizations
> for free).

This is why the proposal is for an SSL specific configuration file,
loaded only when the SSL module is imported.

> In the end, we'd be introducing another sitecustomize.py,
> usercustomize.py and perhaps localcustomize.py - only with
> fixed locations rather than importing them via sys.path.

It won't come to that, as Linux system package managers don't support
any of these - that's what containers are for.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From donald at stufft.io  Tue May 12 12:04:32 2015
From: donald at stufft.io (Donald Stufft)
Date: Tue, 12 May 2015 06:04:32 -0400
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <5551B26C.2080800@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
Message-ID: <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>


> On May 12, 2015, at 3:57 AM, M.-A. Lemburg <mal at egenix.com> wrote:
> 
> In a user based installation (which most applications shipping
> their own Python installation are), you can always do this
> provided you can gain the application user permissions.

Of course, if the application is shipping it?s own Python then
it has to actually do something to update to 2.7.9 and it can
add it?s own option to disable TLS verification. I personally
think that the application providing that option is the *right* way
and all these other things are, at best, just temporary shims until
the applications do that.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 801 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/20adc00d/attachment.sig>

From mal at egenix.com  Tue May 12 12:56:56 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Tue, 12 May 2015 12:56:56 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<5541E0E6.6060701@egenix.com>	<CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>	<554C7944.2020905@egenix.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>	<CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>	<55506289.1000705@egenix.com>	<CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>	<555074C1.80909@egenix.com>	<CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>	<5550F9D4.5050701@egenix.com>	<CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>	<5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
Message-ID: <5551DC78.1090200@egenix.com>

On 12.05.2015 12:04, Donald Stufft wrote:
> 
>> On May 12, 2015, at 3:57 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>>
>> In a user based installation (which most applications shipping
>> their own Python installation are), you can always do this
>> provided you can gain the application user permissions.
> 
> Of course, if the application is shipping it?s own Python then
> it has to actually do something to update to 2.7.9 and it can
> add it?s own option to disable TLS verification. I personally
> think that the application providing that option is the *right* way
> and all these other things are, at best, just temporary shims until
> the applications do that.

I still believe that requiring to monkeypatch Python is a very poor
approach in terms of quality software design. We can do better and
we should.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 12 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From donald at stufft.io  Tue May 12 13:09:56 2015
From: donald at stufft.io (Donald Stufft)
Date: Tue, 12 May 2015 07:09:56 -0400
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <5551DC78.1090200@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
Message-ID: <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>

If you control the app you don't need to do that. All relevant api accept the context parameter. The shims are only useful when you don't control the app. So an app shipping their own python doesn't fall under that. 

> On May 12, 2015, at 6:56 AM, M.-A. Lemburg <mal at egenix.com> wrote:
> 
>> On 12.05.2015 12:04, Donald Stufft wrote:
>> 
>>> On May 12, 2015, at 3:57 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>>> 
>>> In a user based installation (which most applications shipping
>>> their own Python installation are), you can always do this
>>> provided you can gain the application user permissions.
>> 
>> Of course, if the application is shipping it?s own Python then
>> it has to actually do something to update to 2.7.9 and it can
>> add it?s own option to disable TLS verification. I personally
>> think that the application providing that option is the *right* way
>> and all these other things are, at best, just temporary shims until
>> the applications do that.
> 
> I still believe that requiring to monkeypatch Python is a very poor
> approach in terms of quality software design. We can do better and
> we should.
> 
> -- 
> Marc-Andre Lemburg
> eGenix.com
> 
> Professional Python Services directly from the Source  (#1, May 12 2015)
>>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
> ________________________________________________________________________
> 
> ::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::
> 
>   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
>    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
>           Registered at Amtsgericht Duesseldorf: HRB 46611
>               http://www.egenix.com/company/contact/

From ncoghlan at gmail.com  Tue May 12 13:10:11 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 21:10:11 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <5551DC78.1090200@egenix.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
Message-ID: <CADiSq7fhy86WzNic1NjYGbO=M92389bxASquBHwL70M_KA=8uQ@mail.gmail.com>

On 12 May 2015 at 20:56, M.-A. Lemburg <mal at egenix.com> wrote:
> On 12.05.2015 12:04, Donald Stufft wrote:
>>
>>> On May 12, 2015, at 3:57 AM, M.-A. Lemburg <mal at egenix.com> wrote:
>>>
>>> In a user based installation (which most applications shipping
>>> their own Python installation are), you can always do this
>>> provided you can gain the application user permissions.
>>
>> Of course, if the application is shipping it?s own Python then
>> it has to actually do something to update to 2.7.9 and it can
>> add it?s own option to disable TLS verification. I personally
>> think that the application providing that option is the *right* way
>> and all these other things are, at best, just temporary shims until
>> the applications do that.
>
> I still believe that requiring to monkeypatch Python is a very poor
> approach in terms of quality software design. We can do better and
> we should.

It's a deliberate design choice to actively discourage people from
doing it - your "Ewww" reaction to monkeypatching is exactly the one
we want. There's no technical reason for people to be bothered by it,
since it's a documented and supported technique covered by the
relevant PEP - it just so happens that the configuration being done is
to switch a function alias between two different functions.

Both of the recommended options I'm putting in the PEP (essentially
the Red Hat design and the eGenix design, since we cover two different
use cases) still adopt that same basic implementation model, they just
provide ways for redistributors to move the configuration inside the
SSL module itself if they decide it is in their users' interests for
them to do so.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Tue May 12 13:17:21 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 21:17:21 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
 <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
Message-ID: <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>

On 12 May 2015 at 21:09, Donald Stufft <donald at stufft.io> wrote:
> If you control the app you don't need to do that. All relevant api accept the context parameter. The shims are only useful when you don't control the app. So an app shipping their own python doesn't fall under that.

I think the "bundled Python" scenario MAL is interested in is this one:

1. An application with a bundled CPython runtime is using the
verification defaults
2. Upgraded the bundled Python to 2.7.9
3. Didn't provide new configuration settings to disable certificate verification
4. Is being upgraded in an environment where verifying certificates
makes the app unusable for environmental reasons related to
certificate management

The PyRun single-file Python interpreter has a similar need, where
some apps than ran fine under 2.7.8 will need a way to disable cert
verification in 2.7.9+ on a per-application basis, *without* modifying
the applications.

Both of those make sense to me as cases where the environment variable
based security downgrade approach is the "least bad" answer available,
which is why I eventually agreed it should be one of the
recommendations in the PEP.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Tue May 12 13:19:55 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 21:19:55 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
 <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
 <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
Message-ID: <CADiSq7fCAfMb+G6FZq+WiTraMdicSr6wi+YfyoaAMw0wpTP78Q@mail.gmail.com>

On 12 May 2015 at 21:17, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Both of those make sense to me as cases where the environment variable
> based security downgrade approach is the "least bad" answer available,
> which is why I eventually agreed it should be one of the
> recommendations in the PEP.

It occurs to me that the subtitle of PEP 493 could be "All software is
terrible, but it's often a system administrator's job to make it run
anyway" :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From mal at egenix.com  Tue May 12 13:20:28 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Tue, 12 May 2015 13:20:28 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7fCAfMb+G6FZq+WiTraMdicSr6wi+YfyoaAMw0wpTP78Q@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>	<CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>	<55506289.1000705@egenix.com>	<CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>	<555074C1.80909@egenix.com>	<CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>	<5550F9D4.5050701@egenix.com>	<CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>	<5551B26C.2080800@egenix.com>	<CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>	<5551DC78.1090200@egenix.com>	<A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>	<CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
 <CADiSq7fCAfMb+G6FZq+WiTraMdicSr6wi+YfyoaAMw0wpTP78Q@mail.gmail.com>
Message-ID: <5551E1FC.8020904@egenix.com>

On 12.05.2015 13:19, Nick Coghlan wrote:
> On 12 May 2015 at 21:17, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> Both of those make sense to me as cases where the environment variable
>> based security downgrade approach is the "least bad" answer available,
>> which is why I eventually agreed it should be one of the
>> recommendations in the PEP.
> 
> It occurs to me that the subtitle of PEP 493 could be "All software is
> terrible, but it's often a system administrator's job to make it run
> anyway" :)

+1 :-)

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 12 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From donald at stufft.io  Tue May 12 13:21:39 2015
From: donald at stufft.io (Donald Stufft)
Date: Tue, 12 May 2015 07:21:39 -0400
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
 <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
 <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
Message-ID: <8E6DA854-D904-4A76-B812-B070C33F2FCE@stufft.io>


> On May 12, 2015, at 7:17 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> On 12 May 2015 at 21:09, Donald Stufft <donald at stufft.io> wrote:
>> If you control the app you don't need to do that. All relevant api accept the context parameter. The shims are only useful when you don't control the app. So an app shipping their own python doesn't fall under that.
> 
> I think the "bundled Python" scenario MAL is interested in is this one:
> 
> 1. An application with a bundled CPython runtime is using the
> verification defaults
> 2. Upgraded the bundled Python to 2.7.9
> 3. Didn't provide new configuration settings to disable certificate verification
> 4. Is being upgraded in an environment where verifying certificates
> makes the app unusable for environmental reasons related to
> certificate management
> 
> The PyRun single-file Python interpreter has a similar need, where
> some apps than ran fine under 2.7.8 will need a way to disable cert
> verification in 2.7.9+ on a per-application basis, *without* modifying
> the applications.
> 
> Both of those make sense to me as cases where the environment variable
> based security downgrade approach is the "least bad" answer available,
> which is why I eventually agreed it should be one of the
> recommendations in the PEP.
> 

Why is without modifying the app a reasonable goal? If Python is bundled
with the app then you have direct control over when that upgrade happens,
so you can delay the upgrade to 2.7.9 until your application which is
bundling Python has the relevant switches. This is distinctly different
from a situation like downstream distributors where the version of Python
being provided is being provided by a group different than the person
providing the application.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 801 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/21aab7e1/attachment.sig>

From ncoghlan at gmail.com  Tue May 12 13:40:01 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 21:40:01 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <8E6DA854-D904-4A76-B812-B070C33F2FCE@stufft.io>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
 <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
 <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
 <8E6DA854-D904-4A76-B812-B070C33F2FCE@stufft.io>
Message-ID: <CADiSq7dCiQzxr4VKF+9D=r-mxgA5SDfyW+uUASBBy5uc8nr_Nw@mail.gmail.com>

On 12 May 2015 at 21:21, Donald Stufft <donald at stufft.io> wrote:
>
>> On May 12, 2015, at 7:17 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>
>> On 12 May 2015 at 21:09, Donald Stufft <donald at stufft.io> wrote:
>>> If you control the app you don't need to do that. All relevant api accept the context parameter. The shims are only useful when you don't control the app. So an app shipping their own python doesn't fall under that.
>>
>> I think the "bundled Python" scenario MAL is interested in is this one:
>>
>> 1. An application with a bundled CPython runtime is using the
>> verification defaults
>> 2. Upgraded the bundled Python to 2.7.9
>> 3. Didn't provide new configuration settings to disable certificate verification
>> 4. Is being upgraded in an environment where verifying certificates
>> makes the app unusable for environmental reasons related to
>> certificate management
>>
>> The PyRun single-file Python interpreter has a similar need, where
>> some apps than ran fine under 2.7.8 will need a way to disable cert
>> verification in 2.7.9+ on a per-application basis, *without* modifying
>> the applications.
>>
>> Both of those make sense to me as cases where the environment variable
>> based security downgrade approach is the "least bad" answer available,
>> which is why I eventually agreed it should be one of the
>> recommendations in the PEP.
>>
>
> Why is without modifying the app a reasonable goal? If Python is bundled
> with the app then you have direct control over when that upgrade happens,
> so you can delay the upgrade to 2.7.9 until your application which is
> bundling Python has the relevant switches. This is distinctly different
> from a situation like downstream distributors where the version of Python
> being provided is being provided by a group different than the person
> providing the application.

Because of the way redistribution works. MAL was right that I was
thinking specifically in terms of the Linux distributor case, where
we're the OS vendor, so we need a way to offer "off by default, opt-in
on a per-server basis". Once I got past that perspective, I was able
to figure out where he was coming from as someone that offers explicit
support for the "redistribution for bundling" use case.

When apps "bundle Python", what's usually happening is that they'll
just bundle whatever version is used on the build server that does the
bundling. If the app developer's testing all uses valid HTTPS
certificates (or simply doesn't test HTTPS at all), they won't see any
problems with the 2.7.9 upgrade, and hence will ship that along to
their customers, where it may break if that customer's environment
turns out to be relying on the lack of certificate verification in
2.7.8 and earlier.

If that scenario happens with unmodified upstream 2.7.9, the
redistributor has no workaround they can pass along to app developers
to in turn pass on to customers - the app developer simply has to tell
their customers to downgrade back to the previous release, and then
issue an updated version with a configuration setting to disable HTTPS
verification as fast as they can. Customers tend to get rather grouchy
about things like that :)

By contrast, if the redistributor for the bundled version of Python
has injected PYTHONHTTPSVERIFY support, then the app developers can at
least pass along "set PYTHONHTTPSVERIFY=0 in the environment" to their
customers as an interim workaround until they get a release out the
door with a proper configuration setting to control whether or not the
app verifies certificates (assuming they don't decide the environment
variable is a good enough workaround).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From mal at egenix.com  Tue May 12 13:41:48 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Tue, 12 May 2015 13:41:48 +0200
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <8E6DA854-D904-4A76-B812-B070C33F2FCE@stufft.io>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>	<CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>	<554C8C60.8000603@egenix.com>	<CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>	<554E4E67.4040405@egenix.com>	<CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>	<CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>	<55506289.1000705@egenix.com>	<CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>	<555074C1.80909@egenix.com>	<CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>	<5550F9D4.5050701@egenix.com>	<CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>	<5551B26C.2080800@egenix.com>	<CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>	<5551DC78.1090200@egenix.com>	<A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>	<CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
 <8E6DA854-D904-4A76-B812-B070C33F2FCE@stufft.io>
Message-ID: <5551E6FC.4050305@egenix.com>

On 12.05.2015 13:21, Donald Stufft wrote:
> 
>> On May 12, 2015, at 7:17 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>
>> On 12 May 2015 at 21:09, Donald Stufft <donald at stufft.io> wrote:
>>> If you control the app you don't need to do that. All relevant api accept the context parameter. The shims are only useful when you don't control the app. So an app shipping their own python doesn't fall under that.
>>
>> I think the "bundled Python" scenario MAL is interested in is this one:
>>
>> 1. An application with a bundled CPython runtime is using the
>> verification defaults
>> 2. Upgraded the bundled Python to 2.7.9
>> 3. Didn't provide new configuration settings to disable certificate verification
>> 4. Is being upgraded in an environment where verifying certificates
>> makes the app unusable for environmental reasons related to
>> certificate management
>>
>> The PyRun single-file Python interpreter has a similar need, where
>> some apps than ran fine under 2.7.8 will need a way to disable cert
>> verification in 2.7.9+ on a per-application basis, *without* modifying
>> the applications.
>>
>> Both of those make sense to me as cases where the environment variable
>> based security downgrade approach is the "least bad" answer available,
>> which is why I eventually agreed it should be one of the
>> recommendations in the PEP.
>>
> 
> Why is without modifying the app a reasonable goal? If Python is bundled
> with the app then you have direct control over when that upgrade happens,
> so you can delay the upgrade to 2.7.9 until your application which is
> bundling Python has the relevant switches. This is distinctly different
> from a situation like downstream distributors where the version of Python
> being provided is being provided by a group different than the person
> providing the application.

Take a Plone Intranet as example:

The unified installer downloads and installs Python 2.7 for you.
As of Plone 4.3.3 the version is Python 2.7.6.

Now say you are a sys admin and your Intranet users are affected by
some bug in 2.7.6 which is fixed in 2.7.9. The natural approach would
be to upgrade the bundled Python to 2.7.9.

Because it's an Intranet and Plone is used to aggregate
information from other systems which use self-signed
certificates, you don't want to risk breaking your Plone
installation and need a way to disable the cert checks.

The best way to do this is by configuring the bundled
Python to disable the checks, since you would not want
to mess with the Plone application itself.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 12 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From donald at stufft.io  Tue May 12 13:45:43 2015
From: donald at stufft.io (Donald Stufft)
Date: Tue, 12 May 2015 07:45:43 -0400
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7dCiQzxr4VKF+9D=r-mxgA5SDfyW+uUASBBy5uc8nr_Nw@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
 <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
 <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
 <8E6DA854-D904-4A76-B812-B070C33F2FCE@stufft.io>
 <CADiSq7dCiQzxr4VKF+9D=r-mxgA5SDfyW+uUASBBy5uc8nr_Nw@mail.gmail.com>
Message-ID: <4F56788F-C7A0-478A-B3E7-CF88BA03F17E@stufft.io>


> On May 12, 2015, at 7:40 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> On 12 May 2015 at 21:21, Donald Stufft <donald at stufft.io> wrote:
>> 
>>> On May 12, 2015, at 7:17 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>> 
>>> On 12 May 2015 at 21:09, Donald Stufft <donald at stufft.io> wrote:
>>>> If you control the app you don't need to do that. All relevant api accept the context parameter. The shims are only useful when you don't control the app. So an app shipping their own python doesn't fall under that.
>>> 
>>> I think the "bundled Python" scenario MAL is interested in is this one:
>>> 
>>> 1. An application with a bundled CPython runtime is using the
>>> verification defaults
>>> 2. Upgraded the bundled Python to 2.7.9
>>> 3. Didn't provide new configuration settings to disable certificate verification
>>> 4. Is being upgraded in an environment where verifying certificates
>>> makes the app unusable for environmental reasons related to
>>> certificate management
>>> 
>>> The PyRun single-file Python interpreter has a similar need, where
>>> some apps than ran fine under 2.7.8 will need a way to disable cert
>>> verification in 2.7.9+ on a per-application basis, *without* modifying
>>> the applications.
>>> 
>>> Both of those make sense to me as cases where the environment variable
>>> based security downgrade approach is the "least bad" answer available,
>>> which is why I eventually agreed it should be one of the
>>> recommendations in the PEP.
>>> 
>> 
>> Why is without modifying the app a reasonable goal? If Python is bundled
>> with the app then you have direct control over when that upgrade happens,
>> so you can delay the upgrade to 2.7.9 until your application which is
>> bundling Python has the relevant switches. This is distinctly different
>> from a situation like downstream distributors where the version of Python
>> being provided is being provided by a group different than the person
>> providing the application.
> 
> Because of the way redistribution works. MAL was right that I was
> thinking specifically in terms of the Linux distributor case, where
> we're the OS vendor, so we need a way to offer "off by default, opt-in
> on a per-server basis". Once I got past that perspective, I was able
> to figure out where he was coming from as someone that offers explicit
> support for the "redistribution for bundling" use case.
> 
> When apps "bundle Python", what's usually happening is that they'll
> just bundle whatever version is used on the build server that does the
> bundling. If the app developer's testing all uses valid HTTPS
> certificates (or simply doesn't test HTTPS at all), they won't see any
> problems with the 2.7.9 upgrade, and hence will ship that along to
> their customers, where it may break if that customer's environment
> turns out to be relying on the lack of certificate verification in
> 2.7.8 and earlier.
> 
> If that scenario happens with unmodified upstream 2.7.9, the
> redistributor has no workaround they can pass along to app developers
> to in turn pass on to customers - the app developer simply has to tell
> their customers to downgrade back to the previous release, and then
> issue an updated version with a configuration setting to disable HTTPS
> verification as fast as they can. Customers tend to get rather grouchy
> about things like that :)
> 
> By contrast, if the redistributor for the bundled version of Python
> has injected PYTHONHTTPSVERIFY support, then the app developers can at
> least pass along "set PYTHONHTTPSVERIFY=0 in the environment" to their
> customers as an interim workaround until they get a release out the
> door with a proper configuration setting to control whether or not the
> app verifies certificates (assuming they don't decide the environment
> variable is a good enough workaround).
> 

Honestly, this reads like "If the person bundling 2.7.9 with their app doesn't
bother to pay attention to what 2.7.9 means then things might break", but
that's hardly special to TLS, there are lots of things that change in a release
that may end up breaking in certain cases.

Looking at Marc-Andre's latest email though, it appears we're using bundling
in a different context? I'm thinking of things like PyInstaller or such where
you're distributing Python + your own App, but this appears to just be some
third party tool is installing Python and an App?

Ultimately, as long as it doesn't end up in upstream CPython the PEP can
recommend any approach and I'm OK with it in the sense that it won't affect me.
Though the PEP should be clear it's for 2.7 only and not 3.x or 2.8 if that
ever gets reversed and we end up with a 2.8.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 801 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/bf7680ba/attachment.sig>

From ncoghlan at gmail.com  Tue May 12 14:34:12 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 12 May 2015 22:34:12 +1000
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <4F56788F-C7A0-478A-B3E7-CF88BA03F17E@stufft.io>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <5541E0E6.6060701@egenix.com>
 <CADiSq7f3AQ1fZtfvEvWLCG+Z-TeuRxftzR7vv_62T2DOU3WKPw@mail.gmail.com>
 <554C7944.2020905@egenix.com>
 <CADiSq7dOazLxnd9DaWY0We6CZ8H=GKnnzLs_kt6tRmqC4aQHdw@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
 <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
 <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
 <8E6DA854-D904-4A76-B812-B070C33F2FCE@stufft.io>
 <CADiSq7dCiQzxr4VKF+9D=r-mxgA5SDfyW+uUASBBy5uc8nr_Nw@mail.gmail.com>
 <4F56788F-C7A0-478A-B3E7-CF88BA03F17E@stufft.io>
Message-ID: <CADiSq7d0zTZXUWjs_X9MjrMiyCXQ0BQ1GyW9a8R-8-vShf_UEg@mail.gmail.com>

On 12 May 2015 at 21:45, Donald Stufft <donald at stufft.io> wrote:
>
>> On May 12, 2015, at 7:40 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>
>> On 12 May 2015 at 21:21, Donald Stufft <donald at stufft.io> wrote:
>>>
>>>> On May 12, 2015, at 7:17 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>>>
>>>> On 12 May 2015 at 21:09, Donald Stufft <donald at stufft.io> wrote:
>>>>> If you control the app you don't need to do that. All relevant api accept the context parameter. The shims are only useful when you don't control the app. So an app shipping their own python doesn't fall under that.
>>>>
>>>> I think the "bundled Python" scenario MAL is interested in is this one:
>>>>
>>>> 1. An application with a bundled CPython runtime is using the
>>>> verification defaults
>>>> 2. Upgraded the bundled Python to 2.7.9
>>>> 3. Didn't provide new configuration settings to disable certificate verification
>>>> 4. Is being upgraded in an environment where verifying certificates
>>>> makes the app unusable for environmental reasons related to
>>>> certificate management
>>>>
>>>> The PyRun single-file Python interpreter has a similar need, where
>>>> some apps than ran fine under 2.7.8 will need a way to disable cert
>>>> verification in 2.7.9+ on a per-application basis, *without* modifying
>>>> the applications.
>>>>
>>>> Both of those make sense to me as cases where the environment variable
>>>> based security downgrade approach is the "least bad" answer available,
>>>> which is why I eventually agreed it should be one of the
>>>> recommendations in the PEP.
>>>>
>>>
>>> Why is without modifying the app a reasonable goal? If Python is bundled
>>> with the app then you have direct control over when that upgrade happens,
>>> so you can delay the upgrade to 2.7.9 until your application which is
>>> bundling Python has the relevant switches. This is distinctly different
>>> from a situation like downstream distributors where the version of Python
>>> being provided is being provided by a group different than the person
>>> providing the application.
>>
>> Because of the way redistribution works. MAL was right that I was
>> thinking specifically in terms of the Linux distributor case, where
>> we're the OS vendor, so we need a way to offer "off by default, opt-in
>> on a per-server basis". Once I got past that perspective, I was able
>> to figure out where he was coming from as someone that offers explicit
>> support for the "redistribution for bundling" use case.
>>
>> When apps "bundle Python", what's usually happening is that they'll
>> just bundle whatever version is used on the build server that does the
>> bundling. If the app developer's testing all uses valid HTTPS
>> certificates (or simply doesn't test HTTPS at all), they won't see any
>> problems with the 2.7.9 upgrade, and hence will ship that along to
>> their customers, where it may break if that customer's environment
>> turns out to be relying on the lack of certificate verification in
>> 2.7.8 and earlier.
>>
>> If that scenario happens with unmodified upstream 2.7.9, the
>> redistributor has no workaround they can pass along to app developers
>> to in turn pass on to customers - the app developer simply has to tell
>> their customers to downgrade back to the previous release, and then
>> issue an updated version with a configuration setting to disable HTTPS
>> verification as fast as they can. Customers tend to get rather grouchy
>> about things like that :)
>>
>> By contrast, if the redistributor for the bundled version of Python
>> has injected PYTHONHTTPSVERIFY support, then the app developers can at
>> least pass along "set PYTHONHTTPSVERIFY=0 in the environment" to their
>> customers as an interim workaround until they get a release out the
>> door with a proper configuration setting to control whether or not the
>> app verifies certificates (assuming they don't decide the environment
>> variable is a good enough workaround).
>>
>
> Honestly, this reads like "If the person bundling 2.7.9 with their app doesn't
> bother to pay attention to what 2.7.9 means then things might break", but
> that's hardly special to TLS, there are lots of things that change in a release
> that may end up breaking in certain cases.

Cert verification is a special case, as cert management in intranets
tends to rely on two things:

* browser users just click through SSL security warnings
* automated internal tools don't check certs at all

Organisations with a specific interest in network security may have
their systems in a happier state, but I wouldn't bet on them being a
substantial minority of organisations, let alone a signficant
majority.

It's hard to overstate how big a mindset shift "getting intranet
network security right isn't optional" represents in our industry, and
it's going to take years for that attitude change to filter out
through the later parts of the technology adoption curve.

> Looking at Marc-Andre's latest email though, it appears we're using bundling
> in a different context? I'm thinking of things like PyInstaller or such where
> you're distributing Python + your own App, but this appears to just be some
> third party tool is installing Python and an App?

I'm personally talking about both, as for a lot of folks bundling
Python, the actual bundling process is handled by a third party tool,
and a lot of application vendors aren't going to think about what
happens where their app is run in an environment with bad certificate
management.

> Ultimately, as long as it doesn't end up in upstream CPython the PEP can
> recommend any approach and I'm OK with it in the sense that it won't affect me.
> Though the PEP should be clear it's for 2.7 only and not 3.x

Yep, I just pushed an update
(https://hg.python.org/peps/rev/b395246d0af7) that adds MAL's
environment variable based solution as an alternative recommendation,
and that update includes this phrase:

===================
These designs are being proposed as a recommendation for redistributors, rather
than as new upstream features, as they are needed purely to support legacy
environments migrating from older versions of Python 2.7. Neither approach
is being proposed as an upstream Python 2.7 feature, nor as a feature in any
version of Python 3 (whether published directly by the Python Software
Foundation or by a redistributor).
===================

> or 2.8 if that
> ever gets reversed and we end up with a 2.8.

There won't be a 2.8, so mentioning that these downstream
modifications wouldn't be used on a release that is never going to
happen would just confuse people :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From skip.montanaro at gmail.com  Tue May 12 15:14:21 2015
From: skip.montanaro at gmail.com (Skip Montanaro)
Date: Tue, 12 May 2015 08:14:21 -0500
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <CACBhJdEU9e_6_xy9iH0WL5SFmZVhSNwjPS5AZ82B=CQCcgqbAQ@mail.gmail.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
 <CACBhJdEU9e_6_xy9iH0WL5SFmZVhSNwjPS5AZ82B=CQCcgqbAQ@mail.gmail.com>
Message-ID: <CANc-5UyNtt5XYTOgXtrkwPO0OWp6CCLAH45pHMFaON0nWQRW=w@mail.gmail.com>

> Twice now, I've gotten this popup: ...

Let me improve my request, as it seems there is some confusion about
what I want. I'm specifically not asking that the popups not be
displayed. I don't mind dismissing them. When they appear, I would,
however, like to glance over at the stream of messages emitted by the
test runner and see a message about it being expected. It seems that
the tests which can trigger the crash reporter do this.

If I get a chance, I will look into where the crash reporter messages
are displayed. Something similar for the network tickling tests should
also be possible.

Skip

From dimaqq at gmail.com  Tue May 12 16:04:56 2015
From: dimaqq at gmail.com (Dima Tisnek)
Date: Tue, 12 May 2015 16:04:56 +0200
Subject: [Python-Dev] What's missing in PEP-484 (Type hints)
In-Reply-To: <CAP7+vJJGgpe1px5RYxhrWEwp3eScq19grSRy=ru2BzDAmt9MQg@mail.gmail.com>
References: <CAGGBzX+yci3B7BQqwxkV1z1U_NA4VT__YaJ8s5XT2S3DPLUtuA@mail.gmail.com>
 <20150430123345.GD5663@ando.pearwood.info>
 <CAGGBzX+0xXk8mPHFs5aKmT1Jvb5hMCfZL1wDa1SU7HTzJNOT4Q@mail.gmail.com>
 <CAP7+vJJGgpe1px5RYxhrWEwp3eScq19grSRy=ru2BzDAmt9MQg@mail.gmail.com>
Message-ID: <CAGGBzXL-MLvbb8WE2KmA=xjF2PBWPeHyo13jijuQ6C0BQvApMA@mail.gmail.com>

re: comprehension

Perhaps PEP can, at least, have a short list/summary of limitations?
I recall something was mentioned, but I can't find a section like that in PEP.


re: example

following https://github.com/JukkaL/mypy/blob/master/stubs/3.2/socket.py

# socket.pyi python2
class _socketobject:
    family = 0  # inferred from initializer (?)
    type = 0  # type: int  # explicit

# socket.pyi python3
class socket:
    family = AddressFamily.AF_UNSPEC  # inferred I presume?

    def settimeout(timeout: Union[int, float, None]) -> None: pass  #
negative arguments illegal
    timeout = -1.0  # yet, that's what you get by default (set None)


Perhaps, after all, socket module is a bad example.
I suppose you have a point that well-written modules are
self-documenting anyway...
Here's another try:

# _sqlite3.pyi python2 version
# warning, readonly: module allows reassignment, but you really shouldn't!
# instead use sqlite3.register_xxx functions
converters = {}  # type: Dict[str, Callable[[str], Any]]
adapters = {}  # type: Dict[Tuple[Type, SomethingInternal],
Callable[[Any], str]]




On 7 May 2015 at 17:39, Guido van Rossum <guido at python.org> wrote:
> On Thu, May 7, 2015 at 7:25 AM, Dima Tisnek <dimaqq at gmail.com> wrote:
>>
>> On 30 April 2015 at 14:33, Steven D'Aprano <steve at pearwood.info> wrote:
>> > On Thu, Apr 30, 2015 at 01:41:53PM +0200, Dima Tisnek wrote:
>> >> # internal vs external
>> >> @intify
>> >> def foo() -> int:
>> >>     b = "42"
>> >>     return b  # check 1
>> >> x = foo() // 2  # check 2
>> >>
>> >> Does the return type apply to implementation (str) or decorated
>> >> callable (int)?
>> >
>> > I would expect that a static type checker would look at foo, and flag
>> > this as an error. The annotation says that foo returns an int, but it
>> > clearly returns a string. That's an obvious error.
>>
>> Is this per PEP, or just a guess?
>>
>> I think PEP needs to be explicit about this.
>
>
> The PEP shouldn't have to explain all the rules for type inferencing.
> There's a section "What is checked?" that says (amongst other things):
>
>   The body of a checked function is checked for consistency with the
>   given annotations.  The annotations are also used to check correctness
>   of calls appearing in other checked functions.
>
>> > Normally local variables will have their type inferred from the
>> > operations done to them:
>> >
>> >     s = arg[1:]  # s has the same type as arg
>>
>> Good point, should be mentioned in PEP.
>
>
> Again, what do you want the PEP to say? I am trying to keep the PEP shorter
> than the actual code that implements the type checker. :-)
>
>>
>> Technically, type can be empty list, mixed list or custom return type
>> for overloaded __getitem__ that accepts slices.
>>
>> I'm sorry if I was not clear. My question was how should type of
>> ephemeral `x` be specified.
>> In other words, can types be specified on anything inside a comprehension?
>
>
> That's actually a good question; the PEP shows some examples of #type:
> comments in peculiar places, but there's no example using list
> comprehensions. Your best bet is to leave it to the type inferencer; if your
> comprehension is so complex that need to put type annotations on parts of
> it, you may be better off rewriting it as a regular for-loop, which offers
> more options for annotations.
>
>>
>> Stub is better (required?) because it allows to specify types of
>> attributes that are not assigned in class scope, but that are expected
>> to be there as result of __init__ or because it's a C extension.
>
>
> Note that the PEP does not explicitly say whether the information of a stub
> might be *merged* with the information gleaned from the source code. The
> basic model is that if a stub is present the implementation source code is
> not read at all by the type checker (and, conversely, information from stubs
> is not available at all at runtime). But it is possible for some type
> checker to improve upon this model.
>
>>
>> An example in PEP would be good.
>
>
> Can you give an example that I can edit and put in the PEP?
>
> --
> --Guido van Rossum (python.org/~guido)

From guido at python.org  Tue May 12 17:28:52 2015
From: guido at python.org (Guido van Rossum)
Date: Tue, 12 May 2015 08:28:52 -0700
Subject: [Python-Dev] Fwd: Coverity Scan update
In-Reply-To: <BLUPR05MB547266807DC8DBA22B2C983C5DA0@BLUPR05MB547.namprd05.prod.outlook.com>
References: <BLUPR05MB547266807DC8DBA22B2C983C5DA0@BLUPR05MB547.namprd05.prod.outlook.com>
Message-ID: <CAP7+vJ+-DOwV9-6ON2nupsm0h_ZuZFdCj5K=CStcoDtOQe2NwA@mail.gmail.com>

---------- Forwarded message ----------
From: "Dakshesh Vyas" <dvyas at coverity.com>
Date: May 12, 2015 1:08 AM
Subject: Coverity Scan update
To: "guido at python.org" <guido at python.org>
Cc:

Hello Guido van Rossum,

Thank you for using Coverity Scan.
With more than 4000 open source projects now registered at Coverity Scan,
we are committed to help the open source community find quality and
security issues early in development.

We would like to inform you that you can now nominate your favorite defect!
Tell us which defects you are glad were found using Coverity Scan and get
special gifts like a Samsung Gear 2 Smartwatch, Code Black Drone or Tile
for iOS and Android. Gifts will be distributed every month based on defect
nomination!

We recently added new hardware resource and upgraded our Scan server to our
latest 7.6 version, which includes great new features:

* New security and quality checkers with version 7.6.
* You can now setup automatic approval for viewing defects in read-only
mode for everyone.
* See charts showing defects outstanding vs. fixed over a period of time,
and sort high impact defects by category.
* Support for standard SSL port 443 instead of 8443.
* Subscribe to email notifications for your entire open source project or
for specific components or modules within your project.
* You can now submit builds more frequently.
* Please download the latest Coverity Build tool from
https://scan.coverity.com/download
* The old Coverity Build tool will still continue to work, but it may not
be able to find some security and quality defects that could be found using
our new and improved checkers as a part of 7.6 upgrade.

* Important note for projects that uses automation or a script to submit
build to Coverity Scan: The old URL to upload build on scan5.coverity.com
is no longer supported. It is applicable to and used by some of the old
projects on Coverity Scan.
* Please sign-in to Coverity Scan to find the updated URL under automation
section of Submit Build.
* Finally, don't forget to nominate your favorite defects to receive
special gifts like a Samsung Gear 2 Smartwatch, Code Black Drone or Tile
for iOS and Android every month!


Post your technical questions at
https://communities.coverity.com/community/scan-(open-source)/content

Please email us at scan-admin at coverity.com if you have any question or
concerns, and we would be happy to help you.


Thanks
Dakshesh Vyas | Technical Manager - Coverity Scan
Office: 415.935.2957 | dvyas at coverity.com
https://scan.coverity.com/
Coverity by Synopsys


To manage Coverity Scan email notifications, click
https://scan.coverity.com/subscriptions/edit?email=guido%40python.org&token=8ba34039b1e46cc590ce8f06179fccdc
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/0f4beded/attachment.html>

From tjreedy at udel.edu  Tue May 12 17:44:25 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Tue, 12 May 2015 11:44:25 -0400
Subject: [Python-Dev] anomaly
In-Reply-To: <20150511074053.GG429@tonks>
References: <CAMjeLr-67ZJNF3N5nWuDNGRWjOY_B41++737TxC4V6DYFH1t5Q@mail.gmail.com>
 <20150511074053.GG429@tonks>
Message-ID: <mit751$p5g$1@ger.gmane.org>

On 5/11/2015 3:40 AM, Florian Bruhin wrote:

[snip]

This trollish thread was cross-posted to python-list, where it was 
semi-ok, at least in the beginning, and pydev, where it is not.  It has 
continued on python-list with pydev removed. Please do not continue it 
here (on pydev).

-- 
Terry Jan Reedy


From taleinat at gmail.com  Tue May 12 18:14:17 2015
From: taleinat at gmail.com (Tal Einat)
Date: Tue, 12 May 2015 19:14:17 +0300
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <CANc-5UyNtt5XYTOgXtrkwPO0OWp6CCLAH45pHMFaON0nWQRW=w@mail.gmail.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
 <CACBhJdEU9e_6_xy9iH0WL5SFmZVhSNwjPS5AZ82B=CQCcgqbAQ@mail.gmail.com>
 <CANc-5UyNtt5XYTOgXtrkwPO0OWp6CCLAH45pHMFaON0nWQRW=w@mail.gmail.com>
Message-ID: <CALWZvp63cQT3FZRk4v7QCMnCs4wdXvLtOL-oLOmKwN4qxF3ong@mail.gmail.com>

On Tue, May 12, 2015 at 4:14 PM, Skip Montanaro
<skip.montanaro at gmail.com> wrote:
>
> > Twice now, I've gotten this popup: ...
>
> Let me improve my request, as it seems there is some confusion about
> what I want. I'm specifically not asking that the popups not be
> displayed. I don't mind dismissing them. When they appear, I would,
> however, like to glance over at the stream of messages emitted by the
> test runner and see a message about it being expected. It seems that
> the tests which can trigger the crash reporter do this.

In my case, the popups appear but then disappear within a fraction of
a second, and this happens about 10-20 times when running the full
test suite. So I don't have a chance to interact with the popups, and
this causes test failures.

Also, when running a large suite of tests, I may not be looking at the
screen by the time these popups appear. I wouldn't want the tests to
fail nor would I want the test run to stall.

I can't test this right now, but does disabling the "network" resource
avoid these popups? Though even if it does we'll still need a way to
run network-related tests on OSX.

Regards,
- Tal Einat

From larry at hastings.org  Tue May 12 19:04:39 2015
From: larry at hastings.org (Larry Hastings)
Date: Tue, 12 May 2015 10:04:39 -0700
Subject: [Python-Dev] How shall we conduct the Python 3.5 beta and rc
 periods? (Please vote!)
Message-ID: <555232A7.7060002@hastings.org>



Python 3.5 beta 1 is coming up soon.  After beta is rc; after rc is 
3.5.0 final.  During the beta and rc periods the Python developer 
workflow changes a little--what sorts of checkins are permissible, and 
how to get something accepted and merged generally becomes more complicated.

I was the release manager for Python 3.4, and the way I ran the rc 
period for 3.4 was miserable.  Everyone hated it--and that includes me.  
I'm absolutely not doing it that way for 3.5.  But that leads one to the 
question: okay, how *am* I going to run it?  I have three ideas for 
workflows for the beta and rc periods for 3.5, below.  But first let's 
talk about what I/we hope to achieve.

Traditionally during the beta and rc periods, development of new 
features stops completely.   Obviously, after feature freeze no new 
features can go into the beta / rc releases.  But why can't people 
develop new features for the *next* release?  The reason given is that 
we're trying to guide the core dev community in the right 
direction--they should concentrate on fixing bugs in the new release.  I 
suspect the real reason for this is that back in the bad old days of 
Subversion (and CVS!) branching and merging were painful.  This social 
engineering policy is a justification after-the-fact.

But Mercurial makes branching and merging nearly effortless. Mercurial 
also makes it painless to develop new features in private.  So consider: 
anyone who wants to work on new features during beta and rc can easily 
do so.  All our "no new features during beta and rc" policy really does 
is drive such development away out of public view.

I think it's time we experimented with lifting that policy.  The trick 
is finding a good place for us to check in the work.  You see, during 
the 3.5 rc period, we arguably want to accept checkins for *three* 
different revisions:

    3.5.0
    3.5.1
    3.6

3.5.1?  Yep.  During the rc period for 3.4--and particularly after the 
last rc was tagged--there were a lot of minor fixes that were desirable, 
but I didn't want to accept into 3.4 just to avoid destabilizing the 
release.  I still wanted those checkins, I just didn't want them in 
3.4.0.  So the way it worked was, developers would check those bugfixes 
in to trunk, then I'd cherry-pick the revisions I wanted into the rc 
branch.  In other words, during the rc period for 3.4, trunk effectively 
represented 3.4.1.  This was valuable and I absolutely want to do it again.


So here are the workflows.  Workflow 0 most resembles what we've done in 
the past.  Workflow 1 is my favorite and the most ambitious.  All three 
give us a public place to put revisions for 3.5.0 and 3.5.1; Workflow 1 
also gives us a place to check in work for 3.6 during the beta and rc 
periods for 3.5.  Workflow 2 is a less ambitious compromise.


Workflow 0
==========

When I ship beta 1, trunk remains 3.5.

When I ship rc 1, trunk becomes 3.5.1.  I create a publically visible 
read-only repo that represents 3.5.0, and any checkins that I want to 
accept into 3.5.0 I must manually cherry-pick from trunk.

When I ship Python 3.5.0 final, we branch 3.5, and trunk becomes 3.6.

Workflow 1
==========

When I ship beta 1, we create the 3.5 branch.  trunk become 3.6.

When I ship rc 1, the 3.5 branch becomes 3.5.1.  I maintain a publically 
visible repo /on bitbucket/ for 3.5.0, and we use bitbucket "pull 
requests" for cherry-picks from 3.5.1 into 3.5.0.

This gives us a pilot project to try out a web GUI for merging.  It 
makes my workflow easier, as I can push a button to accept 
cherry-picks.  (If they don't apply cleanly I can tell the author to go 
fix the conflict and resubmit it.)  The downside: it requires core devs 
to have and use bitbucket accounts.

Workflow 2
==========

When I ship beta 1, trunk remains 3.5.

When I ship rc 1, we create the 3.5 branch.  The 3.5 branch is 3.5.0 and 
trunk is 3.5.1.  Only blessed stuff gets cherry-picked from 3.5.1 back 
into 3.5.0.


What do you think?  My votes are as follows:

    Workflow 0: -0.5
    Workflow 1: +1
    Workflow 2: +0.5


Please cast your votes,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/20cab77b/attachment.html>

From rosuav at gmail.com  Tue May 12 19:23:38 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Wed, 13 May 2015 03:23:38 +1000
Subject: [Python-Dev] How shall we conduct the Python 3.5 beta and rc
 periods? (Please vote!)
In-Reply-To: <555232A7.7060002@hastings.org>
References: <555232A7.7060002@hastings.org>
Message-ID: <CAPTjJmoUQCESyAG4_+Zm4p16AgVh3H5k60dd6H6gEVwZHo_yVg@mail.gmail.com>

On Wed, May 13, 2015 at 3:04 AM, Larry Hastings <larry at hastings.org> wrote:
> Workflow 1
> ==========
>
> When I ship beta 1, we create the 3.5 branch.  trunk become 3.6.
>
> When I ship rc 1, the 3.5 branch becomes 3.5.1.  I maintain a publically
> visible repo on bitbucket for 3.5.0, and we use bitbucket "pull requests"
> for cherry-picks from 3.5.1 into 3.5.0.
>

As a non-core-dev, I would be in favour of this model. I use the
CPython trunk as my build branch for "give me the absolute latest
CPython", and this means that that will always be the case. Same with
the 3.5 branch always meaning "the absolute latest CPython 3.5".
Whether the 3.5.0 branch is on the main hg.python.org or bitbucket
makes no difference to me, as I wouldn't be building against it, so do
whatever makes sense for you and the core dev team. Testing a patch
off the issue tracker would normally want to be done on trunk, I
presume.

Will this model be plausibly extensible to every release? For
instance, when a 3.5.1 rc is cut, will the 3.5 branch immediately
become 3.5.2, with a new 3.5.1 branch being opened on bitbucket?

This model seems the easiest to explain. Every branch is the latest of
whatever it describes; to access anything earlier, including proposed
versions, you seek a different branch.

ChrisA

From nad at acm.org  Tue May 12 19:23:47 2015
From: nad at acm.org (Ned Deily)
Date: Tue, 12 May 2015 10:23:47 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
	Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <555232A7.7060002@hastings.org>
References: <555232A7.7060002@hastings.org>
Message-ID: <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org>

On May 12, 2015, at 10:04, Larry Hastings <larry at hastings.org> wrote:

> Workflow 1
> ==========
> 
> When I ship beta 1, we create the 3.5 branch.  trunk become 3.6.
> 
> When I ship rc 1, the 3.5 branch becomes 3.5.1.  I maintain a publically visible repo on bitbucket for 3.5.0, and we use bitbucket "pull requests" for cherry-picks from 3.5.1 into 3.5.0.
> 
> This gives us a pilot project to try out a web GUI for merging.  It makes my workflow easier, as I can push a button to accept cherry-picks.  (If they don't apply cleanly I can tell the author to go fix the conflict and resubmit it.)  The downside: it requires core devs to have and use bitbucket accounts.

One possible issue with Workflow 1 is that there would need to be an additional set of buildbots (for 3.5, in addition to the existing 3.x (AKA "trunk"), 3.4, and 2.7 ones) for the period from beta 1 until at least 3.5.0 is released and, ideally, until 3.4 support ends, which following recent past practice, would be relatively soon after 3.5.0.

> Workflow 2
> ==========
> 
> When I ship beta 1, trunk remains 3.5.
> 
> When I ship rc 1, we create the 3.5 branch.  The 3.5 branch is 3.5.0 and trunk is 3.5.1.  Only blessed stuff gets cherry-picked from 3.5.1 back into 3.5.0.

Where does 3.6.x fit into Workflow 2?

--
  Ned Deily
  nad at acm.org -- []



From larry at hastings.org  Tue May 12 19:38:23 2015
From: larry at hastings.org (Larry Hastings)
Date: Tue, 12 May 2015 10:38:23 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org>
References: <555232A7.7060002@hastings.org>
 <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org>
Message-ID: <55523A8F.60509@hastings.org>

On 05/12/2015 10:23 AM, Ned Deily wrote:
> One possible issue with Workflow 1 is that there would need to be an additional set of buildbots (for 3.5, in addition to the existing 3.x (AKA "trunk"), 3.4, and 2.7 ones) for the period from beta 1 until at least 3.5.0 is released and, ideally, until 3.4 support ends, which following recent past practice, would be relatively soon after 3.5.0.

Good point.  Though I could concievably push the 3.5.0 rc repo up to an 
hg.python.org "server-side clone" and kick off the buildbots from there.
> Where does 3.6.x fit into Workflow 2?

It doesn't.  Workflows 0 and 2 mean no public development of 3.6 until 
after 3.5.0 final ships, as per tradition.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/5a0dc756/attachment.html>

From larry at hastings.org  Tue May 12 19:39:39 2015
From: larry at hastings.org (Larry Hastings)
Date: Tue, 12 May 2015 10:39:39 -0700
Subject: [Python-Dev] How shall we conduct the Python 3.5 beta and rc
 periods? (Please vote!)
In-Reply-To: <CAPTjJmoUQCESyAG4_+Zm4p16AgVh3H5k60dd6H6gEVwZHo_yVg@mail.gmail.com>
References: <555232A7.7060002@hastings.org>
 <CAPTjJmoUQCESyAG4_+Zm4p16AgVh3H5k60dd6H6gEVwZHo_yVg@mail.gmail.com>
Message-ID: <55523ADB.4000508@hastings.org>

On 05/12/2015 10:23 AM, Chris Angelico wrote:
> Will this model be plausibly extensible to every release? For
> instance, when a 3.5.1 rc is cut, will the 3.5 branch immediately
> become 3.5.2, with a new 3.5.1 branch being opened on bitbucket?

Yes, we could always do it that way, though in the past we haven't 
bothered.  Development on previous versions slows down a great deal 
after x.y.1.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/3620406c/attachment.html>

From brett at python.org  Tue May 12 19:20:21 2015
From: brett at python.org (Brett Cannon)
Date: Tue, 12 May 2015 17:20:21 +0000
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <555232A7.7060002@hastings.org>
References: <555232A7.7060002@hastings.org>
Message-ID: <CAP1=2W7PGFVKmPXSMhy_YbqQC-zqxpFm+xYkU2drxnNZ2LsAxw@mail.gmail.com>

On Tue, May 12, 2015 at 1:05 PM Larry Hastings <larry at hastings.org> wrote:

> [SNIP]
>
> What do you think?  My votes are as follows:
>
> Workflow 0: -0.5
> Workflow 1: +1
> Workflow 2: +0.5
>
>
> Please cast your votes,
>

Workflow 0: -0
Workflow 1: +1
Workflow 2:  +0
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/56101f5f/attachment.html>

From solipsis at pitrou.net  Tue May 12 19:53:13 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Tue, 12 May 2015 19:53:13 +0200
Subject: [Python-Dev] How shall we conduct the Python 3.5 beta and rc
 periods? (Please vote!)
References: <555232A7.7060002@hastings.org>
Message-ID: <20150512195313.784fd074@fsol>

On Tue, 12 May 2015 10:04:39 -0700
Larry Hastings <larry at hastings.org> wrote:
> 
> Workflow 1
> ==========
> 
> When I ship beta 1, we create the 3.5 branch.  trunk become 3.6.
> 
> When I ship rc 1, the 3.5 branch becomes 3.5.1.  I maintain a publically 
> visible repo /on bitbucket/ for 3.5.0, and we use bitbucket "pull 
> requests" for cherry-picks from 3.5.1 into 3.5.0.
> 
> This gives us a pilot project to try out a web GUI for merging.  It 
> makes my workflow easier, as I can push a button to accept 
> cherry-picks.  (If they don't apply cleanly I can tell the author to go 
> fix the conflict and resubmit it.)  The downside: it requires core devs 
> to have and use bitbucket accounts.

Only if they want to submit cherry-picks, of course.

> What do you think?  My votes are as follows:
> 
>     Workflow 0: -0.5
>     Workflow 1: +1
>     Workflow 2: +0.5
> 
> Please cast your votes,

Well, since you're the one doing the work, and you seem to be ok with
the most complicated workflow, I'll vote exactly as you :)

Another entirely different approach, though, is to rework the release
cycle and shorten the time between feature releases. Then we can have
shorter freeze periods (the current freeze durations are really long).

Regards

Antoine.



From ethan at stoneleaf.us  Tue May 12 19:55:41 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Tue, 12 May 2015 10:55:41 -0700
Subject: [Python-Dev] PYTHONHTTPSVERIFY env var
In-Reply-To: <CADiSq7fCAfMb+G6FZq+WiTraMdicSr6wi+YfyoaAMw0wpTP78Q@mail.gmail.com>
References: <CADiSq7cXL-5M+O67i9Vp66g2ksgiqU9xr0oxxOvOY2Y-77hRCQ@mail.gmail.com>
 <554C8C60.8000603@egenix.com>
 <CADiSq7cxy2-vBmitGjZM7cAJ1-marG_2fXHVd05Ot-PgeupCEA@mail.gmail.com>
 <554E4E67.4040405@egenix.com>
 <CAPTjJmryWbDUr6N-vkmUULwNXqCzh6N+xQyN=546_ZkKfM0YUw@mail.gmail.com>
 <CAJ3HoZ1MgTAhDqh1WqwCtPOc=kK2CoLp+pFDDk+GkcVQ3Y1_OA@mail.gmail.com>
 <55506289.1000705@egenix.com>
 <CADiSq7fvUpS62S+uX8eyRvxd=iGi7B7ewLPdR9dzbLn=7BJ4ow@mail.gmail.com>
 <555074C1.80909@egenix.com>
 <CADiSq7eCqv=E9X9wP4U1DLeGA7UpQpjBM=1nRsL9ZRVR=mpp7Q@mail.gmail.com>
 <5550F9D4.5050701@egenix.com>
 <CADiSq7cMaKbzD+BRE4q+0BmiGRa=eePTJ18G7X+CCh+H9a_LAA@mail.gmail.com>
 <5551B26C.2080800@egenix.com>
 <CB0EC535-701A-453A-A0D8-0644D5721E7C@stufft.io>
 <5551DC78.1090200@egenix.com>
 <A8C7AE0F-3812-43E6-BC2C-4426001B27AC@stufft.io>
 <CADiSq7dEGwLnho8ey2OTKkoRBbBvCyNqx1ZTxrrSbPy46YuX_g@mail.gmail.com>
 <CADiSq7fCAfMb+G6FZq+WiTraMdicSr6wi+YfyoaAMw0wpTP78Q@mail.gmail.com>
Message-ID: <55523E9D.9090101@stoneleaf.us>

On 05/12/2015 04:19 AM, Nick Coghlan wrote:

> It occurs to me that the subtitle of PEP 493 could be "All software is
> terrible, but it's often a system administrator's job to make it run
> anyway" :)

+1 QoTW

--
~Ethan~


From nad at acm.org  Tue May 12 20:21:43 2015
From: nad at acm.org (Ned Deily)
Date: Tue, 12 May 2015 11:21:43 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
	Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <55523A8F.60509@hastings.org>
References: <555232A7.7060002@hastings.org>
 <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org> <55523A8F.60509@hastings.org>
Message-ID: <0C1213E6-B7CD-4ED7-82ED-0278C8C55EA2@acm.org>

On May 12, 2015, at 10:38, Larry Hastings <larry at hastings.org> wrote:
> On 05/12/2015 10:23 AM, Ned Deily wrote:
>> One possible issue with Workflow 1 is that there would need to be an additional set of buildbots (for 3.5, in addition to the existing 3.x (AKA "trunk"), 3.4, and 2.7 ones) for the period from beta 1 until at least 3.5.0 is released and, ideally, until 3.4 support ends, which following recent past practice, would be relatively soon after 3.5.0.
> Good point.  Though I could concievably push the 3.5.0 rc repo up to an hg.python.org "server-side clone" and kick off the buildbots from there.

I wasn't worrying about the 3.5.0rc "branch", but, yeah, we could probably improvise ones for that as you suggest. So, buildbots would be: 3.5 branch (->3.5.1 as of beta1), 3.5rc (as needed from rc1 to final), along with the current 3.x (->3.5.0 today, -> 3.6.0, as of beta1), 3.4, and 2.7 buildbots.

I like the idea of experimentally trying the push workflow but, if we are all doing our jobs right, there should be very few changes going in after rc1 so most committers won't need to push anything to the 3.5.0rc repo and, if for some reason they aren't able to use Bitbucket, someone could do it for them.  In other words, while nice, the use of Bitbucket is not a critical feature of Workflow 1.  I think 1 is the best choice with or without the use of Bitbucket.  But we should also recognize that adopting it can make more work for committers fixing bugs over the next few months (between beta1 and final), as we need to consider testing and pushing each fix to default (for 3.6.0), 3.5 (for 3.5.0 until rc1, then for 3.5.1), 3.4 (for 3.4.4), and/or 2.7 (for 2.7.11).  That's the tradeoff for allowing feature work to be committed, integrated, and tested during the beta and rc periods.

>> Where does 3.6.x fit into Workflow 2?
> It doesn't.  Workflows 0 and 2 mean no public development of 3.6 until after 3.5.0 final ships, as per tradition.

Workflow 0 = -1
Workflow 1 = +1
Workflow 2 = -0.5

--
  Ned Deily
  nad at acm.org -- []



From ethan at stoneleaf.us  Tue May 12 20:29:30 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Tue, 12 May 2015 11:29:30 -0700
Subject: [Python-Dev] How shall we conduct the Python 3.5 beta and rc
 periods? (Please vote!)
In-Reply-To: <555232A7.7060002@hastings.org>
References: <555232A7.7060002@hastings.org>
Message-ID: <5552468A.1000201@stoneleaf.us>

On 05/12/2015 10:04 AM, Larry Hastings wrote:

Workflow 1: +1

--
~Ethan~

From barry at python.org  Tue May 12 21:03:26 2015
From: barry at python.org (Barry Warsaw)
Date: Tue, 12 May 2015 15:03:26 -0400
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <55523A8F.60509@hastings.org>
References: <555232A7.7060002@hastings.org>
 <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org>
 <55523A8F.60509@hastings.org>
Message-ID: <20150512150326.5f5f0d25@anarchist.wooz.org>

On May 12, 2015, at 10:38 AM, Larry Hastings wrote:

>It doesn't.  Workflows 0 and 2 mean no public development of 3.6 until after
>3.5.0 final ships, as per tradition.

I still think it's a good idea to focus primarily on 3.5 while it's in the
beta/rc period, but if you want to allow for landings on what will be 3.6,
then going with workflow 1 will be an interesting social experiment.

I'm fine with any of them as long as the workflow is *well documented*,
preferably in the devguide.

Cheers,
-Barry
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/4f5e2aad/attachment.sig>

From nad at acm.org  Tue May 12 21:13:14 2015
From: nad at acm.org (Ned Deily)
Date: Tue, 12 May 2015 12:13:14 -0700
Subject: [Python-Dev] How shall we conduct the Python 3.5 beta and rc
	periods? (Please vote!)
References: <555232A7.7060002@hastings.org>
 <CAPTjJmoUQCESyAG4_+Zm4p16AgVh3H5k60dd6H6gEVwZHo_yVg@mail.gmail.com>
 <55523ADB.4000508@hastings.org>
Message-ID: <nad-267C41.12131412052015@news.gmane.org>

In article <55523ADB.4000508 at hastings.org>,
 Larry Hastings <larry at hastings.org> wrote:

> On 05/12/2015 10:23 AM, Chris Angelico wrote:
> > Will this model be plausibly extensible to every release? For
> > instance, when a 3.5.1 rc is cut, will the 3.5 branch immediately
> > become 3.5.2, with a new 3.5.1 branch being opened on bitbucket?
> Yes, we could always do it that way, though in the past we haven't 
> bothered.  Development on previous versions slows down a great deal 
> after x.y.1.

Well, we *could*.  But that is hardly worth it as we don't do long-term 
maintenance of each maintenance release.  In other words, from a 
python-dev perspective, 3.5.1 is dead the moment 3.5.2 goes out the 
door; we only maintain the most recent maintenance release of a major 
release cycle.  So there is no need for a 3.5.x branch other than 
possibly during the typically very short rc cycle of a maintenance 
release and even then we usually see at most a couple of rc fixes and 
usually none, as expected.

-- 
 Ned Deily,
 nad at acm.org


From tjreedy at udel.edu  Wed May 13 00:08:30 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Tue, 12 May 2015 18:08:30 -0400
Subject: [Python-Dev] Tracker reviews look like spam
Message-ID: <mittl6$bj3$1@ger.gmane.org>

Gmail dumps patch review email in my junk box. The problem seems to be 
the spoofed From: header.

Received: from psf.upfronthosting.co.za ([2a01:4f8:131:2480::3])
         by mx.google.com with ESMTP id 
m1si26039166wjy.52.2015.05.12.00.20.38
         for <tjreedy at udel.edu>;
         Tue, 12 May 2015 00:20:38 -0700 (PDT)
Received-SPF: softfail (google.com: domain of transitioning 
storchaka at gmail.com does not designate 2a01:4f8:131:2480::3 as permitted 
sender) client-ip=2a01:4f8:131:2480::3;

Tracker reviews are the only false positives in my junk list. Otherwise, 
I might stop reviewing. Verizon does not even deliver mail that fails 
its junk test, so I would not be surprised if there are people who 
simply do not get emailed reviews.

Tracker posts are sent from Person Name <report at bugs.python.org>
Perhaps reviews could come 'from' Person Name <review at bugs.python.org>

Even direct tracker posts just get a neutral score.
Received-SPF: neutral (google.com: 2a01:4f8:131:2480::3 is neither 
permitted nor denied by best guess record for domain of 
roundup-admin at psf.upfronthosting.co.za) client-ip=2a01:4f8:131:2480::3;

SPF is Sender Policy Framework
https://en.wikipedia.org/wiki/Sender_Policy_Framework

Checkins mail, for instance, gets an SPF 'pass' because python.org 
designates mail.python.org as a permitted sender.

-- 
Terry Jan Reedy


From dw+python-dev at hmmz.org  Wed May 13 00:15:25 2015
From: dw+python-dev at hmmz.org (David Wilson)
Date: Tue, 12 May 2015 22:15:25 +0000
Subject: [Python-Dev] Tracker reviews look like spam
In-Reply-To: <mittl6$bj3$1@ger.gmane.org>
References: <mittl6$bj3$1@ger.gmane.org>
Message-ID: <20150512221524.GC1768@k3>

SPF only covers the envelope sender, so it should be possible to set
that to something that validates with SPF, keep the RFC822 From: header
as it is, and maybe(?) include a separate Sender: header matching the
envelope address.


David

On Tue, May 12, 2015 at 06:08:30PM -0400, Terry Reedy wrote:
> Gmail dumps patch review email in my junk box. The problem seems to be the
> spoofed From: header.
> 
> Received: from psf.upfronthosting.co.za ([2a01:4f8:131:2480::3])
>         by mx.google.com with ESMTP id
> m1si26039166wjy.52.2015.05.12.00.20.38
>         for <tjreedy at udel.edu>;
>         Tue, 12 May 2015 00:20:38 -0700 (PDT)
> Received-SPF: softfail (google.com: domain of transitioning
> storchaka at gmail.com does not designate 2a01:4f8:131:2480::3 as permitted
> sender) client-ip=2a01:4f8:131:2480::3;
> 
> Tracker reviews are the only false positives in my junk list. Otherwise, I
> might stop reviewing. Verizon does not even deliver mail that fails its junk
> test, so I would not be surprised if there are people who simply do not get
> emailed reviews.
> 
> Tracker posts are sent from Person Name <report at bugs.python.org>
> Perhaps reviews could come 'from' Person Name <review at bugs.python.org>
> 
> Even direct tracker posts just get a neutral score.
> Received-SPF: neutral (google.com: 2a01:4f8:131:2480::3 is neither permitted
> nor denied by best guess record for domain of
> roundup-admin at psf.upfronthosting.co.za) client-ip=2a01:4f8:131:2480::3;
> 
> SPF is Sender Policy Framework
> https://en.wikipedia.org/wiki/Sender_Policy_Framework
> 
> Checkins mail, for instance, gets an SPF 'pass' because python.org
> designates mail.python.org as a permitted sender.

From christian at python.org  Wed May 13 00:18:58 2015
From: christian at python.org (Christian Heimes)
Date: Wed, 13 May 2015 00:18:58 +0200
Subject: [Python-Dev] Fwd: Coverity Scan update
In-Reply-To: <CAP7+vJ+-DOwV9-6ON2nupsm0h_ZuZFdCj5K=CStcoDtOQe2NwA@mail.gmail.com>
References: <BLUPR05MB547266807DC8DBA22B2C983C5DA0@BLUPR05MB547.namprd05.prod.outlook.com>
 <CAP7+vJ+-DOwV9-6ON2nupsm0h_ZuZFdCj5K=CStcoDtOQe2NwA@mail.gmail.com>
Message-ID: <55527C52.8040808@python.org>

On 2015-05-12 17:28, Guido van Rossum wrote:
> ---------- Forwarded message ----------
> From: "Dakshesh Vyas" <dvyas at coverity.com <mailto:dvyas at coverity.com>>
> Date: May 12, 2015 1:08 AM
> Subject: Coverity Scan update
> To: "guido at python.org <mailto:guido at python.org>" <guido at python.org
> <mailto:guido at python.org>>
> Cc:
> 
> Hello Guido van Rossum,
> 
> Thank you for using Coverity Scan.
> With more than 4000 open source projects now registered at Coverity
> Scan, we are committed to help the open source community find quality
> and security issues early in development.
> 
> We would like to inform you that you can now nominate your favorite
> defect! Tell us which defects you are glad were found using Coverity
> Scan and get special gifts like a Samsung Gear 2 Smartwatch, Code Black
> Drone or Tile for iOS and Android. Gifts will be distributed every month
> based on defect nomination!
> 
> We recently added new hardware resource and upgraded our Scan server to
> our latest 7.6 version, which includes great new features:

Thanks Guido,

I didn't get the mail although I'm a project admin. Dakshesh must have
sent it out to the initial project owner.

I'm planning to do a final round of fixes when the first beta is out.
Currently CPython is down to about 5 issues. The new version might
reveal additional problems or false-positives. Let's see if I can get it
down to zero again.

Christian

From cs at zip.com.au  Wed May 13 00:57:29 2015
From: cs at zip.com.au (Cameron Simpson)
Date: Wed, 13 May 2015 08:57:29 +1000
Subject: [Python-Dev] Tracker reviews look like spam
In-Reply-To: <20150512221524.GC1768@k3>
References: <20150512221524.GC1768@k3>
Message-ID: <20150512225729.GA5739@cskk.homeip.net>

On 12May2015 22:15, David Wilson <dw+python-dev at hmmz.org> wrote:
>SPF only covers the envelope sender, so it should be possible to set
>that to something that validates with SPF, keep the RFC822 From: header
>as it is, and maybe(?) include a separate Sender: header matching the
>envelope address.
>
>David

Indeed. That sounds sane to me too. Google's complaint is SPF specific, so 
hopefully that is the criterion for the spam rating.

It looks like bugs.python.org does not have an SPF record at all - neither SPF 
not TXT. (You really want both, same format, to support older DNS clients).

I'm not sure you need a Sender: (though it wouldn't hurt), given that the From: 
is already a "system" like address ("<report@") and not a forged From: eg 
"Terry Reedy <tjreedy at udel.edu>" as a mailing list would do.

Cheers,
Cameron Simpson <cs at zip.com.au>

>On Tue, May 12, 2015 at 06:08:30PM -0400, Terry Reedy wrote:
>> Gmail dumps patch review email in my junk box. The problem seems to be the
>> spoofed From: header.
>>
>> Received: from psf.upfronthosting.co.za ([2a01:4f8:131:2480::3])
>>         by mx.google.com with ESMTP id
>> m1si26039166wjy.52.2015.05.12.00.20.38
>>         for <tjreedy at udel.edu>;
>>         Tue, 12 May 2015 00:20:38 -0700 (PDT)
>> Received-SPF: softfail (google.com: domain of transitioning
>> storchaka at gmail.com does not designate 2a01:4f8:131:2480::3 as permitted
>> sender) client-ip=2a01:4f8:131:2480::3;
>>
>> Tracker reviews are the only false positives in my junk list. Otherwise, I
>> might stop reviewing. Verizon does not even deliver mail that fails its junk
>> test, so I would not be surprised if there are people who simply do not get
>> emailed reviews.
>>
>> Tracker posts are sent from Person Name <report at bugs.python.org>
>> Perhaps reviews could come 'from' Person Name <review at bugs.python.org>
>>
>> Even direct tracker posts just get a neutral score.
>> Received-SPF: neutral (google.com: 2a01:4f8:131:2480::3 is neither permitted
>> nor denied by best guess record for domain of
>> roundup-admin at psf.upfronthosting.co.za) client-ip=2a01:4f8:131:2480::3;
>>
>> SPF is Sender Policy Framework
>> https://en.wikipedia.org/wiki/Sender_Policy_Framework
>>
>> Checkins mail, for instance, gets an SPF 'pass' because python.org
>> designates mail.python.org as a permitted sender.
>_______________________________________________
>Python-Dev mailing list
>Python-Dev at python.org
>https://mail.python.org/mailman/listinfo/python-dev
>Unsubscribe: https://mail.python.org/mailman/options/python-dev/cs%40zip.com.au

From larry at hastings.org  Wed May 13 01:36:22 2015
From: larry at hastings.org (Larry Hastings)
Date: Tue, 12 May 2015 16:36:22 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <0C1213E6-B7CD-4ED7-82ED-0278C8C55EA2@acm.org>
References: <555232A7.7060002@hastings.org>
 <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org> <55523A8F.60509@hastings.org>
 <0C1213E6-B7CD-4ED7-82ED-0278C8C55EA2@acm.org>
Message-ID: <55528E76.4090505@hastings.org>

On 05/12/2015 11:21 AM, Ned Deily wrote:
> I like the idea of experimentally trying the push workflow but, if we are all doing our jobs right, there should be very few changes going in after rc1 so most committers won't need to push anything to the 3.5.0rc repo and, if for some reason they aren't able to use Bitbucket, someone could do it for them.

For 3.4, I had an amazing number of cherry-picked revisions.  By the end 
it was... 72? over 100?  I'm no longer even sure.

Granted, 3.4 had the new module asyncio, and I got a deluge of 
cherry-pick requests just for that one module.  I permitted 'em because 
a) they wanted it to be ready for prime time when 3.4 shipped, b) there 
was no installed base, and c) a healthy percentage of those changes were 
doc-only.  (But if Victor tries that again during the 3.5 betas, he may 
have another thing coming!)

BTW, this workload was exacerbated by my foolish desire to keep the 
revision DAG nice and clean.  So I was actually starting over from 
scratch and redoing all the cherry-picking every couple of days, just so 
I could get all the cherry-picked revisions in strict chronological 
order.  By the end I had evolved a pretty elaborate guided-process 
automation script to do all the cherry-picking, which you can see here 
if you're curious:

    https://hg.python.org/release/file/b7529318e7cc/3.4/threefourtool.py

I have since given up this foolish desire.  I'll be happy to have a nice 
grubby revision DAG if it means I can spend more time on the internet 
cruising for funny cat videos.


In short, as release manager, I'm pretty stoked by the idea of pressing 
a big shiny button on a website exactly once when I accept a cherry-pick 
request.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/b63379df/attachment.html>

From dirkjan at ochtman.nl  Wed May 13 02:11:44 2015
From: dirkjan at ochtman.nl (Dirkjan Ochtman)
Date: Tue, 12 May 2015 17:11:44 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <55528E76.4090505@hastings.org>
References: <555232A7.7060002@hastings.org>
 <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org>
 <55523A8F.60509@hastings.org> <0C1213E6-B7CD-4ED7-82ED-0278C8C55EA2@acm.org>
 <55528E76.4090505@hastings.org>
Message-ID: <CAKmKYaA=bLLcaejRVpszGpWvLSY_vKvdERNHNDCWV0wNukQ07A@mail.gmail.com>

On Tue, May 12, 2015 at 4:36 PM, Larry Hastings <larry at hastings.org> wrote:
> BTW, this workload was exacerbated by my foolish desire to keep the revision
> DAG nice and clean.  So I was actually starting over from scratch and
> redoing all the cherry-picking every couple of days, just so I could get all
> the cherry-picked revisions in strict chronological order.  By the end I had
> evolved a pretty elaborate guided-process automation script to do all the
> cherry-picking, which you can see here if you're curious:

Couldn't you just keep this as a branch that you then keep rebasing
(without unlinking the original branch)? It doesn't seem like
something that needs a one-off script, to me.

Cheers,

Dirkjan

From larry at hastings.org  Wed May 13 02:15:19 2015
From: larry at hastings.org (Larry Hastings)
Date: Tue, 12 May 2015 17:15:19 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <CAKmKYaA=bLLcaejRVpszGpWvLSY_vKvdERNHNDCWV0wNukQ07A@mail.gmail.com>
References: <555232A7.7060002@hastings.org>
 <B378A343-D109-460D-8EC7-C2CAFF8C6B96@acm.org> <55523A8F.60509@hastings.org>
 <0C1213E6-B7CD-4ED7-82ED-0278C8C55EA2@acm.org>
 <55528E76.4090505@hastings.org>
 <CAKmKYaA=bLLcaejRVpszGpWvLSY_vKvdERNHNDCWV0wNukQ07A@mail.gmail.com>
Message-ID: <55529797.3070804@hastings.org>

On 05/12/2015 05:11 PM, Dirkjan Ochtman wrote:
> Couldn't you just keep this as a branch that you then keep rebasing
> (without unlinking the original branch)? It doesn't seem like
> something that needs a one-off script, to me.

Probably.  It's water under the bridge now--that all happened last 
February and I'm not doing it that way again.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/9d59fd8f/attachment.html>

From ncoghlan at gmail.com  Wed May 13 02:19:56 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 13 May 2015 10:19:56 +1000
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <CAP1=2W7PGFVKmPXSMhy_YbqQC-zqxpFm+xYkU2drxnNZ2LsAxw@mail.gmail.com>
References: <555232A7.7060002@hastings.org>
 <CAP1=2W7PGFVKmPXSMhy_YbqQC-zqxpFm+xYkU2drxnNZ2LsAxw@mail.gmail.com>
Message-ID: <CADiSq7c0AYZOWRaawwAR6Vrzd75vXXfKz7H8Z8x8kXRz9UrTmQ@mail.gmail.com>

On 13 May 2015 03:47, "Brett Cannon" <brett at python.org> wrote:
>
>
>
> On Tue, May 12, 2015 at 1:05 PM Larry Hastings <larry at hastings.org> wrote:
>>
>> [SNIP]
>>
>>
>> What do you think?  My votes are as follows:
>>>
>>> Workflow 0: -0.5
>>> Workflow 1: +1
>>> Workflow 2: +0.5
>>
>>
>> Please cast your votes,
>
>
> Workflow 0: -0
> Workflow 1: +1
> Workflow 2:  +0

Workflow 0: -0
Workflow 1: +1
Workflow 2:  +0

That's taking into account the clarification that the buildbots will be set
up to track the 3.5.x branch after the beta is forked, and that Larry will
also push the 3.5rcX repo to hg.python.org for branch testing.

(Possible alternative plan for the latter: rc1 isn't until August, and I
could aim to have a pilot Kallithea instance set up by then that uses
bugs.python.org credentials to log in. If we didn't get that up and running
for some reason, BitBucket could still be a fallback plan)

Cheers,
Nick.
>
> _______________________________________________
> python-committers mailing list
> python-committers at python.org
> https://mail.python.org/mailman/listinfo/python-committers
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150513/d91ad087/attachment.html>

From larry at hastings.org  Wed May 13 02:29:59 2015
From: larry at hastings.org (Larry Hastings)
Date: Tue, 12 May 2015 17:29:59 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <555243F2.802@jcea.es>
References: <555232A7.7060002@hastings.org> <555243F2.802@jcea.es>
Message-ID: <55529B07.70805@hastings.org>

On 05/12/2015 11:18 AM, Jesus Cea wrote:
> Larry, could you comment about the impact in the buildbots?. I suppose
> option #1 could allows us to test both 3.5 and 3.6 changes. Would you
> confirm this?

Workflow #1 gets us automatic buildbot testing for the 3.5 branch (betas 
and 3.5.1) and trunk (3.6).  It doesn't get us testing of the 3.5.0 
release candidates automatically, because those would be hosted at 
bitbucket.  However: hg.python.org allows creating "server-side clones" 
which can manually run tests against the buildbots.  So if I create a 
server-side clone, then push the release candidate branch there, I can 
kick off buildbot tests.  Who knows, maybe I'd even automate that process.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150512/20642648/attachment-0001.html>

From rosuav at gmail.com  Wed May 13 07:20:28 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Wed, 13 May 2015 15:20:28 +1000
Subject: [Python-Dev] Tracker reviews look like spam
In-Reply-To: <20150512221524.GC1768@k3>
References: <mittl6$bj3$1@ger.gmane.org>
	<20150512221524.GC1768@k3>
Message-ID: <CAPTjJmoz+nDE4dXnN7CaVyt7D+L8XY8C27Y-F2Qoj+JGBdojBw@mail.gmail.com>

On Wed, May 13, 2015 at 8:15 AM, David Wilson <dw+python-dev at hmmz.org> wrote:
> SPF only covers the envelope sender, so it should be possible to set
> that to something that validates with SPF, keep the RFC822 From: header
> as it is, and maybe(?) include a separate Sender: header matching the
> envelope address.

As Cameron says, Sender: isn't necessary - just have the envelope
address be bounces@ or something and it should be fine. This is how
SPF and (eg) mailing lists interact.

ChrisA

From storchaka at gmail.com  Wed May 13 08:50:24 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Wed, 13 May 2015 09:50:24 +0300
Subject: [Python-Dev] cpython: Issue #20172: Convert the winsound module
 to Argument Clinic.
In-Reply-To: <20150513063228.8157.35819@psf.io>
References: <20150513063228.8157.35819@psf.io>
Message-ID: <mius7g$b6m$1@ger.gmane.org>

On 13.05.15 09:32, zach.ware wrote:
> https://hg.python.org/cpython/rev/d3582826d24c
> changeset:   96006:d3582826d24c
> user:        Zachary Ware <zachary.ware at gmail.com>
> date:        Wed May 13 01:21:21 2015 -0500
> summary:
>    Issue #20172: Convert the winsound module to Argument Clinic.

> +/*[clinic input]
> +winsound.PlaySound
> +
> +    sound: Py_UNICODE(nullable=True)

I think this is no longer correct syntax. Should be 
Py_UNICODE(accept={str, NoneType}).



From berker.peksag at gmail.com  Wed May 13 08:57:15 2015
From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=)
Date: Wed, 13 May 2015 09:57:15 +0300
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <555232A7.7060002@hastings.org>
References: <555232A7.7060002@hastings.org>
Message-ID: <CAF4280+GF0HJV87BYY9_nJsiROn8yMOQJLzMwycPG4sY9B6-Rg@mail.gmail.com>

On Tue, May 12, 2015 at 8:04 PM, Larry Hastings <larry at hastings.org> wrote:
> What do you think?  My votes are as follows:
>
> Workflow 0: -0.5
> Workflow 1: +1
> Workflow 2: +0.5
>
>
> Please cast your votes,

Workflow 0: -0
Workflow 1: +1
Workflow 2: +0

--Berker

From larry at hastings.org  Wed May 13 09:59:44 2015
From: larry at hastings.org (Larry Hastings)
Date: Wed, 13 May 2015 00:59:44 -0700
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <CADiSq7c0AYZOWRaawwAR6Vrzd75vXXfKz7H8Z8x8kXRz9UrTmQ@mail.gmail.com>
References: <555232A7.7060002@hastings.org>	<CAP1=2W7PGFVKmPXSMhy_YbqQC-zqxpFm+xYkU2drxnNZ2LsAxw@mail.gmail.com>
 <CADiSq7c0AYZOWRaawwAR6Vrzd75vXXfKz7H8Z8x8kXRz9UrTmQ@mail.gmail.com>
Message-ID: <55530470.3010705@hastings.org>

On 05/12/2015 05:19 PM, Nick Coghlan wrote:
>
> Workflow 0: -0
> Workflow 1: +1
> Workflow 2:  +0
>
> That's taking into account the clarification that the buildbots will 
> be set up to track the 3.5.x branch after the beta is forked, and that 
> Larry will also push the 3.5rcX repo to hg.python.org 
> <http://hg.python.org> for branch testing.
>

I sort of assumed the buildbots would start building the 3.5 branch once 
it was created, yes.  (Are there any branches in the cpython repo that 
they ignore?)

When you say "branch testing", you mean "running the buildbots against 
it"?  Right now the UI for doing that is pretty clunky. Kicking off a 
build against a server-side clone (iirc) requires clicking through a 
couple web pages, filling out a form, and clicking on a teeny-tiny 
button.  It would help *tremendously* here if I could get this 
automated, so I could run a script locally that made everything happen.

Is there a remote API for starting builds?  Or existing automation of 
any kind?  Who should I talk to about this stuff?


> (Possible alternative plan for the latter: rc1 isn't until August, and 
> I could aim to have a pilot Kallithea instance set up by then that 
> uses bugs.python.org <http://bugs.python.org> credentials to log in. 
> If we didn't get that up and running for some reason, BitBucket could 
> still be a fallback plan)
>

I'm happy to consider it.  My proposed workflow is only using a very 
small set of features, and I gather Kallithea already has those 
features.  Bolting on authentication from bugs.python.org would make it 
*less* friction than using Bitbucket.

But I couldn't say for sure until I got to try it.  So get cracking, Nick!


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150513/45e66295/attachment.html>

From eric at trueblade.com  Wed May 13 10:56:55 2015
From: eric at trueblade.com (Eric V. Smith)
Date: Wed, 13 May 2015 04:56:55 -0400
Subject: [Python-Dev] [Python-checkins] cpython: Issue #24064:
 Property() docstrings are now writeable.
In-Reply-To: <20150513081022.11936.7286@psf.io>
References: <20150513081022.11936.7286@psf.io>
Message-ID: <555311D7.5040108@trueblade.com>

On 5/13/2015 4:10 AM, raymond.hettinger wrote:
> https://hg.python.org/cpython/rev/1e8a768fa0a5
> changeset:   96010:1e8a768fa0a5
> user:        Raymond Hettinger <python at rcn.com>
> date:        Wed May 13 01:09:59 2015 -0700
> summary:
>   Issue #24064: Property() docstrings are now writeable.
> (Patch by Berker Peksag.)

> diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst
> --- a/Doc/library/collections.rst
> +++ b/Doc/library/collections.rst
> @@ -924,6 +924,15 @@
>  
>      >>> Point3D = namedtuple('Point3D', Point._fields + ('z',))
>  
> +Docstrings can be customized by making direct assignments to the ``__doc__``
> +fields:
> +
> +   >>> Book = namedtuple('Book', ['id', 'title', 'authors'])
> +   >>> Book.__doc__ = 'Hardcover book in active collection'
> +   >>> Book.id = '13-digit ISBN'
> +   >>> Book.title = 'Title of first printing'
> +   >>> Book.author = 'List of authors sorted by last name'

Should these be:
Book.id.__doc__ = ...
etc.?

> +    Point = namedtuple('Point', ['x', 'y'])
> +    Point.__doc__ = 'ordered pair'
> +    Point.x.__doc__ = 'abscissa'
> +    Point.y.__doc__ = 'ordinate'

These lines from /Doc/whatsnew/3.5.rst would make me think so.

Eric.


From berker.peksag at gmail.com  Wed May 13 11:18:22 2015
From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=)
Date: Wed, 13 May 2015 12:18:22 +0300
Subject: [Python-Dev] [Python-checkins] cpython: Issue #24064:
 Property() docstrings are now writeable.
In-Reply-To: <555311D7.5040108@trueblade.com>
References: <20150513081022.11936.7286@psf.io> <555311D7.5040108@trueblade.com>
Message-ID: <CAF4280+WtfZ=S62-NROu5iQ1y29PSmT+zSfTyMjhaLkcWeieWg@mail.gmail.com>

On Wed, May 13, 2015 at 11:56 AM, Eric V. Smith <eric at trueblade.com> wrote:
> On 5/13/2015 4:10 AM, raymond.hettinger wrote:
>> https://hg.python.org/cpython/rev/1e8a768fa0a5
>> changeset:   96010:1e8a768fa0a5
>> user:        Raymond Hettinger <python at rcn.com>
>> date:        Wed May 13 01:09:59 2015 -0700
>> summary:
>>   Issue #24064: Property() docstrings are now writeable.
>> (Patch by Berker Peksag.)
>
>> diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst
>> --- a/Doc/library/collections.rst
>> +++ b/Doc/library/collections.rst
>> @@ -924,6 +924,15 @@
>>
>>      >>> Point3D = namedtuple('Point3D', Point._fields + ('z',))
>>
>> +Docstrings can be customized by making direct assignments to the ``__doc__``
>> +fields:
>> +
>> +   >>> Book = namedtuple('Book', ['id', 'title', 'authors'])
>> +   >>> Book.__doc__ = 'Hardcover book in active collection'
>> +   >>> Book.id = '13-digit ISBN'
>> +   >>> Book.title = 'Title of first printing'
>> +   >>> Book.author = 'List of authors sorted by last name'
>
> Should these be:
> Book.id.__doc__ = ...
> etc.?

Hi Eric,

Good catch. Fixed in https://hg.python.org/cpython/rev/bde652ae05fd

Thanks!

--Berker

>> +    Point = namedtuple('Point', ['x', 'y'])
>> +    Point.__doc__ = 'ordered pair'
>> +    Point.x.__doc__ = 'abscissa'
>> +    Point.y.__doc__ = 'ordinate'
>
> These lines from /Doc/whatsnew/3.5.rst would make me think so.
>
> Eric.
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/berker.peksag%40gmail.com

From zachary.ware+pydev at gmail.com  Wed May 13 17:02:11 2015
From: zachary.ware+pydev at gmail.com (Zachary Ware)
Date: Wed, 13 May 2015 10:02:11 -0500
Subject: [Python-Dev] cpython: Issue #20172: Convert the winsound module
 to Argument Clinic.
In-Reply-To: <mius7g$b6m$1@ger.gmane.org>
References: <20150513063228.8157.35819@psf.io> <mius7g$b6m$1@ger.gmane.org>
Message-ID: <CAKJDb-NJaSjG0v6WrgxnTaYpCRG8Std9oH3+z8CSa81W_g3R_g@mail.gmail.com>

On Wed, May 13, 2015 at 1:50 AM, Serhiy Storchaka <storchaka at gmail.com> wrote:
> On 13.05.15 09:32, zach.ware wrote:
>>
>> https://hg.python.org/cpython/rev/d3582826d24c
>> changeset:   96006:d3582826d24c
>> user:        Zachary Ware <zachary.ware at gmail.com>
>> date:        Wed May 13 01:21:21 2015 -0500
>> summary:
>>    Issue #20172: Convert the winsound module to Argument Clinic.
>
>
>> +/*[clinic input]
>> +winsound.PlaySound
>> +
>> +    sound: Py_UNICODE(nullable=True)
>
>
> I think this is no longer correct syntax. Should be Py_UNICODE(accept={str,
> NoneType}).

Oh, whoops.  I forgot to run clinic over things again before committing.

We need to stick that into the Windows build process somewhere...

-- 
Zach

From sandro.tosi at gmail.com  Wed May 13 19:22:17 2015
From: sandro.tosi at gmail.com (Sandro Tosi)
Date: Wed, 13 May 2015 13:22:17 -0400
Subject: [Python-Dev] Fwd: [docs] Python documentation missing
In-Reply-To: <CAEjkP-LQjy2q6o-kd2=GnBUTh-HHLrT+p_7qSAEFOq707GjAFQ@mail.gmail.com>
References: <CAEjkP-LQjy2q6o-kd2=GnBUTh-HHLrT+p_7qSAEFOq707GjAFQ@mail.gmail.com>
Message-ID: <CAB4XWXwDr2U3tZiAPQCBYwOfOZ6rtcAoDr4his7i+RF70ZrJPQ@mail.gmail.com>

Hello,
this happens every time we cut a RC release: the files referenced in
the download section are missing and (rightfully so) people complain


---------- Forwarded message ----------
From: Ronald Legere <rjljr2 at gmail.com>
Date: Wed, May 13, 2015 at 11:14 AM
Subject: [docs] Python documentation missing
To: docs at python.org



The links to the downloadable documentation for python 2.7 are 404 on this page:

https://docs.python.org/2/download.html

Are they available somewhere????

--
Ron Legere (rjljr2 at gmail.com)
 C'est le temps que tu as perdu pour ta rose qui fait ta rose si importante


_______________________________________________
docs mailing list
docs at python.org
https://mail.python.org/mailman/listinfo/docs



-- 
Sandro Tosi (aka morph, morpheus, matrixhasu)
My website: http://matrixhasu.altervista.org/
Me at Debian: http://wiki.debian.org/SandroTosi

From benjamin at python.org  Wed May 13 19:36:53 2015
From: benjamin at python.org (Benjamin Peterson)
Date: Wed, 13 May 2015 13:36:53 -0400
Subject: [Python-Dev] Fwd: [docs] Python documentation missing
In-Reply-To: <CAB4XWXwDr2U3tZiAPQCBYwOfOZ6rtcAoDr4his7i+RF70ZrJPQ@mail.gmail.com>
References: <CAEjkP-LQjy2q6o-kd2=GnBUTh-HHLrT+p_7qSAEFOq707GjAFQ@mail.gmail.com>
 <CAB4XWXwDr2U3tZiAPQCBYwOfOZ6rtcAoDr4his7i+RF70ZrJPQ@mail.gmail.com>
Message-ID: <1431538613.3052112.267951521.146E6A1E@webmail.messagingengine.com>



On Wed, May 13, 2015, at 13:22, Sandro Tosi wrote:
> Hello,
> this happens every time we cut a RC release: the files referenced in
> the download section are missing and (rightfully so) people complain

In this case, it was because the Python 3.5 docs were failing to build.
Should be fixed in ~12 hours.

From sandro.tosi at gmail.com  Wed May 13 20:02:21 2015
From: sandro.tosi at gmail.com (Sandro Tosi)
Date: Wed, 13 May 2015 14:02:21 -0400
Subject: [Python-Dev] Fwd: [docs] Python documentation missing
In-Reply-To: <1431538613.3052112.267951521.146E6A1E@webmail.messagingengine.com>
References: <CAEjkP-LQjy2q6o-kd2=GnBUTh-HHLrT+p_7qSAEFOq707GjAFQ@mail.gmail.com>
 <CAB4XWXwDr2U3tZiAPQCBYwOfOZ6rtcAoDr4his7i+RF70ZrJPQ@mail.gmail.com>
 <1431538613.3052112.267951521.146E6A1E@webmail.messagingengine.com>
Message-ID: <CAB4XWXxBvgAu8yV=Ss5oXwe5C4fNgDyJYBHKngtFZiKtM5mn3w@mail.gmail.com>

On Wed, May 13, 2015 at 1:36 PM, Benjamin Peterson <benjamin at python.org> wrote:
>
>
> On Wed, May 13, 2015, at 13:22, Sandro Tosi wrote:
>> Hello,
>> this happens every time we cut a RC release: the files referenced in
>> the download section are missing and (rightfully so) people complain
>
> In this case, it was because the Python 3.5 docs were failing to build.
> Should be fixed in ~12 hours.

awesome! that was also a sort of heads up that it has happened
regularly in the past (I can remember for sure since last summer at
least) that cutting a RC release broke the doc download

Cheers,
-- 
Sandro Tosi (aka morph, morpheus, matrixhasu)
My website: http://matrixhasu.altervista.org/
Me at Debian: http://wiki.debian.org/SandroTosi

From tjreedy at udel.edu  Wed May 13 23:05:16 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 13 May 2015 17:05:16 -0400
Subject: [Python-Dev] Repository builds as 2.7.8+
Message-ID: <mj0eak$9ji$1@ger.gmane.org>

Python 2.7.8+ (default, May 13 2015, 16:46:29) [MSC v.1500 32 bit 
(Intel)] on win32

Shouldn't this be 2.7.9+ or 2.7.10rc1?

-- 
Terry Jan Reedy


From zachary.ware+pydev at gmail.com  Wed May 13 23:45:28 2015
From: zachary.ware+pydev at gmail.com (Zachary Ware)
Date: Wed, 13 May 2015 16:45:28 -0500
Subject: [Python-Dev] Repository builds as 2.7.8+
In-Reply-To: <mj0eak$9ji$1@ger.gmane.org>
References: <mj0eak$9ji$1@ger.gmane.org>
Message-ID: <CAKJDb-Pc1Z9viuigX7N4aecWtdW9v24AakmcLCvw+cza--k1XQ@mail.gmail.com>

On Wed, May 13, 2015 at 4:05 PM, Terry Reedy <tjreedy at udel.edu> wrote:
> Python 2.7.8+ (default, May 13 2015, 16:46:29) [MSC v.1500 32 bit (Intel)]
> on win32
>
> Shouldn't this be 2.7.9+ or 2.7.10rc1?

Make sure your repository is up to date, the patchlevel is correct at
the current tip of the 2.7 branch.

-- 
Zach

From tjreedy at udel.edu  Thu May 14 00:15:10 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 13 May 2015 18:15:10 -0400
Subject: [Python-Dev] Repository builds as 2.7.8+
In-Reply-To: <CAKJDb-Pc1Z9viuigX7N4aecWtdW9v24AakmcLCvw+cza--k1XQ@mail.gmail.com>
References: <mj0eak$9ji$1@ger.gmane.org>
 <CAKJDb-Pc1Z9viuigX7N4aecWtdW9v24AakmcLCvw+cza--k1XQ@mail.gmail.com>
Message-ID: <mj0idm$cjk$1@ger.gmane.org>

On 5/13/2015 5:45 PM, Zachary Ware wrote:
> On Wed, May 13, 2015 at 4:05 PM, Terry Reedy <tjreedy at udel.edu> wrote:
>> Python 2.7.8+ (default, May 13 2015, 16:46:29) [MSC v.1500 32 bit (Intel)]
>> on win32
>>
>> Shouldn't this be 2.7.9+ or 2.7.10rc1?
>
> Make sure your repository is up to date, the patchlevel is correct at
> the current tip of the 2.7 branch.

I had just installed tortoise/hg 3.4, pulled and updated twice.
Then ran external.bat on all branches.
2.7 tip is Kuchling's push at 15:25 utc

I deleted python_d.exe and rebuilt it.  External timestamp in Explorer 
is updated, internal one displayed on startup is not.

The only changes in my working directory are to idle files.

-- 
Terry Jan Reedy


From solipsis at pitrou.net  Thu May 14 00:22:48 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Thu, 14 May 2015 00:22:48 +0200
Subject: [Python-Dev] Repository builds as 2.7.8+
References: <mj0eak$9ji$1@ger.gmane.org>
 <CAKJDb-Pc1Z9viuigX7N4aecWtdW9v24AakmcLCvw+cza--k1XQ@mail.gmail.com>
 <mj0idm$cjk$1@ger.gmane.org>
Message-ID: <20150514002248.4d92e981@fsol>

On Wed, 13 May 2015 18:15:10 -0400
Terry Reedy <tjreedy at udel.edu> wrote:

> On 5/13/2015 5:45 PM, Zachary Ware wrote:
> > On Wed, May 13, 2015 at 4:05 PM, Terry Reedy <tjreedy at udel.edu> wrote:
> >> Python 2.7.8+ (default, May 13 2015, 16:46:29) [MSC v.1500 32 bit (Intel)]
> >> on win32
> >>
> >> Shouldn't this be 2.7.9+ or 2.7.10rc1?
> >
> > Make sure your repository is up to date, the patchlevel is correct at
> > the current tip of the 2.7 branch.
> 
> I had just installed tortoise/hg 3.4, pulled and updated twice.
> Then ran external.bat on all branches.
> 2.7 tip is Kuchling's push at 15:25 utc

What does "hg sum" say?




From tjreedy at udel.edu  Thu May 14 01:48:40 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 13 May 2015 19:48:40 -0400
Subject: [Python-Dev] Repository builds as 2.7.8+
In-Reply-To: <20150514002248.4d92e981@fsol>
References: <mj0eak$9ji$1@ger.gmane.org>
 <CAKJDb-Pc1Z9viuigX7N4aecWtdW9v24AakmcLCvw+cza--k1XQ@mail.gmail.com>
 <mj0idm$cjk$1@ger.gmane.org> <20150514002248.4d92e981@fsol>
Message-ID: <mj0nt0$ul5$1@ger.gmane.org>

On 5/13/2015 6:22 PM, Antoine Pitrou wrote:
> On Wed, 13 May 2015 18:15:10 -0400
> Terry Reedy <tjreedy at udel.edu> wrote:
>
>> On 5/13/2015 5:45 PM, Zachary Ware wrote:
>>> On Wed, May 13, 2015 at 4:05 PM, Terry Reedy <tjreedy at udel.edu> wrote:
>>>> Python 2.7.8+ (default, May 13 2015, 16:46:29) [MSC v.1500 32 bit (Intel)]
>>>> on win32
>>>>
>>>> Shouldn't this be 2.7.9+ or 2.7.10rc1?
>>>
>>> Make sure your repository is up to date, the patchlevel is correct at
>>> the current tip of the 2.7 branch.
>>
>> I had just installed tortoise/hg 3.4, pulled and updated twice.
>> Then ran external.bat on all branches.
>> 2.7 tip is Kuchling's push at 15:25 utc
>
> What does "hg sum" say?

F:\Python\dev\27>hg sum
parent: 96026:68d653f9a2c9
  #19934: fix mangled wording
branch: 2.7
commit: 6 modified
update: (current)

-- 
Terry Jan Reedy


From ronaldoussoren at mac.com  Thu May 14 11:36:09 2015
From: ronaldoussoren at mac.com (Ronald Oussoren)
Date: Thu, 14 May 2015 11:36:09 +0200
Subject: [Python-Dev] Mac popups running make test
In-Reply-To: <CALWZvp63cQT3FZRk4v7QCMnCs4wdXvLtOL-oLOmKwN4qxF3ong@mail.gmail.com>
References: <CANc-5UyQnnxaBkdSdaBx0_QmtQY7dNUpiH7AJmA5bgqf+Hd6Cg@mail.gmail.com>
 <CACBhJdEU9e_6_xy9iH0WL5SFmZVhSNwjPS5AZ82B=CQCcgqbAQ@mail.gmail.com>
 <CANc-5UyNtt5XYTOgXtrkwPO0OWp6CCLAH45pHMFaON0nWQRW=w@mail.gmail.com>
 <CALWZvp63cQT3FZRk4v7QCMnCs4wdXvLtOL-oLOmKwN4qxF3ong@mail.gmail.com>
Message-ID: <49131D27-1A6E-4BEF-B9CE-F71F6381FEF0@mac.com>


> On 12 May 2015, at 18:14, Tal Einat <taleinat at gmail.com> wrote:
> 
> On Tue, May 12, 2015 at 4:14 PM, Skip Montanaro
> <skip.montanaro at gmail.com> wrote:
>> 
>>> Twice now, I've gotten this popup: ...
>> 
>> Let me improve my request, as it seems there is some confusion about
>> what I want. I'm specifically not asking that the popups not be
>> displayed. I don't mind dismissing them. When they appear, I would,
>> however, like to glance over at the stream of messages emitted by the
>> test runner and see a message about it being expected. It seems that
>> the tests which can trigger the crash reporter do this.
> 
> In my case, the popups appear but then disappear within a fraction of
> a second, and this happens about 10-20 times when running the full
> test suite. So I don't have a chance to interact with the popups, and
> this causes test failures.
> 
> Also, when running a large suite of tests, I may not be looking at the
> screen by the time these popups appear. I wouldn't want the tests to
> fail nor would I want the test run to stall.
> 
> I can't test this right now, but does disabling the "network" resource
> avoid these popups? Though even if it does we'll still need a way to
> run network-related tests on OSX.

The only way I know to easily avoid these pop-ups is to turn off the local firewall while testing.  

Signing the interpreter likely also works, but probably only when using a paid developer certificate.

Ronald


From larry at hastings.org  Thu May 14 16:15:34 2015
From: larry at hastings.org (Larry Hastings)
Date: Thu, 14 May 2015 07:15:34 -0700
Subject: [Python-Dev] How shall we conduct the Python 3.5 beta and rc
 periods? (Please vote!)
In-Reply-To: <555232A7.7060002@hastings.org>
References: <555232A7.7060002@hastings.org>
Message-ID: <5554AE06.8030108@hastings.org>


On 05/12/2015 10:04 AM, Larry Hastings wrote:
> What do you think?  [...] Please cast your votes

workflow           0    1    2

Larry Hastings    -0.5  1    0.5
Brett Cannon       0    1    0
Nick Coghlan       0    1    0
Chris Angelico     0    0    0    ?in favor of [Workflow 1]?
Ned Deily         -1    1   -0.5
Barry Warsaw       0    0    0    ?I'm fine with any of them as long as 
the workflow is well documented?
Antoine Pitrou    -0.5  1    0.5
Ethan Furman       0    1    0    didn't cast explicit votes for 
anything but workflow 1
Berker Peksag      0    1    0
Yuri Selivanov     0    1    0    didn't cast explicit votes for 
anything but workflow 1
Jesus Cea          0    1    0

total             -2    9    0.5


I'm calling it--the winner by a landslide is the ambitious Workflow 1.  
Barry and Chris A didn't cast any explicit votes, but both approved of 
Workflow 1.   Every other voter was +1 on Workflow 1 and uninterested or 
negative on the others.

I'll start experimenting with the workflow(s) and will add documentation 
to the Dev Guide.

The fun starts next weekend,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150514/469c8437/attachment.html>

From raymond.hettinger at gmail.com  Thu May 14 16:29:55 2015
From: raymond.hettinger at gmail.com (Raymond Hettinger)
Date: Thu, 14 May 2015 07:29:55 -0700
Subject: [Python-Dev] PEP 455 -- TransformDict
Message-ID: <1F07BE60-C41C-41C1-AF1D-06B6AE9BFC0F@gmail.com>

Before the Python 3.5 feature freeze, I should step-up and
formally reject PEP 455 for "Adding a key-transforming
dictionary to collections".

I had completed an involved review effort a long time ago
and I apologize for the delay in making the pronouncement.

What made it a interesting choice from the outset is that the
idea of a "transformation" is an enticing concept that seems
full of possibility.  I spent a good deal of time exploring
what could be done with it but found that it mostly fell short
of its promise.

There were many issues.  Here are some that were at the top:

* Most use cases don't need or want the reverse lookup feature
  (what is wanted is a set of one-way canonicalization functions).
  Those that do would want to have a choice of what is saved
  (first stored, last stored, n most recent, a set of all inputs,
  a list of all inputs, nothing, etc).  In database terms, it
  models a many-to-one table (the canonicalization or
  transformation function) with the one being a primary key into
  another possibly surjective table of two columns (the
  key/value store).  A surjection into another surjection isn't
  inherently reversible in a useful way, nor does it seem to be a
  common way to model data.

* People are creative at coming up with using cases for the TD
  but then find that the resulting code is less clear, slower,
  less intuitive, more memory intensive, and harder to debug than
  just using a plain dict with a function call before the lookup:
  d[func(key)].  It was challenging to find any existing code
  that would be made better by the availability of the TD.

* The TD seems to be all about combining data scrubbing
  (case-folding, unicode canonicalization, type-folding, object
  identity, unit-conversion, or finding a canonical member of an
  equivalence class) with a mapping (looking-up a value for a
  given key).  Those two operations are conceptually orthogonal.
  The former doesn't get easier when hidden behind a mapping API
  and the latter loses the flexibility of choosing your preferred
  mapping (an ordereddict, a persistentdict, a chainmap, etc) and
  the flexibility of establishing your own rules for whether and
  how to do a reverse lookup.


Raymond Hettinger


P.S.  Besides the core conceptual issues listed above, there
are a number of smaller issues with the TD that surfaced
during design review sessions.  In no particular order, here
are a few of the observations:

* It seems to require above average skill to figure-out what
  can be used as a transform function.  It is more
  expert-friendly than beginner friendly.  It takes a little
  while to get used to it.  It wasn't self-evident that
  transformations happen both when a key is stored and again
  when it is looked-up (contrast this with key-functions for
  sorting which are called at most once per key).

* The name, TransformDict, suggests that it might transform the
  value instead of the key or that it might transform the
  dictionary into something else.  The name TransformDict is so
  general that it would be hard to discover when faced with a
  specific problem.  The name also limits perception of what
  could be done with it (i.e. a function that logs accesses
  but doesn't actually change the key).

* The tool doesn't self describe itself well.  Looking at the
  help(), or the __repr__(), or the tooltips did not provide
  much insight or clarity.  The dir() shows many of the
  _abc implementation details rather than the API itself.

* The original key is stored and if you change it, the change
  isn't stored.  The _original dict is private (perhaps to
  reduce the risk of putting the TD in an inconsistent state)
  but this limits access to the stored data.

* The TD is unsuitable for bijections because the API is
  inherently biased with a rich group of operators and methods
  for forward lookup but has only one method for reverse lookup.

* The reverse feature is hard to find (getitem vs __getitem__)
  and its output pair is surprising and a bit awkward to use.
  It provides only one accessor method rather that the full
  dict API that would be given by a second dictionary.  The
  API hides the fact that there are two underlying dictionaries.

* It was surprising that when d[k] failed, it failed with
  transformation exception rather than a KeyError, violating
  the expectations of the calling code (for example, if the
  transformation function is int(), the call d["12"]
  transforms to d[12] and either succeeds in returning a value
  or in raising a KeyError, but the call d["12.0"] fails with
  a TypeError).  The latter issue limits its substitutability
  into existing code that expects real mappings and for
  exposing to end-users as if it were a normal dictionary.

* There were other issues with dict invariants as well and
  these affected substitutability in a sometimes subtle way.
  For example, the TD does not work with __missing__().
  Also, "k in td" does not imply that "k in list(td.keys())".

* The API is at odds with wanting to access the transformations.
  You pay a transformation cost both when storing and when
  looking up, but you can't access the transformed value itself.
  For example, if the transformation is a function that scrubs
  hand entered mailing addresses and puts them into a standard
  format with standard abbreviations, you have no way of getting
  back to the cleaned-up address.

* One design reviewer summarized her thoughts like this:
  "There is a learning curve to be climbed to figure out what
  it does, how to use it, and what the applications [are].
  But, the [working out the same] examplea with plain dicts
  requires only basic knowledge."  -- Patricia

From guido at python.org  Thu May 14 16:41:55 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 14 May 2015 07:41:55 -0700
Subject: [Python-Dev] PEP 455 -- TransformDict
In-Reply-To: <1F07BE60-C41C-41C1-AF1D-06B6AE9BFC0F@gmail.com>
References: <1F07BE60-C41C-41C1-AF1D-06B6AE9BFC0F@gmail.com>
Message-ID: <CAP7+vJLVT=hr0DMZuxuNmiagyKwfgBNO5ysjVs+UHtty8L4Jjw@mail.gmail.com>

Thanks for this thorough review, Raymond! Especially the user research is
amazing.

 And thanks for Antoine for writing the PEP -- you never know how an idea
pans out until you've tried it.

--Guido

On Thu, May 14, 2015 at 7:29 AM, Raymond Hettinger <
raymond.hettinger at gmail.com> wrote:

> Before the Python 3.5 feature freeze, I should step-up and
> formally reject PEP 455 for "Adding a key-transforming
> dictionary to collections".
>
> I had completed an involved review effort a long time ago
> and I apologize for the delay in making the pronouncement.
>
> What made it a interesting choice from the outset is that the
> idea of a "transformation" is an enticing concept that seems
> full of possibility.  I spent a good deal of time exploring
> what could be done with it but found that it mostly fell short
> of its promise.
>
> There were many issues.  Here are some that were at the top:
>
> * Most use cases don't need or want the reverse lookup feature
>   (what is wanted is a set of one-way canonicalization functions).
>   Those that do would want to have a choice of what is saved
>   (first stored, last stored, n most recent, a set of all inputs,
>   a list of all inputs, nothing, etc).  In database terms, it
>   models a many-to-one table (the canonicalization or
>   transformation function) with the one being a primary key into
>   another possibly surjective table of two columns (the
>   key/value store).  A surjection into another surjection isn't
>   inherently reversible in a useful way, nor does it seem to be a
>   common way to model data.
>
> * People are creative at coming up with using cases for the TD
>   but then find that the resulting code is less clear, slower,
>   less intuitive, more memory intensive, and harder to debug than
>   just using a plain dict with a function call before the lookup:
>   d[func(key)].  It was challenging to find any existing code
>   that would be made better by the availability of the TD.
>
> * The TD seems to be all about combining data scrubbing
>   (case-folding, unicode canonicalization, type-folding, object
>   identity, unit-conversion, or finding a canonical member of an
>   equivalence class) with a mapping (looking-up a value for a
>   given key).  Those two operations are conceptually orthogonal.
>   The former doesn't get easier when hidden behind a mapping API
>   and the latter loses the flexibility of choosing your preferred
>   mapping (an ordereddict, a persistentdict, a chainmap, etc) and
>   the flexibility of establishing your own rules for whether and
>   how to do a reverse lookup.
>
>
> Raymond Hettinger
>
>
> P.S.  Besides the core conceptual issues listed above, there
> are a number of smaller issues with the TD that surfaced
> during design review sessions.  In no particular order, here
> are a few of the observations:
>
> * It seems to require above average skill to figure-out what
>   can be used as a transform function.  It is more
>   expert-friendly than beginner friendly.  It takes a little
>   while to get used to it.  It wasn't self-evident that
>   transformations happen both when a key is stored and again
>   when it is looked-up (contrast this with key-functions for
>   sorting which are called at most once per key).
>
> * The name, TransformDict, suggests that it might transform the
>   value instead of the key or that it might transform the
>   dictionary into something else.  The name TransformDict is so
>   general that it would be hard to discover when faced with a
>   specific problem.  The name also limits perception of what
>   could be done with it (i.e. a function that logs accesses
>   but doesn't actually change the key).
>
> * The tool doesn't self describe itself well.  Looking at the
>   help(), or the __repr__(), or the tooltips did not provide
>   much insight or clarity.  The dir() shows many of the
>   _abc implementation details rather than the API itself.
>
> * The original key is stored and if you change it, the change
>   isn't stored.  The _original dict is private (perhaps to
>   reduce the risk of putting the TD in an inconsistent state)
>   but this limits access to the stored data.
>
> * The TD is unsuitable for bijections because the API is
>   inherently biased with a rich group of operators and methods
>   for forward lookup but has only one method for reverse lookup.
>
> * The reverse feature is hard to find (getitem vs __getitem__)
>   and its output pair is surprising and a bit awkward to use.
>   It provides only one accessor method rather that the full
>   dict API that would be given by a second dictionary.  The
>   API hides the fact that there are two underlying dictionaries.
>
> * It was surprising that when d[k] failed, it failed with
>   transformation exception rather than a KeyError, violating
>   the expectations of the calling code (for example, if the
>   transformation function is int(), the call d["12"]
>   transforms to d[12] and either succeeds in returning a value
>   or in raising a KeyError, but the call d["12.0"] fails with
>   a TypeError).  The latter issue limits its substitutability
>   into existing code that expects real mappings and for
>   exposing to end-users as if it were a normal dictionary.
>
> * There were other issues with dict invariants as well and
>   these affected substitutability in a sometimes subtle way.
>   For example, the TD does not work with __missing__().
>   Also, "k in td" does not imply that "k in list(td.keys())".
>
> * The API is at odds with wanting to access the transformations.
>   You pay a transformation cost both when storing and when
>   looking up, but you can't access the transformed value itself.
>   For example, if the transformation is a function that scrubs
>   hand entered mailing addresses and puts them into a standard
>   format with standard abbreviations, you have no way of getting
>   back to the cleaned-up address.
>
> * One design reviewer summarized her thoughts like this:
>   "There is a learning curve to be climbed to figure out what
>   it does, how to use it, and what the applications [are].
>   But, the [working out the same] examplea with plain dicts
>   requires only basic knowledge."  -- Patricia
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150514/be3bf4d0/attachment.html>

From ncoghlan at gmail.com  Thu May 14 18:56:55 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 15 May 2015 02:56:55 +1000
Subject: [Python-Dev] PEP 455 -- TransformDict
In-Reply-To: <CAP7+vJLVT=hr0DMZuxuNmiagyKwfgBNO5ysjVs+UHtty8L4Jjw@mail.gmail.com>
References: <1F07BE60-C41C-41C1-AF1D-06B6AE9BFC0F@gmail.com>
 <CAP7+vJLVT=hr0DMZuxuNmiagyKwfgBNO5ysjVs+UHtty8L4Jjw@mail.gmail.com>
Message-ID: <CADiSq7dWWRcuQfH5_xLdWx1jxQ1qQvxEgom_4UwwciHueWvHyg@mail.gmail.com>

On 15 May 2015 at 00:41, Guido van Rossum <guido at python.org> wrote:
> Thanks for this thorough review, Raymond! Especially the user research is
> amazing.
>
>  And thanks for Antoine for writing the PEP -- you never know how an idea
> pans out until you've tried it.

Hear, hear! I thought the TransformDict idea sounded interesting when
Antoine proposed it, but Raymond's rationale for the rejection makes a
great deal of sense.

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From njs at pobox.com  Fri May 15 02:45:11 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Thu, 14 May 2015 17:45:11 -0700
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
Message-ID: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>

Hi all,

While attempting to clean up some of the more squamous aspects of
numpy's operator dispatch code [1][2], I've encountered a situation
where the semantics we want and are using are possible according to
CPython-the-interpreter, but AFAICT ought not to be possible according
to Python-the-language, i.e., it's not clear to me whether it's
possible even in principle to implement an object that works the way
numpy.ndarray does in any other interpreter. Which makes me a bit
nervous, so I wanted to check if there was any ruling on this.

Specifically, the quirk we are relying on is this: in CPython, if you do

  [1, 2] * my_object

then my_object's __rmul__ gets called *before* list.__mul__,
*regardless* of the inheritance relationship between list and
type(my_object). This occurs as a side-effect of the weirdness
involved in having both tp_as_number->nb_multiply and
tp_as_sequence->sq_repeat in the C API -- when evaluating "a * b",
CPython tries a's nb_multiply, then b's nb_multiply, then a's
sq_repeat, then b's sq_repeat. Since list has an sq_repeat but not an
nb_multiply, this means that my_object's nb_multiply gets called
before any list method.

Here's an example demonstrating how weird this is. list.__mul__ wants
an integer, and by "integer" it means "any object with an __index__
method". So here's a class that list is happy to be multiplied by --
according to the ordinary rules for operator dispatch, in the example
below Indexable.__mul__ and __rmul__ shouldn't even get a look-in:

In [3]: class Indexable(object):
   ...:     def __index__(self):
   ...:         return 2
   ...:

In [4]: [1, 2] * Indexable()
Out[4]: [1, 2, 1, 2]

But, if I add an __rmul__ method, then this actually wins:

In [6]: class IndexableWithMul(object):
   ...:     def __index__(self):
   ...:         return 2
  ...:     def __mul__(self, other):
   ...:         return "indexable forward mul"
   ...:     def __rmul__(self, other):
   ...:         return "indexable reverse mul"

In [7]: [1, 2] * IndexableWithMul()
Out[7]: 'indexable reverse mul'

In [8]: IndexableWithMul() * [1, 2]
Out[8]: 'indexable forward mul'

NumPy arrays, of course, correctly define both __index__ method (which
raises an array on general arrays but coerces to int for arrays that
contain exactly 1 integer), and also defines an nb_multiply slot which
accepts lists and performs elementwise multiplication:

In [9]: [1, 2] * np.array(2)
Out[9]: array([2, 4])

And that's all great! Just what we want. But the only reason this is
possible, AFAICT, is that CPython 'list' is a weird type with
undocumented behaviour that you can't actually define using pure
Python code.

Should I be worried?

-n

[1] https://github.com/numpy/numpy/pull/5864
[2] https://github.com/numpy/numpy/issues/5844

-- 
Nathaniel J. Smith -- http://vorpus.org

From Stefan.Richthofer at gmx.de  Fri May 15 04:43:05 2015
From: Stefan.Richthofer at gmx.de (Stefan Richthofer)
Date: Fri, 15 May 2015 04:43:05 +0200
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
Message-ID: <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>

>>Should I be worried?

You mean should *I* be worried ;)

Stuff like this is highly relevant for JyNI, so thanks very much for clarifying this
subtle behavior. It went onto my todo-list right now to ensure that JyNI will emulate
this behavior as soon as I am done with gc-support. (Hopefully it will be feasible,
but I can only tell in half a year or so since there are currently other priorities.)
Still, this "essay" potentially will save me a lot of time.

So, everybody please feel encouraged to post things like this as they come up. Maybe
there could be kind of a pitfalls-page somewhere in the docs collecting these things.

Best

Stefan


> Gesendet: Freitag, 15. Mai 2015 um 02:45 Uhr
> Von: "Nathaniel Smith" <njs at pobox.com>
> An: "Python Dev" <python-dev at python.org>
> Betreff: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
>
> Hi all,
> 
> While attempting to clean up some of the more squamous aspects of
> numpy's operator dispatch code [1][2], I've encountered a situation
> where the semantics we want and are using are possible according to
> CPython-the-interpreter, but AFAICT ought not to be possible according
> to Python-the-language, i.e., it's not clear to me whether it's
> possible even in principle to implement an object that works the way
> numpy.ndarray does in any other interpreter. Which makes me a bit
> nervous, so I wanted to check if there was any ruling on this.
> 
> Specifically, the quirk we are relying on is this: in CPython, if you do
> 
>   [1, 2] * my_object
> 
> then my_object's __rmul__ gets called *before* list.__mul__,
> *regardless* of the inheritance relationship between list and
> type(my_object). This occurs as a side-effect of the weirdness
> involved in having both tp_as_number->nb_multiply and
> tp_as_sequence->sq_repeat in the C API -- when evaluating "a * b",
> CPython tries a's nb_multiply, then b's nb_multiply, then a's
> sq_repeat, then b's sq_repeat. Since list has an sq_repeat but not an
> nb_multiply, this means that my_object's nb_multiply gets called
> before any list method.
> 
> Here's an example demonstrating how weird this is. list.__mul__ wants
> an integer, and by "integer" it means "any object with an __index__
> method". So here's a class that list is happy to be multiplied by --
> according to the ordinary rules for operator dispatch, in the example
> below Indexable.__mul__ and __rmul__ shouldn't even get a look-in:
> 
> In [3]: class Indexable(object):
>    ...:     def __index__(self):
>    ...:         return 2
>    ...:
> 
> In [4]: [1, 2] * Indexable()
> Out[4]: [1, 2, 1, 2]
> 
> But, if I add an __rmul__ method, then this actually wins:
> 
> In [6]: class IndexableWithMul(object):
>    ...:     def __index__(self):
>    ...:         return 2
>   ...:     def __mul__(self, other):
>    ...:         return "indexable forward mul"
>    ...:     def __rmul__(self, other):
>    ...:         return "indexable reverse mul"
> 
> In [7]: [1, 2] * IndexableWithMul()
> Out[7]: 'indexable reverse mul'
> 
> In [8]: IndexableWithMul() * [1, 2]
> Out[8]: 'indexable forward mul'
> 
> NumPy arrays, of course, correctly define both __index__ method (which
> raises an array on general arrays but coerces to int for arrays that
> contain exactly 1 integer), and also defines an nb_multiply slot which
> accepts lists and performs elementwise multiplication:
> 
> In [9]: [1, 2] * np.array(2)
> Out[9]: array([2, 4])
> 
> And that's all great! Just what we want. But the only reason this is
> possible, AFAICT, is that CPython 'list' is a weird type with
> undocumented behaviour that you can't actually define using pure
> Python code.
> 
> Should I be worried?
> 
> -n
> 
> [1] https://github.com/numpy/numpy/pull/5864
> [2] https://github.com/numpy/numpy/issues/5844
> 
> -- 
> Nathaniel J. Smith -- http://vorpus.org
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de
> 

From guido at python.org  Fri May 15 06:29:45 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 14 May 2015 21:29:45 -0700
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
Message-ID: <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>

I expect you can make something that behaves like list by defining __mul__
and __rmul__ and returning NotImplemented.

On Thursday, May 14, 2015, Stefan Richthofer <Stefan.Richthofer at gmx.de>
wrote:

> >>Should I be worried?
>
> You mean should *I* be worried ;)
>
> Stuff like this is highly relevant for JyNI, so thanks very much for
> clarifying this
> subtle behavior. It went onto my todo-list right now to ensure that JyNI
> will emulate
> this behavior as soon as I am done with gc-support. (Hopefully it will be
> feasible,
> but I can only tell in half a year or so since there are currently other
> priorities.)
> Still, this "essay" potentially will save me a lot of time.
>
> So, everybody please feel encouraged to post things like this as they come
> up. Maybe
> there could be kind of a pitfalls-page somewhere in the docs collecting
> these things.
>
> Best
>
> Stefan
>
>
> > Gesendet: Freitag, 15. Mai 2015 um 02:45 Uhr
> > Von: "Nathaniel Smith" <njs at pobox.com <javascript:;>>
> > An: "Python Dev" <python-dev at python.org <javascript:;>>
> > Betreff: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
> >
> > Hi all,
> >
> > While attempting to clean up some of the more squamous aspects of
> > numpy's operator dispatch code [1][2], I've encountered a situation
> > where the semantics we want and are using are possible according to
> > CPython-the-interpreter, but AFAICT ought not to be possible according
> > to Python-the-language, i.e., it's not clear to me whether it's
> > possible even in principle to implement an object that works the way
> > numpy.ndarray does in any other interpreter. Which makes me a bit
> > nervous, so I wanted to check if there was any ruling on this.
> >
> > Specifically, the quirk we are relying on is this: in CPython, if you do
> >
> >   [1, 2] * my_object
> >
> > then my_object's __rmul__ gets called *before* list.__mul__,
> > *regardless* of the inheritance relationship between list and
> > type(my_object). This occurs as a side-effect of the weirdness
> > involved in having both tp_as_number->nb_multiply and
> > tp_as_sequence->sq_repeat in the C API -- when evaluating "a * b",
> > CPython tries a's nb_multiply, then b's nb_multiply, then a's
> > sq_repeat, then b's sq_repeat. Since list has an sq_repeat but not an
> > nb_multiply, this means that my_object's nb_multiply gets called
> > before any list method.
> >
> > Here's an example demonstrating how weird this is. list.__mul__ wants
> > an integer, and by "integer" it means "any object with an __index__
> > method". So here's a class that list is happy to be multiplied by --
> > according to the ordinary rules for operator dispatch, in the example
> > below Indexable.__mul__ and __rmul__ shouldn't even get a look-in:
> >
> > In [3]: class Indexable(object):
> >    ...:     def __index__(self):
> >    ...:         return 2
> >    ...:
> >
> > In [4]: [1, 2] * Indexable()
> > Out[4]: [1, 2, 1, 2]
> >
> > But, if I add an __rmul__ method, then this actually wins:
> >
> > In [6]: class IndexableWithMul(object):
> >    ...:     def __index__(self):
> >    ...:         return 2
> >   ...:     def __mul__(self, other):
> >    ...:         return "indexable forward mul"
> >    ...:     def __rmul__(self, other):
> >    ...:         return "indexable reverse mul"
> >
> > In [7]: [1, 2] * IndexableWithMul()
> > Out[7]: 'indexable reverse mul'
> >
> > In [8]: IndexableWithMul() * [1, 2]
> > Out[8]: 'indexable forward mul'
> >
> > NumPy arrays, of course, correctly define both __index__ method (which
> > raises an array on general arrays but coerces to int for arrays that
> > contain exactly 1 integer), and also defines an nb_multiply slot which
> > accepts lists and performs elementwise multiplication:
> >
> > In [9]: [1, 2] * np.array(2)
> > Out[9]: array([2, 4])
> >
> > And that's all great! Just what we want. But the only reason this is
> > possible, AFAICT, is that CPython 'list' is a weird type with
> > undocumented behaviour that you can't actually define using pure
> > Python code.
> >
> > Should I be worried?
> >
> > -n
> >
> > [1] https://github.com/numpy/numpy/pull/5864
> > [2] https://github.com/numpy/numpy/issues/5844
> >
> > --
> > Nathaniel J. Smith -- http://vorpus.org
> > _______________________________________________
> > Python-Dev mailing list
> > Python-Dev at python.org <javascript:;>
> > https://mail.python.org/mailman/listinfo/python-dev
> > Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de
> >
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org <javascript:;>
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>


-- 
--Guido van Rossum (on iPad)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150514/7a2f76bf/attachment.html>

From storchaka at gmail.com  Fri May 15 08:07:05 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Fri, 15 May 2015 09:07:05 +0300
Subject: [Python-Dev] cpython: inspect: Micro-optimize __eq__ for
 Signature, Parameter and BoundArguments
In-Reply-To: <20150514222301.27985.39244@psf.io>
References: <20150514222301.27985.39244@psf.io>
Message-ID: <mj42ea$gbc$1@ger.gmane.org>

On 15.05.15 01:23, yury.selivanov wrote:
> https://hg.python.org/cpython/rev/f0b10980b19e
> changeset:   96056:f0b10980b19e
> parent:      96054:15701e89d710
> user:        Yury Selivanov <yselivanov at sprymix.com>
> date:        Thu May 14 18:20:01 2015 -0400
> summary:
>    inspect: Micro-optimize __eq__ for Signature, Parameter and BoundArguments
>
> Provide __ne__ method for consistency.
>
> files:
>    Lib/inspect.py |  32 ++++++++++++++++++++++----------
>    1 files changed, 22 insertions(+), 10 deletions(-)
>
>
> diff --git a/Lib/inspect.py b/Lib/inspect.py
> --- a/Lib/inspect.py
> +++ b/Lib/inspect.py
> @@ -2353,11 +2353,15 @@
>           return hash((self.name, self.kind, self.annotation, self.default))
>
>       def __eq__(self, other):
> -        return (issubclass(other.__class__, Parameter) and
> -                self._name == other._name and
> -                self._kind == other._kind and
> -                self._default == other._default and
> -                self._annotation == other._annotation)
> +        return (self is other or
> +                    (issubclass(other.__class__, Parameter) and
> +                     self._name == other._name and
> +                     self._kind == other._kind and
> +                     self._default == other._default and
> +                     self._annotation == other._annotation))

It would be better to return NotImplemented if other is not an instance 
of Parameter.

         if self is other:
             return True
         if not isinstance(other, Parameter):
             return NotImplemented
         return (self._name == other._name and
                 self._kind == other._kind and
                 self._default == other._default and
                 self._annotation == other._annotation)

And why you use issubclass() instead of isinstance()?

> +    def __ne__(self, other):
> +        return not self.__eq__(other)

This is not need (and incorrect if __eq__ returns NotImplemented). The 
default __ne__ implementations calls __eq__ and correctly handles 
NotImplemented.



From storchaka at gmail.com  Fri May 15 08:28:07 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Fri, 15 May 2015 09:28:07 +0300
Subject: [Python-Dev] cpython: inspect: Add __slots__ to BoundArguments.
In-Reply-To: <20150513213811.1939.45761@psf.io>
References: <20150513213811.1939.45761@psf.io>
Message-ID: <mj43ln$428$1@ger.gmane.org>

On 14.05.15 00:38, yury.selivanov wrote:
> https://hg.python.org/cpython/rev/ee31277386cb
> changeset:   96038:ee31277386cb
> user:        Yury Selivanov <yselivanov at sprymix.com>
> date:        Wed May 13 17:18:41 2015 -0400
> summary:
>    inspect: Add __slots__ to BoundArguments.

Note that adding __slots__ breaks pickleability.


From njs at pobox.com  Fri May 15 08:53:35 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Thu, 14 May 2015 23:53:35 -0700
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
 <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
Message-ID: <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>

On Thu, May 14, 2015 at 9:29 PM, Guido van Rossum <guido at python.org> wrote:
> I expect you can make something that behaves like list by defining __mul__
> and __rmul__ and returning NotImplemented.

Hmm, it's fairly tricky, and part of the trick is that you can never
return NotImplemented (because you have to pretty much take over and
entirely replace the normal dispatch rules inside __mul__ and
__rmul__), but see attached for something I think should work.

So I guess this is just how Python's list, tuple, etc. work, and PyPy
and friends need to match...

-n

> On Thursday, May 14, 2015, Stefan Richthofer <Stefan.Richthofer at gmx.de>
> wrote:
>>
>> >>Should I be worried?
>>
>> You mean should *I* be worried ;)
>>
>> Stuff like this is highly relevant for JyNI, so thanks very much for
>> clarifying this
>> subtle behavior. It went onto my todo-list right now to ensure that JyNI
>> will emulate
>> this behavior as soon as I am done with gc-support. (Hopefully it will be
>> feasible,
>> but I can only tell in half a year or so since there are currently other
>> priorities.)
>> Still, this "essay" potentially will save me a lot of time.
>>
>> So, everybody please feel encouraged to post things like this as they come
>> up. Maybe
>> there could be kind of a pitfalls-page somewhere in the docs collecting
>> these things.
>>
>> Best
>>
>> Stefan
>>
>>
>> > Gesendet: Freitag, 15. Mai 2015 um 02:45 Uhr
>> > Von: "Nathaniel Smith" <njs at pobox.com>
>> > An: "Python Dev" <python-dev at python.org>
>> > Betreff: [Python-Dev] Python-versus-CPython question for __mul__
>> > dispatch
>> >
>> > Hi all,
>> >
>> > While attempting to clean up some of the more squamous aspects of
>> > numpy's operator dispatch code [1][2], I've encountered a situation
>> > where the semantics we want and are using are possible according to
>> > CPython-the-interpreter, but AFAICT ought not to be possible according
>> > to Python-the-language, i.e., it's not clear to me whether it's
>> > possible even in principle to implement an object that works the way
>> > numpy.ndarray does in any other interpreter. Which makes me a bit
>> > nervous, so I wanted to check if there was any ruling on this.
>> >
>> > Specifically, the quirk we are relying on is this: in CPython, if you do
>> >
>> >   [1, 2] * my_object
>> >
>> > then my_object's __rmul__ gets called *before* list.__mul__,
>> > *regardless* of the inheritance relationship between list and
>> > type(my_object). This occurs as a side-effect of the weirdness
>> > involved in having both tp_as_number->nb_multiply and
>> > tp_as_sequence->sq_repeat in the C API -- when evaluating "a * b",
>> > CPython tries a's nb_multiply, then b's nb_multiply, then a's
>> > sq_repeat, then b's sq_repeat. Since list has an sq_repeat but not an
>> > nb_multiply, this means that my_object's nb_multiply gets called
>> > before any list method.
>> >
>> > Here's an example demonstrating how weird this is. list.__mul__ wants
>> > an integer, and by "integer" it means "any object with an __index__
>> > method". So here's a class that list is happy to be multiplied by --
>> > according to the ordinary rules for operator dispatch, in the example
>> > below Indexable.__mul__ and __rmul__ shouldn't even get a look-in:
>> >
>> > In [3]: class Indexable(object):
>> >    ...:     def __index__(self):
>> >    ...:         return 2
>> >    ...:
>> >
>> > In [4]: [1, 2] * Indexable()
>> > Out[4]: [1, 2, 1, 2]
>> >
>> > But, if I add an __rmul__ method, then this actually wins:
>> >
>> > In [6]: class IndexableWithMul(object):
>> >    ...:     def __index__(self):
>> >    ...:         return 2
>> >   ...:     def __mul__(self, other):
>> >    ...:         return "indexable forward mul"
>> >    ...:     def __rmul__(self, other):
>> >    ...:         return "indexable reverse mul"
>> >
>> > In [7]: [1, 2] * IndexableWithMul()
>> > Out[7]: 'indexable reverse mul'
>> >
>> > In [8]: IndexableWithMul() * [1, 2]
>> > Out[8]: 'indexable forward mul'
>> >
>> > NumPy arrays, of course, correctly define both __index__ method (which
>> > raises an array on general arrays but coerces to int for arrays that
>> > contain exactly 1 integer), and also defines an nb_multiply slot which
>> > accepts lists and performs elementwise multiplication:
>> >
>> > In [9]: [1, 2] * np.array(2)
>> > Out[9]: array([2, 4])
>> >
>> > And that's all great! Just what we want. But the only reason this is
>> > possible, AFAICT, is that CPython 'list' is a weird type with
>> > undocumented behaviour that you can't actually define using pure
>> > Python code.
>> >
>> > Should I be worried?
>> >
>> > -n
>> >
>> > [1] https://github.com/numpy/numpy/pull/5864
>> > [2] https://github.com/numpy/numpy/issues/5844
>> >
>> > --
>> > Nathaniel J. Smith -- http://vorpus.org
>> > _______________________________________________
>> > Python-Dev mailing list
>> > Python-Dev at python.org
>> > https://mail.python.org/mailman/listinfo/python-dev
>> > Unsubscribe:
>> > https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de
>> >
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
>
>
> --
> --Guido van Rossum (on iPad)



-- 
Nathaniel J. Smith -- http://vorpus.org
-------------- next part --------------
A non-text attachment was scrubbed...
Name: sequence.py
Type: text/x-python
Size: 2868 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150514/56a62e7e/attachment.py>

From solipsis at pitrou.net  Fri May 15 13:57:22 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Fri, 15 May 2015 13:57:22 +0200
Subject: [Python-Dev] cpython: inspect: Add __slots__ to BoundArguments.
References: <20150513213811.1939.45761@psf.io>
	<mj43ln$428$1@ger.gmane.org>
Message-ID: <20150515135722.0d2e7fba@fsol>

On Fri, 15 May 2015 09:28:07 +0300
Serhiy Storchaka <storchaka at gmail.com> wrote:
> On 14.05.15 00:38, yury.selivanov wrote:
> > https://hg.python.org/cpython/rev/ee31277386cb
> > changeset:   96038:ee31277386cb
> > user:        Yury Selivanov <yselivanov at sprymix.com>
> > date:        Wed May 13 17:18:41 2015 -0400
> > summary:
> >    inspect: Add __slots__ to BoundArguments.
> 
> Note that adding __slots__ breaks pickleability.

That's a problem indeed. I think picklability should be part of the
Signature and BoundArguments contracts.

Regards

Antoine.



From status at bugs.python.org  Fri May 15 18:08:20 2015
From: status at bugs.python.org (Python tracker)
Date: Fri, 15 May 2015 18:08:20 +0200 (CEST)
Subject: [Python-Dev] Summary of Python tracker Issues
Message-ID: <20150515160820.0924756755@psf.upfronthosting.co.za>


ACTIVITY SUMMARY (2015-05-08 - 2015-05-15)
Python tracker at http://bugs.python.org/

To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.

Issues counts and deltas:
  open    4840 ( +2)
  closed 31123 (+54)
  total  35963 (+56)

Open issues with patches: 2237 


Issues opened (39)
==================

#21162: code in multiprocessing.pool freeze if inside some code from s
http://bugs.python.org/issue21162  reopened by Ivan.K

#21593: Clarify re.search documentation first match
http://bugs.python.org/issue21593  reopened by Joshua.Landau

#23757: tuple function gives wrong answer when called on list subclass
http://bugs.python.org/issue23757  reopened by r.david.murray

#24134: assertRaises can behave differently
http://bugs.python.org/issue24134  reopened by serhiy.storchaka

#24147: doublequote are not well recognized with Dialect class
http://bugs.python.org/issue24147  opened by MiK

#24148: 'cum' not a valid sort key for pstats.Stats.sort_stats
http://bugs.python.org/issue24148  opened by ramiro

#24151: test_pydoc fails
http://bugs.python.org/issue24151  opened by hsiehr

#24152: test_mailcap fails if a mailcap file contains a non-decodable 
http://bugs.python.org/issue24152  opened by petrosr2

#24153: threading/multiprocessing tests fail on chromebook under crout
http://bugs.python.org/issue24153  opened by shiprex

#24154: pathlib.Path.rename moves file to Path.cwd() when argument is 
http://bugs.python.org/issue24154  opened by yurit

#24157: test_urandom_fd_reopened failure if OS X crash reporter defaul
http://bugs.python.org/issue24157  opened by skip.montanaro

#24159: Misleading TypeError when pickling bytes to a file opened as t
http://bugs.python.org/issue24159  opened by jason.coombs

#24160: Pdb sometimes crashes when trying to remove a breakpoint defin
http://bugs.python.org/issue24160  opened by ppperry

#24162: [2.7 regression] test_asynchat test failure on i586-linux-gnu
http://bugs.python.org/issue24162  opened by doko

#24163: shutil.copystat fails when attribute security.selinux is prese
http://bugs.python.org/issue24163  opened by sstirlin

#24164: Support pickling objects with __new__ with keyword arguments w
http://bugs.python.org/issue24164  opened by serhiy.storchaka

#24165: Free list for single-digits ints
http://bugs.python.org/issue24165  opened by serhiy.storchaka

#24166: ArgumentParser behavior does not match generated help
http://bugs.python.org/issue24166  opened by tellendil

#24168: Unittest discover fails with namespace package if the path con
http://bugs.python.org/issue24168  opened by toshishige hagihara

#24169: sockets convert out-of-range port numbers % 2**16
http://bugs.python.org/issue24169  opened by Kurt.Rose

#24172: Errors in resource.getpagesize module documentation
http://bugs.python.org/issue24172  opened by mahmoud

#24173: curses HOWTO/implementation disagreement
http://bugs.python.org/issue24173  opened by White_Rabbit

#24174: Python crash on exit
http://bugs.python.org/issue24174  opened by gnumdk

#24175: Test failure in test_utime on FreeBSD
http://bugs.python.org/issue24175  opened by koobs

#24176: Incorrect parsing of unpacked expressions in call
http://bugs.python.org/issue24176  opened by tcaswell

#24177: Add https?_proxy support to http.client
http://bugs.python.org/issue24177  opened by demian.brecht

#24180: PEP 492: Documentation
http://bugs.python.org/issue24180  opened by yselivanov

#24181: test_fileio crash, 3.5, Win 7
http://bugs.python.org/issue24181  opened by terry.reedy

#24182: test_tcl assertion failure, 2.7, Win 7
http://bugs.python.org/issue24182  opened by terry.reedy

#24186: OpenSSL causes buffer overrun exception
http://bugs.python.org/issue24186  opened by steve.dower

#24190: BoundArguments facility to inject defaults
http://bugs.python.org/issue24190  opened by pitrou

#24192: unexpected system error with pep420 style namespace packages
http://bugs.python.org/issue24192  opened by ronaldoussoren

#24193: Make LOGGING_FORMAT of assertLogs configurable
http://bugs.python.org/issue24193  opened by berker.peksag

#24194: tokenize yield an ERRORTOKEN if an identifier uses Other_ID_St
http://bugs.python.org/issue24194  opened by Joshua.Landau

#24195: Add `Executor.filter` to concurrent.futures
http://bugs.python.org/issue24195  opened by cool-RR

#24198: please align the platform tag for windows
http://bugs.python.org/issue24198  opened by doko

#24199: Idle: remove idlelib.idlever.py and its use in About dialog
http://bugs.python.org/issue24199  opened by terry.reedy

#24200: Redundant id in informative reprs
http://bugs.python.org/issue24200  opened by serhiy.storchaka

#24201: _winreg PyHKEY Type Confusion
http://bugs.python.org/issue24201  opened by JohnLeitch



Most recent 15 issues with no replies (15)
==========================================

#24200: Redundant id in informative reprs
http://bugs.python.org/issue24200

#24194: tokenize yield an ERRORTOKEN if an identifier uses Other_ID_St
http://bugs.python.org/issue24194

#24193: Make LOGGING_FORMAT of assertLogs configurable
http://bugs.python.org/issue24193

#24180: PEP 492: Documentation
http://bugs.python.org/issue24180

#24177: Add https?_proxy support to http.client
http://bugs.python.org/issue24177

#24176: Incorrect parsing of unpacked expressions in call
http://bugs.python.org/issue24176

#24174: Python crash on exit
http://bugs.python.org/issue24174

#24173: curses HOWTO/implementation disagreement
http://bugs.python.org/issue24173

#24164: Support pickling objects with __new__ with keyword arguments w
http://bugs.python.org/issue24164

#24162: [2.7 regression] test_asynchat test failure on i586-linux-gnu
http://bugs.python.org/issue24162

#24159: Misleading TypeError when pickling bytes to a file opened as t
http://bugs.python.org/issue24159

#24154: pathlib.Path.rename moves file to Path.cwd() when argument is 
http://bugs.python.org/issue24154

#24148: 'cum' not a valid sort key for pstats.Stats.sort_stats
http://bugs.python.org/issue24148

#24143: Makefile in tarball don't provide make uninstall target
http://bugs.python.org/issue24143

#24137: Force not using _default_root in IDLE
http://bugs.python.org/issue24137



Most recent 15 issues waiting for review (15)
=============================================

#24198: please align the platform tag for windows
http://bugs.python.org/issue24198

#24195: Add `Executor.filter` to concurrent.futures
http://bugs.python.org/issue24195

#24190: BoundArguments facility to inject defaults
http://bugs.python.org/issue24190

#24176: Incorrect parsing of unpacked expressions in call
http://bugs.python.org/issue24176

#24165: Free list for single-digits ints
http://bugs.python.org/issue24165

#24164: Support pickling objects with __new__ with keyword arguments w
http://bugs.python.org/issue24164

#24145: Support |= for parameters in converters
http://bugs.python.org/issue24145

#24142: ConfigParser._read doesn't join multi-line values collected wh
http://bugs.python.org/issue24142

#24136: document PEP 448
http://bugs.python.org/issue24136

#24134: assertRaises can behave differently
http://bugs.python.org/issue24134

#24130: Remove -fno-common compile option from OS X framework builds?
http://bugs.python.org/issue24130

#24129: Incorrect (misleading) statement in the execution model docume
http://bugs.python.org/issue24129

#24117: Wrong range checking in GB18030 decoder.
http://bugs.python.org/issue24117

#24114: ctypes.utils uninitialized variable 'path'
http://bugs.python.org/issue24114

#24109: Documentation for difflib uses optparse
http://bugs.python.org/issue24109



Top 10 most discussed issues (10)
=================================

#24181: test_fileio crash, 3.5, Win 7
http://bugs.python.org/issue24181  10 msgs

#23699: Add a macro to ease writing rich comparisons
http://bugs.python.org/issue23699   9 msgs

#23970: Update distutils.msvccompiler for VC14
http://bugs.python.org/issue23970   9 msgs

#24195: Add `Executor.filter` to concurrent.futures
http://bugs.python.org/issue24195   8 msgs

#23971: dict(list) and dict.fromkeys() doesn't account for 2/3 fill ra
http://bugs.python.org/issue23971   7 msgs

#24165: Free list for single-digits ints
http://bugs.python.org/issue24165   7 msgs

#24190: BoundArguments facility to inject defaults
http://bugs.python.org/issue24190   7 msgs

#23857: Make default HTTPS certificate verification setting configurab
http://bugs.python.org/issue23857   6 msgs

#24182: test_tcl assertion failure, 2.7, Win 7
http://bugs.python.org/issue24182   6 msgs

#24198: please align the platform tag for windows
http://bugs.python.org/issue24198   6 msgs



Issues closed (52)
==================

#1322: Deprecate platform.dist() and platform.linux_distribution() fu
http://bugs.python.org/issue1322  closed by berker.peksag

#9514: platform.linux_distribution() under Ubuntu returns ('debian', 
http://bugs.python.org/issue9514  closed by lemburg

#12018: No tests for ntpath.samefile, ntpath.sameopenfile
http://bugs.python.org/issue12018  closed by r.david.murray

#17762: platform.linux_distribution() should honor /etc/os-release
http://bugs.python.org/issue17762  closed by lemburg

#19934: collections.Counter.most_common does not document `None` as ac
http://bugs.python.org/issue19934  closed by rhettinger

#20172: Derby #3: Convert 67 sites to Argument Clinic across 4 files (
http://bugs.python.org/issue20172  closed by zach.ware

#21795: smtpd.SMTPServer should announce 8BITMIME when supported and a
http://bugs.python.org/issue21795  closed by r.david.murray

#22064: Misleading message from 2to3 when skipping optional fixers
http://bugs.python.org/issue22064  closed by berker.peksag

#22486: Add math.gcd()
http://bugs.python.org/issue22486  closed by benjamin.peterson

#22547: Implement an informative `BoundArguments.__repr__`
http://bugs.python.org/issue22547  closed by yselivanov

#22681: Add support of KOI8-T encoding
http://bugs.python.org/issue22681  closed by serhiy.storchaka

#22682: Add support of KZ1048 (RK1048) encoding
http://bugs.python.org/issue22682  closed by serhiy.storchaka

#22906: PEP 479: Change StopIteration handling inside generators
http://bugs.python.org/issue22906  closed by yselivanov

#23042: ctypes module doesn't build on FreeBSD, RHEL (x86) - Undefined
http://bugs.python.org/issue23042  closed by benjamin.peterson

#23088: Document that PyUnicode_AsUTF8() returns a null-terminated str
http://bugs.python.org/issue23088  closed by r.david.murray

#23201: Decimal(0)**0 is an error, 0**0 is 1, but Decimal(0) == 0
http://bugs.python.org/issue23201  closed by rhettinger

#23227: Generator's finally block not run if close() called before fir
http://bugs.python.org/issue23227  closed by vadmium

#23290: Faster set copying
http://bugs.python.org/issue23290  closed by rhettinger

#23488: Random objects twice as big as necessary on 64-bit builds
http://bugs.python.org/issue23488  closed by serhiy.storchaka

#23695: idiom for clustering a data series into n-length groups
http://bugs.python.org/issue23695  closed by rhettinger

#23796: BufferedReader.peek() crashes if closed
http://bugs.python.org/issue23796  closed by berker.peksag

#23870: pprint collections classes
http://bugs.python.org/issue23870  closed by serhiy.storchaka

#23983: Update example in the pty documentation
http://bugs.python.org/issue23983  closed by berker.peksag

#23995: msvcrt could not be imported
http://bugs.python.org/issue23995  closed by r.david.murray

#24013: Improve os.scandir() and DirEntry documentation
http://bugs.python.org/issue24013  closed by python-dev

#24017: Implemenation of the PEP 492 - Coroutines with async and await
http://bugs.python.org/issue24017  closed by yselivanov

#24018: add a Generator ABC
http://bugs.python.org/issue24018  closed by rhettinger

#24032: urlparse.urljoin does not add query part
http://bugs.python.org/issue24032  closed by r.david.murray

#24042: Convert os._getfullpathname() and os._isdir() to Argument Clin
http://bugs.python.org/issue24042  closed by serhiy.storchaka

#24064: Make the property doctstring writeable
http://bugs.python.org/issue24064  closed by rhettinger

#24138: Speed up range() by caching and modifying long objects
http://bugs.python.org/issue24138  closed by larry

#24149: Issue with unit tests
http://bugs.python.org/issue24149  closed by ned.deily

#24150: text_contextlib fails on Mac OSX 10.10.3
http://bugs.python.org/issue24150  closed by ned.deily

#24155: Optimize heapify for better cache utililzation
http://bugs.python.org/issue24155  closed by rhettinger

#24156: test.test_ssl.ThreadedTests unit test failed
http://bugs.python.org/issue24156  closed by ned.deily

#24158: Error of the hint of upgrading pip
http://bugs.python.org/issue24158  closed by dstufft

#24161: PyIter_Check returns false positive for objects of type instan
http://bugs.python.org/issue24161  closed by rhettinger

#24167: 2.4.X links on www.python.org/downloads/windows point to the w
http://bugs.python.org/issue24167  closed by berker.peksag

#24170: IDLE crashes when I press ^ key
http://bugs.python.org/issue24170  closed by ned.deily

#24171: httplib
http://bugs.python.org/issue24171  closed by r.david.murray

#24178: asyncio: support 'async with' for locks
http://bugs.python.org/issue24178  closed by yselivanov

#24179: asyncio: support 'async for' for StreamReader
http://bugs.python.org/issue24179  closed by yselivanov

#24183: ABCMeta classes do not support the **kwargs standard class int
http://bugs.python.org/issue24183  closed by r.david.murray

#24184: PEP 492: Add AsyncIterator and AsyncIterable to collections.ab
http://bugs.python.org/issue24184  closed by yselivanov

#24185: Add Function for Sending File to Trash (or Recycling Bin)
http://bugs.python.org/issue24185  closed by r.david.murray

#24187: del statement documentation doesn't mention name binding behav
http://bugs.python.org/issue24187  closed by jc13

#24188: Signature objects not hashable
http://bugs.python.org/issue24188  closed by pitrou

#24189: Parameter doesn't expose its index
http://bugs.python.org/issue24189  closed by yselivanov

#24191: BoundArguments.signature not documented
http://bugs.python.org/issue24191  closed by yselivanov

#24196: Fail to create file if name starts with prn.
http://bugs.python.org/issue24196  closed by pitrou

#24197: minidom parses comments wrongly
http://bugs.python.org/issue24197  closed by ned.deily

#24202: Multiprocessing Pool not working for userdefined function
http://bugs.python.org/issue24202  closed by paul.moore

From yselivanov.ml at gmail.com  Fri May 15 18:40:55 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 15 May 2015 12:40:55 -0400
Subject: [Python-Dev] cpython: inspect: Add __slots__ to BoundArguments.
In-Reply-To: <mj43ln$428$1@ger.gmane.org>
References: <20150513213811.1939.45761@psf.io> <mj43ln$428$1@ger.gmane.org>
Message-ID: <55562197.9040007@gmail.com>

No, it does not.

I implemented __getstate__/__setstate__ specifically to maintain 
backwards compatibility (I also tested it).

Yury

On 2015-05-15 2:28 AM, Serhiy Storchaka wrote:
> On 14.05.15 00:38, yury.selivanov wrote:
>> https://hg.python.org/cpython/rev/ee31277386cb
>> changeset:   96038:ee31277386cb
>> user:        Yury Selivanov <yselivanov at sprymix.com>
>> date:        Wed May 13 17:18:41 2015 -0400
>> summary:
>>    inspect: Add __slots__ to BoundArguments.
>
> Note that adding __slots__ breaks pickleability.
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com


From yselivanov.ml at gmail.com  Fri May 15 18:46:44 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 15 May 2015 12:46:44 -0400
Subject: [Python-Dev] cpython: inspect: Add __slots__ to BoundArguments.
In-Reply-To: <20150515135722.0d2e7fba@fsol>
References: <20150513213811.1939.45761@psf.io>	<mj43ln$428$1@ger.gmane.org>
 <20150515135722.0d2e7fba@fsol>
Message-ID: <555622F4.6000905@gmail.com>

There are pickle tests for all signature related classes (all protocols 
for BoundArguments, Signature and Parameter).

This patch ensures that old pickled BoundArguments will be unpickled 
without a problem in 3.5.

Yury

On 2015-05-15 7:57 AM, Antoine Pitrou wrote:
> On Fri, 15 May 2015 09:28:07 +0300
> Serhiy Storchaka <storchaka at gmail.com> wrote:
>> On 14.05.15 00:38, yury.selivanov wrote:
>>> https://hg.python.org/cpython/rev/ee31277386cb
>>> changeset:   96038:ee31277386cb
>>> user:        Yury Selivanov <yselivanov at sprymix.com>
>>> date:        Wed May 13 17:18:41 2015 -0400
>>> summary:
>>>     inspect: Add __slots__ to BoundArguments.
>> Note that adding __slots__ breaks pickleability.
> That's a problem indeed. I think picklability should be part of the
> Signature and BoundArguments contracts.
>
> Regards
>
> Antoine.
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com


From pje at telecommunity.com  Fri May 15 19:03:41 2015
From: pje at telecommunity.com (PJ Eby)
Date: Fri, 15 May 2015 13:03:41 -0400
Subject: [Python-Dev] Minimal async event loop and async utilities (Was:
 PEP 492: async/await in Python; version 4)
In-Reply-To: <CAP7+vJKt1PLFJ8peP2JEk8oJSUh4g9DPC40hXGPasiO7ZTWQJQ@mail.gmail.com>
References: <CACac1F_YnzjoQhix__LXEhoRjocDPHrtAzFvAXb6Z93PdfMr4A@mail.gmail.com>
 <CAP7+vJ+KL5NmLJZLuKbDYX8i2V_zZPUNaqXmfacxb7Cj43jqgg@mail.gmail.com>
 <CACac1F9a5eV4O0rjLvGADc1JrXjxdX7PW9g6TPpvgbk+b8vf8Q@mail.gmail.com>
 <CAP7+vJKt1PLFJ8peP2JEk8oJSUh4g9DPC40hXGPasiO7ZTWQJQ@mail.gmail.com>
Message-ID: <CALeMXf5Ye+15u+jj7pp81-qefZbFoP+gf_vY13B1_9QJLZ9ORQ@mail.gmail.com>

On Mon, May 11, 2015 at 6:05 PM, Guido van Rossum <guido at python.org> wrote:
> OTOH you may look at micropython's uasyncio -- IIRC it doesn't have Futures
> and it definitely has I/O waiting.

Here's a sketch of an *extremely* minimal main loop that can do I/O
without Futures, and might be suitable as a PEP example.  (Certainly,
it would be hard to write a *simpler* example than this, since it
doesn't even use any *classes* or require any specially named methods,
works with present-day generators, and is (I think) both 2.x/3.x
compatible.)

    coroutines = []     # round-robin of currently "running" coroutines

    def schedule(coroutine, val=None, err=None):
        coroutines.insert(0, (coroutine, val, err))

    def runLoop():
        while coroutines:
            (coroutine, val, err) = coroutines.pop()
            try:
                if err is not None:
                    suspend = coroutine.throw(err)
                else
                    suspend = coroutine.send(val)
            except StopIteration:
                # coroutine is finished, so don't reschedule it
                continue

            except Exception:
                # framework-specific detail  (i.e., log it, send
                # to an error handling coroutine, or just stop the program
                # Here, we just ignore it and stop the coroutine
                continue

            else:
                if hasattr(suspend, '__call__') and suspend(coroutine):
                    continue
                else:
                    # put it back on the round-robin list
                    schedule(coroutine)

To use it, `schedule()` one or more coroutines, then call `runLoop()`,
which will run as long as there are things to do.  Each coroutine
scheduled must yield *thunks*: callable objects that take a coroutine
as a parameter, and return True if the coroutine should be suspended,
or False if it should continue to run.  If the thunk returns true,
that means the thunk has taken responsibility for arranging to
`schedule()` the coroutine with a value or error when it's time to
send it the result of the suspension.

You might be asking, "wait, but where's the I/O?"  Why, in a
coroutine, of course...

    readers = {}
    writers = {}
    timers = []

    def readable(fileno):
        """yield readable(fileno) resumes when fileno is readable"""
        def suspend(coroutine):
            readers[fileno] = coroutine
            return True
        return suspend

    def writable(fileno):
        """yield writable(fileno) resumes when fileno is writable"""
        def suspend(coroutine):
            writers[fileno] = coroutine
            return True
        return suspend

    def sleepFor(seconds):
        """yield sleepFor(seconds) resumes after that much time"""
        return suspendUntil(time.time() + seconds)

    def suspendUntil(timestamp):
        """yield suspendUntil(timestamp) resumes when that time is reached"""
        def suspend(coroutine)
            heappush(timers, (timestamp, coroutine)
        return suspend

   def doIO():
        while coroutines or readers or writers or timers:

            # Resume scheduled tasks
            while timers and timers[0][0] <= time.time():
                ts, coroutine = heappop(timers)
                schedule(coroutine)

            if readers or writers:
                if coroutines:
                    # Other tasks are running; use minimal timeout
                    timeout = 0.001
                else if timers:
                    timeout = max(timers[0][0] - time.time(), 0.001)
                else:
                    timeout = 0     # take as long as necessary
                r, w, e = select(readers, writers, [], timeout)
                for rr in r: schedule(readers.pop(rr))
                for ww in w: schedule(writers.pop(ww))

            yield   # allow other coroutines to run

    schedule(doIO())  # run the I/O loop as a coroutine

(This is painfully incomplete for a real framework, but it's a rough
sketch of how one of peak.events' first drafts worked, circa early
2004.)

Basically, you just need a coroutine whose job is to resume coroutines
whose scheduled time has arrived, or whose I/O is ready.  And of
course, some data structures to keep track of such things, and an API
to update the data structures and suspend the coroutines.  The I/O
loop exits once there are no more running tasks and nothing waiting on
I/O...  which will also exit the runLoop.  (A bit like a miniature
version of NodeJS for Python.)

And, while you need to preferably have only *one* such I/O coroutine
(to prevent busy-waiting), the I/O coroutine is completely
replaceable.  All that's required to implement one is that the core
runloop expose the count of active coroutines.  (Notice that, apart
from checking the length of `coroutines`, the I/O loop shown above
uses only the public `schedule()` API and the exposed thunk-suspension
protocol to do its thing.)

Also, note that you *can* indeed have multiple I/O coroutines running
at the same time, as long as you don't mind busy-waiting.  In fact,
you can refactor this to move the time-based scheduling inside the
runloop, and expose the "time until next task" and "number of running
non-I/O coroutines" to allow multiple I/O waiters to co-ordinate and
avoid busy-waiting.  (A later version of peak.events did this, though
it really wasn't to allow multiple I/O waiters, so much as to simplify
I/O waiters by providing a core time-scheduler, and to support
simulated time for running tests.)

So, there's definitely no requirement for I/O to be part of a "core"
runloop system.  The overall approach is *extremely* open to
extension, hardcodes next to nothing, and is super-easy to write new
yieldables for, since they need only have a method (or function) that
returns a suspend function.

At the time I *first* implemented this approach in '03/'04, I hadn't
thought of using plain functions as suspend targets; I used objects
with a `shouldSupend()` method.  But in fairness, I was working with
Python 2.2 and closures were still a pretty new feature back then.
;-)

Since then, though, I've seen this approach implemented elsewhere
using closures in almost exactly this way.  For example, the `co`
library for Javascript implements almost exactly the above sketch's
approach, in not much more code.  It just uses the built-in Javascript
event loop facilities, and supports yielding other things besides
thunks.  (Its thunks also don't return a value, and take a callback
rather than a coroutine.  But these are superficial differences.)

This approach is super-flexible in practice, as there are a ton of
add-on libraries for `co` that implement their control flow using
these thunks.  You can indeed fully generalize control flow in such
terms, without the need for futures or similar objects.  For example,
if you want to provide sugar for yielding to futures or other types of
objects, you just write a thunk-returning function or method, e.g.:

    def await_future(future):
        def suspend(coroutine):
            @future.add_done_callback
            def resume(future):
                err = future.exception()
                if err:
                    schedule(coroutine, None, future.exception())
                else:
                    schedule(coroutine, future.result())
            return True
        return suspend

So `yield await_future(someFuture)` will arrange for suspension until
the future is ready.  Libraries or frameworks can also be written that
wrap a generator with one that provides automatic translation to
thunks for a variety of types or protocols.  Similarly, you can write
functions that take multiple awaitables, or that provide cancellation,
etc. on top of thunks.

From benhoyt at gmail.com  Fri May 15 21:30:34 2015
From: benhoyt at gmail.com (Ben Hoyt)
Date: Fri, 15 May 2015 15:30:34 -0400
Subject: [Python-Dev] Scandir module's C code updated to Python 3.5 code
Message-ID: <CAL9jXCECLyx2-47DaPnnuyVVFuv9=Go0bHcf_SghYY0wUtOVNA@mail.gmail.com>

Hi folks,

With os.scandir() now in the Python 3.5 stdlib, I just thought I'd let
folks know that I've released the scandir module version 1.0. So this
is now basically a copy-n-paste of the C code that went into CPython
3.5's posixmodule.c with the necessary changes to make it work or
Python 2.x (2.6+).

You can use the following import to pick os.scandir/os.walk if on
Python 3.5+ or the scandir module version otherwise:

    try:
        from os import scandir, walk
    except ImportError:
        from scandir import scandir, walk

I've tested it and it all looks good and performs well, but please let
me know if you have any issues!

* PyPI: https://pypi.python.org/pypi/scandir
* Github project: https://github.com/benhoyt/scandir

Would love to hear any success/speedup stories, too!

-Ben

From njs at pobox.com  Fri May 15 23:35:38 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Fri, 15 May 2015 14:35:38 -0700
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
 <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
 <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
Message-ID: <CAPJVwBkkVahf7kk50WJ1g9K1_qR-52jOX9+gEkLNkM06TA+_wQ@mail.gmail.com>

On Thu, May 14, 2015 at 11:53 PM, Nathaniel Smith <njs at pobox.com> wrote:
> On Thu, May 14, 2015 at 9:29 PM, Guido van Rossum <guido at python.org> wrote:
>> I expect you can make something that behaves like list by defining __mul__
>> and __rmul__ and returning NotImplemented.
>
> Hmm, it's fairly tricky, and part of the trick is that you can never
> return NotImplemented (because you have to pretty much take over and
> entirely replace the normal dispatch rules inside __mul__ and
> __rmul__), but see attached for something I think should work.
>
> So I guess this is just how Python's list, tuple, etc. work, and PyPy
> and friends need to match...

For the record, it looks like PyPy does already have a hack to
implement this -- they do it by having a hidden flag on the built-in
sequence types which the implementations of '*' and '+' check for, and
if it's found it triggers a different rule for dispatching to the
__op__ methods:
    https://bitbucket.org/pypy/pypy/src/a1a494787f4112e42f50c6583e0fea18db3fb4fa/pypy/objspace/descroperation.py?at=default#cl-692

-- 
Nathaniel J. Smith -- http://vorpus.org

From songofacandy at gmail.com  Sat May 16 04:45:38 2015
From: songofacandy at gmail.com (INADA Naoki)
Date: Sat, 16 May 2015 11:45:38 +0900
Subject: [Python-Dev] No tags in semi-official github mirror of cpython
	repository.
Message-ID: <CAEfz+Txe5nOJKriwCcwtSr5JW7PvKYD1oZnZpqpiOA_EtMV7zA@mail.gmail.com>

Hi.

I foud "semi official github mirror" of cpython.
https://github.com/python/cpython

I want to use it as upstream of our project (Translating docs in Japanese).
But it doesn't have tags.

Is the repository stable enough for forking project like us? Or should we
use mercurial?
Could you mirror tags too?

Thanks
-- 
INADA Naoki  <songofacandy at gmail.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150516/d5b019f8/attachment-0001.html>

From ncoghlan at gmail.com  Sat May 16 10:15:56 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 16 May 2015 18:15:56 +1000
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
Message-ID: <CADiSq7fDhRGsACG-DUagk+6QRcNj0ZEJGoE2_Akdi=7J0vKusA@mail.gmail.com>

On 15 May 2015 at 10:45, Nathaniel Smith <njs at pobox.com> wrote:
> Hi all,
>
> While attempting to clean up some of the more squamous aspects of
> numpy's operator dispatch code [1][2], I've encountered a situation
> where the semantics we want and are using are possible according to
> CPython-the-interpreter, but AFAICT ought not to be possible according
> to Python-the-language, i.e., it's not clear to me whether it's
> possible even in principle to implement an object that works the way
> numpy.ndarray does in any other interpreter. Which makes me a bit
> nervous, so I wanted to check if there was any ruling on this.

It's a known CPython operand precedence bug due to the fact several of
the builtin types only implement sq_concat & sq_repeat without
implementing nb_add & nb_mul: http://bugs.python.org/issue11477

There's then a related problem where we *don't* process
"NotImplemented" results from sq_concat and sq_repeat properly, so all
the builtin sequence types throw TypeError directly, instead of
returning NotImplemented when they don't recognise the other type.

I wrote a preliminary patch attempting to fix it a few years back
after the issue was discovered by Mike Bayer and Alex Gaynor when
porting SQL Alchemy to PyPy, but never committed it because my own
verdict on the approach I used was that it rendered the abstract
object API implementation for __mul__ and __add__ utterly
unmaintainable.

The better fix would be to make defining sq_concat and sq_repeat more
like defining __add__ and __mul__ at the Python level: PyType_Ready
should implicitly fill in nb_add and nb_mul references to standard
implementations that delegate to sq_concat and sq_repeat, and we
should update the implementations of the latter for the standard
library sequence types implemented in C to return NotImplemented
rather than throwing TypeError directly.

However, my intermittent attempts to get anyone else interested in
fixing it haven't borne any fruit, and I've prioritised other projects
over coming up with a different patch myself.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Sat May 16 10:16:37 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 16 May 2015 18:16:37 +1000
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
 <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
 <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
Message-ID: <CADiSq7fVmwzobhHkwh2YFX6Dedgn-C7jf5O7V1hbTU5yigx-XQ@mail.gmail.com>

On 15 May 2015 at 16:53, Nathaniel Smith <njs at pobox.com> wrote:
> On Thu, May 14, 2015 at 9:29 PM, Guido van Rossum <guido at python.org> wrote:
>> I expect you can make something that behaves like list by defining __mul__
>> and __rmul__ and returning NotImplemented.
>
> Hmm, it's fairly tricky, and part of the trick is that you can never
> return NotImplemented (because you have to pretty much take over and
> entirely replace the normal dispatch rules inside __mul__ and
> __rmul__), but see attached for something I think should work.
>
> So I guess this is just how Python's list, tuple, etc. work, and PyPy
> and friends need to match...

No, CPython is broken.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Sat May 16 10:31:34 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 16 May 2015 18:31:34 +1000
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CAPJVwBkkVahf7kk50WJ1g9K1_qR-52jOX9+gEkLNkM06TA+_wQ@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
 <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
 <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
 <CAPJVwBkkVahf7kk50WJ1g9K1_qR-52jOX9+gEkLNkM06TA+_wQ@mail.gmail.com>
Message-ID: <CADiSq7erx=p7S7kg75pdwOOEDNYu-NEATwwL6xP22ez_c6M1Jw@mail.gmail.com>

On 16 May 2015 at 07:35, Nathaniel Smith <njs at pobox.com> wrote:
> On Thu, May 14, 2015 at 11:53 PM, Nathaniel Smith <njs at pobox.com> wrote:
>> On Thu, May 14, 2015 at 9:29 PM, Guido van Rossum <guido at python.org> wrote:
>>> I expect you can make something that behaves like list by defining __mul__
>>> and __rmul__ and returning NotImplemented.
>>
>> Hmm, it's fairly tricky, and part of the trick is that you can never
>> return NotImplemented (because you have to pretty much take over and
>> entirely replace the normal dispatch rules inside __mul__ and
>> __rmul__), but see attached for something I think should work.
>>
>> So I guess this is just how Python's list, tuple, etc. work, and PyPy
>> and friends need to match...
>
> For the record, it looks like PyPy does already have a hack to
> implement this -- they do it by having a hidden flag on the built-in
> sequence types which the implementations of '*' and '+' check for, and
> if it's found it triggers a different rule for dispatching to the
> __op__ methods:
>     https://bitbucket.org/pypy/pypy/src/a1a494787f4112e42f50c6583e0fea18db3fb4fa/pypy/objspace/descroperation.py?at=default#cl-692

Oh, that's rather annoying that the PyPy team implemented bug-for-bug
compatibility there, and didn't follow up on the operand precedence
bug report to say that they had done so. We also hadn't previously
been made aware that NumPy is relying on this operand precedence bug
to implement publicly documented API behaviour, so fixing it *would*
break end user code :(

I guess that means someone in the numeric community will need to write
a PEP to make this "try the other operand first" "feature" part of the
language specification, so that other interpreters can implement it up
front, rather than all having to come up with their own independent
custom hacks just to make NumPy work.

Regards,
Nick.

P.S. It would also be nice if someone could take on the PEP for a
Python level buffer API for 3.6: http://bugs.python.org/issue13797

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From phd at phdru.name  Sat May 16 13:34:56 2015
From: phd at phdru.name (Oleg Broytman)
Date: Sat, 16 May 2015 13:34:56 +0200
Subject: [Python-Dev] No tags in semi-official github mirror of cpython
 repository.
In-Reply-To: <CAEfz+Txe5nOJKriwCcwtSr5JW7PvKYD1oZnZpqpiOA_EtMV7zA@mail.gmail.com>
References: <CAEfz+Txe5nOJKriwCcwtSr5JW7PvKYD1oZnZpqpiOA_EtMV7zA@mail.gmail.com>
Message-ID: <20150516113456.GA27431@phdru.name>

Hi!

On Sat, May 16, 2015 at 11:45:38AM +0900, INADA Naoki <songofacandy at gmail.com> wrote:
> I foud "semi official github mirror" of cpython.
> https://github.com/python/cpython
> 
> I want to use it as upstream of our project (Translating docs in Japanese).
> But it doesn't have tags.
> 
> Is the repository stable enough for forking project like us? Or should we
> use mercurial?
> Could you mirror tags too?

   If you prefer to use git for development instead of mercurial, like I
do, you can try some hg-to-git gateways. I tried hg-fast-export and
git-remote-hg and found the latter to be much better.

   See https://github.com/felipec/git-remote-hg and
https://github.com/felipec/git/wiki/Comparison-of-git-remote-hg-alternatives

> Thanks
> -- 
> INADA Naoki  <songofacandy at gmail.com>

Oleg.
-- 
     Oleg Broytman            http://phdru.name/            phd at phdru.name
           Programmers don't die, they just GOSUB without RETURN.

From nad at acm.org  Sun May 17 01:44:13 2015
From: nad at acm.org (Ned Deily)
Date: Sat, 16 May 2015 16:44:13 -0700
Subject: [Python-Dev] cpython (merge 3.4 -> default): Added tests for
	more builtin types.
References: <20150516183940.21146.77232@psf.io>
Message-ID: <nad-6332FE.16441316052015@news.gmane.org>

In article <20150516183940.21146.77232 at psf.io>,
 serhiy.storchaka <python-checkins at python.org> wrote:
> https://hg.python.org/cpython/rev/7b350f712c0e
> changeset:   96099:7b350f712c0e
> parent:      96096:f0c94892ac31
> parent:      96098:955dffec3d94
> user:        Serhiy Storchaka <storchaka at gmail.com>
> date:        Sat May 16 21:35:56 2015 +0300
> summary:
>   Added tests for more builtin types.
> Made test_pprint discoverable.
> 
> files:
>   Lib/test/test_pprint.py |  17 ++++++++---------
>   1 files changed, 8 insertions(+), 9 deletions(-)

======================================================================
ERROR: test_coverage (test.test_trace.TestCoverage)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 312, 
in test_coverage
    self._coverage(tracer)
  File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 305, 
in _coverage
    tracer.run(cmd)
  File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 500, in run
    self.runctx(cmd, dict, dict)
  File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 508, in runctx
    exec(cmd, globals, locals)
  File "<string>", line 1, in <module>
AttributeError: module 'test.test_pprint' has no attribute 'test_main'

======================================================================
ERROR: test_coverage_ignore (test.test_trace.TestCoverage)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 327, 
in test_coverage_ignore
    self._coverage(tracer)
  File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 305, 
in _coverage
    tracer.run(cmd)
  File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 500, in run
    self.runctx(cmd, dict, dict)
  File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 508, in runctx
    exec(cmd, globals, locals)
  File "<string>", line 1, in <module>
AttributeError: module 'test.test_pprint' has no attribute 'test_main'

Also breaks 3.4.

-- 
 Ned Deily,
 nad at acm.org


From storchaka at gmail.com  Sun May 17 07:57:31 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Sun, 17 May 2015 08:57:31 +0300
Subject: [Python-Dev] cpython (merge 3.4 -> default): Added tests for
 more builtin types.
In-Reply-To: <nad-6332FE.16441316052015@news.gmane.org>
References: <20150516183940.21146.77232@psf.io>
 <nad-6332FE.16441316052015@news.gmane.org>
Message-ID: <mj9akc$ack$1@ger.gmane.org>

On 17.05.15 02:44, Ned Deily wrote:
> In article <20150516183940.21146.77232 at psf.io>,
>   serhiy.storchaka <python-checkins at python.org> wrote:
>> https://hg.python.org/cpython/rev/7b350f712c0e
>> changeset:   96099:7b350f712c0e
>> parent:      96096:f0c94892ac31
>> parent:      96098:955dffec3d94
>> user:        Serhiy Storchaka <storchaka at gmail.com>
>> date:        Sat May 16 21:35:56 2015 +0300
>> summary:
>>    Added tests for more builtin types.
>> Made test_pprint discoverable.
>>
>> files:
>>    Lib/test/test_pprint.py |  17 ++++++++---------
>>    1 files changed, 8 insertions(+), 9 deletions(-)
>
> ======================================================================
> ERROR: test_coverage (test.test_trace.TestCoverage)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 312,
> in test_coverage
>      self._coverage(tracer)
>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 305,
> in _coverage
>      tracer.run(cmd)
>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 500, in run
>      self.runctx(cmd, dict, dict)
>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 508, in runctx
>      exec(cmd, globals, locals)
>    File "<string>", line 1, in <module>
> AttributeError: module 'test.test_pprint' has no attribute 'test_main'
>
> ======================================================================
> ERROR: test_coverage_ignore (test.test_trace.TestCoverage)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 327,
> in test_coverage_ignore
>      self._coverage(tracer)
>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 305,
> in _coverage
>      tracer.run(cmd)
>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 500, in run
>      self.runctx(cmd, dict, dict)
>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 508, in runctx
>      exec(cmd, globals, locals)
>    File "<string>", line 1, in <module>
> AttributeError: module 'test.test_pprint' has no attribute 'test_main'
>
> Also breaks 3.4.
>

Thank you Ned. Opened issue24215 for this because just restoring 
test_main perhaps not the best way.


From yselivanov.ml at gmail.com  Sun May 17 17:04:19 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Sun, 17 May 2015 11:04:19 -0400
Subject: [Python-Dev] [Python-checkins] peps: Apply Chris's changes,
 including an acceptance mark
In-Reply-To: <20150517021227.21144.86257@psf.io>
References: <20150517021227.21144.86257@psf.io>
Message-ID: <5558ADF3.5070402@gmail.com>

Chris,

Could you please add a link to the email where the PEP was accepted?

Thanks,
Yury

On 2015-05-16 10:12 PM, chris.angelico wrote:
> https://hg.python.org/peps/rev/f876276ce076
> changeset:   5854:f876276ce076
> user:        Chris Angelico <rosuav at gmail.com>
> date:        Sun May 17 12:12:19 2015 +1000
> summary:
>    Apply Chris's changes, including an acceptance mark
>
> files:
>    pep-0485.txt |  6 +++---
>    1 files changed, 3 insertions(+), 3 deletions(-)
>
>
> diff --git a/pep-0485.txt b/pep-0485.txt
> --- a/pep-0485.txt
> +++ b/pep-0485.txt
> @@ -3,7 +3,7 @@
>   Version: $Revision$
>   Last-Modified: $Date$
>   Author: Christopher Barker <Chris.Barker at noaa.gov>
> -Status: Draft
> +Status: Accepted
>   Type: Standards Track
>   Content-Type: text/x-rst
>   Created: 20-Jan-2015
> @@ -391,9 +391,9 @@
>   The most common use case is expected to be small tolerances -- on order of the
>   default 1e-9. However there may be use cases where a user wants to know if two
>   fairly disparate values are within a particular range of each other: "is a
> -within 200% (rel_tol = 2.0) of b? In this case, the string test would never
> +within 200% (rel_tol = 2.0) of b? In this case, the strong test would never
>   indicate that two values are within that range of each other if one of them is
> -zero. The strong case, however would use the larger (non-zero) value for the
> +zero. The weak case, however would use the larger (non-zero) value for the
>   test, and thus return true if one value is zero. For example: is 0 within 200%
>   of 10? 200% of ten is 20, so the range within 200% of ten is -10 to +30. Zero
>   falls within that range, so it will return True.
>
>
>
> _______________________________________________
> Python-checkins mailing list
> Python-checkins at python.org
> https://mail.python.org/mailman/listinfo/python-checkins


From rosuav at gmail.com  Sun May 17 17:15:54 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Mon, 18 May 2015 01:15:54 +1000
Subject: [Python-Dev] [Python-checkins] peps: Apply Chris's changes,
 including an acceptance mark
In-Reply-To: <5558ADF3.5070402@gmail.com>
References: <20150517021227.21144.86257@psf.io>
	<5558ADF3.5070402@gmail.com>
Message-ID: <CAPTjJmozObPGmLHVyXXrBJOJvMB96oCp=Hos4z2nfAZwsLd8cg@mail.gmail.com>

On Mon, May 18, 2015 at 1:04 AM, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> Chris,
>
> Could you please add a link to the email where the PEP was accepted?

Sure. A Resolution: header is the right way to do this? Done.

ChrisA

From tjreedy at udel.edu  Sun May 17 19:00:28 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Sun, 17 May 2015 13:00:28 -0400
Subject: [Python-Dev] cpython (merge 3.4 -> default): Added tests for
 more builtin types.
In-Reply-To: <mj9akc$ack$1@ger.gmane.org>
References: <20150516183940.21146.77232@psf.io>
 <nad-6332FE.16441316052015@news.gmane.org> <mj9akc$ack$1@ger.gmane.org>
Message-ID: <mjahfc$49b$1@ger.gmane.org>

On 5/17/2015 1:57 AM, Serhiy Storchaka wrote:
> On 17.05.15 02:44, Ned Deily wrote:
>> In article <20150516183940.21146.77232 at psf.io>,
>>   serhiy.storchaka <python-checkins at python.org> wrote:
>>> https://hg.python.org/cpython/rev/7b350f712c0e
>>> changeset:   96099:7b350f712c0e
>>> parent:      96096:f0c94892ac31
>>> parent:      96098:955dffec3d94
>>> user:        Serhiy Storchaka <storchaka at gmail.com>
>>> date:        Sat May 16 21:35:56 2015 +0300
>>> summary:
>>>    Added tests for more builtin types.
>>> Made test_pprint discoverable.
>>>
>>> files:
>>>    Lib/test/test_pprint.py |  17 ++++++++---------
>>>    1 files changed, 8 insertions(+), 9 deletions(-)
>>
>> ======================================================================
>> ERROR: test_coverage (test.test_trace.TestCoverage)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 312,
>> in test_coverage
>>      self._coverage(tracer)
>>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 305,
>> in _coverage
>>      tracer.run(cmd)
>>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 500, in run
>>      self.runctx(cmd, dict, dict)
>>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 508, in runctx
>>      exec(cmd, globals, locals)
>>    File "<string>", line 1, in <module>
>> AttributeError: module 'test.test_pprint' has no attribute 'test_main'
>>
>> ======================================================================
>> ERROR: test_coverage_ignore (test.test_trace.TestCoverage)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 327,
>> in test_coverage_ignore
>>      self._coverage(tracer)
>>    File "/py/dev/3x/root/uxd/lib/python3.5/test/test_trace.py", line 305,
>> in _coverage
>>      tracer.run(cmd)
>>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 500, in run
>>      self.runctx(cmd, dict, dict)
>>    File "/py/dev/3x/root/uxd/lib/python3.5/trace.py", line 508, in runctx
>>      exec(cmd, globals, locals)
>>    File "<string>", line 1, in <module>
>> AttributeError: module 'test.test_pprint' has no attribute 'test_main'
>>
>> Also breaks 3.4.
>>
>
> Thank you Ned. Opened issue24215 for this because just restoring
> test_main perhaps not the best way.

test_trace can be easily modified to use test_pprint as revised. See issue.


-- 
Terry Jan Reedy


From njs at pobox.com  Sun May 17 23:38:30 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Sun, 17 May 2015 14:38:30 -0700
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CADiSq7erx=p7S7kg75pdwOOEDNYu-NEATwwL6xP22ez_c6M1Jw@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
 <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
 <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
 <CAPJVwBkkVahf7kk50WJ1g9K1_qR-52jOX9+gEkLNkM06TA+_wQ@mail.gmail.com>
 <CADiSq7erx=p7S7kg75pdwOOEDNYu-NEATwwL6xP22ez_c6M1Jw@mail.gmail.com>
Message-ID: <CAPJVwBmCkOHHe3QHppQK0BJgKsDAWGTnwLyA=cLTdPYH_ax1JQ@mail.gmail.com>

On Sat, May 16, 2015 at 1:31 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 16 May 2015 at 07:35, Nathaniel Smith <njs at pobox.com> wrote:
>> On Thu, May 14, 2015 at 11:53 PM, Nathaniel Smith <njs at pobox.com> wrote:
>>> On Thu, May 14, 2015 at 9:29 PM, Guido van Rossum <guido at python.org> wrote:
>>>> I expect you can make something that behaves like list by defining __mul__
>>>> and __rmul__ and returning NotImplemented.
>>>
>>> Hmm, it's fairly tricky, and part of the trick is that you can never
>>> return NotImplemented (because you have to pretty much take over and
>>> entirely replace the normal dispatch rules inside __mul__ and
>>> __rmul__), but see attached for something I think should work.
>>>
>>> So I guess this is just how Python's list, tuple, etc. work, and PyPy
>>> and friends need to match...
>>
>> For the record, it looks like PyPy does already have a hack to
>> implement this -- they do it by having a hidden flag on the built-in
>> sequence types which the implementations of '*' and '+' check for, and
>> if it's found it triggers a different rule for dispatching to the
>> __op__ methods:
>>     https://bitbucket.org/pypy/pypy/src/a1a494787f4112e42f50c6583e0fea18db3fb4fa/pypy/objspace/descroperation.py?at=default#cl-692
>
> Oh, that's rather annoying that the PyPy team implemented bug-for-bug
> compatibility there, and didn't follow up on the operand precedence
> bug report to say that they had done so. We also hadn't previously
> been made aware that NumPy is relying on this operand precedence bug
> to implement publicly documented API behaviour, so fixing it *would*
> break end user code :(

I don't think any of us were aware of it either :-).

It is a fairly obscure case -- it only comes up specifically if you
have a single-element integer array that you are trying to multiply by
a list that you expect to be auto-coerced to an array. If Python
semantics were such that this became impossible to handle correctly
then we would survive. (We've certainly survived worse, e.g.
arr[array_of_indices] += 1 silently gives the wrong/unexpected result
when array_of_indices has duplicate entries, and this bites people
constantly. Unfortunately I can't see any reasonable way to fix this
within Python's semantics, so... oh well.)

But yeah, given that we're at a point where list dispatch actually has
worked this way forever and across multiple interpreter
implementations, I think it's de facto going to end up part of the
language specification unless someone does something pretty quick...

> I guess that means someone in the numeric community will need to write
> a PEP to make this "try the other operand first" "feature" part of the
> language specification, so that other interpreters can implement it up
> front, rather than all having to come up with their own independent
> custom hacks just to make NumPy work.

I'll make a note...

> P.S. It would also be nice if someone could take on the PEP for a
> Python level buffer API for 3.6: http://bugs.python.org/issue13797

At a guess, if you want to find people who have this itch strong
enough to try scratching it, then probably numpy users are actually
not your best bet, b/c if you have numpy then you already have
workarounds. In particular, numpy still supports a legacy Python level
buffer export API:

   http://docs.scipy.org/doc/numpy/reference/arrays.interface.html#python-side

So if all you want is to hand a buffer to numpy (rather than to an
arbitrary PEP 3118 consumer) then this works fine, and if you do need
an arbitrary PEP 3118 consumer then you can use numpy as an adaptor
(use __array_interface__ to convert your object to ndarray -> ndarray
supports the PEP 3118 API).

-n

-- 
Nathaniel J. Smith -- http://vorpus.org

From alex.gronholm at nextday.fi  Mon May 18 00:07:12 2015
From: alex.gronholm at nextday.fi (=?UTF-8?B?QWxleCBHcsO2bmhvbG0=?=)
Date: Mon, 18 May 2015 01:07:12 +0300
Subject: [Python-Dev] PEP 484 wishes
Message-ID: <55591110.6060709@nextday.fi>

Looking at PEP 484, I came up with two use cases that I felt were not 
catered for:

 1. Specifying that a parameter should be a subclass of another
    (example: Type[dict] would match dict or OrderedDict; plain "Type"
    would equal "type" from builtins)
 2. Specifying that a callable should take at least the specified
    arguments but would not be limited to them: Callable[[str, int,
    ...], Any]

Case #2 works already (Callable[[str, int], Any] if the unspecified 
arguments are optional, but not if they're mandatory. Any thoughts?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/72eb00e7/attachment.html>

From chris.barker at noaa.gov  Mon May 18 01:02:49 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Sun, 17 May 2015 16:02:49 -0700
Subject: [Python-Dev] PEP 485 isclose() implementation review requested
Message-ID: <CALGmxEJASd+Y+fMQXQE-6PtBZB1jRu6JwUkn+dNSvWAd3dSBtg@mail.gmail.com>

Folks,

After a huge delay, I finally found the time to implement the PEP 485
isclose() function, in C. I tihnk it's time for some review. I appologise
for the fact that I have little experience with C, and haven't used the raw
C API for years, but it's a pretty simple function, and there's lots of
code to copy, so I think it's in OK shape. I hav not yet integrated it with
the cPyton source code -- it belongs in mathmodule.c, but I found it easier
to put it in a separate module while figuring it out.

You can find the code in the same gitHub repo as the draft PEP and python
prototyping code:

https://github.com/PythonCHB/close_pep

the code is in:
  is_close_module.c

There is a test module in:
  test_isclose_c.py

and it can be built with:

python3 setup.py build_ext --inplace

Let me know if I should restructure it or put it anywhere else before it
gets reviewed but in the meantime, i welcome any feedback.

Thanks,
  -Chris

A few questions I have off the bat:

C-API (and plain C) questions:
=============================

* Is there a better way to create a False or True than::

    PyBool_FromLong(0) and PyBool_FromLong(1)

* Style question: should I put brackets in an if clause that has only one
line?::

    if (some_check) {
        just_this_one_expression
    }

* I can't find docs for PyDoc_STRVAR: but it looks like it should use it --
how?

* I'm getting a warning in my PyMethodDef clause::

    static PyMethodDef IsCloseMethods[] = {
        {"isclose", isclose_c, METH_VARARGS | METH_KEYWORDS,
         "determine if two floating point numbers are close"},
        {NULL, NULL, 0, NULL}        /* Sentinel */
    };

is_close_module.c:61:17: warning: incompatible pointer types initializing
      'PyCFunction' (aka 'PyObject *(*)(PyObject *, PyObject *)') with an
      expression of type 'PyObject *(PyObject *, PyObject *, PyObject *)'
      [-Wincompatible-pointer-types]
    {"isclose", isclose_c, METH_VARARGS | METH_KEYWORDS,

but it seems to be working -- and I've copied, as well as I can, the
examples.

Functionality Questions:
========================

* What do do with other numeric types?
  - Integers cast fine...
  - Decimal  and Fraction cast fine, too -- but precision is presumably
lost.
  - Complex ? -- add it to cmath?


* It's raising an Exception for negative tolerances: which don't make sense,
  but don't really cause harm either (fabs() is used anyway). I can't say I
recall why I did that
  for the python prototype, but I reproduced in the C version. Should I?

* What about zero tolerance? should equal values still be considered close?
They are now, and tests reflect that.












-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150517/bdd265e1/attachment.html>

From christian at python.org  Mon May 18 01:16:13 2015
From: christian at python.org (Christian Heimes)
Date: Mon, 18 May 2015 01:16:13 +0200
Subject: [Python-Dev] PEP 485 isclose() implementation review requested
In-Reply-To: <CALGmxEJASd+Y+fMQXQE-6PtBZB1jRu6JwUkn+dNSvWAd3dSBtg@mail.gmail.com>
References: <CALGmxEJASd+Y+fMQXQE-6PtBZB1jRu6JwUkn+dNSvWAd3dSBtg@mail.gmail.com>
Message-ID: <5559213D.2050709@python.org>

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

On 2015-05-18 01:02, Chris Barker wrote:
> * Is there a better way to create a False or True than::
> 
> PyBool_FromLong(0) and PyBool_FromLong(1)

You can use the macros Py_RETURN_TRUE and Py_RETURN_FALSE instead of
return PyBool_FromLong(0).


> * Style question: should I put brackets in an if clause that has
> only one line?::
> 
> if (some_check) { just_this_one_expression }

I prefer the extra brackets because they make the code more explicit.
It's really a matter of taste.

> * I can't find docs for PyDoc_STRVAR: but it looks like it should
> use it -- how?

PyDoc_STRVAR(functionname_doc,
"isclose(foo) -> bool\n\
\n\
long doc string.");

> * I'm getting a warning in my PyMethodDef clause::
> 
> static PyMethodDef IsCloseMethods[] = { {"isclose", isclose_c,
> METH_VARARGS | METH_KEYWORDS, "determine if two floating point
> numbers are close"}, {NULL, NULL, 0, NULL}        /* Sentinel */ 
> };

You have to type cast the function pointer to a PyCFunction here:

  (PyCFunction)isclose_c

The type cast is required for KEYWORD functions and NOARGS functions.

Christian

From guido at python.org  Mon May 18 01:50:22 2015
From: guido at python.org (Guido van Rossum)
Date: Sun, 17 May 2015 16:50:22 -0700
Subject: [Python-Dev] PEP 484 wishes
In-Reply-To: <55591110.6060709@nextday.fi>
References: <55591110.6060709@nextday.fi>
Message-ID: <CAP7+vJK8hqBhTXXz7U2cRuOcs4LWDejQd4SE8pRA-prSNhGu1w@mail.gmail.com>

On Sun, May 17, 2015 at 3:07 PM, Alex Gr?nholm <alex.gronholm at nextday.fi>
wrote:

>  Looking at PEP 484, I came up with two use cases that I felt were not
> catered for:
>
>    1. Specifying that a parameter should be a subclass of another
>    (example: Type[dict] would match dict or OrderedDict; plain "Type" would
>    equal "type" from builtins)
>
>
I don't understand. What is "Type"? Can you work this out in a full
example? This code is already okay:

def foo(a: dict):
    ...

foo(OrderedDict())


>
>    1. Specifying that a callable should take at least the specified
>    arguments but would not be limited to them: Callable[[str, int, ...], Any]
>
> Case #2 works already (Callable[[str, int], Any] if the unspecified
> arguments are optional, but not if they're mandatory. Any thoughts?
>
For #2 we explicitly debated this and found that there aren't use cases
known that are strong enough to need additional flexibility in the args of
a callable. (How is the code calling the callable going to know what
arguments are safe to pass?) If there really is a need we can address in a
future revision.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150517/28d2bbdd/attachment.html>

From storchaka at gmail.com  Mon May 18 11:05:03 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Mon, 18 May 2015 12:05:03 +0300
Subject: [Python-Dev] PyObject_IsInstance is dangerous
Message-ID: <mjca00$9u9$1@ger.gmane.org>

PyObject_IsInstance is not safe when used to check if the object is an 
instance of specified builtin type. Typical code:

     rc = PyObject_IsInstance(obj, &Someting_Type);
     if (rc < 0) return NULL;
     if (rc) {
         SometingObject *something = (SometingObject *)obj;
         something->some_field ...
     }

The __class__ attribute can be modified and PyObject_IsInstance() can 
return true if the object has not layout compatible with specified 
structure. And even worse, __class__  can be dynamic property and 
PyObject_IsInstance() can execute arbitrary Python code, that can 
invalidate cached values of pointers and sizes in C code.

More safe way would be to use PyObject_IsSubclass().

     rc = PyObject_IsSubclass((Py_Object *)obj->ob_type, &Someting_Type);
     if (rc < 0) return NULL;
     if (rc) {
         SometingObject *something = (SometingObject *)obj;
         something->some_field ...
     }

For example see issue24102 [1], issue24091 [2] and many other issues 
opened by pkt.

[1] http://bugs.python.org/issue24102
[2] http://bugs.python.org/issue24091


From greg.ewing at canterbury.ac.nz  Mon May 18 14:14:24 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 19 May 2015 00:14:24 +1200
Subject: [Python-Dev] PyObject_IsInstance is dangerous
In-Reply-To: <mjca00$9u9$1@ger.gmane.org>
References: <mjca00$9u9$1@ger.gmane.org>
Message-ID: <5559D7A0.6040901@canterbury.ac.nz>

Serhiy Storchaka wrote:
> PyObject_IsInstance is not safe when used to check if the object is an 
> instance of specified builtin type.
> 
> The __class__ attribute can be modified and PyObject_IsInstance() can 
> return true if the object has not layout compatible with specified 
> structure.

Code that requires a particular C layout should be using
PyObject_TypeCheck, not PyObject_IsInstance.

-- 
Greg


From storchaka at gmail.com  Mon May 18 14:34:15 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Mon, 18 May 2015 15:34:15 +0300
Subject: [Python-Dev] PyObject_IsInstance is dangerous
In-Reply-To: <5559D7A0.6040901@canterbury.ac.nz>
References: <mjca00$9u9$1@ger.gmane.org> <5559D7A0.6040901@canterbury.ac.nz>
Message-ID: <mjcm88$nna$1@ger.gmane.org>

On 18.05.15 15:14, Greg Ewing wrote:
> Serhiy Storchaka wrote:
>> PyObject_IsInstance is not safe when used to check if the object is an
>> instance of specified builtin type.
>>
>> The __class__ attribute can be modified and PyObject_IsInstance() can
>> return true if the object has not layout compatible with specified
>> structure.
>
> Code that requires a particular C layout should be using
> PyObject_TypeCheck, not PyObject_IsInstance.

Thank you. I didn't know about this helper.

Looks as most (if not all) usages of PyObject_IsInstance are not 
correct. May be modify PyObject_IsInstance so that it will never return 
true if layouts are not compatible?


From greg.ewing at canterbury.ac.nz  Mon May 18 14:41:48 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 19 May 2015 00:41:48 +1200
Subject: [Python-Dev] PyObject_IsInstance is dangerous
In-Reply-To: <mjcm88$nna$1@ger.gmane.org>
References: <mjca00$9u9$1@ger.gmane.org> <5559D7A0.6040901@canterbury.ac.nz>
 <mjcm88$nna$1@ger.gmane.org>
Message-ID: <5559DE0C.2050105@canterbury.ac.nz>

Serhiy Storchaka wrote:
> May be modify PyObject_IsInstance so that it will never return 
> true if layouts are not compatible?

That wouldn't be a good idea, since PyObject_IsInstance is
meant to reflect the behaviour of python's isinstance()
function, which doesn't care about C layouts.

-- 
Greg

From alex.gronholm at nextday.fi  Mon May 18 09:14:09 2015
From: alex.gronholm at nextday.fi (=?UTF-8?B?QWxleCBHcsO2bmhvbG0=?=)
Date: Mon, 18 May 2015 10:14:09 +0300
Subject: [Python-Dev] PEP 484 wishes
In-Reply-To: <CAP7+vJK8hqBhTXXz7U2cRuOcs4LWDejQd4SE8pRA-prSNhGu1w@mail.gmail.com>
References: <55591110.6060709@nextday.fi>
 <CAP7+vJK8hqBhTXXz7U2cRuOcs4LWDejQd4SE8pRA-prSNhGu1w@mail.gmail.com>
Message-ID: <55599141.8080400@nextday.fi>



18.05.2015, 02:50, Guido van Rossum kirjoitti:
> On Sun, May 17, 2015 at 3:07 PM, Alex Gr?nholm 
> <alex.gronholm at nextday.fi <mailto:alex.gronholm at nextday.fi>> wrote:
>
>     Looking at PEP 484, I came up with two use cases that I felt were
>     not catered for:
>
>      1. Specifying that a parameter should be a subclass of another
>         (example: Type[dict] would match dict or OrderedDict; plain
>         "Type" would equal "type" from builtins)
>
>
> I don't understand. What is "Type"? Can you work this out in a full 
> example? This code is already okay:
>
> def foo(a: dict):
>     ...
>
> foo(OrderedDict())
This code is passing an /instance/ of OrderedDict. But how can I specify 
that foo() accepts a /subclass/ of dict, and not an instance thereof?

A full example:

def foo(a: Type[dict]):
     ...

foo(dict)  # ok
foo(OrderedDict)  # ok
foo({'x': 1})  # error
>
>      1. Specifying that a callable should take at least the specified
>         arguments but would not be limited to them: Callable[[str,
>         int, ...], Any]
>
>     Case #2 works already (Callable[[str, int], Any] if the
>     unspecified arguments are optional, but not if they're mandatory.
>     Any thoughts?
>
> For #2 we explicitly debated this and found that there aren't use 
> cases known that are strong enough to need additional flexibility in 
> the args of a callable. (How is the code calling the callable going to 
> know what arguments are safe to pass?) If there really is a need we 
> can address in a future revision.
Consider a framework where a request handler always takes a Request 
object as its first argument, but the rest of the arguments could be 
anything. If you want to only allow registration of such callables, you 
could do this:

def calculate_sum(request: Request, *values):
    return sum(values)

def register_request_handler(handler: Callable[[Request, ...], Any]):
    ...
> -- 
> --Guido van Rossum (python.org/~guido <http://python.org/%7Eguido>)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/81c295c1/attachment.html>

From levkivskyi at gmail.com  Mon May 18 13:59:11 2015
From: levkivskyi at gmail.com (Ivan Levkivskyi)
Date: Mon, 18 May 2015 13:59:11 +0200
Subject: [Python-Dev] PEP 484 wishes
Message-ID: <CAOMjWk=18CKTCokzbk7ys+xasJEFvT8yekTiP4A53Lo5ao9E1Q@mail.gmail.com>

> >  Looking at PEP 484, I came up with two use cases that I felt were not
> > catered for:
> >
> >    1. Specifying that a parameter should be a subclass of another
> >    (example: Type[dict] would match dict or OrderedDict; plain "Type"
would
> >    equal "type" from builtins)
> >
> >
> I don't understand. What is "Type"? Can you work this out in a full
> example? This code is already okay:
>
> def foo(a: dict):
>     ...
>
> foo(OrderedDict())

I think Alex means this: https://github.com/ambv/typehinting/issues/107
This could be really useful, for example:

def fancy_instantiate(cls: Type[T]) -> T:
    ...
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/1d508d26/attachment.html>

From guido at python.org  Mon May 18 17:05:14 2015
From: guido at python.org (Guido van Rossum)
Date: Mon, 18 May 2015 08:05:14 -0700
Subject: [Python-Dev] PEP 484 wishes
In-Reply-To: <55599141.8080400@nextday.fi>
References: <55591110.6060709@nextday.fi>
 <CAP7+vJK8hqBhTXXz7U2cRuOcs4LWDejQd4SE8pRA-prSNhGu1w@mail.gmail.com>
 <55599141.8080400@nextday.fi>
Message-ID: <CAP7+vJK-s97bRPQ6nkV1ZN0FBhyND2Vw7OS_gUh6AoN4eibrAw@mail.gmail.com>

On Mon, May 18, 2015 at 12:14 AM, Alex Gr?nholm <alex.gronholm at nextday.fi>
wrote:

>
>
> 18.05.2015, 02:50, Guido van Rossum kirjoitti:
>
>  On Sun, May 17, 2015 at 3:07 PM, Alex Gr?nholm <alex.gronholm at nextday.fi>
> wrote:
>
>>  Looking at PEP 484, I came up with two use cases that I felt were not
>> catered for:
>>
>>    1. Specifying that a parameter should be a subclass of another
>>    (example: Type[dict] would match dict or OrderedDict; plain "Type" would
>>    equal "type" from builtins)
>>
>>
>  I don't understand. What is "Type"? Can you work this out in a full
> example? This code is already okay:
>
>  def foo(a: dict):
>     ...
>
>  foo(OrderedDict())
>
> This code is passing an *instance* of OrderedDict. But how can I specify
> that foo() accepts a *subclass* of dict, and not an instance thereof?
>
> A full example:
>
> def foo(a: Type[dict]):
>     ...
>
> foo(dict)  # ok
> foo(OrderedDict)  # ok
> foo({'x': 1})  # error
>

You want the argument to be a *class*. We currently don't support that
beyond using 'type' as the annotation. We may get to this in a future
version; it is relatively uncommon. As to what notation to use, perhaps it
would make more sense to use Class and Class[dict], since in the world of
PEP 484, a class is a concrete thing that you can instantiate, while a type
is an abstraction used to describe the possible values of a
variable/argument/etc.

Also, what you gave is still not a full example, since you don't show what
you are going to do with that type. Not every class can be easily
instantiated (without knowing the specific signature). So if you were
planning to instantiate it, perhaps you should use Callable[..., dict] as
the type instead. (The ellipsis is not yet supported by mypy --
https://github.com/JukkaL/mypy/issues/393 -- but it is allowed by the PEP.)


>
>
>>
>>    1. Specifying that a callable should take at least the specified
>>    arguments but would not be limited to them: Callable[[str, int, ...], Any]
>>
>> Case #2 works already (Callable[[str, int], Any] if the unspecified
>> arguments are optional, but not if they're mandatory. Any thoughts?
>>
> For #2 we explicitly debated this and found that there aren't use cases
> known that are strong enough to need additional flexibility in the args of
> a callable. (How is the code calling the callable going to know what
> arguments are safe to pass?) If there really is a need we can address in a
> future revision.
>
> Consider a framework where a request handler always takes a Request object
> as its first argument, but the rest of the arguments could be anything. If
> you want to only allow registration of such callables, you could do this:
>
> def calculate_sum(request: Request, *values):
>    return sum(values)
>
> def register_request_handler(handler: Callable[[Request, ...], Any]):
>    ...
>

Hm... Yeah, you'd be stuck with using Callable[..., Any] for now. Maybe in
a future version of the PEP. (We can't boil the ocean of typing in one PEP.
:-)

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/fea1a6dd/attachment.html>

From chris.barker at noaa.gov  Mon May 18 17:49:52 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Mon, 18 May 2015 08:49:52 -0700
Subject: [Python-Dev] PEP 485 isclose() implementation review requested
In-Reply-To: <5559213D.2050709@python.org>
References: <CALGmxEJASd+Y+fMQXQE-6PtBZB1jRu6JwUkn+dNSvWAd3dSBtg@mail.gmail.com>
 <5559213D.2050709@python.org>
Message-ID: <CALGmxE+dLxKSvdkRU4hZCC5KvZUs6jFPgL8+scof=4WC-3QZqw@mail.gmail.com>

Thanks Cristian, that clears up a couple things -- got it compiling without
warning.

But I also discovered that I must have not pushed the latest copy yesterday.

It's on a machine at home -- I'll push it tonight. But the copy on gitHub
now is mostly good -- I think the only changes are handling the docsstrings
better and some more/better tests.

-Chris





On Sun, May 17, 2015 at 4:16 PM, Christian Heimes <christian at python.org>
wrote:

> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA512
>
> On 2015-05-18 01:02, Chris Barker wrote:
> > * Is there a better way to create a False or True than::
> >
> > PyBool_FromLong(0) and PyBool_FromLong(1)
>
> You can use the macros Py_RETURN_TRUE and Py_RETURN_FALSE instead of
> return PyBool_FromLong(0).
>
>
> > * Style question: should I put brackets in an if clause that has
> > only one line?::
> >
> > if (some_check) { just_this_one_expression }
>
> I prefer the extra brackets because they make the code more explicit.
> It's really a matter of taste.
>
> > * I can't find docs for PyDoc_STRVAR: but it looks like it should
> > use it -- how?
>
> PyDoc_STRVAR(functionname_doc,
> "isclose(foo) -> bool\n\
> \n\
> long doc string.");
>
> > * I'm getting a warning in my PyMethodDef clause::
> >
> > static PyMethodDef IsCloseMethods[] = { {"isclose", isclose_c,
> > METH_VARARGS | METH_KEYWORDS, "determine if two floating point
> > numbers are close"}, {NULL, NULL, 0, NULL}        /* Sentinel */
> > };
>
> You have to type cast the function pointer to a PyCFunction here:
>
>   (PyCFunction)isclose_c
>
> The type cast is required for KEYWORD functions and NOARGS functions.
>
> Christian
>



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/d490b2dc/attachment.html>

From christian at python.org  Mon May 18 18:00:51 2015
From: christian at python.org (Christian Heimes)
Date: Mon, 18 May 2015 18:00:51 +0200
Subject: [Python-Dev] PEP 485 isclose() implementation review requested
In-Reply-To: <CALGmxE+dLxKSvdkRU4hZCC5KvZUs6jFPgL8+scof=4WC-3QZqw@mail.gmail.com>
References: <CALGmxEJASd+Y+fMQXQE-6PtBZB1jRu6JwUkn+dNSvWAd3dSBtg@mail.gmail.com>
 <5559213D.2050709@python.org>
 <CALGmxE+dLxKSvdkRU4hZCC5KvZUs6jFPgL8+scof=4WC-3QZqw@mail.gmail.com>
Message-ID: <555A0CB3.4060608@python.org>

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

On 2015-05-18 17:49, Chris Barker wrote:
> Thanks Cristian, that clears up a couple things -- got it
> compiling without warning.
> 
> But I also discovered that I must have not pushed the latest copy
> yesterday.
> 
> It's on a machine at home -- I'll push it tonight. But the copy on 
> gitHub now is mostly good -- I think the only changes are handling
> the docsstrings better and some more/better tests.

You're welcome!

Does your latest patch handle NaN, too? I only noticed infinity checks
but no explicit check for NaN.

Christian

-----BEGIN PGP SIGNATURE-----

iQEcBAEBCgAGBQJVWgyrAAoJEIZoUkkhLbaJhBsH/3gH7Z1+otWwR6hIYjNU4OjK
xjPmGeypisU6UDxfQa+lIHg1rEmyxlSbkXtn2DYysw9CMentK/XclF8GzWAA/ySV
m0UE4+hzcWB7fsOLgbCxQKfNTEM/jp7D2z3qIZTknEecHENx552AaEfTXRTWQKHK
QLh0sLA3QrOzUkOf+EXJQHlYvxf+F71PVyfOX8/m3XHhaQrpb70AGktsUPRDN3yc
blY6SSQoV1uhw+/crqz34BoPGipAkZdq9abyz4Ja0adC8hT++7rbVldFdsrDIPdQ
MX30atV+ZQ2Mb5NqJkmEjCKF5uXvwvlP8ijgz5nZKZ9db+9Z8YS0/e7UrPb85uM=
=7N7z
-----END PGP SIGNATURE-----

From alex.gronholm at nextday.fi  Mon May 18 20:01:06 2015
From: alex.gronholm at nextday.fi (=?UTF-8?B?QWxleCBHcsO2bmhvbG0=?=)
Date: Mon, 18 May 2015 21:01:06 +0300
Subject: [Python-Dev] PEP 484 wishes
In-Reply-To: <CAP7+vJK-s97bRPQ6nkV1ZN0FBhyND2Vw7OS_gUh6AoN4eibrAw@mail.gmail.com>
References: <55591110.6060709@nextday.fi>
 <CAP7+vJK8hqBhTXXz7U2cRuOcs4LWDejQd4SE8pRA-prSNhGu1w@mail.gmail.com>
 <55599141.8080400@nextday.fi>
 <CAP7+vJK-s97bRPQ6nkV1ZN0FBhyND2Vw7OS_gUh6AoN4eibrAw@mail.gmail.com>
Message-ID: <555A28E2.1090701@nextday.fi>



18.05.2015, 18:05, Guido van Rossum kirjoitti:
> On Mon, May 18, 2015 at 12:14 AM, Alex Gr?nholm 
> <alex.gronholm at nextday.fi <mailto:alex.gronholm at nextday.fi>> wrote:
>
>
>
>     18.05.2015, 02:50, Guido van Rossum kirjoitti:
>>     On Sun, May 17, 2015 at 3:07 PM, Alex Gr?nholm
>>     <alex.gronholm at nextday.fi <mailto:alex.gronholm at nextday.fi>> wrote:
>>
>>         Looking at PEP 484, I came up with two use cases that I felt
>>         were not catered for:
>>
>>          1. Specifying that a parameter should be a subclass of
>>             another (example: Type[dict] would match dict or
>>             OrderedDict; plain "Type" would equal "type" from builtins)
>>
>>
>>     I don't understand. What is "Type"? Can you work this out in a
>>     full example? This code is already okay:
>>
>>     def foo(a: dict):
>>         ...
>>
>>     foo(OrderedDict())
>     This code is passing an /instance/ of OrderedDict. But how can I
>     specify that foo() accepts a /subclass/ of dict, and not an
>     instance thereof?
>
>     A full example:
>
>     def foo(a: Type[dict]):
>         ...
>
>     foo(dict)  # ok
>     foo(OrderedDict)  # ok
>     foo({'x': 1})  # error
>
>
> You want the argument to be a *class*. We currently don't support that 
> beyond using 'type' as the annotation. We may get to this in a future 
> version; it is relatively uncommon. As to what notation to use, 
> perhaps it would make more sense to use Class and Class[dict], since 
> in the world of PEP 484, a class is a concrete thing that you can 
> instantiate, while a type is an abstraction used to describe the 
> possible values of a variable/argument/etc.
>
> Also, what you gave is still not a full example, since you don't show 
> what you are going to do with that type. Not every class can be easily 
> instantiated (without knowing the specific signature). So if you were 
> planning to instantiate it, perhaps you should use Callable[..., dict] 
> as the type instead. (The ellipsis is not yet supported by mypy -- 
> https://github.com/JukkaL/mypy/issues/393 -- but it is allowed by the 
> PEP.)
Here's one example, straight from the code of my new framework:

@typechecked
def register_extension_type(ext_type: str, extension_class: type, 
replace: bool=False):
     """
     Adds a new extension type that can be used with a dictionary based 
configuration.

     :param ext_type: the extension type identifier
     :param extension_class: a class that implements IExtension
     :param replace: ``True`` to replace an existing type
     """

     assert_subclass('extension_class', extension_class, IExtension)
     if ext_type in extension_types and not replace:
         raise ValueError('Extension type "{}" already 
exists'.format(ext_type))

     extension_types[ext_type] = extension_class

I would like to declare the second argument as "extension_class: 
Type[IExtension]" (or Class[IExtension], doesn't matter to me). 
Likewise, the type hint for "extension_types" should be "Dict[str, 
Type[IExtension]]".
>
>>          1. Specifying that a callable should take at least the
>>             specified arguments but would not be limited to them:
>>             Callable[[str, int, ...], Any]
>>
>>         Case #2 works already (Callable[[str, int], Any] if the
>>         unspecified arguments are optional, but not if they're
>>         mandatory. Any thoughts?
>>
>>     For #2 we explicitly debated this and found that there aren't use
>>     cases known that are strong enough to need additional flexibility
>>     in the args of a callable. (How is the code calling the callable
>>     going to know what arguments are safe to pass?) If there really
>>     is a need we can address in a future revision.
>     Consider a framework where a request handler always takes a
>     Request object as its first argument, but the rest of the
>     arguments could be anything. If you want to only allow
>     registration of such callables, you could do this:
>
>     def calculate_sum(request: Request, *values):
>        return sum(values)
>
>     def register_request_handler(handler: Callable[[Request, ...], Any]):
>        ...
>
>
> Hm... Yeah, you'd be stuck with using Callable[..., Any] for now. 
> Maybe in a future version of the PEP. (We can't boil the ocean of 
> typing in one PEP. :-)
>
> -- 
> --Guido van Rossum (python.org/~guido <http://python.org/%7Eguido>)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/b1556df0/attachment-0001.html>

From guido at python.org  Mon May 18 21:32:35 2015
From: guido at python.org (Guido van Rossum)
Date: Mon, 18 May 2015 12:32:35 -0700
Subject: [Python-Dev] PEP 484 wishes
In-Reply-To: <555A28E2.1090701@nextday.fi>
References: <55591110.6060709@nextday.fi>
 <CAP7+vJK8hqBhTXXz7U2cRuOcs4LWDejQd4SE8pRA-prSNhGu1w@mail.gmail.com>
 <55599141.8080400@nextday.fi>
 <CAP7+vJK-s97bRPQ6nkV1ZN0FBhyND2Vw7OS_gUh6AoN4eibrAw@mail.gmail.com>
 <555A28E2.1090701@nextday.fi>
Message-ID: <CAP7+vJLdt9_0R8WAdKZ8YrsnPP3GqUhufarjvVet=f47HYK2JQ@mail.gmail.com>

Can you add your example to the issue?
https://github.com/ambv/typehinting/issues/107

We're trying to finish up PEP 484 in the next few days (wait for an
announcement :-) and we just don't have time for every use case; but over
the course of 3.5 we will be adding features that are considered useful,
and we'll keep the issue open to remind us of it. Until then you'll have to
use plain "type" as the annotation (still better than "Any". :-)

On Mon, May 18, 2015 at 11:01 AM, Alex Gr?nholm <alex.gronholm at nextday.fi>
wrote:

>
>
> 18.05.2015, 18:05, Guido van Rossum kirjoitti:
>
>  On Mon, May 18, 2015 at 12:14 AM, Alex Gr?nholm <alex.gronholm at nextday.fi
> > wrote:
>
>>
>>
>> 18.05.2015, 02:50, Guido van Rossum kirjoitti:
>>
>>  On Sun, May 17, 2015 at 3:07 PM, Alex Gr?nholm <alex.gronholm at nextday.fi
>> > wrote:
>>
>>>  Looking at PEP 484, I came up with two use cases that I felt were not
>>> catered for:
>>>
>>>    1. Specifying that a parameter should be a subclass of another
>>>    (example: Type[dict] would match dict or OrderedDict; plain "Type" would
>>>    equal "type" from builtins)
>>>
>>>
>>  I don't understand. What is "Type"? Can you work this out in a full
>> example? This code is already okay:
>>
>>  def foo(a: dict):
>>     ...
>>
>>  foo(OrderedDict())
>>
>>  This code is passing an *instance* of OrderedDict. But how can I
>> specify that foo() accepts a *subclass* of dict, and not an instance
>> thereof?
>>
>> A full example:
>>
>> def foo(a: Type[dict]):
>>     ...
>>
>>   foo(dict)  # ok
>> foo(OrderedDict)  # ok
>> foo({'x': 1})  # error
>>
>
>  You want the argument to be a *class*. We currently don't support that
> beyond using 'type' as the annotation. We may get to this in a future
> version; it is relatively uncommon. As to what notation to use, perhaps it
> would make more sense to use Class and Class[dict], since in the world of
> PEP 484, a class is a concrete thing that you can instantiate, while a type
> is an abstraction used to describe the possible values of a
> variable/argument/etc.
>
> Also, what you gave is still not a full example, since you don't show what
> you are going to do with that type. Not every class can be easily
> instantiated (without knowing the specific signature). So if you were
> planning to instantiate it, perhaps you should use Callable[..., dict] as
> the type instead. (The ellipsis is not yet supported by mypy --
> https://github.com/JukkaL/mypy/issues/393 -- but it is allowed by the
> PEP.)
>
> Here's one example, straight from the code of my new framework:
>
>  @typechecked
> def register_extension_type(ext_type: str, extension_class: type, replace:
> bool=False):
>      """
>     Adds a new extension type that can be used with a dictionary based
> configuration.
>
>     :param ext_type: the extension type identifier
>     :param extension_class: a class that implements IExtension
>     :param replace: ``True`` to replace an existing type
>     """
>
>     assert_subclass('extension_class', extension_class, IExtension)
>     if ext_type in extension_types and not replace:
>         raise ValueError('Extension type "{}" already
> exists'.format(ext_type))
>
>     extension_types[ext_type] = extension_class
>
> I would like to declare the second argument as "extension_class:
> Type[IExtension]" (or Class[IExtension], doesn't matter to me). Likewise,
> the type hint for "extension_types" should be "Dict[str, Type[IExtension]]".
>
>
>
>>
>>
>>>
>>>    1. Specifying that a callable should take at least the specified
>>>    arguments but would not be limited to them: Callable[[str, int, ...], Any]
>>>
>>> Case #2 works already (Callable[[str, int], Any] if the unspecified
>>> arguments are optional, but not if they're mandatory. Any thoughts?
>>>
>> For #2 we explicitly debated this and found that there aren't use cases
>> known that are strong enough to need additional flexibility in the args of
>> a callable. (How is the code calling the callable going to know what
>> arguments are safe to pass?) If there really is a need we can address in a
>> future revision.
>>
>>  Consider a framework where a request handler always takes a Request
>> object as its first argument, but the rest of the arguments could be
>> anything. If you want to only allow registration of such callables, you
>> could do this:
>>
>> def calculate_sum(request: Request, *values):
>>    return sum(values)
>>
>> def register_request_handler(handler: Callable[[Request, ...], Any]):
>>    ...
>>
>
>  Hm... Yeah, you'd be stuck with using Callable[..., Any] for now. Maybe
> in a future version of the PEP. (We can't boil the ocean of typing in one
> PEP. :-)
>
> --
> --Guido van Rossum (python.org/~guido <http://python.org/%7Eguido>)
>
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
>


-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/af035a48/attachment.html>

From Lcolvard at horrycountyschools.net  Mon May 18 21:02:40 2015
From: Lcolvard at horrycountyschools.net (Lisa Colvard)
Date: Mon, 18 May 2015 19:02:40 +0000
Subject: [Python-Dev] Gcode path
Message-ID: <SN1PR0201MB159798A95892455150A9D6E7A5C40@SN1PR0201MB1597.namprd02.prod.outlook.com>

I am trying to get Replicatorg to use python to gcode but it keeps giving the error "generate Gcode  requires that a Python interpreter be installed.  Would you like to visit the Python download page now?

I tell it no because I have it installed.  I have even gone in and selected the path for the gcode.  It still gives me the same error.  How can I fix this?

Thanks,

Lisa Colvard, Ed.D.
Conway High School
Media/Technology Specialist
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/30e64bd3/attachment.html>

From guido at python.org  Mon May 18 22:18:22 2015
From: guido at python.org (Guido van Rossum)
Date: Mon, 18 May 2015 13:18:22 -0700
Subject: [Python-Dev] Gcode path
In-Reply-To: <SN1PR0201MB159798A95892455150A9D6E7A5C40@SN1PR0201MB1597.namprd02.prod.outlook.com>
References: <SN1PR0201MB159798A95892455150A9D6E7A5C40@SN1PR0201MB1597.namprd02.prod.outlook.com>
Message-ID: <CAP7+vJJ7uRFfh3d7rNUPvpmLwBkh_=Mp1_AVz25wKt46b3EEag@mail.gmail.com>

Hi Lisa,

It's unlikely that anyone on this list can help you with this. It's a
question you should ask of the ReplicatorG people, not the Python people...

--Guido

On Mon, May 18, 2015 at 12:02 PM, Lisa Colvard <
Lcolvard at horrycountyschools.net> wrote:

>  I am trying to get Replicatorg to use python to gcode but it keeps
> giving the error ?generate Gcode  requires that a Python interpreter be
> installed.  Would you like to visit the Python download page now?
>
>
>
> I tell it no because I have it installed.  I have even gone in and
> selected the path for the gcode.  It still gives me the same error.  How
> can I fix this?
>
>
>
> Thanks,
>
>
>
> Lisa Colvard, Ed.D.
>
> Conway High School
>
> Media/Technology Specialist
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
>


-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/5806a165/attachment.html>

From tjreedy at udel.edu  Tue May 19 00:44:11 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Mon, 18 May 2015 18:44:11 -0400
Subject: [Python-Dev] Gcode path
In-Reply-To: <CAP7+vJJ7uRFfh3d7rNUPvpmLwBkh_=Mp1_AVz25wKt46b3EEag@mail.gmail.com>
References: <SN1PR0201MB159798A95892455150A9D6E7A5C40@SN1PR0201MB1597.namprd02.prod.outlook.com>
 <CAP7+vJJ7uRFfh3d7rNUPvpmLwBkh_=Mp1_AVz25wKt46b3EEag@mail.gmail.com>
Message-ID: <mjdpvt$a4j$1@ger.gmane.org>

On 5/18/2015 4:18 PM, Guido van Rossum wrote:

> It's unlikely that anyone on this list can help you with this. It's a
> question you should ask of the ReplicatorG people, not the Python people...

If that does not work, try stackoverflow, or possibly even python-list. 
  This list is focused only on development of future python releases.

-- 
Terry Jan Reedy


From chris.barker at noaa.gov  Tue May 19 01:50:09 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Mon, 18 May 2015 16:50:09 -0700
Subject: [Python-Dev] Gcode path
In-Reply-To: <CAP7+vJJ7uRFfh3d7rNUPvpmLwBkh_=Mp1_AVz25wKt46b3EEag@mail.gmail.com>
References: <SN1PR0201MB159798A95892455150A9D6E7A5C40@SN1PR0201MB1597.namprd02.prod.outlook.com>
 <CAP7+vJJ7uRFfh3d7rNUPvpmLwBkh_=Mp1_AVz25wKt46b3EEag@mail.gmail.com>
Message-ID: <CALGmxELJ4a9ByX-xaV6kJ-APH0PE9mtdG1T77BZuwdzX5PvinQ@mail.gmail.com>

Lisa,

As noted, not the right list.

But seeing this kind of stuff done in High Schools is GREAT!

So one suggestion:

If this is Windows, there are two versions of python for Windows: 32bit and
64bit -- if an installer for a third-party package is looking for one of
those, and the other is installed, you will get the confusing error
something like "python is not installed", when it really means "the correct
version of python is not installed".

So make sure you see which version Replicatorg is expecting and make sure
that's the one you installed.

Good luck,

-Chris



On Mon, May 18, 2015 at 1:18 PM, Guido van Rossum <guido at python.org> wrote:

> Hi Lisa,
>
> It's unlikely that anyone on this list can help you with this. It's a
> question you should ask of the ReplicatorG people, not the Python people...
>
> --Guido
>
> On Mon, May 18, 2015 at 12:02 PM, Lisa Colvard <
> Lcolvard at horrycountyschools.net> wrote:
>
>>  I am trying to get Replicatorg to use python to gcode but it keeps
>> giving the error ?generate Gcode  requires that a Python interpreter be
>> installed.  Would you like to visit the Python download page now?
>>
>>
>>
>> I tell it no because I have it installed.  I have even gone in and
>> selected the path for the gcode.  It still gives me the same error.  How
>> can I fix this?
>>
>>
>>
>> Thanks,
>>
>>
>>
>> Lisa Colvard, Ed.D.
>>
>> Conway High School
>>
>> Media/Technology Specialist
>>
>>
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>>
>>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov
>
>


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150518/f45ceba3/attachment.html>

From arigo at tunes.org  Tue May 19 16:00:54 2015
From: arigo at tunes.org (Armin Rigo)
Date: Tue, 19 May 2015 16:00:54 +0200
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CADiSq7erx=p7S7kg75pdwOOEDNYu-NEATwwL6xP22ez_c6M1Jw@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
 <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
 <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
 <CAPJVwBkkVahf7kk50WJ1g9K1_qR-52jOX9+gEkLNkM06TA+_wQ@mail.gmail.com>
 <CADiSq7erx=p7S7kg75pdwOOEDNYu-NEATwwL6xP22ez_c6M1Jw@mail.gmail.com>
Message-ID: <CAMSv6X0XkLt9w-gPcxsm75wwsmBA0rLNfpLkuPz5jvK12aS+oQ@mail.gmail.com>

Hi Nick,

On 16 May 2015 at 10:31, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Oh, that's rather annoying that the PyPy team implemented bug-for-bug
> compatibility there, and didn't follow up on the operand precedence
> bug report to say that they had done so.

It's sadly not the only place, by far, where a behavior of CPython
could be considered an implementation detail, but people rely on it
and so we need to write a workaround.  We don't report all of them,
particularly not the ones that are clearly of the kind "won't be
changed in CPython 2.7".  Maybe we should?

Another example where this same bug occurs is:

    class T(tuple):
       def __radd__(self, other):
          return 42

    lst = [ ]
    lst += T()

which calls T.__radd__ in contradiction to all the general rules.
(Yes, if you print(lst) afterwards, you get 42.  And oops, trying this
out on PyPy does not give 42; only "lst + T()" does.  Probably another
corner case to fix...)


A bient?t,

Armin.

From kushaldas at gmail.com  Tue May 19 17:02:51 2015
From: kushaldas at gmail.com (Kushal Das)
Date: Tue, 19 May 2015 20:32:51 +0530
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
Message-ID: <20150519150251.GJ20426@kdas-laptop>

Hi,

With the help of CentOS project I am happy to announce an automated
system [1] to test patches from bugs.python.org. This can be fully automated
to test the patches whenever someone uploads a patch in the roundup, but
for now it accepts IRC commands on #python-dev channel. I worked on a
docker based prototype during sprints in PyCon.

How to use it?
---------------

1. Join #python-dev on irc.freenode.net.
2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
3. They will issue a simple command. #add: YOUR_NICK_NAME
4. You can then test by issuing the following command in the channel:
    
    #test: BUGNUMBER
    like #test: 21271

This will do the following:
Start a new job on ci.centos.org, announce it on the channel, and
announce the result also.

I will be working on a minimal lint for patches, and include it
the workflow.

The current steps can be found at [2]. Each build is happening on a
fresh system.

Limitations
-----------

1. It excepts one single patch to contain all the required changes
(instead of a series of patches).
2. It runs only on x86_64 architecture, CentOS7 based systems.



[1] https://ci.centos.org/job/cPython-build-patch/
[2] https://github.com/kushaldas/pypatcher/blob/master/pypatcher.sh

Kushal
-- 
Fedora Cloud Engineer
CPython Core Developer
Director @ Python Software Foundation
http://kushaldas.in

From berker.peksag at gmail.com  Tue May 19 17:48:27 2015
From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=)
Date: Tue, 19 May 2015 18:48:27 +0300
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <20150519150251.GJ20426@kdas-laptop>
References: <20150519150251.GJ20426@kdas-laptop>
Message-ID: <CAF4280LgsZe4tp5d8mKx8xwmgdvWhEffTLB5pk5Po0pzKTPZEw@mail.gmail.com>

On Tue, May 19, 2015 at 6:02 PM, Kushal Das <kushaldas at gmail.com> wrote:
> Hi,
>
> With the help of CentOS project I am happy to announce an automated
> system [1] to test patches from bugs.python.org. This can be fully automated
> to test the patches whenever someone uploads a patch in the roundup, but
> for now it accepts IRC commands on #python-dev channel. I worked on a
> docker based prototype during sprints in PyCon.
>
> How to use it?
> ---------------
>
> 1. Join #python-dev on irc.freenode.net.
> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
> 3. They will issue a simple command. #add: YOUR_NICK_NAME
> 4. You can then test by issuing the following command in the channel:
>
>     #test: BUGNUMBER
>     like #test: 21271
>
> This will do the following:
> Start a new job on ci.centos.org, announce it on the channel, and
> announce the result also.

Hi Kushal,

Looks great, thanks! :)

Two comments:

* It would be good to have a pypatcher repository at hg.python.org (at
least a mirror), so we can work on it together without dealing with
"add me to the repo" messages on GitHub.
* Do you have a roadmap or a TODO list? For example, I think
downloading a tarball of the default branch every time (or is it
cached?) would be a little bit slow. Do you have a plan to make the
workflow Mercurial based (e.g. "hg pull -u, hg imp --no-c
issueXXXX.diff, compile" instead of "wget tarball, extract it, apply
patch, compile")?

> I will be working on a minimal lint for patches, and include it
> the workflow.

Could you give more details about the linter? Can we use
Tools/scripts/patchcheck.py?

Thanks!

--Berker

From kushaldas at gmail.com  Tue May 19 18:53:15 2015
From: kushaldas at gmail.com (Kushal Das)
Date: Tue, 19 May 2015 22:23:15 +0530
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <CAF4280LgsZe4tp5d8mKx8xwmgdvWhEffTLB5pk5Po0pzKTPZEw@mail.gmail.com>
References: <20150519150251.GJ20426@kdas-laptop>
 <CAF4280LgsZe4tp5d8mKx8xwmgdvWhEffTLB5pk5Po0pzKTPZEw@mail.gmail.com>
Message-ID: <20150519165315.GA12901@kdas-laptop.redhat.com>

On 19/05/15, Berker Peksa? wrote:
> On Tue, May 19, 2015 at 6:02 PM, Kushal Das <kushaldas at gmail.com> wrote:
> > Hi,
> >
> > With the help of CentOS project I am happy to announce an automated
> > system [1] to test patches from bugs.python.org. This can be fully automated
> > to test the patches whenever someone uploads a patch in the roundup, but
> > for now it accepts IRC commands on #python-dev channel. I worked on a
> > docker based prototype during sprints in PyCon.
> >
> > How to use it?
> > ---------------
> >
> > 1. Join #python-dev on irc.freenode.net.
> > 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
> > 3. They will issue a simple command. #add: YOUR_NICK_NAME
> > 4. You can then test by issuing the following command in the channel:
> >
> >     #test: BUGNUMBER
> >     like #test: 21271
> >
> > This will do the following:
> > Start a new job on ci.centos.org, announce it on the channel, and
> > announce the result also.
> 
> Hi Kushal,
> 
> Looks great, thanks! :)
> 
> Two comments:
> 
> * It would be good to have a pypatcher repository at hg.python.org (at
> least a mirror), so we can work on it together without dealing with
> "add me to the repo" messages on GitHub.

We can surely do this. I started with github as generally most people
are already there. Do you know what is the procedure for creating a new
repo in hg.python.org?


> * Do you have a roadmap or a TODO list? For example, I think
> downloading a tarball of the default branch every time (or is it
> cached?) would be a little bit slow. Do you have a plan to make the
> workflow Mercurial based (e.g. "hg pull -u, hg imp --no-c
> issueXXXX.diff, compile" instead of "wget tarball, extract it, apply
> patch, compile")?
I will have to work on the TODO list, I will post it on the repo itself.
The downloading tarball currently takes around 15-16 seconds, which I
found fast enough to start with. I personally always use standard patch
command, that is why I chose this approach instead of hg. We can always
improve the workflow :)


> > I will be working on a minimal lint for patches, and include it
> > the workflow.
> 
> Could you give more details about the linter? Can we use
> Tools/scripts/patchcheck.py?
For this I really never thought much. We should discuss more on this to
find out what all we can do.

Kushal
-- 
Fedora Cloud Engineer
CPython Core Developer
Director @ Python Software Foundation
http://kushaldas.in

From wes.turner at gmail.com  Tue May 19 20:22:42 2015
From: wes.turner at gmail.com (Wes Turner)
Date: Tue, 19 May 2015 13:22:42 -0500
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <20150519150251.GJ20426@kdas-laptop>
References: <20150519150251.GJ20426@kdas-laptop>
Message-ID: <CACfEFw8B0PT9EXdQyAhSYyOUFe=EPtZnavopAyps-wmpsJUnHg@mail.gmail.com>

Cool! Thanks!

BuildBot integration?

* http://docs.buildbot.net/latest/full.html#source-stamps
* http://docs.buildbot.net/latest/full.html#enabling-the-irc-bot
* http://docs.buildbot.net/latest/full.html#choosing-a-change-source

https://github.com/audreyr/cookiecutter-pypackage (requirements.txt,
setup.py, Makefile)

   pip install cookiecutter
   cookiecutter gh:audreyr/cookiecutter-pypackage


On Tue, May 19, 2015 at 10:02 AM, Kushal Das <kushaldas at gmail.com> wrote:

> Hi,
>
> With the help of CentOS project I am happy to announce an automated
> system [1] to test patches from bugs.python.org. This can be fully
> automated
> to test the patches whenever someone uploads a patch in the roundup, but
> for now it accepts IRC commands on #python-dev channel. I worked on a
> docker based prototype during sprints in PyCon.
>
> How to use it?
> ---------------
>
> 1. Join #python-dev on irc.freenode.net.
> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
> 3. They will issue a simple command. #add: YOUR_NICK_NAME
> 4. You can then test by issuing the following command in the channel:
>
>     #test: BUGNUMBER
>     like #test: 21271
>
> This will do the following:
> Start a new job on ci.centos.org, announce it on the channel, and
> announce the result also.
>
> I will be working on a minimal lint for patches, and include it
> the workflow.
>
> The current steps can be found at [2]. Each build is happening on a
> fresh system.
>
> Limitations
> -----------
>
> 1. It excepts one single patch to contain all the required changes
> (instead of a series of patches).
> 2. It runs only on x86_64 architecture, CentOS7 based systems.
>
>
>
> [1] https://ci.centos.org/job/cPython-build-patch/
> [2] https://github.com/kushaldas/pypatcher/blob/master/pypatcher.sh
>
> Kushal
> --
> Fedora Cloud Engineer
> CPython Core Developer
> Director @ Python Software Foundation
> http://kushaldas.in
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150519/1fe92b3e/attachment.html>

From brett at python.org  Tue May 19 17:25:26 2015
From: brett at python.org (Brett Cannon)
Date: Tue, 19 May 2015 15:25:26 +0000
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <20150519150251.GJ20426@kdas-laptop>
References: <20150519150251.GJ20426@kdas-laptop>
Message-ID: <CAP1=2W4sbjavM5R1BaHnkVg1nKQxeMv8nLybxru1i9-R1N68mA@mail.gmail.com>

In the airport but I wanted to say thanks for this!

On Tue, May 19, 2015, 11:03 Kushal Das <kushaldas at gmail.com> wrote:

> Hi,
>
> With the help of CentOS project I am happy to announce an automated
> system [1] to test patches from bugs.python.org. This can be fully
> automated
> to test the patches whenever someone uploads a patch in the roundup, but
> for now it accepts IRC commands on #python-dev channel. I worked on a
> docker based prototype during sprints in PyCon.
>
> How to use it?
> ---------------
>
> 1. Join #python-dev on irc.freenode.net.
> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
> 3. They will issue a simple command. #add: YOUR_NICK_NAME
> 4. You can then test by issuing the following command in the channel:
>
>     #test: BUGNUMBER
>     like #test: 21271
>
> This will do the following:
> Start a new job on ci.centos.org, announce it on the channel, and
> announce the result also.
>
> I will be working on a minimal lint for patches, and include it
> the workflow.
>
> The current steps can be found at [2]. Each build is happening on a
> fresh system.
>
> Limitations
> -----------
>
> 1. It excepts one single patch to contain all the required changes
> (instead of a series of patches).
> 2. It runs only on x86_64 architecture, CentOS7 based systems.
>
>
>
> [1] https://ci.centos.org/job/cPython-build-patch/
> [2] https://github.com/kushaldas/pypatcher/blob/master/pypatcher.sh
>
> Kushal
> --
> Fedora Cloud Engineer
> CPython Core Developer
> Director @ Python Software Foundation
> http://kushaldas.in
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150519/6845a7a5/attachment.html>

From wes.turner at gmail.com  Tue May 19 20:25:19 2015
From: wes.turner at gmail.com (Wes Turner)
Date: Tue, 19 May 2015 13:25:19 -0500
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <CACfEFw8B0PT9EXdQyAhSYyOUFe=EPtZnavopAyps-wmpsJUnHg@mail.gmail.com>
References: <20150519150251.GJ20426@kdas-laptop>
 <CACfEFw8B0PT9EXdQyAhSYyOUFe=EPtZnavopAyps-wmpsJUnHg@mail.gmail.com>
Message-ID: <CACfEFw8vrbhrQ-nwQgTrioWgqUDtDhALWLUFVRHYF300YcRhVw@mail.gmail.com>

http://docs.buildbot.net/latest/search.html?q=docker ...

Here's a BuildBot Dockerfile:
http://docs.buildbot.net/latest/manual/cfg-buildslaves-docker.html#image-creation

On Tue, May 19, 2015 at 1:22 PM, Wes Turner <wes.turner at gmail.com> wrote:

> Cool! Thanks!
>
> BuildBot integration?
>
> * http://docs.buildbot.net/latest/full.html#source-stamps
> * http://docs.buildbot.net/latest/full.html#enabling-the-irc-bot
> * http://docs.buildbot.net/latest/full.html#choosing-a-change-source
>
> https://github.com/audreyr/cookiecutter-pypackage (requirements.txt,
> setup.py, Makefile)
>
>    pip install cookiecutter
>    cookiecutter gh:audreyr/cookiecutter-pypackage
>
>
> On Tue, May 19, 2015 at 10:02 AM, Kushal Das <kushaldas at gmail.com> wrote:
>
>> Hi,
>>
>> With the help of CentOS project I am happy to announce an automated
>> system [1] to test patches from bugs.python.org. This can be fully
>> automated
>> to test the patches whenever someone uploads a patch in the roundup, but
>> for now it accepts IRC commands on #python-dev channel. I worked on a
>> docker based prototype during sprints in PyCon.
>>
>> How to use it?
>> ---------------
>>
>> 1. Join #python-dev on irc.freenode.net.
>> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
>> 3. They will issue a simple command. #add: YOUR_NICK_NAME
>> 4. You can then test by issuing the following command in the channel:
>>
>>     #test: BUGNUMBER
>>     like #test: 21271
>>
>> This will do the following:
>> Start a new job on ci.centos.org, announce it on the channel, and
>> announce the result also.
>>
>> I will be working on a minimal lint for patches, and include it
>> the workflow.
>>
>> The current steps can be found at [2]. Each build is happening on a
>> fresh system.
>>
>> Limitations
>> -----------
>>
>> 1. It excepts one single patch to contain all the required changes
>> (instead of a series of patches).
>> 2. It runs only on x86_64 architecture, CentOS7 based systems.
>>
>>
>>
>> [1] https://ci.centos.org/job/cPython-build-patch/
>> [2] https://github.com/kushaldas/pypatcher/blob/master/pypatcher.sh
>>
>> Kushal
>> --
>> Fedora Cloud Engineer
>> CPython Core Developer
>> Director @ Python Software Foundation
>> http://kushaldas.in
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150519/6a0a5fc3/attachment-0001.html>

From wes.turner at gmail.com  Tue May 19 20:27:32 2015
From: wes.turner at gmail.com (Wes Turner)
Date: Tue, 19 May 2015 13:27:32 -0500
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <CACfEFw8vrbhrQ-nwQgTrioWgqUDtDhALWLUFVRHYF300YcRhVw@mail.gmail.com>
References: <20150519150251.GJ20426@kdas-laptop>
 <CACfEFw8B0PT9EXdQyAhSYyOUFe=EPtZnavopAyps-wmpsJUnHg@mail.gmail.com>
 <CACfEFw8vrbhrQ-nwQgTrioWgqUDtDhALWLUFVRHYF300YcRhVw@mail.gmail.com>
Message-ID: <CACfEFw-x7iV0r5COJvaRk-N9K=dMyspi-PJBqYoxu_W1WvGCqQ@mail.gmail.com>

Here's a GerritChangeSource for BuildBot events:
http://docs.buildbot.net/latest/full.html#chsrc-GerritChangeSource

Great idea, thanks again!

On Tue, May 19, 2015 at 1:25 PM, Wes Turner <wes.turner at gmail.com> wrote:

> http://docs.buildbot.net/latest/search.html?q=docker ...
>
> Here's a BuildBot Dockerfile:
> http://docs.buildbot.net/latest/manual/cfg-buildslaves-docker.html#image-creation
>
> On Tue, May 19, 2015 at 1:22 PM, Wes Turner <wes.turner at gmail.com> wrote:
>
>> Cool! Thanks!
>>
>> BuildBot integration?
>>
>> * http://docs.buildbot.net/latest/full.html#source-stamps
>> * http://docs.buildbot.net/latest/full.html#enabling-the-irc-bot
>> * http://docs.buildbot.net/latest/full.html#choosing-a-change-source
>>
>> https://github.com/audreyr/cookiecutter-pypackage (requirements.txt,
>> setup.py, Makefile)
>>
>>    pip install cookiecutter
>>    cookiecutter gh:audreyr/cookiecutter-pypackage
>>
>>
>> On Tue, May 19, 2015 at 10:02 AM, Kushal Das <kushaldas at gmail.com> wrote:
>>
>>> Hi,
>>>
>>> With the help of CentOS project I am happy to announce an automated
>>> system [1] to test patches from bugs.python.org. This can be fully
>>> automated
>>> to test the patches whenever someone uploads a patch in the roundup, but
>>> for now it accepts IRC commands on #python-dev channel. I worked on a
>>> docker based prototype during sprints in PyCon.
>>>
>>> How to use it?
>>> ---------------
>>>
>>> 1. Join #python-dev on irc.freenode.net.
>>> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
>>> 3. They will issue a simple command. #add: YOUR_NICK_NAME
>>> 4. You can then test by issuing the following command in the channel:
>>>
>>>     #test: BUGNUMBER
>>>     like #test: 21271
>>>
>>> This will do the following:
>>> Start a new job on ci.centos.org, announce it on the channel, and
>>> announce the result also.
>>>
>>> I will be working on a minimal lint for patches, and include it
>>> the workflow.
>>>
>>> The current steps can be found at [2]. Each build is happening on a
>>> fresh system.
>>>
>>> Limitations
>>> -----------
>>>
>>> 1. It excepts one single patch to contain all the required changes
>>> (instead of a series of patches).
>>> 2. It runs only on x86_64 architecture, CentOS7 based systems.
>>>
>>>
>>>
>>> [1] https://ci.centos.org/job/cPython-build-patch/
>>> [2] https://github.com/kushaldas/pypatcher/blob/master/pypatcher.sh
>>>
>>> Kushal
>>> --
>>> Fedora Cloud Engineer
>>> CPython Core Developer
>>> Director @ Python Software Foundation
>>> http://kushaldas.in
>>> _______________________________________________
>>> Python-Dev mailing list
>>> Python-Dev at python.org
>>> https://mail.python.org/mailman/listinfo/python-dev
>>> Unsubscribe:
>>> https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150519/593211ee/attachment-0001.html>

From tjreedy at udel.edu  Tue May 19 22:37:54 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Tue, 19 May 2015 16:37:54 -0400
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <20150519150251.GJ20426@kdas-laptop>
References: <20150519150251.GJ20426@kdas-laptop>
Message-ID: <mjg6v7$m02$1@ger.gmane.org>

On 5/19/2015 11:02 AM, Kushal Das wrote:
> Hi,
>
> With the help of CentOS project I am happy to announce an automated
> system [1] to test patches from bugs.python.org. This can be fully automated
> to test the patches whenever someone uploads a patch in the roundup, but
> for now it accepts IRC commands on #python-dev channel. I worked on a
> docker based prototype during sprints in PyCon.
>
> How to use it?
> ---------------
>
> 1. Join #python-dev on irc.freenode.net.
> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
> 3. They will issue a simple command. #add: YOUR_NICK_NAME
> 4. You can then test by issuing the following command in the channel:
>
>      #test: BUGNUMBER
>      like #test: 21271

What if there are multiple patches on the issue?  Pick the latest?
This is not correct if someone follows up a patch with a 2.7 backport, 
or if there are competing patches.

> This will do the following:
> Start a new job on ci.centos.org, announce it on the channel, and
> announce the result also.
>
> I will be working on a minimal lint for patches, and include it
> the workflow.
>
> The current steps can be found at [2]. Each build is happening on a
> fresh system.
>
> Limitations
> -----------
>
> 1. It excepts one single patch to contain all the required changes
> (instead of a series of patches).



> 2. It runs only on x86_64 architecture, CentOS7 based systems.
>
>
>
> [1] https://ci.centos.org/job/cPython-build-patch/
> [2] https://github.com/kushaldas/pypatcher/blob/master/pypatcher.sh
>
> Kushal
>


-- 
Terry Jan Reedy


From kevmod at gmail.com  Tue May 19 22:58:31 2015
From: kevmod at gmail.com (Kevin Modzelewski)
Date: Tue, 19 May 2015 13:58:31 -0700
Subject: [Python-Dev] Python-versus-CPython question for __mul__ dispatch
In-Reply-To: <CAMSv6X0XkLt9w-gPcxsm75wwsmBA0rLNfpLkuPz5jvK12aS+oQ@mail.gmail.com>
References: <CAPJVwBncV9jVQESapVd4s2fLN34-HatrCausgbSh7nr6GZvm+Q@mail.gmail.com>
 <trinity-60022aaa-c0b5-4b4d-b5cb-497088670776-1431657785648@3capp-gmx-bs34>
 <CAP7+vJ+hB7SefycT-U2M5ib01o-p5kCx0E-L6P-b_9iTEsuMbg@mail.gmail.com>
 <CAPJVwBm3G-zfeAqogjmsDs6NZdp7iLjVzo8_ijY9O-2-Z=AwZg@mail.gmail.com>
 <CAPJVwBkkVahf7kk50WJ1g9K1_qR-52jOX9+gEkLNkM06TA+_wQ@mail.gmail.com>
 <CADiSq7erx=p7S7kg75pdwOOEDNYu-NEATwwL6xP22ez_c6M1Jw@mail.gmail.com>
 <CAMSv6X0XkLt9w-gPcxsm75wwsmBA0rLNfpLkuPz5jvK12aS+oQ@mail.gmail.com>
Message-ID: <CALRAs_cgT5uwzmzoBfYOH3Gq+Ghcg5MioH5AKZuafb+Lfxor0A@mail.gmail.com>

We have a similar experience -- Pyston runs into a similar issue with
sqlalchemy (with "str() + foo" calling foo.__radd__ before str.sq_concat)
and we are working to match CPython's behavior.

On Tue, May 19, 2015 at 7:00 AM, Armin Rigo <arigo at tunes.org> wrote:

> Hi Nick,
>
> On 16 May 2015 at 10:31, Nick Coghlan <ncoghlan at gmail.com> wrote:
> > Oh, that's rather annoying that the PyPy team implemented bug-for-bug
> > compatibility there, and didn't follow up on the operand precedence
> > bug report to say that they had done so.
>
> It's sadly not the only place, by far, where a behavior of CPython
> could be considered an implementation detail, but people rely on it
> and so we need to write a workaround.  We don't report all of them,
> particularly not the ones that are clearly of the kind "won't be
> changed in CPython 2.7".  Maybe we should?
>
> Another example where this same bug occurs is:
>
>     class T(tuple):
>        def __radd__(self, other):
>           return 42
>
>     lst = [ ]
>     lst += T()
>
> which calls T.__radd__ in contradiction to all the general rules.
> (Yes, if you print(lst) afterwards, you get 42.  And oops, trying this
> out on PyPy does not give 42; only "lst + T()" does.  Probably another
> corner case to fix...)
>
>
> A bient?t,
>
> Armin.
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/kevmod%40gmail.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150519/d2f553c8/attachment.html>

From techtonik at gmail.com  Wed May 20 09:44:13 2015
From: techtonik at gmail.com (anatoly techtonik)
Date: Wed, 20 May 2015 10:44:13 +0300
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <mjg6v7$m02$1@ger.gmane.org>
References: <20150519150251.GJ20426@kdas-laptop> <mjg6v7$m02$1@ger.gmane.org>
Message-ID: <CAPkN8xKn2EuttDkJ6kb6t_ey6OBQV7nQ_NsekCk_SuZ21BEXHg@mail.gmail.com>

On Tue, May 19, 2015 at 11:37 PM, Terry Reedy <tjreedy at udel.edu> wrote:
> On 5/19/2015 11:02 AM, Kushal Das wrote:
>>
>> Hi,
>>
>> With the help of CentOS project I am happy to announce an automated
>> system [1] to test patches from bugs.python.org. This can be fully
>> automated
>> to test the patches whenever someone uploads a patch in the roundup, but
>> for now it accepts IRC commands on #python-dev channel. I worked on a
>> docker based prototype during sprints in PyCon.
>>
>> How to use it?
>> ---------------
>>
>> 1. Join #python-dev on irc.freenode.net.
>> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
>> 3. They will issue a simple command. #add: YOUR_NICK_NAME
>> 4. You can then test by issuing the following command in the channel:
>>
>>      #test: BUGNUMBER
>>      like #test: 21271
>
>
> What if there are multiple patches on the issue?  Pick the latest?
> This is not correct if someone follows up a patch with a 2.7 backport, or if
> there are competing patches.

Here is the code that checks how much outstanding patches a certain
module has by downloading all patches from open issues, parsing them
and comparing the paths. It is possible to reuse the parser to check paths
in patch against paths present in certain Python versions, or add different
heuristics.

https://bitbucket.org/techtonik/python-stdlib

All this is pure Python and should work cross-platform too.


This was intended to add status for bugs.python.org, but the work on
Roundup had stalled due to uncertainty and despair on how to handle
utf-8 (internal to Roundup) vs unicode (internal to Jinja2) in this issue:
http://issues.roundup-tracker.org/issue2550811
The root of the problem is that Python 2.7 uses 'ascii' and not 'utf-8'
internally, so Jinja2 engine fails with 'ascii' ordinal not in range
somewhere in the way. Need an expert advice how to handle that, because
my brain power is not enough to process it.

From guido at python.org  Wed May 20 17:29:52 2015
From: guido at python.org (Guido van Rossum)
Date: Wed, 20 May 2015 08:29:52 -0700
Subject: [Python-Dev] PEP 484 (Type Hints) -- penultimate(?) draft
Message-ID: <CAP7+vJKQu3nmmsV5o0Lj9cOm6Z+epSY7+oZD6rMMsv+xt+SWYA@mail.gmail.com>

I'm happy to present a much updated PEP 484 for your review. It will
(hopefully) appear on python.org within the next hour. I'm also working on
an implementation (
https://github.com/ambv/typehinting/tree/master/prototyping) which I hope
will be good enough to include in beta 1, assuming the BDFL-Delegate (Mark
Shannon) approves.

--Guido

PEP: 484
Title: Type Hints
Version: $Revision$
Last-Modified: $Date$
Author: Guido van Rossum <guido at python.org>, Jukka Lehtosalo <
jukka.lehtosalo at iki.fi>, ?ukasz Langa <lukasz at langa.pl>
BDFL-Delegate: Mark Shannon
Discussions-To: Python-Dev <python-dev at python.org>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 29-Sep-2014
Post-History: 16-Jan-2015,20-Mar-2015,17-Apr-2015,20-May-2015
Resolution:


Abstract
========

PEP 3107 introduced syntax for function annotations, but the semantics
were deliberately left undefined.  There has now been enough 3rd party
usage for static type analysis that the community would benefit from
a standard vocabulary and baseline tools within the standard library.

This PEP introduces a provisional module to provide these standard
definitions and tools, along with some conventions for situations
where annotations are not available.

Note that this PEP still explicitly does NOT prevent other uses of
annotations, nor does it require (or forbid) any particular processing
of annotations, even when they conform to this specification.  It
simply enables better coordination, as PEP 333 did for web frameworks.

For example, here is a simple function whose argument and return type
are declared in the annotations::

  def greeting(name: str) -> str:
      return 'Hello ' + name

While these annotations are available at runtime through the usual
``__annotations__`` attribute, *no type checking happens at runtime*.
Instead, the proposal assumes the existence of a separate off-line
type checker which users can run over their source code voluntarily.
Essentially, such a type checker acts as a very powerful linter.
(While it would of course be possible for individual users to employ
a similar checker at run time for Design By Contract enforcement or
JIT optimization, those tools are not yet as mature.)

The proposal is strongly inspired by mypy [mypy]_.  For example, the
type "sequence of integers" can be written as ``Sequence[int]``.  The
square brackets mean that no new syntax needs to be added to the
language.  The example here uses a custom type ``Sequence``, imported
from a pure-Python module ``typing``.  The ``Sequence[int]`` notation
works at runtime by implementing ``__getitem__()`` in the metaclass
(but its significance is primarily to an offline type checker).

The type system supports unions, generic types, and a special type
named ``Any`` which is consistent with (i.e. assignable to and from) all
types.  This latter feature is taken from the idea of gradual typing.
Gradual typing and the full type system are explained in PEP 483.

Other approaches from which we have borrowed or to which ours can be
compared and contrasted are described in PEP 482.


Rationale and Goals
===================

PEP 3107 added support for arbitrary annotations on parts of a
function definition.  Although no meaning was assigned to annotations
then, there has always been an implicit goal to use them for type
hinting [gvr-artima]_, which is listed as the first possible use case
in said PEP.

This PEP aims to provide a standard syntax for type annotations,
opening up Python code to easier static analysis and refactoring,
potential runtime type checking, and (perhaps, in some contexts)
code generation utilizing type information.

Of these goals, static analysis is the most important.  This includes
support for off-line type checkers such as mypy, as well as providing
a standard notation that can be used by IDEs for code completion and
refactoring.

Non-goals
---------

While the proposed typing module will contain some building blocks for
runtime type checking -- in particular a useful ``isinstance()``
implementation -- third party packages would have to be developed to
implement specific runtime type checking functionality, for example
using decorators or metaclasses.  Using type hints for performance
optimizations is left as an exercise for the reader.

It should also be emphasized that **Python will remain a dynamically
typed language, and the authors have no desire to ever make type hints
mandatory, even by convention.**


The meaning of annotations
==========================

Any function without annotations should be treated as having the most
general type possible, or ignored, by any type checker.  Functions
with the ``@no_type_check`` decorator or with a ``# type: ignore``
comment should be treated as having no annotations.

It is recommended but not required that checked functions have
annotations for all arguments and the return type.  For a checked
function, the default annotation for arguments and for the return type
is ``Any``.  An exception is that the first argument of instance and
class methods does not need to be annotated; it is assumed to have the
type of the containing class for instance methods, and ``type`` for
class methods.

(Note that the return type of ``__init__`` ought to be annotated with
``-> None``.  The reason for this is subtle.  If ``__init__`` assumed
a return annotation of ``-> None``, would that mean that an
argument-less, un-annotated ``__init__`` method should still be
type-checked?  Rather than leaving this ambiguous or introducing an
exception to the exception, we simply say that ``__init__`` ought to
have a return annotation; the default behavior is thus the same as for
other methods.)

A type checker is expected to check the body of a checked function for
consistency with the given annotations.  The annotations may also used
to check correctness of calls appearing in other checked functions.

Type checkers are expected to attempt to infer as much information as
necessary.  The minimum requirement is to handle the builtin
decorators ``@property``, ``@staticmethod`` and ``@classmethod``.


Type Definition Syntax
======================

The syntax leverages PEP 3107-style annotations with a number of
extensions described in sections below.  In its basic form, type
hinting is used by filling function annotation slots with classes::

  def greeting(name: str) -> str:
      return 'Hello ' + name

This states that the expected type of the ``name`` argument is
``str``.  Analogically, the expected return type is ``str``.

Expressions whose type is a subtype of a specific argument type are
also accepted for that argument.


Acceptable type hints
---------------------

Type hints may be built-in classes (including those defined in
standard library or third-party extension modules), abstract base
classes, types available in the ``types`` module, and user-defined
classes (including those defined in the standard library or
third-party modules).

While annotations are normally the best format for type hints,
there are times when it is more appropriate to represent them
by a special comment, or in a separately distributed interface
file.  (See below for examples.)

Annotations must be valid expressions that evaluate without raising
exceptions at the time the function is defined (but see below for
forward references).

Annotations should be kept simple or static analysis tools may not be
able to interpret the values. For example, dynamically computed types
are unlikely to be understood.  (This is an
intentionally somewhat vague requirement, specific inclusions and
exclusions may be added to future versions of this PEP as warranted by
the discussion.)

In addition to the above, the following special constructs defined
below may be used: ``None``, ``Any``, ``Union``, ``Tuple``,
``Callable``, all ABCs and stand-ins for concrete classes exported
from ``typing`` (e.g. ``Sequence`` and ``Dict``), type variables, and
type aliases.

All newly introduced names used to support features described in
following sections (such as ``Any`` and ``Union``) are available in
the ``typing`` module.


Using None
----------

When used in a type hint, the expression ``None`` is considered
equivalent to ``type(None)``.


Type aliases
------------

Type aliases are defined by simple variable assignments::

  Url = str

  def retry(url: Url, retry_count: int) -> None: ...

Note that we recommend capitalizing alias names, since they represent
user-defined types, which (like user-defined classes) are typically
spelled that way.

Type aliases may be as complex as type hints in annotations --
anything that is acceptable as a type hint is acceptable in a type
alias::

    from typing import TypeVar, Iterable, Tuple

    T = TypeVar('T', int, float, complex)
    Vector = Iterable[Tuple[T, T]]

    def inproduct(v: Vector) -> T:
        return sum(x*y for x, y in v)

This is equivalent to::

    from typing import TypeVar, Iterable, Tuple

    T = TypeVar('T', int, float, complex)

    def inproduct(v: Iterable[Tuple[T, T]]) -> T:
        return sum(x*y for x, y in v)


Callable
--------

Frameworks expecting callback functions of specific signatures might be
type hinted using ``Callable[[Arg1Type, Arg2Type], ReturnType]``.
Examples::

  from typing import Callable

  def feeder(get_next_item: Callable[[], str]) -> None:
      # Body

  def async_query(on_success: Callable[[int], None],
                  on_error: Callable[[int, Exception], None]) -> None:
      # Body

It is possible to declare the return type of a callable without
specifying the call signature by substituting a literal ellipsis
(three dots) for the list of arguments::

  def partial(func: Callable[..., str], *args) -> Callable[..., str]:
      # Body

Note that there are no square brackets around the ellipsis.  The
arguments of the callback are completely unconstrained in this case
(and keyword arguments are acceptable).

Since using callbacks with keyword arguments is not perceived as a
common use case, there is currently no support for specifying keyword
arguments with ``Callable``.  Similarly, there is no support for
specifying callback signatures with a variable number of argument of a
specific type.


Generics
--------

Since type information about objects kept in containers cannot be
statically inferred in a generic way, abstract base classes have been
extended to support subscription to denote expected types for container
elements.  Example::

  from typing import Mapping, Set

  def notify_by_email(employees: Set[Employee], overrides: Mapping[str,
str]) -> None: ...

Generics can be parametrized by using a new factory available in
``typing`` called ``TypeVar``.  Example::

  from typing import Sequence, TypeVar

  T = TypeVar('T')      # Declare type variable

  def first(l: Sequence[T]) -> T:   # Generic function
      return l[0]

In this case the contract is that the returned value is consistent with
the elements held by the collection.

``TypeVar`` supports constraining parametric types to a fixed set of
possible types.  For example, we can define a type variable that ranges
over just ``str`` and ``bytes``.  By default, a type variable ranges
over all possible types.  Example of constraining a type variable::

  from typing import TypeVar

  AnyStr = TypeVar('AnyStr', str, bytes)

  def concat(x: AnyStr, y: AnyStr) -> AnyStr:
      return x + y

The function ``concat`` can be called with either two ``str`` arguments
or two ``bytes`` arguments, but not with a mix of ``str`` and ``bytes``
arguments.

There should be at least two constraints, if any; specifying a single
constraint is disallowed.

Subtypes of types constrained by a type variable should be treated
as their respective explicitly listed base types in the context of the
type variable.  Consider this example::

  class MyStr(str): ...

  x = concat(MyStr('apple'), MyStr('pie'))

The call is valid but the type variable ``AnyStr`` will be set to
``str`` and not ``MyStr``. In effect, the inferred type of the return
value assigned to ``x`` will also be ``str``.

Additionally, ``Any`` is a valid value for every type variable.
Consider the following::

  def count_truthy(elements: List[Any]) -> int:
      return sum(1 for elem in elements if element)

This is equivalent to omitting the generic notation and just saying
``elements: List``.


User-defined generic types
--------------------------

You can include a ``Generic`` base class to define a user-defined class
as generic.  Example::

  from typing import TypeVar, Generic

  T = TypeVar('T')

  class LoggedVar(Generic[T]):
      def __init__(self, value: T, name: str, logger: Logger) -> None:
          self.name = name
          self.logger = logger
          self.value = value

      def set(self, new: T) -> None:
          self.log('Set ' + repr(self.value))
          self.value = new

      def get(self) -> T:
          self.log('Get ' + repr(self.value))
          return self.value

      def log(self, message: str) -> None:
          self.logger.info('{}: {}'.format(self.name message))

``Generic[T]`` as a base class defines that the class ``LoggedVar``
takes a single type parameter ``T``. This also makes ``T`` valid as
a type within the class body.

The ``Generic`` base class uses a metaclass that defines ``__getitem__``
so that ``LoggedVar[t]`` is valid as a type::

  from typing import Iterable

  def zero_all_vars(vars: Iterable[LoggedVar[int]]) -> None:
      for var in vars:
          var.set(0)

A generic type can have any number of type variables, and type variables
may be constrained. This is valid::

  from typing import TypeVar, Generic
  ...

  T = TypeVar('T')
  S = TypeVar('S')

  class Pair(Generic[T, S]):
      ...

Each type variable argument to ``Generic`` must be distinct. This is
thus invalid::

  from typing import TypeVar, Generic
  ...

  T = TypeVar('T')

  class Pair(Generic[T, T]):   # INVALID
      ...

You can use multiple inheritance with ``Generic``::

  from typing import TypeVar, Generic, Sized

  T = TypeVar('T')

  class LinkedList(Sized, Generic[T]):
      ...

Subclassing a generic class without specifying type parameters assumes
``Any`` for each position.  In the following example, ``MyIterable``
is not generic but implicitly inherits from ``Iterable[Any]``:

  from typing import Iterable

  class MyIterable(Iterable):  # Same as Iterable[Any]
      ...


Instantiating generic classes and type erasure
----------------------------------------------

Generic types like ``List`` or ``Sequence`` cannot be instantiated.
However, user-defined classes derived from them can be instantiated.
Given a generic class ``Node[T]`` there are three forms of
instantiation:

* ``x = Node()`` -- the type of x is ``Node[Any]``.

* ``x = Node[T]()`` -- the type of x is ``Node[T]``.

* ``x = Node[int]()`` -- the type of x is ``Node[int]``.

At runtime the type is not preserved, and the observable type of x is
just ``Node``.  This is type erasure and common practice in languages
with generics (e.g. Java, Typescript).



Arbitrary generic types as base classes
---------------------------------------

``Generic[T]`` is only valid as a base class -- it's not a proper type.
However, user-defined generic types such as ``LinkedList[T]`` from the
above example and built-in generic types and ABCs such as ``List[T]``
and ``Iterable[T]`` are valid both as types and as base classes. For
example, we can define a subclass of ``Dict`` that specializes type
arguments::

  from typing import Dict, List, Optional

  class Node:
      ...

  class SymbolTable(Dict[str, List[Node]]):
      def push(self, name: str, node: Node) -> None:
          self.setdefault(name, []).append(node)

      def pop(self, name: str) -> Node:
          return self[name].pop()

      def lookup(self, name: str) -> Optional[Node]:
          nodes = self.get(name)
          if nodes:
              return nodes[-1]
          return None

``SymbolTable`` is a subclass of ``dict`` and a subtype of ``Dict[str,
List[Node]]``.

If a generic base class has a type variable as a type argument, this
makes the defined class generic. For example, we can define a generic
``LinkedList`` class that is iterable and a container::

  from typing import TypeVar, Iterable, Container

  T = TypeVar('T')

  class LinkedList(Iterable[T], Container[T]):
      ...

Now ``LinkedList[int]`` is a valid type. Note that we can use ``T``
multiple times in the base class list, as long as we don't use the
same type variable ``T`` multiple times within ``Generic[...]``.

Also consider the following example::

  from typing import TypeVar, Mapping

  T = TypeVar('T')

  class MyDict(Mapping[str, T]):
      ...

In this case MyDict has a single parameter, T.


Abstract generic types
----------------------

The metaclass used by ``Generic`` is a subclass of ``abc.ABCMeta``.
A generic class can be an ABC by including abstract methods
or properties, and generic classes can also have ABCs as base
classes without a metaclass conflict.


Type variables with an upper bound
----------------------------------

A type variable may specify an upper bound using ``bound=<type>``.
This means that an actual type substituted (explicitly or implictly)
for the type variable must be a subclass of the boundary type.  A
common example is the definition of a Comparable type that works well
enough to catch the most common errors::

  from typing import TypeVar

  class Comparable(metaclass=ABCMeta):
      @abstractmethod
      def __lt__(self, other: Any) -> bool: ...
      ... # __gt__ etc. as well

  CT = TypeVar('CT', bound=Comparable)

  def min(x: CT, y: CT) -> CT:
      if x < y:
          return x
      else:
          return y

  min(1, 2) # ok, return type int
  min('x', 'y') # ok, return type str

(Note that this is not ideal -- for example ``min('x', 1)`` is invalid
at runtime but a type checker would simply infer the return type
``Comparable``.  Unfortunately, addressing this would require
introducing a much more powerful and also much more complicated
concept, F-bounded polymorphism.  We may revisit this in the future.)

An upper bound cannot be combined with type constraints (as in used
``AnyStr``, see the example earlier); type constraints cause the
inferred type to be _exactly_ one of the constraint types, while an
upper bound just requires that the actual type is a subclass of the
boundary type.


Covariance and contravariance
-----------------------------

Consider a class ``Employee`` with a subclass ``Manager``.  Now
suppose we have a function with an argument annotated with
``List[Employee]``.  Should we be allowed to call this function with a
variable of type ``List[Manager]`` as its argument?  Many people would
answer "yes, of course" without even considering the consequences.
But unless we know more about the function, a type checker should
reject such a call: the function might append an ``Employee`` instance
to the list, which would violate the variable's type in the caller.

It turns out such an argument acts _contravariantly_, whereas the
intuitive answer (which is correct in case the function doesn't mutate
its argument!) requires the argument to act _covariantly_.  A longer
introduction to these concepts can be found on Wikipedia
[wiki-variance]_; here we just show how to control a type checker's
behavior.

By default type variables are considered _invariant_, which means that
arguments for arguments annotated with types like ``List[Employee]``
must exactly match the type annotation -- no subclasses or
superclasses of the type parameter (in this example ``Employee``) are
allowed.

To facilitate the declaration of container types where covariant type
checking is acceptable, a type variable can be declared using
``covariant=True``.  For the (rare) case where contravariant behavior
is desirable, pass ``contravariant=True``.  At most one of these may
be passed.

A typical example involves defining an immutable container class::

  from typing import TypeVar

  T = TypeVar('T', covariant=True)

  class ImmutableList(Generic[T]):
      def append(self, T): ...
      ...

  class Employee: ...

  class Manager(Employee): ...

  def dump_employees(emps: ImmutableList[Employee]) -> None: ...

  mgrs = ...  # type: ImmutableList[Mananger]
  mgrs.append(Manager())

  dump_employees(mgrs)  # OK

The immutable collection classes in ``typing`` are all defined using a
covariant type variable (e.g. ``Mapping`` and ``Sequence``).  The
mutable collection classes (e.g. ``MutableMapping`` and
``MutableSequence``) are defined using regular invariant type
variables.  The one example of a contravariant type variable is the
``Generator`` type, which is contravariant in the ``send()`` argument
type (see below).

Note: variance affects type parameters for generic types -- it does
not affect regular parameters.  For example, the following example is
fine::

  from typing import TypeVar

  class Employee: ...

  class Manager(Employee): ...

  E = TypeVar('E', bound=Employee)  # Invariant

  def dump_employee(e: E) -> None: ...

  dump_employee(Manager())  # OK


The numeric tower
-----------------

PEP 3141 defines Python's numeric tower, and the stdlib module
``numbers`` implements the corresponding ABCs (``Number``,
``Complex``, ``Real``, ``Rational`` and ``Integral``).  There are some
issues with these ABCs, but the built-in concrete numeric classes
``complex``, ``float`` and ``int`` are ubiquitous (especially the
latter two :-).

Rather than requiring that users write ``import numbers`` and then use
``numbers.Float`` etc., this PEP proposes a straightforward shortcut
that is almost as effective: when an argument is annotated as having
type ``float``, an argument of type ``int`` is acceptable; similar,
for an argument annotated as having type ``complex``, arguments of
type ``float`` or ``int`` are acceptable.  This does not handle
classes implementing the corresponding ABCs or the
``fractions.Fraction`` class, but we believe those use cases are
exceedingly rare.


The bytes types
---------------

There are three different builtin classes used for arrays of bytes
(not counting the classes available in the ``array`` module):
``bytes``, ``bytearray`` and ``memoryview``.  Of these, ``bytes`` and
``bytearray`` have many behaviors in common (though not all --
``bytearray`` is mutable).

While there is an ABC ``ByteString`` defined in ``collections.abc``
and a corresponding type in ``typing``, functions accepting bytes (of
some form) are so common that it would be cumbersome to have to write
``typing.ByteString`` everywhere.  So, as a shortcut similar to that
for the builtin numeric classes, when an argument is annotated as
having type ``bytes``, arguments of type ``bytearray`` or
``memoryview`` are acceptable.  (Again, there are situations where
this isn't sound, but we believe those are exceedingly rare in
practice.)


Forward references
------------------

When a type hint contains names that have not been defined yet, that
definition may be expressed as a string literal, to be resolved later.

A situation where this occurs commonly is the definition of a
container class, where the class being defined occurs in the signature
of some of the methods.  For example, the following code (the start of
a simple binary tree implementation) does not work::

  class Tree:
      def __init__(self, left: Tree, right: Tree):
          self.left = left
          self.right = right

To address this, we write::

  class Tree:
      def __init__(self, left: 'Tree', right: 'Tree'):
          self.left = left
          self.right = right

The string literal should contain a valid Python expression (i.e.,
``compile(lit, '', 'expr')`` should be a valid code object) and it
should evaluate without errors once the module has been fully loaded.
The local and global namespace in which it is evaluated should be the
same namespaces in which default arguments to the same function would
be evaluated.

Moreover, the expression should be parseable as a valid type hint, i.e.,
it is constrained by the rules from the section `Acceptable type hints`_
above.

It is allowable to use string literals as *part* of a type hint, for
example::

    class Tree:
        ...
        def leaves(self) -> List['Tree']:
            ...

A common use for forward references is when e.g. Django models are
needed in the signatures.  Typically, each model is in a separate
file, and has methods that arguments whose type involves other models.
Because of the way circular imports work in Python, it is often not
possible to import all the needed models directly::

    # File models/a.py
    from models.b import B
    class A(Model):
        def foo(self, b: B): ...

    # File models/b.py
    from models.a import A
    class B(Model):
        def bar(self, a: A): ...

    # File main.py
    from a import A
    from b import B

Assuming main is imported first, this will fail with an ImportError at
the line ``from models.a import A`` in models/b.py, which is being
imported from models/a.py before a has defined class A.  The solution
is to switch to module-only imports and reference the models by their
_module_._class_ name::

    # File models/a.py
    from models import b
    class A(Model):
        def foo(self, b: 'b.B'): ...

    # File models/b.py
    from models import a
    class B(Model):
        def bar(self, a: 'a.A'): ...

    # File main.py
    from a import A
    from b import B


Union types
-----------

Since accepting a small, limited set of expected types for a single
argument is common, there is a new special factory called ``Union``.
Example::

  from typing import Union

  def handle_employees(e: Union[Employee, Sequence[Employee]]) -> None:
      if isinstance(e, Employee):
          e = [e]
      ...

A type factored by ``Union[T1, T2, ...]`` responds ``True`` to
``issubclass`` checks for ``T1`` and any of its subtypes, ``T2`` and
any of its subtypes, and so on.

One common case of union types are *optional* types.  By default,
``None`` is an invalid value for any type, unless a default value of
``None`` has been provided in the function definition.  Examples::

  def handle_employee(e: Union[Employee, None]) -> None: ...

As a shorthand for ``Union[T1, None]`` you can write ``Optional[T1]``;
for example, the above is equivalent to::

  from typing import Optional

  def handle_employee(e: Optional[Employee]) -> None: ...

An optional type is also automatically assumed when the default value is
``None``, for example::

  def handle_employee(e: Employee = None): ...

This is equivalent to::

  def handle_employee(e: Optional[Employee] = None) -> None: ...

The ``Any`` type
----------------

A special kind of type is ``Any``.  Every type is a subtype of
``Any``.  This is also true for the builtin type ``object``.
However, to the static type checker these are completely different.

When the type of a value is ``object``, the type checker will reject
almost all operations on it, and assigning it to a variable (or using
it as a return value) of a more specialized type is a type error.  On
the other hand, when a value has type ``Any``, the type checker will
allow all operations on it, and a value of type ``Any`` can be assigned
to a variable (or used as a return value) of a more constrained type.


Predefined constants
--------------------

Some predefined Boolean constants are defined in the ``typing``
module to enable platform-specific type definitions and such::

  from typing import PY2, PY3, WINDOWS, POSIX

  if PY2:
      text = unicode
  else:
      text = str

  def f() -> text: ...

  if WINDOWS:
      loop = ProactorEventLoop
  else:
      loop = UnixSelectorEventLoop

It is up to the type checker implementation to define their values, as
long as ``PY2 == not PY3`` and ``WINDOWS == not POSIX``.  When the
program is being executed these always reflect the current platform,
and this is also the suggested default when the program is being
type-checked.


Default argument values
-----------------------

In stubs it may be useful to declare an argument as having a default
without specifying the actual default value.  For example::

  def foo(x: AnyStr, y: AnyStr = ...) -> AnyStr: ...

What should the default value look like?  Any of the options ``""``,
``b""`` or ``None`` fails to satisfy the type constraint (actually,
``None`` will *modify* the type to become ``Optional[AnyStr]``).

In such cases the default value may be specified as a literal
ellipsis, i.e. the above example is literally what you would write.


Compatibility with other uses of function annotations
=====================================================

A number of existing or potential use cases for function annotations
exist, which are incompatible with type hinting.  These may confuse
a static type checker.  However, since type hinting annotations have no
runtime behavior (other than evaluation of the annotation expression and
storing annotations in the ``__annotations__`` attribute of the function
object), this does not make the program incorrect -- it just may cause
a type checker to emit spurious warnings or errors.

To mark portions of the program that should not be covered by type
hinting, you can use one or more of the following:

* a ``# type: ignore`` comment;

* a ``@no_type_check`` decorator on a class or function;

* a custom class or function decorator marked with
  ``@no_type_check_decorator``.

For more details see later sections.

In order for maximal compatibility with offline type checking it may
eventually be a good idea to change interfaces that rely on annotations
to switch to a different mechanism, for example a decorator.  In Python
3.5 there is no pressure to do this, however.  See also the longer
discussion under `Rejected alternatives`_ below.


Type comments
=============

No first-class syntax support for explicitly marking variables as being
of a specific type is added by this PEP.  To help with type inference in
complex cases, a comment of the following format may be used::

  x = []   # type: List[Employee]
  x, y, z = [], [], []  # type: List[int], List[int], List[str]
  x, y, z = [], [], []  # type: (List[int], List[int], List[str])
  x = [
     1,
     2,
  ]  # type: List[int]

Type comments should be put on the last line of the statement that
contains the variable definition. They can also be placed on
``with`` statements and ``for`` statements, right after the colon.

Examples of type comments on ``with`` and ``for`` statements::

  with frobnicate() as foo:  # type: int
      # Here foo is an int
      ...

  for x, y in points:  # type: float, float
      # Here x and y are floats
      ...

In stubs it may be useful to declare the existence of a variable
without giving it an initial value.  This can be done using a literal
ellipsis::

  from typing import IO

  stream = ...  # type: IO[str]

In non-stub code, there is a similar special case:

  from typing import IO

  stream = None  # type: IO[str]

Type checkers should not complain about this (despite the value
``None`` not matching the given type), nor should they change the
inferred type to ``Optional[...]`` (despite the rule that does this
for annotated arguments with a default value of ``None``).  The
assumption here is that other code will ensure that the variable is
given a value of the proper type, and all uses can assume that the
variable has the given type.

The ``# type: ignore`` comment should be put on the line that the
error refers to::

  import http.client
  errors = {
      'not_found': http.client.NOT_FOUND  # type: ignore
  }

A ``# type: ignore`` comment on a line by itself disables all type
checking for the rest of the file.

If type hinting proves useful in general, a syntax for typing variables
may be provided in a future Python version.

Casts
=====

Occasionally the type checker may need a different kind of hint: the
programmer may know that an expression is of a more constrained type
than a type checker may be able to infer.  For example::

  from typing import List, cast

  def find_first_str(a: List[object]) -> str:
      index = next(i for i, x in enumerate(a) if isinstance(x, str))
      # We only get here if there's at least one string in a
      return cast(str, a[index])

Some type checkers may not be able to infers that the type of
``a[index]`` is ``str`` and only infer ``object`` or ``Any``", but we
know that (if the code gets to that point) it must be a string.  The
``cast(t, x)`` call tells the type checker that we are confident that
the type of ``x`` is ``t``.  At runtime a cast always returns the
expression unchanged -- it does not check the type, and it does not
convert or coerce the value.

Casts differ from type comments (see the previous section).  When using
a type comment, the type checker should still verify that the inferred
type is consistent with the stated type.  When using a cast, the type
checker should blindly believe the programmer.  Also, casts can be used
in expressions, while type comments only apply to assignments.


Stub Files
==========

Stub files are files containing type hints that are only for use by
the type checker, not at runtime.  There are several use cases for
stub files:

* Extension modules

* Third-party modules whose authors have not yet added type hints

* Standard library modules for which type hints have not yet been
  written

* Modules that must be compatible with Python 2 and 3

* Modules that use annotations for other purposes

Stub files have the same syntax as regular Python modules.  There is one
feature of the ``typing`` module that may only be used in stub files:
the ``@overload`` decorator described below.

The type checker should only check function signatures in stub files;
It is recommended that function bodies in stub files just be a single
ellipsis (``...``).

The type checker should have a configurable search path for stub files.
If a stub file is found the type checker should not read the
corresponding "real" module.

While stub files are syntactically valid Python modules, they use the
``.pyi`` extension to make it possible to maintain stub files in the
same directory as the corresponding real module.  This also reinforces
the notion that no runtime behavior should be expected of stub files.

Additional notes on stub files:

* Modules and variables imported into the stub are not considered
  exported from the stub unless the import uses the ``import ... as
  ...`` form.

Function overloading
--------------------

The ``@overload`` decorator allows describing functions that support
multiple different combinations of argument types.  This pattern is
used frequently in builtin modules and types.  For example, the
``__getitem__()`` method of the ``bytes`` type can be described as
follows::

  from typing import overload

  class bytes:
    ...
    @overload
    def __getitem__(self, i: int) -> int: ...
    @overload
    def __getitem__(self, s: slice) -> bytes: ...

This description is more precise than would be possible using unions
(which cannot express the relationship between the argument and return
types)::

  from typing import Union
  class bytes:
    ...
    def __getitem__(self, a: Union[int, slice]) -> Union[int, bytes]: ...

Another example where ``@overload`` comes in handy is the type of the
builtin ``map()`` function, which takes a different number of
arguments depending on the type of the callable::

  from typing import Callable, Iterable, Iterator, Tuple, TypeVar, overload

  T1 = TypeVar('T1')
  T2 = TypeVar('T2)
  S = TypeVar('S')

  @overload
  def map(func: Callable[[T1], S], iter1: Iterable[T1]) -> Iterator[S]: ...
  @overload
  def map(func: Callable[[T1, T2], S],
          iter1: Iterable[T1], iter2: Iterable[T2]) -> Iterator[S]: ...
  # ... and we could add more items to support more than two iterables

Note that we could also easily add items to support ``map(None, ...)``::

  @overload
  def map(func: None, iter1: Iterable[T1]) -> Iterable[T1]: ...
  @overload
  def map(func: None,
          iter1: Iterable[T1],
          iter2: Iterable[T2]) -> Iterable[Tuple[T1, T2]]: ...

The ``@overload`` decorator may only be used in stub files.  While it
would be possible to provide a multiple dispatch implementation using
this syntax, its implementation would require using
``sys._getframe()``, which is frowned upon.  Also, designing and
implementing an efficient multiple dispatch mechanism is hard, which
is why previous attempts were abandoned in favor of
``functools.singledispatch()``.  (See PEP 443, especially its section
"Alternative approaches".)  In the future we may come up with a
satisfactory multiple dispatch design, but we don't want such a design
to be constrained by the overloading syntax defined for type hints in
stub files.  In the meantime, using the ``@overload`` decorator or
calling ``overload()`` directly raises ``RuntimeError``.

Storing and distributing stub files
-----------------------------------

The easiest form of stub file storage and distribution is to put them
alongside Python modules in the same directory.  This makes them easy to
find by both programmers and the tools.  However, since package
maintainers are free not to add type hinting to their packages,
third-party stubs installable by ``pip`` from PyPI are also supported.
In this case we have to consider three issues: naming, versioning,
installation path.

This PEP does not provide a recommendation on a naming scheme that
should be used for third-party stub file packages.  Discoverability will
hopefully be based on package popularity, like with Django packages for
example.

Third-party stubs have to be versioned using the lowest version of the
source package that is compatible.  Example: FooPackage has versions
1.0, 1.1, 1.2, 1.3, 2.0, 2.1, 2.2.  There are API changes in versions
1.1, 2.0 and 2.2.  The stub file package maintainer is free to release
stubs for all versions but at least 1.0, 1.1, 2.0 and 2.2 are needed
to enable the end user type check all versions.  This is because the
user knows that the closest *lower or equal* version of stubs is
compatible.  In the provided example, for FooPackage 1.3 the user would
choose stubs version 1.1.

Note that if the user decides to use the "latest" available source
package, using the "latest" stub files should generally also work if
they're updated often.

Third-party stub packages can use any location for stub storage.  Type
checkers should search for them using PYTHONPATH.  A default fallback
directory that is always checked is ``shared/typehints/python3.5/`` (or
3.6, etc.).  Since there can only be one package installed for a given
Python version per environment, no additional versioning is performed
under that directory (just like bare directory installs by ``pip`` in
site-packages).  Stub file package authors might use the following
snippet in ``setup.py``::

  ...
  data_files=[
      (
          'shared/typehints/python{}.{}'.format(*sys.version_info[:2]),
          pathlib.Path(SRC_PATH).glob('**/*.pyi'),
      ),
  ],
  ...

The Typeshed Repo
-----------------

There is a shared repository where useful stubs are being collected
[typeshed]_.  Note that stubs for a given package will not be included
here without the explicit consent of the package owner.  Further
policies regarding the stubs collected here will be decided at a later
time, after discussion on python-dev, and reported in the typeshed
repo's README.


Exceptions
==========

No syntax for listing explicitly raised exceptions is proposed.
Currently the only known use case for this feature is documentational,
in which case the recommendation is to put this information in a
docstring.


The ``typing`` Module
=====================

To open the usage of static type checking to Python 3.5 as well as older
versions, a uniform namespace is required.  For this purpose, a new
module in the standard library is introduced called ``typing``.

It defines the fundamental building blocks for constructing types
(e.g. ``Any``), types representing generic variants of builtin
collections (e.g. ``List``), types representing generic
collection ABCs (e.g. ``Sequence``), and a small collection of
convenience definitions.

Fundamental building blocks:

* Any, used as ``def get(key: str) -> Any: ...``

* Union, used as ``Union[Type1, Type2, Type3]``

* Callable, used as ``Callable[[Arg1Type, Arg2Type], ReturnType]``

* Tuple, used by listing the element types, for example
  ``Tuple[int, int, str]``.
  Arbitrary-length homogeneous tuples can be expressed
  using one type and ellipsis, for example ``Tuple[int, ...]``.
  (The ``...`` here are part of the syntax, a literal ellipsis.)

* TypeVar, used as ``X = TypeVar('X', Type1, Type2, Type3)`` or simply
  ``Y = TypeVar('Y')`` (see above for more details)

Generic variants of builtin collections:

* Dict, used as ``Dict[key_type, value_type]``

* List, used as ``List[element_type]``

* Set, used as ``Set[element_type]``. See remark for ``AbstractSet``
  below.

* FrozenSet, used as ``FrozenSet[element_type]``

Note: ``Dict``, ``List``, ``Set`` and ``FrozenSet`` are mainly useful
for annotating return values.  For arguments, prefer the abstract
collection types defined below, e.g.  ``Mapping``, ``Sequence`` or
``AbstractSet``.

Generic variants of container ABCs (and a few non-containers):

* ByteString

* Callable (see above, listed here for completeness)

* Container

* Generator, used as ``Generator[yield_type, send_type,
  return_type]``.  This represents the return value of generator
  functions.  It is a subtype of ``Iterable`` and it has additional
  type variables for the type accepted by the ``send()`` method (which
  is contravariant -- a generator that accepts sending it ``Employee``
  instance is valid in a context where a generator is required that
  accepts sending it ``Manager`` instances) and the return type of the
  generator.

* Hashable (not generic, but present for completeness)

* ItemsView

* Iterable

* Iterator

* KeysView

* Mapping

* MappingView

* MutableMapping

* MutableSequence

* MutableSet

* Sequence

* Set, renamed to ``AbstractSet``. This name change was required
  because ``Set`` in the ``typing`` module means ``set()`` with
  generics.

* Sized (not generic, but present for completeness)

* ValuesView

A few one-off types are defined that test for single special methods
(similar to ``Hashable`` or ``Sized``):

* Reversible, to test for ``__reversed__``

* SupportsAbs, to test for ``__abs__``

* SupportsComplex, to test for ``__complex__``

* SupportsFloat, to test for ``__float__``

* SupportsInt, to test for ``__int__``

* SupportsRound, to test for ``__round__``

* SupportsBytes, to test for ``__bytes__``

Constants for platform-specific type hinting:

* PY2

* PY3, equivalent to ``not PY2``

* WINDOWS

* POSIX, equivalent to ``not WINDOWS``

Convenience definitions:

* AnyStr, defined as ``TypeVar('AnyStr', str, bytes)``

* NamedTuple, used as
  ``NamedTuple(type_name, [(field_name, field_type), ...])``
  and equivalent to
  ``collections.namedtuple(type_name, [field_name, ...])``.
  This is useful to declare the types of the fields of a a named tuple
  type.

* cast(), described earlier

* @no_type_check, a decorator to disable type checking per class or
  function (see below)

* @no_type_check_decorator, a decorator to create your own decorators
  with the same meaning as ``@no_type_check`` (see below)

* @overload, described earlier

* get_type_hints(), a utility function to retrieve the type hints from a
  function or method.  Given a function or method object, it returns
  a dict with the same format as ``__annotations__``, but evaluating
  forward references (which are given as string literals) as expressions
  in the context of the original function or method definition.

Types available in the ``typing.io`` submodule:

* IO (generic over ``AnyStr``)

* BinaryIO (a simple subtype of ``IO[bytes]``)

* TextIO (a simple subtype of ``IO[str]``)

Types available in the ``typing.re`` submodule:

* Match and Pattern, types of ``re.match()`` and ``re.compile()``
  results (generic over ``AnyStr``)


Rejected Alternatives
=====================

During discussion of earlier drafts of this PEP, various objections
were raised and alternatives were proposed.  We discuss some of these
here and explain why we reject them.

Several main objections were raised.

Which brackets for generic type parameters?
-------------------------------------------

Most people are familiar with the use of angular brackets
(e.g. ``List<int>``) in languages like C++, Java, C# and Swift to
express the parametrization of generic types.  The problem with these
is that they are really hard to parse, especially for a simple-minded
parser like Python.  In most languages the ambiguities are usually
dealt with by only allowing angular brackets in specific syntactic
positions, where general expressions aren't allowed.  (And also by
using very powerful parsing techniques that can backtrack over an
arbitrary section of code.)

But in Python, we'd like type expressions to be (syntactically) the
same as other expressions, so that we can use e.g. variable assignment
to create type aliases.  Consider this simple type expression::

    List<int>

>From the Python parser's perspective, the expression begins with the
same four tokens (NAME, LESS, NAME, GREATER) as a chained comparison::

    a < b > c  # I.e., (a < b) and (b > c)

We can even make up an example that could be parsed both ways::

    a < b > [ c ]

Assuming we had angular brackets in the language, this could be
interpreted as either of the following two::

    (a<b>)[c]      # I.e., (a<b>).__getitem__(c)
    a < b > ([c])  # I.e., (a < b) and (b > [c])

It would surely be possible to come up with a rule to disambiguate
such cases, but to most users the rules would feel arbitrary and
complex.  It would also require us to dramatically change the CPython
parser (and every other parser for Python).  It should be noted that
Python's current parser is intentionally "dumb" -- a simple grammar is
easier for users to reason about.

For all these reasons, square brackets (e.g. ``List[int]``) are (and
have long been) the preferred syntax for generic type parameters.
They can be implemented by defining the ``__getitem__()`` method on
the metaclass, and no new syntax is required at all.  This option
works in all recent versions of Python (starting with Python 2.2).
Python is not alone in this syntactic choice -- generic classes in
Scala also use square brackets.

What about existing uses of annotations?
----------------------------------------

One line of argument points out that PEP 3107 explicitly supports
the use of arbitrary expressions in function annotations.  The new
proposal is then considered incompatible with the specification of PEP
3107.

Our response to this is that, first of all, the current proposal does
not introduce any direct incompatibilities, so programs using
annotations in Python 3.4 will still work correctly and without
prejudice in Python 3.5.

We do hope that type hints will eventually become the sole use for
annotations, but this will require additional discussion and a
deprecation period after the initial roll-out of the typing module
with Python 3.5.  The current PEP will have provisional status (see
PEP 411) until Python 3.6 is released.  The fastest conceivable scheme
would introduce silent deprecation of non-type-hint annotations in
3.6, full deprecation in 3.7, and declare type hints as the only
allowed use of annotations in Python 3.8.  This should give authors of
packages that use annotations plenty of time to devise another
approach, even if type hints become an overnight success.

Another possible outcome would be that type hints will eventually
become the default meaning for annotations, but that there will always
remain an option to disable them.  For this purpose the current
proposal defines a decorator ``@no_type_check`` which disables the
default interpretation of annotations as type hints in a given class
or function.  It also defines a meta-decorator
``@no_type_check_decorator`` which can be used to decorate a decorator
(!), causing annotations in any function or class decorated with the
latter to be ignored by the type checker.

There are also ``# type: ignore`` comments, and static checkers should
support configuration options to disable type checking in selected
packages.

Despite all these options, proposals have been circulated to allow
type hints and other forms of annotations to coexist for individual
arguments.  One proposal suggests that if an annotation for a given
argument is a dictionary literal, each key represents a different form
of annotation, and the key ``'type'`` would be use for type hints.
The problem with this idea and its variants is that the notation
becomes very "noisy" and hard to read.  Also, in most cases where
existing libraries use annotations, there would be little need to
combine them with type hints.  So the simpler approach of selectively
disabling type hints appears sufficient.

The problem of forward declarations
-----------------------------------

The current proposal is admittedly sub-optimal when type hints must
contain forward references.  Python requires all names to be defined
by the time they are used.  Apart from circular imports this is rarely
a problem: "use" here means "look up at runtime", and with most
"forward" references there is no problem in ensuring that a name is
defined before the function using it is called.

The problem with type hints is that annotations (per PEP 3107, and
similar to default values) are evaluated at the time a function is
defined, and thus any names used in an annotation must be already
defined when the function is being defined.  A common scenario is a
class definition whose methods need to reference the class itself in
their annotations.  (More general, it can also occur with mutually
recursive classes.)  This is natural for container types, for
example::

  class Node:
      """Binary tree node."""

      def __init__(self, left: Node, right: None):
          self.left = left
          self.right = right

As written this will not work, because of the peculiarity in Python
that class names become defined once the entire body of the class has
been executed.  Our solution, which isn't particularly elegant, but
gets the job done, is to allow using string literals in annotations.
Most of the time you won't have to use this though -- most *uses* of
type hints are expected to reference builtin types or types defined in
other modules.

A counterproposal would change the semantics of type hints so they
aren't evaluated at runtime at all (after all, type checking happens
off-line, so why would type hints need to be evaluated at runtime at
all).  This of course would run afoul of backwards compatibility,
since the Python interpreter doesn't actually know whether a
particular annotation is meant to be a type hint or something else.

A compromise is possible where a ``__future__`` import could enable
turning *all* annotations in a given module into string literals, as
follows::

  from __future__ import annotations

  class ImSet:
      def add(self, a: ImSet) -> List[ImSet]: ...

  assert ImSet.add.__annotations__ == {'a': 'ImSet', 'return':
'List[ImSet]'}

Such a ``__future__`` import statement may be proposed in a separate
PEP.


The double colon
----------------

A few creative souls have tried to invent solutions for this problem.
For example, it was proposed to use a double colon (``::``) for type
hints, solving two problems at once: disambiguating between type hints
and other annotations, and changing the semantics to preclude runtime
evaluation.  There are several things wrong with this idea, however.

* It's ugly.  The single colon in Python has many uses, and all of
  them look familiar because they resemble the use of the colon in
  English text.  This is a general rule of thumb by which Python
  abides for most forms of punctuation; the exceptions are typically
  well known from other programming languages.  But this use of ``::``
  is unheard of in English, and in other languages (e.g. C++) it is
  used as a scoping operator, which is a very different beast.  In
  contrast, the single colon for type hints reads naturally -- and no
  wonder, since it was carefully designed for this purpose (the idea
  long predates PEP 3107 [gvr-artima]_).  It is also used in the same
  fashion in other languages from Pascal to Swift.

* What would you do for return type annotations?

* It's actually a feature that type hints are evaluated at runtime.

  * Making type hints available at runtime allows runtime type
    checkers to be built on top of type hints.

  * It catches mistakes even when the type checker is not run.  Since
    it is a separate program, users may choose not to run it (or even
    install it), but might still want to use type hints as a concise
    form of documentation.  Broken type hints are no use even for
    documentation.

* Because it's new syntax, using the double colon for type hints would
  limit them to code that works with Python 3.5 only.  By using
  existing syntax, the current proposal can easily work for older
  versions of Python 3.  (And in fact mypy supports Python 3.2 and
  newer.)

* If type hints become successful we may well decide to add new syntax
  in the future to declare the type for variables, for example
  ``var age: int = 42``.  If we were to use a double colon for
  argument type hints, for consistency we'd have to use the same
  convention for future syntax, perpetuating the ugliness.

Other forms of new syntax
-------------------------

A few other forms of alternative syntax have been proposed, e.g. the
introduction of a ``where`` keyword [roberge]_, and Cobra-inspired
``requires`` clauses.  But these all share a problem with the double
colon: they won't work for earlier versions of Python 3.  The same
would apply to a new ``__future__`` import.

Other backwards compatible conventions
--------------------------------------

The ideas put forward include:

* A decorator, e.g. ``@typehints(name=str, returns=str)``.  This could
  work, but it's pretty verbose (an extra line, and the argument names
  must be repeated), and a far cry in elegance from the PEP 3107
  notation.

* Stub files.  We do want stub files, but they are primarily useful
  for adding type hints to existing code that doesn't lend itself to
  adding type hints, e.g. 3rd party packages, code that needs to
  support both Python 2 and Python 3, and especially extension
  modules.  For most situations, having the annotations in line with
  the function definitions makes them much more useful.

* Docstrings.  There is an existing convention for docstrings, based
  on the Sphinx notation (``:type arg1: description``).  This is
  pretty verbose (an extra line per parameter), and not very elegant.
  We could also make up something new, but the annotation syntax is
  hard to beat (because it was designed for this very purpose).

It's also been proposed to simply wait another release.  But what
problem would that solve?  It would just be procrastination.


PEP Development Process
=======================

A live draft for this PEP lives on GitHub [github]_.  There is also an
issue tracker [issues]_, where much of the technical discussion takes
place.

The draft on GitHub is updated regularly in small increments.  The
official PEPS repo [peps_] is (usually) only updated when a new draft
is posted to python-dev.


Acknowledgements
================

This document could not be completed without valuable input,
encouragement and advice from Jim Baker, Jeremy Siek, Michael Matson
Vitousek, Andrey Vlasovskikh, Radomir Dopieralski, Peter Ludemann,
and the BDFL-Delegate, Mark Shannon.

Influences include existing languages, libraries and frameworks
mentioned in PEP 482.  Many thanks to their creators, in alphabetical
order: Stefan Behnel, William Edwards, Greg Ewing, Larry Hastings,
Anders Hejlsberg, Alok Menghrajani, Travis E. Oliphant, Joe Pamer,
Raoul-Gabriel Urma, and Julien Verlaguet.


References
==========

.. [mypy]
   http://mypy-lang.org

.. [gvr-artima]
   http://www.artima.com/weblogs/viewpost.jsp?thread=85551

.. [wiki-variance]

http://en.wikipedia.org/wiki/Covariance_and_contravariance_%28computer_science%29

.. [typeshed]
   https://github.com/JukkaL/typeshed/

.. [pyflakes]
   https://github.com/pyflakes/pyflakes/

.. [pylint]
   http://www.pylint.org

.. [roberge]
   http://aroberge.blogspot.com/2015/01/type-hinting-in-python-focus-on.html

.. [github]
   https://github.com/ambv/typehinting

.. [issues]
   https://github.com/ambv/typehinting/issues

.. [peps]
   https://hg.python.org/peps/file/tip/pep-0484.txt


Copyright
=========

This document has been placed in the public domain.



..
   Local Variables:
   mode: indented-text
   indent-tabs-mode: nil
   sentence-end-double-space: t
   fill-column: 70
   coding: utf-8
   End:


-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150520/a6d256fa/attachment-0001.html>

From tjreedy at udel.edu  Wed May 20 20:03:22 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 20 May 2015 14:03:22 -0400
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <CAPkN8xKn2EuttDkJ6kb6t_ey6OBQV7nQ_NsekCk_SuZ21BEXHg@mail.gmail.com>
References: <20150519150251.GJ20426@kdas-laptop> <mjg6v7$m02$1@ger.gmane.org>
 <CAPkN8xKn2EuttDkJ6kb6t_ey6OBQV7nQ_NsekCk_SuZ21BEXHg@mail.gmail.com>
Message-ID: <mjii9i$mv3$1@ger.gmane.org>

On 5/20/2015 3:44 AM, anatoly techtonik wrote:

> This was intended to add status for bugs.python.org, but the work on
> Roundup had stalled due to uncertainty and despair on how to handle
> utf-8 (internal to Roundup) vs unicode (internal to Jinja2) in this issue:
> http://issues.roundup-tracker.org/issue2550811
> The root of the problem is that Python 2.7 uses 'ascii' and not 'utf-8'
> internally, so Jinja2 engine fails with 'ascii' ordinal not in range
> somewhere in the way. Need an expert advice how to handle that, because
> my brain power is not enough to process it.

In my view, the root of the problem is using Python 2 and working with 
encoded bytes.  The fix is to upgrade to Python 3.3+, with an improved 
text model (unicode str class), and work with text.  Follow the standard 
process: decode encoded bytes to text when received from browsers, work 
internally with text, and encode output just before sending to browsers.

-- 
Terry Jan Reedy


From olemis at gmail.com  Wed May 20 22:42:31 2015
From: olemis at gmail.com (Olemis Lang)
Date: Wed, 20 May 2015 15:42:31 -0500
Subject: [Python-Dev] Automated testing of patches from bugs.python.org
In-Reply-To: <mjg6v7$m02$1@ger.gmane.org>
References: <20150519150251.GJ20426@kdas-laptop> <mjg6v7$m02$1@ger.gmane.org>
Message-ID: <CAGMZAuPWKg5VzyYZ-F=PMoTO9ypGBn8wwFySP0h-QW-RYbKsQg@mail.gmail.com>

On 5/19/15, Terry Reedy <tjreedy at udel.edu> wrote:
> On 5/19/2015 11:02 AM, Kushal Das wrote:
>> Hi,
>>

Hi !

I'm not very familiar with python-dev development workflows .
Nonetheless I just wanted to mention something that proved to be
useful for me in the past .

>> With the help of CentOS project I am happy to announce an automated
>> system [1] to test patches from bugs.python.org. This can be fully
>> automated
>> to test the patches whenever someone uploads a patch in the roundup, but
>> for now it accepts IRC commands on #python-dev channel. I worked on a
>> docker based prototype during sprints in PyCon.
>>
>> How to use it?
>> ---------------
>>
>> 1. Join #python-dev on irc.freenode.net.
>> 2. Ask for test privilege  from any one of kushal,Taggnostr,bitdancer
>> 3. They will issue a simple command. #add: YOUR_NICK_NAME
>> 4. You can then test by issuing the following command in the channel:
>>
>>      #test: BUGNUMBER
>>      like #test: 21271
>
> What if there are multiple patches on the issue?  Pick the latest?
> This is not correct if someone follows up a patch with a 2.7 backport,
> or if there are competing patches.
>
[...]

It is a fact that running automated tests for patches is a really
useful feature . Nevertheless , IMHO for this to succeed at large
scale there is a need to manage the content of patches themselves ,
the base version they were built upon , as well as their order should
they be stacked . My suggestion for you therefore is to use Hg patch
repositories [1]_ as the starting point for your patch CI system .
Some of the benefits I could mention :

  - triggering (patch) builds on commit via web hooks
  - CI infrastructure needed turns out to be very
    similar to the one setup for the main project
  - Commands to work on patch queue repositories are easy to learn
  - The possibility of editing series file is also useful for ignoring
    some patches without removing their contents .
  - halt if patch cannot be applied upon latest version
    * ... but still be able to see it in action by checking out the
right version
      of the code base used to build it in first place .
  - try the patch against different versions of the code base as it evolves
  - fuzzy refresh
  - version control for patches
  - multiple branches
    * which may be bound to tickets in many ways e.g. naming conventions
    * ... particularly useful for competing patches .

There are a few black spots too . Patch repositories deal with the
diff of a diff , hence some operations applied upon patches (e.g.
merging) might be quite messy , Most of the time this is no big deal
though .

The following are repositories I used while developing Apache
Bloodhound , during incubation and after it became a TLP . I'm
including them to illustrate branching and naming conventions (I used)
to keep track of tickets .

https://bitbucket.org/olemis/bloodhound-incubator-mq/
https://bitbucket.org/olemis/bloodhound-mq

HTH , since this way the workflow would be tightly integrated with
Mercurial , as highlighted by Berker Peksa? in previous messages .

.. [1] http://mercurial.selenic.com/wiki/MqTutorial

-- 
Regards,

Olemis - @olemislc

Apache? Bloodhound contributor
http://issues.apache.org/bloodhound
http://blood-hound.net

Blog ES: http://simelo-es.blogspot.com/
Blog EN: http://simelo-en.blogspot.com/

Featured article:

From benhoyt at gmail.com  Thu May 21 04:26:55 2015
From: benhoyt at gmail.com (Ben Hoyt)
Date: Wed, 20 May 2015 22:26:55 -0400
Subject: [Python-Dev] Enable access to the AST for Python code
Message-ID: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>

Hi Python devs,

Enabling access to the AST for compiled code would make some cool
things possible (C# LINQ-style ORMs, for example), and not knowing too
much about this part of Python internals, I'm wondering how possible
and practical this would be.

Context: PonyORM (http://ponyorm.com/) allows you to write regular
Python generator expressions like this:

    select(c for c in Customer if sum(c.orders.price) > 1000)

which compile into and run SQL like this:

    SELECT "c"."id"
    FROM "Customer" "c"
    LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
    GROUP BY "c"."id"
    HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000

I think the Pythonic syntax here is beautiful. But the tricks PonyORM
has to go to get it are ... not quite so beautiful. Because the AST is
not available, PonyORM decompiles Python bytecode into an AST first,
and then converts that to SQL. (More details on all that from author's
EuroPython talk at http://pyvideo.org/video/2968)

I believe PonyORM needs the AST just for generator expressions and
lambda functions, but obviously if this kind of AST access feature
were in Python it'd probably be more general.

I believe C#'s LINQ provides something similar, where if you're
developing a LINQ converter library (say LINQ to SQL), you essentially
get the AST of the code ("expression tree") and the library can do
what it wants with that.

What would it take to enable this kind of AST access in Python? Is it
possible? Is it a good idea?

-Ben

From guido at python.org  Thu May 21 04:31:13 2015
From: guido at python.org (Guido van Rossum)
Date: Wed, 20 May 2015 19:31:13 -0700
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
Message-ID: <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>

Hey Ben, this is probably a better topic for python-ideas. I'll warn you
that a hurdle for ideas like this is that ideally you don't want to support
this just for CPython. It's definitely cool though! (Using movie poster
style quotes you can turn this into a ringing endorsement: "definitely
cool" -- The BDFL. :-)

On Wed, May 20, 2015 at 7:26 PM, Ben Hoyt <benhoyt at gmail.com> wrote:

> Hi Python devs,
>
> Enabling access to the AST for compiled code would make some cool
> things possible (C# LINQ-style ORMs, for example), and not knowing too
> much about this part of Python internals, I'm wondering how possible
> and practical this would be.
>
> Context: PonyORM (http://ponyorm.com/) allows you to write regular
> Python generator expressions like this:
>
>     select(c for c in Customer if sum(c.orders.price) > 1000)
>
> which compile into and run SQL like this:
>
>     SELECT "c"."id"
>     FROM "Customer" "c"
>     LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
>     GROUP BY "c"."id"
>     HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
>
> I think the Pythonic syntax here is beautiful. But the tricks PonyORM
> has to go to get it are ... not quite so beautiful. Because the AST is
> not available, PonyORM decompiles Python bytecode into an AST first,
> and then converts that to SQL. (More details on all that from author's
> EuroPython talk at http://pyvideo.org/video/2968)
>
> I believe PonyORM needs the AST just for generator expressions and
> lambda functions, but obviously if this kind of AST access feature
> were in Python it'd probably be more general.
>
> I believe C#'s LINQ provides something similar, where if you're
> developing a LINQ converter library (say LINQ to SQL), you essentially
> get the AST of the code ("expression tree") and the library can do
> what it wants with that.
>
> What would it take to enable this kind of AST access in Python? Is it
> possible? Is it a good idea?
>
> -Ben
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150520/dce3f816/attachment.html>

From ncoghlan at gmail.com  Thu May 21 06:57:45 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 21 May 2015 14:57:45 +1000
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
Message-ID: <CADiSq7fv2AeD6Y+H7SdjB3yf62cWpVZHK1qhgy-YqX2U+3KYDA@mail.gmail.com>

On 21 May 2015 at 12:31, Guido van Rossum <guido at python.org> wrote:
> Hey Ben, this is probably a better topic for python-ideas. I'll warn you
> that a hurdle for ideas like this is that ideally you don't want to support
> this just for CPython. It's definitely cool though! (Using movie poster
> style quotes you can turn this into a ringing endorsement: "definitely cool"
> -- The BDFL. :-)

Agreed this is python-ideas territory, but yes there's definitely
interest in being able to mark a section of code as being compiled to
an AST object at compile time, and then further processed at runtime
(essentially having syntax to switch on PyCF_ONLY_AST for a
subexpression and/or entire statement).

At the moment, you can do this all through the ast module and the
PyCF_ONLY_AST flag to compile(), but you need to pass the code to be
compiled around as strings, which tends to somewhat user (and IDE!)
unfriendly.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From greg.ewing at canterbury.ac.nz  Thu May 21 07:32:45 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 21 May 2015 17:32:45 +1200
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
Message-ID: <555D6DFD.1070602@canterbury.ac.nz>

Guido van Rossum wrote:
> Hey Ben, this is probably a better topic for python-ideas. I'll warn you 
> that a hurdle for ideas like this is that ideally you don't want to 
> support this just for CPython. It's definitely cool though!

This would effectively be a macro system. I thought
your position on macros was that they're uncool?

If you've changed your mind about this, that's cool
too -- just checking.

-- 
Greg

From Steve.Dower at microsoft.com  Thu May 21 15:05:29 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Thu, 21 May 2015 13:05:29 +0000
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <555D6DFD.1070602@canterbury.ac.nz>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>,
 <555D6DFD.1070602@canterbury.ac.nz>
Message-ID: <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>

It's only a macro system when you generate code in unexpected/unobvious places with it. This is more like inspect.getsource(), but going straight to the AST.

Cheers,
Steve

Top-posted from my Windows Phone
________________________________
From: Greg Ewing<mailto:greg.ewing at canterbury.ac.nz>
Sent: ?5/?20/?2015 22:33
To: Python-Dev<mailto:python-dev at python.org>
Subject: Re: [Python-Dev] Enable access to the AST for Python code

Guido van Rossum wrote:
> Hey Ben, this is probably a better topic for python-ideas. I'll warn you
> that a hurdle for ideas like this is that ideally you don't want to
> support this just for CPython. It's definitely cool though!

This would effectively be a macro system. I thought
your position on macros was that they're uncool?

If you've changed your mind about this, that's cool
too -- just checking.

--
Greg
_______________________________________________
Python-Dev mailing list
Python-Dev at python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150521/5c3f8580/attachment.html>

From benhoyt at gmail.com  Thu May 21 16:01:04 2015
From: benhoyt at gmail.com (Ben Hoyt)
Date: Thu, 21 May 2015 10:01:04 -0400
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
Message-ID: <CAL9jXCFhySYGQKBLOmeTmDpW5vSrF2eU2jnUVJS5wtVEPZ00RA@mail.gmail.com>

Thanks. Good point about python-ideas -- I was thinking that after I sent
it too. I'll repost there soon.

Out of interest, what specifically were you referring to as "definitely
cool" here: LINQ-style generator expressions that build SQL ala PonyORM, or
the more general feature of enabling AST access?

-Ben

On Wed, May 20, 2015 at 10:31 PM, Guido van Rossum <guido at python.org> wrote:

> Hey Ben, this is probably a better topic for python-ideas. I'll warn you
> that a hurdle for ideas like this is that ideally you don't want to support
> this just for CPython. It's definitely cool though! (Using movie poster
> style quotes you can turn this into a ringing endorsement: "definitely
> cool" -- The BDFL. :-)
>
> On Wed, May 20, 2015 at 7:26 PM, Ben Hoyt <benhoyt at gmail.com> wrote:
>
>> Hi Python devs,
>>
>> Enabling access to the AST for compiled code would make some cool
>> things possible (C# LINQ-style ORMs, for example), and not knowing too
>> much about this part of Python internals, I'm wondering how possible
>> and practical this would be.
>>
>> Context: PonyORM (http://ponyorm.com/) allows you to write regular
>> Python generator expressions like this:
>>
>>     select(c for c in Customer if sum(c.orders.price) > 1000)
>>
>> which compile into and run SQL like this:
>>
>>     SELECT "c"."id"
>>     FROM "Customer" "c"
>>     LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
>>     GROUP BY "c"."id"
>>     HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
>>
>> I think the Pythonic syntax here is beautiful. But the tricks PonyORM
>> has to go to get it are ... not quite so beautiful. Because the AST is
>> not available, PonyORM decompiles Python bytecode into an AST first,
>> and then converts that to SQL. (More details on all that from author's
>> EuroPython talk at http://pyvideo.org/video/2968)
>>
>> I believe PonyORM needs the AST just for generator expressions and
>> lambda functions, but obviously if this kind of AST access feature
>> were in Python it'd probably be more general.
>>
>> I believe C#'s LINQ provides something similar, where if you're
>> developing a LINQ converter library (say LINQ to SQL), you essentially
>> get the AST of the code ("expression tree") and the library can do
>> what it wants with that.
>>
>> What would it take to enable this kind of AST access in Python? Is it
>> possible? Is it a good idea?
>>
>> -Ben
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>>
>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150521/50a4c23a/attachment.html>

From ericsnowcurrently at gmail.com  Thu May 21 16:55:06 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 21 May 2015 08:55:06 -0600
Subject: [Python-Dev] segfaults due to hash randomization in C OrderedDict
Message-ID: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>

(see http://bugs.python.org/issue16991)

I an working on resolving an intermittent segfault that my C
OrderedDict patch introduces.  The failure happens in
test_configparser (RawConfigParser uses OrderedDict internally), but
only sporadically.  However, Ned pointed out to me that it appears to
be related to hash randomization, which I have verified.  I'm looking
into it.

In the meantime, here's a specific question.  What would lead to the
pattern of failures I'm seeing?  I've verified that the segfault
happens consistently for certain hash randomization seeds and never
for the rest.  I don't immediately recognize the pattern but expect
that it would shed some light on where the problem lies.  I ran the
following command with the OrderedDict patch applied:

  for i in `seq 1 100`; do echo $i; PYTHONHASHSEED=$i ./python -m
test.regrtest -m test_basic test_configparser ; done

Through 100 I get segfaults with seeds of 7, 15, 35, 37, 39, 40, 42,
47, 50, 66, 67, 85, 87, 88, and 92.  I expect the distribution across
all seeds is uniform, but I haven't verified that.

Thoughts?

-eric

From guido at python.org  Thu May 21 17:01:01 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 21 May 2015 08:01:01 -0700
Subject: [Python-Dev] Status of PEP 484 and the typing module
In-Reply-To: <555DDCA4.2080100@hotpy.org>
References: <555DDCA4.2080100@hotpy.org>
Message-ID: <CAP7+vJ+qtRF4kngnHQEh-C+BVaS6cxPk2kifRaB03NA-NiC3sQ@mail.gmail.com>

Hi Mark,

We're down to the last few items here. I'm CC'ing python-dev so folks can
see how close we are. I'll answer point by point.

On Thu, May 21, 2015 at 6:24 AM, Mark Shannon <mark at hotpy.org> wrote:

> Hi,
>
> The PEP itself is looking fairly good.
>

I hope you'll accept it at least provisionally so we can iterate over the
finer points while a prototype of typing.py in in beta 1.


> However, I don't think that typing.py is ready yet, for a number of
> reasons:
>
> 1.
> As I've said before, there needs to be a distinction between classes and
> types.
> They is no need for Any, Generic, Generic's subtypes, or Union to subclass
> builtins.type.
>

I strongly disagree. They can appear in many positions where real classes
are acceptable, in particular annotations can have classes (e.g. int) or
types (e.g. Union[int, str]).


> Playing around with typing.py, it has also become clear to me that it
> is also important to distinguish type constructors from types.
>
> What do I mean by a type constructor?
> A type constructor makes types.
> "List" is an example of a type constructor. It constructs types such as
> List[T] and List[int].
> Saying that something is a List (as opposed to a list) should be rejected.
>

The PEP actually says that plain List (etc.) is equivalent to List[Any].
(Well, at least that's the intention; it's implied by the section about the
equivalence between Node() and Node[Any]().


> 2.
> Usability of typing as it stands:
>
> Let's try to make a class that implements a mutable mapping.
>
> >>> import typing as tp
> #Make some variables.
> >>> T = tp.TypeVar('T')
> >>> K = tp.TypeVar('K')
> >>> V = tp.TypeVar('V')
>
> #Then make our class:
>
> >>> class MM(tp.MutableMapping): pass
> ...
> #Oh that worked, but it shouldn't. MutableMapping is a type constructor.
>

It means MutableMapping[Any].


> #Let's make one
> >>> MM()
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "/home/mark/repositories/typehinting/prototyping/typing.py", line
> 1095, in __new__
>     if _gorg(c) is Generic:
>   File "/home/mark/repositories/typehinting/prototyping/typing.py", line
> 887, in _gorg
>     while a.__origin__ is not None:
> AttributeError: type object 'Sized' has no attribute '__origin__'
>
> # ???
>

Sorry, that's a bug I introduced in literally the last change to typing.py.
I will fix it. The expected behavior is

TypeError: Can't instantiate abstract class MM with abstract methods __len__



> #Well let's try using type variables.
> class MM2(tp.MutableMapping[K, V]): pass
> ...
> >>> MM2()
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "/home/mark/repositories/typehinting/prototyping/typing.py", line
> 1095, in __new__
>     if _gorg(c) is Generic:
>   File "/home/mark/repositories/typehinting/prototyping/typing.py", line
> 887, in _gorg
>     while a.__origin__ is not None:
> AttributeError: type object 'Sized' has no attribute '__origin__'
>

Ditto, and sorry.

>
> At this point, we have to resort to using 'Dict', which forces us to
> subclass 'dict' which may not be what we want as it may cause metaclass
> conflicts.
>
> 3.
> Memory consumption is also a worry. There is no caching, which means every
> time I use "List[int]" as an annotation, a new class object is created.
> Each class may only be a few KB, but collectively this could easily add up
> to several MBs.
> This should be easy to fix.
>

I can work on this after the beta-1 release. Until then, type aliases can
be used to avoid redundant type creation (and often they are clearer anyway
:-).


> 4.
> PY2, etc. really need to go.
> Assuming that this code type checks OK:
>
>  if typing.PY2:
>      type_safe_under_py2_only()
>  else:
>      type_safe_under_py3_only()
>
> Is the checker supposed to pass this:
>
>  if sys.hexversion < 0x03000000:
>      type_safe_under_py2_only()
>  else:
>      type_safe_under_py3_only()
>
> If it should pass, then why have PY2, etc. at all.
> If it should fail, well that is just stupid and annoying.
>
> Pylint already understands version checks, as does our (Semmle's) checker.
> I suspect most IDEs do as well.
>

I have to negotiate this with Jukka but I think he'll agree.


> 5.
> Removing isinstance() support:
>
> As I said before, this is the job of a checker not typing.py.
>
> It also introduces some strange situations:
> D = tp.Dict[str,int]
> d = {}
> assert isinstance(d, D)
> d["x"] = None
> assert isinstance(d, D)
>
> In the above case the first check passes, and the second fails.
> But d is either of type D or it isn't. It can't be both, as types
> are static properties of programs, unlike classes.
>

Well, isinstance() is a dynamic function. The type checker has no authority
over its behavior beyond its signature.


> And it's broken anyway:
> >>> D = tp.Dict[str,'D']
> >>> d = {"x": {}}
> >>> isinstance(d, D)
> False
>

That's because _ForwardRef doesn't implement __instancheck__ or
__subclasscheck__. It's easily fixed.

>
> Realistically, I don't see typing.py being ready in time for 3.5.
> I'd be happy to be proved wrong.
>
> Cheers,
> Mark.
>
>
> P.S.
> I am worried by the lack of formal specification. It all seems a bit
> hand-waving. A formal spec reduces the likelihood of some unforeseen corner
> case being a permanent wart.
>

Formal specs are not my cup of tea. :-( (I'm not proud of this, but it just
is a fact -- see how terrible a job I've done of the Python reference
manual.) The best I could come up with is PEP 483.


> Take the recursive type above. There is no mention of recursive types in
> the PEP and they are clearly possible. Are they allowed?
>

They should be allowed. I imagine you could create one for which a naive
isinstance() imeplementation ends up in an infinite loop. That can be fixed
too (we fixed this for printing self-referential lists and dicts).


> I'm guessing that Jukka's thesis should cover a lot of this.
> Has it been published yet?
>

Hopefully Jukka can answer that. :-)

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150521/409b5458/attachment.html>

From mark at hotpy.org  Thu May 21 17:45:56 2015
From: mark at hotpy.org (Mark Shannon)
Date: Thu, 21 May 2015 16:45:56 +0100
Subject: [Python-Dev] Status of PEP 484 and the typing module
In-Reply-To: <CAP7+vJ+qtRF4kngnHQEh-C+BVaS6cxPk2kifRaB03NA-NiC3sQ@mail.gmail.com>
References: <555DDCA4.2080100@hotpy.org>
 <CAP7+vJ+qtRF4kngnHQEh-C+BVaS6cxPk2kifRaB03NA-NiC3sQ@mail.gmail.com>
Message-ID: <555DFDB4.7050503@hotpy.org>



On 21/05/15 16:01, Guido van Rossum wrote:
> Hi Mark,
>
> We're down to the last few items here. I'm CC'ing python-dev so folks
> can see how close we are. I'll answer point by point.
>
> On Thu, May 21, 2015 at 6:24 AM, Mark Shannon <mark at hotpy.org
> <mailto:mark at hotpy.org>> wrote:
>
>     Hi,
>
>     The PEP itself is looking fairly good.
>
>
> I hope you'll accept it at least provisionally so we can iterate over
> the finer points while a prototype of typing.py in in beta 1.
>
>     However, I don't think that typing.py is ready yet, for a number of
>     reasons:
>
>     1.
>     As I've said before, there needs to be a distinction between classes
>     and types.
>     They is no need for Any, Generic, Generic's subtypes, or Union to
>     subclass builtins.type.
>
>
> I strongly disagree. They can appear in many positions where real
> classes are acceptable, in particular annotations can have classes (e.g.
> int) or types (e.g. Union[int, str]).

Why does this mean that they have to be classes? Annotations can be any 
object.

It might to help to think, not in terms of types being classes, but 
classes being shorthand for the nominal type for that class (from the 
point of view of the checker and type geeks)
So when the checker sees 'int' it treats it as Type(int).

Subtyping is distinct from subclassing;
Type(int) <: Union[Type(int), Type(str)]
has no parallel in subclassing.
There is no class that corresponds to a Union, Any or a Generic.

In order to support the
class C(ParameterType[T]): pass
syntax, parametric types do indeed need to be classes, but Python has 
multiple inheritance, so thats not a problem:
class ParameterType(type, Type): ...
Otherwise typing.Types shouldn't be builtin.types and vice versa.

I think a lot of this issues on the tracker would not have been issues 
had the distinction been more clearly enforced.

>
>     Playing around with typing.py, it has also become clear to me that it
>     is also important to distinguish type constructors from types.
>
>     What do I mean by a type constructor?
>     A type constructor makes types.
>     "List" is an example of a type constructor. It constructs types such
>     as List[T] and List[int].
>     Saying that something is a List (as opposed to a list) should be
>     rejected.
>
>
> The PEP actually says that plain List (etc.) is equivalent to List[Any].
> (Well, at least that's the intention; it's implied by the section about
> the equivalence between Node() and Node[Any]().

Perhaps we should change that. Using 'List', rather than 'list' or 
'List[Any]' suggests an error, or misunderstanding, to me.

Is there a use case where 'List' is needed, and 'list' will not suffice?
I'm assuming that the type checker knows that 'list' is a MutableSequence.

>
>     2.
>     Usability of typing as it stands:
>
>     Let's try to make a class that implements a mutable mapping.
>
>      >>> import typing as tp
>     #Make some variables.
>      >>> T = tp.TypeVar('T')
>      >>> K = tp.TypeVar('K')
>      >>> V = tp.TypeVar('V')
>
>     #Then make our class:
>
>      >>> class MM(tp.MutableMapping): pass
>     ...
>     #Oh that worked, but it shouldn't. MutableMapping is a type constructor.
>
>
> It means MutableMapping[Any].
>
>     #Let's make one
>      >>> MM()
>     Traceback (most recent call last):
>        File "<stdin>", line 1, in <module>
>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>     line 1095, in __new__
>          if _gorg(c) is Generic:
>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>     line 887, in _gorg
>          while a.__origin__ is not None:
>     AttributeError: type object 'Sized' has no attribute '__origin__'
>
>     # ???
>
>
> Sorry, that's a bug I introduced in literally the last change to
> typing.py. I will fix it. The expected behavior is
>
> TypeError: Can't instantiate abstract class MM with abstract methods __len__
>
>     #Well let's try using type variables.
>     class MM2(tp.MutableMapping[K, V]): pass
>     ...
>      >>> MM2()
>     Traceback (most recent call last):
>        File "<stdin>", line 1, in <module>
>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>     line 1095, in __new__
>          if _gorg(c) is Generic:
>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>     line 887, in _gorg
>          while a.__origin__ is not None:
>     AttributeError: type object 'Sized' has no attribute '__origin__'
>
>
> Ditto, and sorry.
No need to apologise, I'm just a bit worried about how easy it was for 
me to expose this sort of bug.

>
>
>     At this point, we have to resort to using 'Dict', which forces us to
>     subclass 'dict' which may not be what we want as it may cause
>     metaclass conflicts.
>
>     3.
>     Memory consumption is also a worry. There is no caching, which means
>     every time I use "List[int]" as an annotation, a new class object is
>     created. Each class may only be a few KB, but collectively this
>     could easily add up to several MBs.
>     This should be easy to fix.
>
>
> I can work on this after the beta-1 release. Until then, type aliases
> can be used to avoid redundant type creation (and often they are clearer
> anyway :-).
Sure.

>
>     4.
>     PY2, etc. really need to go.
>     Assuming that this code type checks OK:
>
>       if typing.PY2:
>           type_safe_under_py2_only()
>       else:
>           type_safe_under_py3_only()
>
>     Is the checker supposed to pass this:
>
>       if sys.hexversion < 0x03000000:
>           type_safe_under_py2_only()
>       else:
>           type_safe_under_py3_only()
>
>     If it should pass, then why have PY2, etc. at all.
>     If it should fail, well that is just stupid and annoying.
>
>     Pylint already understands version checks, as does our (Semmle's)
>     checker. I suspect most IDEs do as well.
>
>
> I have to negotiate this with Jukka but I think he'll agree.
>
>     5.
>     Removing isinstance() support:
>
>     As I said before, this is the job of a checker not typing.py.
>
>     It also introduces some strange situations:
>     D = tp.Dict[str,int]
>     d = {}
>     assert isinstance(d, D)
>     d["x"] = None
>     assert isinstance(d, D)
>
>     In the above case the first check passes, and the second fails.
>     But d is either of type D or it isn't. It can't be both, as types
>     are static properties of programs, unlike classes.
>
>
> Well, isinstance() is a dynamic function. The type checker has no
> authority over its behavior beyond its signature.
>
>     And it's broken anyway:
>      >>> D = tp.Dict[str,'D']
>      >>> d = {"x": {}}
>      >>> isinstance(d, D)
>     False
>
>
> That's because _ForwardRef doesn't implement __instancheck__ or
> __subclasscheck__. It's easily fixed.
>
>
>     Realistically, I don't see typing.py being ready in time for 3.5.
>     I'd be happy to be proved wrong.
>
>     Cheers,
>     Mark.
>
>
>     P.S.
>     I am worried by the lack of formal specification. It all seems a bit
>     hand-waving. A formal spec reduces the likelihood of some unforeseen
>     corner case being a permanent wart.
>
>
> Formal specs are not my cup of tea. :-( (I'm not proud of this, but it
> just is a fact -- see how terrible a job I've done of the Python
> reference manual.) The best I could come up with is PEP 483.
>
>     Take the recursive type above. There is no mention of recursive
>     types in the PEP and they are clearly possible. Are they allowed?
>
>
> They should be allowed. I imagine you could create one for which a naive
> isinstance() imeplementation ends up in an infinite loop. That can be
> fixed too (we fixed this for printing self-referential lists and dicts).
>
>     I'm guessing that Jukka's thesis should cover a lot of this.
>     Has it been published yet?
>
>
> Hopefully Jukka can answer that. :-)
>
> --
> --Guido van Rossum (python.org/~guido <http://python.org/~guido>)

From python at mrabarnett.plus.com  Thu May 21 19:17:05 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Thu, 21 May 2015 18:17:05 +0100
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
Message-ID: <555E1311.5090105@mrabarnett.plus.com>

On 2015-05-21 15:55, Eric Snow wrote:
> (see http://bugs.python.org/issue16991)
>
> I an working on resolving an intermittent segfault that my C
> OrderedDict patch introduces.  The failure happens in
> test_configparser (RawConfigParser uses OrderedDict internally), but
> only sporadically.  However, Ned pointed out to me that it appears to
> be related to hash randomization, which I have verified.  I'm looking
> into it.
>
> In the meantime, here's a specific question.  What would lead to the
> pattern of failures I'm seeing?  I've verified that the segfault
> happens consistently for certain hash randomization seeds and never
> for the rest.  I don't immediately recognize the pattern but expect
> that it would shed some light on where the problem lies.  I ran the
> following command with the OrderedDict patch applied:
>
>    for i in `seq 1 100`; do echo $i; PYTHONHASHSEED=$i ./python -m
> test.regrtest -m test_basic test_configparser ; done
>
> Through 100 I get segfaults with seeds of 7, 15, 35, 37, 39, 40, 42,
> 47, 50, 66, 67, 85, 87, 88, and 92.  I expect the distribution across
> all seeds is uniform, but I haven't verified that.
>
> Thoughts?
>
In "_odict_get_index", for example (there are others), you're caching
"ma_keys":

     PyDictKeysObject *keys = ((PyDictObject *)od)->ma_keys;

If it resizes, you go back to the label "start", which is after that
line, but could "ma_keys" change when it's resized?


From guido at python.org  Thu May 21 20:17:29 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 21 May 2015 11:17:29 -0700
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <CAL9jXCFhySYGQKBLOmeTmDpW5vSrF2eU2jnUVJS5wtVEPZ00RA@mail.gmail.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <CAL9jXCFhySYGQKBLOmeTmDpW5vSrF2eU2jnUVJS5wtVEPZ00RA@mail.gmail.com>
Message-ID: <CAP7+vJKtQP+sxdnALkoqPPWNcsFVff_AQyuGWkkCtbj1j1_4NQ@mail.gmail.com>

Dang it. :-) I just want to encourage you to continue pursuing this idea,
one way or another.

On Thu, May 21, 2015 at 7:01 AM, Ben Hoyt <benhoyt at gmail.com> wrote:

> Thanks. Good point about python-ideas -- I was thinking that after I sent
> it too. I'll repost there soon.
>
> Out of interest, what specifically were you referring to as "definitely
> cool" here: LINQ-style generator expressions that build SQL ala PonyORM, or
> the more general feature of enabling AST access?
>
> -Ben
>
> On Wed, May 20, 2015 at 10:31 PM, Guido van Rossum <guido at python.org>
> wrote:
>
>> Hey Ben, this is probably a better topic for python-ideas. I'll warn you
>> that a hurdle for ideas like this is that ideally you don't want to support
>> this just for CPython. It's definitely cool though! (Using movie poster
>> style quotes you can turn this into a ringing endorsement: "definitely
>> cool" -- The BDFL. :-)
>>
>> On Wed, May 20, 2015 at 7:26 PM, Ben Hoyt <benhoyt at gmail.com> wrote:
>>
>>> Hi Python devs,
>>>
>>> Enabling access to the AST for compiled code would make some cool
>>> things possible (C# LINQ-style ORMs, for example), and not knowing too
>>> much about this part of Python internals, I'm wondering how possible
>>> and practical this would be.
>>>
>>> Context: PonyORM (http://ponyorm.com/) allows you to write regular
>>> Python generator expressions like this:
>>>
>>>     select(c for c in Customer if sum(c.orders.price) > 1000)
>>>
>>> which compile into and run SQL like this:
>>>
>>>     SELECT "c"."id"
>>>     FROM "Customer" "c"
>>>     LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
>>>     GROUP BY "c"."id"
>>>     HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
>>>
>>> I think the Pythonic syntax here is beautiful. But the tricks PonyORM
>>> has to go to get it are ... not quite so beautiful. Because the AST is
>>> not available, PonyORM decompiles Python bytecode into an AST first,
>>> and then converts that to SQL. (More details on all that from author's
>>> EuroPython talk at http://pyvideo.org/video/2968)
>>>
>>> I believe PonyORM needs the AST just for generator expressions and
>>> lambda functions, but obviously if this kind of AST access feature
>>> were in Python it'd probably be more general.
>>>
>>> I believe C#'s LINQ provides something similar, where if you're
>>> developing a LINQ converter library (say LINQ to SQL), you essentially
>>> get the AST of the code ("expression tree") and the library can do
>>> what it wants with that.
>>>
>>> What would it take to enable this kind of AST access in Python? Is it
>>> possible? Is it a good idea?
>>>
>>> -Ben
>>> _______________________________________________
>>> Python-Dev mailing list
>>> Python-Dev at python.org
>>> https://mail.python.org/mailman/listinfo/python-dev
>>> Unsubscribe:
>>> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>>>
>>
>>
>>
>> --
>> --Guido van Rossum (python.org/~guido)
>>
>
>


-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150521/7c77ad56/attachment.html>

From benhoyt at gmail.com  Thu May 21 20:42:36 2015
From: benhoyt at gmail.com (Ben Hoyt)
Date: Thu, 21 May 2015 14:42:36 -0400
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <CAP7+vJKtQP+sxdnALkoqPPWNcsFVff_AQyuGWkkCtbj1j1_4NQ@mail.gmail.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <CAL9jXCFhySYGQKBLOmeTmDpW5vSrF2eU2jnUVJS5wtVEPZ00RA@mail.gmail.com>
 <CAP7+vJKtQP+sxdnALkoqPPWNcsFVff_AQyuGWkkCtbj1j1_4NQ@mail.gmail.com>
Message-ID: <CAL9jXCE8nftiuwufbierw=g3-LXz0XkcJoGOSctfERcN8Aqnyg@mail.gmail.com>

Heh, thanks. :-)

On Thu, May 21, 2015 at 2:17 PM, Guido van Rossum <guido at python.org> wrote:

> Dang it. :-) I just want to encourage you to continue pursuing this idea,
> one way or another.
>
> On Thu, May 21, 2015 at 7:01 AM, Ben Hoyt <benhoyt at gmail.com> wrote:
>
>> Thanks. Good point about python-ideas -- I was thinking that after I sent
>> it too. I'll repost there soon.
>>
>> Out of interest, what specifically were you referring to as "definitely
>> cool" here: LINQ-style generator expressions that build SQL ala PonyORM, or
>> the more general feature of enabling AST access?
>>
>> -Ben
>>
>> On Wed, May 20, 2015 at 10:31 PM, Guido van Rossum <guido at python.org>
>> wrote:
>>
>>> Hey Ben, this is probably a better topic for python-ideas. I'll warn you
>>> that a hurdle for ideas like this is that ideally you don't want to support
>>> this just for CPython. It's definitely cool though! (Using movie poster
>>> style quotes you can turn this into a ringing endorsement: "definitely
>>> cool" -- The BDFL. :-)
>>>
>>> On Wed, May 20, 2015 at 7:26 PM, Ben Hoyt <benhoyt at gmail.com> wrote:
>>>
>>>> Hi Python devs,
>>>>
>>>> Enabling access to the AST for compiled code would make some cool
>>>> things possible (C# LINQ-style ORMs, for example), and not knowing too
>>>> much about this part of Python internals, I'm wondering how possible
>>>> and practical this would be.
>>>>
>>>> Context: PonyORM (http://ponyorm.com/) allows you to write regular
>>>> Python generator expressions like this:
>>>>
>>>>     select(c for c in Customer if sum(c.orders.price) > 1000)
>>>>
>>>> which compile into and run SQL like this:
>>>>
>>>>     SELECT "c"."id"
>>>>     FROM "Customer" "c"
>>>>     LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
>>>>     GROUP BY "c"."id"
>>>>     HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
>>>>
>>>> I think the Pythonic syntax here is beautiful. But the tricks PonyORM
>>>> has to go to get it are ... not quite so beautiful. Because the AST is
>>>> not available, PonyORM decompiles Python bytecode into an AST first,
>>>> and then converts that to SQL. (More details on all that from author's
>>>> EuroPython talk at http://pyvideo.org/video/2968)
>>>>
>>>> I believe PonyORM needs the AST just for generator expressions and
>>>> lambda functions, but obviously if this kind of AST access feature
>>>> were in Python it'd probably be more general.
>>>>
>>>> I believe C#'s LINQ provides something similar, where if you're
>>>> developing a LINQ converter library (say LINQ to SQL), you essentially
>>>> get the AST of the code ("expression tree") and the library can do
>>>> what it wants with that.
>>>>
>>>> What would it take to enable this kind of AST access in Python? Is it
>>>> possible? Is it a good idea?
>>>>
>>>> -Ben
>>>> _______________________________________________
>>>> Python-Dev mailing list
>>>> Python-Dev at python.org
>>>> https://mail.python.org/mailman/listinfo/python-dev
>>>> Unsubscribe:
>>>> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>>>>
>>>
>>>
>>>
>>> --
>>> --Guido van Rossum (python.org/~guido)
>>>
>>
>>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150521/02003167/attachment.html>

From guido at python.org  Thu May 21 22:27:50 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 21 May 2015 13:27:50 -0700
Subject: [Python-Dev] Status of PEP 484 and the typing module
In-Reply-To: <555DFDB4.7050503@hotpy.org>
References: <555DDCA4.2080100@hotpy.org>
 <CAP7+vJ+qtRF4kngnHQEh-C+BVaS6cxPk2kifRaB03NA-NiC3sQ@mail.gmail.com>
 <555DFDB4.7050503@hotpy.org>
Message-ID: <CAP7+vJJHZj0=3MgkNLQKtkY2AF-H_9t_ah68jSW4Q74PX9oGtQ@mail.gmail.com>

Things are looking up. I think we're down to a very small number of issues
where we still disagree -- hopefully you'll allow me some leeway. :-)

On Thu, May 21, 2015 at 8:45 AM, Mark Shannon <mark at hotpy.org> wrote:

>
>
> On 21/05/15 16:01, Guido van Rossum wrote:
>
>> Hi Mark,
>>
>> We're down to the last few items here. I'm CC'ing python-dev so folks
>> can see how close we are. I'll answer point by point.
>>
>> On Thu, May 21, 2015 at 6:24 AM, Mark Shannon <mark at hotpy.org
>> <mailto:mark at hotpy.org>> wrote:
>>
>>     Hi,
>>
>>     The PEP itself is looking fairly good.
>>
>>
>> I hope you'll accept it at least provisionally so we can iterate over
>> the finer points while a prototype of typing.py in in beta 1.
>>
>>     However, I don't think that typing.py is ready yet, for a number of
>>     reasons:
>>
>>     1.
>>     As I've said before, there needs to be a distinction between classes
>>     and types.
>>     They is no need for Any, Generic, Generic's subtypes, or Union to
>>     subclass builtins.type.
>>
>>
>> I strongly disagree. They can appear in many positions where real
>> classes are acceptable, in particular annotations can have classes (e.g.
>> int) or types (e.g. Union[int, str]).
>>
>
> Why does this mean that they have to be classes? Annotations can be any
> object.
>

I want to encourage users to think about annotations as types, and for most
users the distinction between type and class is too subtle, so a simpler
rule is to say they are classes. This works out nicely when the annotations
are simple types such as 'int' or 'str' or user-defined classes (e.g.
'Employee').


> It might to help to think, not in terms of types being classes, but
> classes being shorthand for the nominal type for that class (from the point
> of view of the checker and type geeks)
> So when the checker sees 'int' it treats it as Type(int).
>

I'm fine with that being the formal interpretation (except that I don't
want to introduce a function named Type()). But it's too subtle for most
users.


> Subtyping is distinct from subclassing;
> Type(int) <: Union[Type(int), Type(str)]
> has no parallel in subclassing.
> There is no class that corresponds to a Union, Any or a Generic.
>

Again, for most people te distinction is too subtle. People expect to be
able to play around with things interactively. I think it will be helpful
if they can experiment with the objects exported by typing too:
experimenting with things like isinstance(42, Union[int, str]) or
issubclass(Any, Employee) and issubclass(Employee, Any) is a useful thing
to explore how these things work (always with the caveat that when Any is
involved, issubclass is not transitive). Of course it won't work when they
advance to type variables -- at that point you just *have* to understand
the theory and switch from using the interactive interpreter to writing
small test programs and seeing how mypy (or some other checker) responds to
them.


> In order to support the
> class C(ParameterType[T]): pass
>

I presume you mean class C(Generic[T])?


> syntax, parametric types do indeed need to be classes, but Python has
> multiple inheritance, so thats not a problem:
> class ParameterType(type, Type): ...
> Otherwise typing.Types shouldn't be builtin.types and vice versa.
>

There's one thing here that Jukka has convinced me of. While I really want
Union[...] to act like a class (though not subclassable!), plain Union
(without the [...]) needn't. The same is true for Callable and Tuple
without [...]. I've filed https://github.com/ambv/typehinting/issues/133
for this. I'm not sure how much work it will be to fix this but I don't
think it absolutely needs to be done in beta 1 -- there's not much you can
do with them anyway.


> I think a lot of this issues on the tracker would not have been issues had
> the distinction been more clearly enforced.
>
>
>>     Playing around with typing.py, it has also become clear to me that it
>>     is also important to distinguish type constructors from types.
>>
>>     What do I mean by a type constructor?
>>     A type constructor makes types.
>>     "List" is an example of a type constructor. It constructs types such
>>     as List[T] and List[int].
>>     Saying that something is a List (as opposed to a list) should be
>>     rejected.
>>
>>
>> The PEP actually says that plain List (etc.) is equivalent to List[Any].
>> (Well, at least that's the intention; it's implied by the section about
>> the equivalence between Node() and Node[Any]().
>>
>
> Perhaps we should change that. Using 'List', rather than 'list' or
> 'List[Any]' suggests an error, or misunderstanding, to me.
>
> Is there a use case where 'List' is needed, and 'list' will not suffice?
> I'm assuming that the type checker knows that 'list' is a MutableSequence.
>

I think it's easier if we ask people to always write 'List' rather than
'list' when they are talking about types, and 'List[Any]' will probably be
a popular type (lots of people don't want to think about exactly what the
item type is, but they are sure that the container is a list).

There's also an argument from consistency with the collection ABCs. As you
know, typing defines a bunch of types that act as "stand ins" for the
corresponding ABCs defined in collections.abc (Iterable, Sequence, Sized,
etc.). The intention here is that anywhere one of the collection ABCs is
valid it should be okay to use the corresponding class imported from typing
-- so that if you have code that currently uses "from collections.abc
import Sequence, Mapping" you can just replace that with "from typing
import Sequence, Mapping" and your code will still work. (You can then
iterate at leisure on parametrizing the types.)

So we can use e.g. Sequence as a base class and it means the same as
Sequence[Any]. Given this rule, it would be somewhat surprising if you
couldn't use List but were forced to write List[Any] in other places.
(Neither Sequence[Any] nor List[Any] can be instantiated so that's not a
concern.)


>
>
>>     2.
>>     Usability of typing as it stands:
>>
>>     Let's try to make a class that implements a mutable mapping.
>>
>>      >>> import typing as tp
>>     #Make some variables.
>>      >>> T = tp.TypeVar('T')
>>      >>> K = tp.TypeVar('K')
>>      >>> V = tp.TypeVar('V')
>>
>>     #Then make our class:
>>
>>      >>> class MM(tp.MutableMapping): pass
>>     ...
>>     #Oh that worked, but it shouldn't. MutableMapping is a type
>> constructor.
>>
>>
>> It means MutableMapping[Any].
>>
>>     #Let's make one
>>      >>> MM()
>>     Traceback (most recent call last):
>>        File "<stdin>", line 1, in <module>
>>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>>     line 1095, in __new__
>>          if _gorg(c) is Generic:
>>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>>     line 887, in _gorg
>>          while a.__origin__ is not None:
>>     AttributeError: type object 'Sized' has no attribute '__origin__'
>>
>>     # ???
>>
>>
>> Sorry, that's a bug I introduced in literally the last change to
>> typing.py. I will fix it. The expected behavior is
>>
>> TypeError: Can't instantiate abstract class MM with abstract methods
>> __len__
>>
>>     #Well let's try using type variables.
>>     class MM2(tp.MutableMapping[K, V]): pass
>>     ...
>>      >>> MM2()
>>     Traceback (most recent call last):
>>        File "<stdin>", line 1, in <module>
>>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>>     line 1095, in __new__
>>          if _gorg(c) is Generic:
>>        File "/home/mark/repositories/typehinting/prototyping/typing.py",
>>     line 887, in _gorg
>>          while a.__origin__ is not None:
>>     AttributeError: type object 'Sized' has no attribute '__origin__'
>>
>>
>> Ditto, and sorry.
>>
> No need to apologise, I'm just a bit worried about how easy it was for me
> to expose this sort of bug.
>

Well, I'm just glad you exposed it so soon after I introduced it. :-)

>
>
>>
>>     At this point, we have to resort to using 'Dict', which forces us to
>>     subclass 'dict' which may not be what we want as it may cause
>>     metaclass conflicts.
>>
>>     3.
>>     Memory consumption is also a worry. There is no caching, which means
>>     every time I use "List[int]" as an annotation, a new class object is
>>     created. Each class may only be a few KB, but collectively this
>>     could easily add up to several MBs.
>>     This should be easy to fix.
>>
>>
>> I can work on this after the beta-1 release. Until then, type aliases
>> can be used to avoid redundant type creation (and often they are clearer
>> anyway :-).
>>
> Sure.
>

OK. Tracking this in https://github.com/ambv/typehinting/issues/130

[...]

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150521/d9db4af5/attachment-0001.html>

From ericsnowcurrently at gmail.com  Thu May 21 23:52:52 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 21 May 2015 15:52:52 -0600
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <555E1311.5090105@mrabarnett.plus.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
 <555E1311.5090105@mrabarnett.plus.com>
Message-ID: <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>

Good catch.  Unfortunately, sticking "keys = ((PyDictObject
*)od)->ma_keys;" right after "hash = ..." did not make a difference.
I still get the same segfault.

-eric

On Thu, May 21, 2015 at 11:17 AM, MRAB <python at mrabarnett.plus.com> wrote:
> On 2015-05-21 15:55, Eric Snow wrote:
>>
>> (see http://bugs.python.org/issue16991)
>>
>> I an working on resolving an intermittent segfault that my C
>> OrderedDict patch introduces.  The failure happens in
>> test_configparser (RawConfigParser uses OrderedDict internally), but
>> only sporadically.  However, Ned pointed out to me that it appears to
>> be related to hash randomization, which I have verified.  I'm looking
>> into it.
>>
>> In the meantime, here's a specific question.  What would lead to the
>> pattern of failures I'm seeing?  I've verified that the segfault
>> happens consistently for certain hash randomization seeds and never
>> for the rest.  I don't immediately recognize the pattern but expect
>> that it would shed some light on where the problem lies.  I ran the
>> following command with the OrderedDict patch applied:
>>
>>    for i in `seq 1 100`; do echo $i; PYTHONHASHSEED=$i ./python -m
>> test.regrtest -m test_basic test_configparser ; done
>>
>> Through 100 I get segfaults with seeds of 7, 15, 35, 37, 39, 40, 42,
>> 47, 50, 66, 67, 85, 87, 88, and 92.  I expect the distribution across
>> all seeds is uniform, but I haven't verified that.
>>
>> Thoughts?
>>
> In "_odict_get_index", for example (there are others), you're caching
> "ma_keys":
>
>     PyDictKeysObject *keys = ((PyDictObject *)od)->ma_keys;
>
> If it resizes, you go back to the label "start", which is after that
> line, but could "ma_keys" change when it's resized?
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com

From python at mrabarnett.plus.com  Fri May 22 00:06:25 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Thu, 21 May 2015 23:06:25 +0100
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>	<555E1311.5090105@mrabarnett.plus.com>
 <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>
Message-ID: <555E56E1.6060904@mrabarnett.plus.com>

On 2015-05-21 22:52, Eric Snow wrote:
 > Good catch.  Unfortunately, sticking "keys = ((PyDictObject
 > *)od)->ma_keys;" right after "hash = ..." did not make a difference.
 > I still get the same segfault.

So, does it change sometimes?

 >
 > On Thu, May 21, 2015 at 11:17 AM, MRAB <python at mrabarnett.plus.com> 
wrote:
 > > On 2015-05-21 15:55, Eric Snow wrote:
 > >>
 > >> (see http://bugs.python.org/issue16991)
 > >>
 > >> I an working on resolving an intermittent segfault that my C
 > >> OrderedDict patch introduces.  The failure happens in
 > >> test_configparser (RawConfigParser uses OrderedDict internally), but
 > >> only sporadically.  However, Ned pointed out to me that it appears to
 > >> be related to hash randomization, which I have verified.  I'm looking
 > >> into it.
 > >>
 > >> In the meantime, here's a specific question.  What would lead to the
 > >> pattern of failures I'm seeing?  I've verified that the segfault
 > >> happens consistently for certain hash randomization seeds and never
 > >> for the rest.  I don't immediately recognize the pattern but expect
 > >> that it would shed some light on where the problem lies.  I ran the
 > >> following command with the OrderedDict patch applied:
 > >>
 > >>    for i in `seq 1 100`; do echo $i; PYTHONHASHSEED=$i ./python -m
 > >> test.regrtest -m test_basic test_configparser ; done
 > >>
 > >> Through 100 I get segfaults with seeds of 7, 15, 35, 37, 39, 40, 42,
 > >> 47, 50, 66, 67, 85, 87, 88, and 92.  I expect the distribution across
 > >> all seeds is uniform, but I haven't verified that.
 > >>
 > >> Thoughts?
 > >>
 > > In "_odict_get_index", for example (there are others), you're caching
 > > "ma_keys":
 > >
 > >     PyDictKeysObject *keys = ((PyDictObject *)od)->ma_keys;
 > >
 > > If it resizes, you go back to the label "start", which is after that
 > > line, but could "ma_keys" change when it's resized?
 > >


From ericsnowcurrently at gmail.com  Fri May 22 00:17:39 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 21 May 2015 16:17:39 -0600
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <555E56E1.6060904@mrabarnett.plus.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
 <555E1311.5090105@mrabarnett.plus.com>
 <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>
 <555E56E1.6060904@mrabarnett.plus.com>
Message-ID: <CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>

On Thu, May 21, 2015 at 4:06 PM, MRAB <python at mrabarnett.plus.com> wrote:
> On 2015-05-21 22:52, Eric Snow wrote:
>> Good catch.  Unfortunately, sticking "keys = ((PyDictObject
>> *)od)->ma_keys;" right after "hash = ..." did not make a difference.
>> I still get the same segfault.
>
> So, does it change sometimes?

The segfault is consistent if I use the same seed (e.g. 7):

  PYTHONHASHSEED=7 ./python -m test.regrtest -m test_basic test_configparser

Some seeds always segfault and some seeds never segfault.

-eric

From python at mrabarnett.plus.com  Fri May 22 00:41:47 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Thu, 21 May 2015 23:41:47 +0100
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>	<555E1311.5090105@mrabarnett.plus.com>	<CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>	<555E56E1.6060904@mrabarnett.plus.com>
 <CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>
Message-ID: <555E5F2B.5030004@mrabarnett.plus.com>

On 2015-05-21 23:17, Eric Snow wrote:
 > On Thu, May 21, 2015 at 4:06 PM, MRAB <python at mrabarnett.plus.com> wrote:
 > > On 2015-05-21 22:52, Eric Snow wrote:
 > >> Good catch.  Unfortunately, sticking "keys = ((PyDictObject
 > >> *)od)->ma_keys;" right after "hash = ..." did not make a difference.
 > >> I still get the same segfault.
 > >
 > > So, does it change sometimes?
 >
 > The segfault is consistent if I use the same seed (e.g. 7):
 >
 >   PYTHONHASHSEED=7 ./python -m test.regrtest -m test_basic 
test_configparser
 >
 > Some seeds always segfault and some seeds never segfault.
 >
OK, another thought.

In "_odict_get_index" again, you say that if the hash has changed, the 
dict might've
been resized, but could the dict be resized _without_ the hash changing?

Could the value of "keys" still become invalid even if the hash is the same?


From ericsnowcurrently at gmail.com  Fri May 22 01:22:58 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 21 May 2015 17:22:58 -0600
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <555E5F2B.5030004@mrabarnett.plus.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
 <555E1311.5090105@mrabarnett.plus.com>
 <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>
 <555E56E1.6060904@mrabarnett.plus.com>
 <CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>
 <555E5F2B.5030004@mrabarnett.plus.com>
Message-ID: <CALFfu7C6cqjJ6Gb2WkJCZUZ-XP-_AYzjVJY6A8QJ+xEf=DVFCw@mail.gmail.com>

On Thu, May 21, 2015 at 4:41 PM, MRAB <python at mrabarnett.plus.com> wrote:
> On 2015-05-21 23:17, Eric Snow wrote:
>> The segfault is consistent if I use the same seed (e.g. 7):
>>
>>   PYTHONHASHSEED=7 ./python -m test.regrtest -m test_basic
>> test_configparser
>>
>> Some seeds always segfault and some seeds never segfault.
>>
> OK, another thought.
>
> In "_odict_get_index" again, you say that if the hash has changed, the dict
> might've
> been resized, but could the dict be resized _without_ the hash changing?
>
> Could the value of "keys" still become invalid even if the hash is the same?

Good question.  The only way I can see here that the dict would resize
is during re-entrance to the interpreter eval loop via Python code
potentially triggered through the PyObject_Hash call.

Also, there's no check for a changed hash.  The code compares the size
of ma_keys (effectively the dict keys hash table) against the size of
of the odict "fast nodes" table.

-eric

From greg.ewing at canterbury.ac.nz  Fri May 22 01:33:10 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 22 May 2015 11:33:10 +1200
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <555D6DFD.1070602@canterbury.ac.nz>
 <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <555E6B36.9020402@canterbury.ac.nz>

Steve Dower wrote:
> It's only a macro system when you generate code in unexpected/unobvious 
> places with it. This is more like inspect.getsource(), but going 
> straight to the AST.

Is it really that much different? The end result is
the same -- the user writes something that looks like
a Python expression, but it gets evaluated using some
other set of semantics that can be arbitrarily different
from Python's.

-- 
Greg

From python at mrabarnett.plus.com  Fri May 22 01:55:55 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Fri, 22 May 2015 00:55:55 +0100
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <CALFfu7C6cqjJ6Gb2WkJCZUZ-XP-_AYzjVJY6A8QJ+xEf=DVFCw@mail.gmail.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>	<555E1311.5090105@mrabarnett.plus.com>	<CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>	<555E56E1.6060904@mrabarnett.plus.com>	<CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>	<555E5F2B.5030004@mrabarnett.plus.com>
 <CALFfu7C6cqjJ6Gb2WkJCZUZ-XP-_AYzjVJY6A8QJ+xEf=DVFCw@mail.gmail.com>
Message-ID: <555E708B.2000203@mrabarnett.plus.com>



On 2015-05-22 00:22, Eric Snow wrote:
> On Thu, May 21, 2015 at 4:41 PM, MRAB <python at mrabarnett.plus.com> wrote:
> > On 2015-05-21 23:17, Eric Snow wrote:
> >> The segfault is consistent if I use the same seed (e.g. 7):
> >>
> >>   PYTHONHASHSEED=7 ./python -m test.regrtest -m test_basic
> >> test_configparser
> >>
> >> Some seeds always segfault and some seeds never segfault.
> >>
> > OK, another thought.
> >
> > In "_odict_get_index" again, you say that if the hash has changed, the dict
> > might've
> > been resized, but could the dict be resized _without_ the hash changing?
> >
> > Could the value of "keys" still become invalid even if the hash is the same?
>
> Good question.  The only way I can see here that the dict would resize
> is during re-entrance to the interpreter eval loop via Python code
> potentially triggered through the PyObject_Hash call.
>
> Also, there's no check for a changed hash.  The code compares the size
> of ma_keys (effectively the dict keys hash table) against the size of
> of the odict "fast nodes" table.
Ah, OK.

I'm not looking at the use of "PyTuple_Pack". As I understand it, 
"PyTuple_Pack" borrows the
references of the objects passed, and when the tuple itself is DECREFed, 
those objects will be
DECREFed

"odict_reduce" calls "PyTuple_Pack", passing 1 or 2 references to 
Py_None which aren't INCREFed
first, so could there be a bug there? (There might be similar issues in 
other functions.)

From ethan at stoneleaf.us  Fri May 22 02:12:08 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Thu, 21 May 2015 17:12:08 -0700
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <555E6B36.9020402@canterbury.ac.nz>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <555D6DFD.1070602@canterbury.ac.nz>
 <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555E6B36.9020402@canterbury.ac.nz>
Message-ID: <555E7458.3040207@stoneleaf.us>

On 05/21/2015 04:33 PM, Greg Ewing wrote:
> Steve Dower wrote:
>>
>> It's only a macro system when you generate code in unexpected/unobvious places with it. This is more like inspect.getsource(), but going straight to the AST.
>
> Is it really that much different? The end result is
> the same -- the user writes something that looks like
> a Python expression, but it gets evaluated using some
> other set of semantics that can be arbitrarily different
> from Python's.

I think the key difference is that the AST is not going to be converted to run different Python code under Python, but under some other language -- presumably to implement the semantics of the Python 
snippet.

--
~Ethan~

From ericsnowcurrently at gmail.com  Fri May 22 02:12:45 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 21 May 2015 18:12:45 -0600
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <555E708B.2000203@mrabarnett.plus.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
 <555E1311.5090105@mrabarnett.plus.com>
 <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>
 <555E56E1.6060904@mrabarnett.plus.com>
 <CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>
 <555E5F2B.5030004@mrabarnett.plus.com>
 <CALFfu7C6cqjJ6Gb2WkJCZUZ-XP-_AYzjVJY6A8QJ+xEf=DVFCw@mail.gmail.com>
 <555E708B.2000203@mrabarnett.plus.com>
Message-ID: <CALFfu7DoXrU0_sffOmF-E+ensr1iK51chWQ5y=bZEj-fmd+rkw@mail.gmail.com>

On Thu, May 21, 2015 at 5:55 PM, MRAB <python at mrabarnett.plus.com> wrote:
> I'm not looking at the use of "PyTuple_Pack". As I understand it,
> "PyTuple_Pack" borrows the
> references of the objects passed, and when the tuple itself is DECREFed,
> those objects will be
> DECREFed

>From the docs [1] it seems that PyTuple_Pack does not steal any
references and it returns a new reference.  Perhaps you were thinking
of PyTuple_SetItem (and PyTuple_SET_ITEM)?

[1] https://docs.python.org/3.5//c-api/tuple.html

>
> "odict_reduce" calls "PyTuple_Pack", passing 1 or 2 references to Py_None
> which aren't INCREFed
> first, so could there be a bug there? (There might be similar issues in
> other functions.)

Alas, I don't think it is. :(

I'll point out that the configparser test in question does a lot of
resizes.  It may be that the problem only surfaces after many resizes
and apparently only for certain hash randomization seeds.  At the
moment I'm looking at how hash randomization impacts resizing.  I'm
certainly seeing that the resizes happen at different item counts
depending on the seed.

-eric

From python at mrabarnett.plus.com  Fri May 22 02:22:41 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Fri, 22 May 2015 01:22:41 +0100
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <CALFfu7DoXrU0_sffOmF-E+ensr1iK51chWQ5y=bZEj-fmd+rkw@mail.gmail.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>	<555E1311.5090105@mrabarnett.plus.com>	<CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>	<555E56E1.6060904@mrabarnett.plus.com>	<CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>	<555E5F2B.5030004@mrabarnett.plus.com>	<CALFfu7C6cqjJ6Gb2WkJCZUZ-XP-_AYzjVJY6A8QJ+xEf=DVFCw@mail.gmail.com>	<555E708B.2000203@mrabarnett.plus.com>
 <CALFfu7DoXrU0_sffOmF-E+ensr1iK51chWQ5y=bZEj-fmd+rkw@mail.gmail.com>
Message-ID: <555E76D1.4040901@mrabarnett.plus.com>

On 2015-05-22 01:12, Eric Snow wrote:
> On Thu, May 21, 2015 at 5:55 PM, MRAB <python at mrabarnett.plus.com> wrote:
> > I'm not looking at the use of "PyTuple_Pack". As I understand it,
> > "PyTuple_Pack" borrows the
> > references of the objects passed, and when the tuple itself is DECREFed,
> > those objects will be
> > DECREFed
>
> >From the docs [1] it seems that PyTuple_Pack does not steal any
> references and it returns a new reference.  Perhaps you were thinking
> of PyTuple_SetItem (and PyTuple_SET_ITEM)?
>
> [1] https://docs.python.org/3.5//c-api/tuple.html
>
> >
> > "odict_reduce" calls "PyTuple_Pack", passing 1 or 2 references to Py_None
> > which aren't INCREFed
> > first, so could there be a bug there? (There might be similar issues in
> > other functions.)
>
> Alas, I don't think it is. :(
I'd come to the same conclusion.

Oh, well, I'll keep looking...
> I'll point out that the configparser test in question does a lot of
> resizes.  It may be that the problem only surfaces after many resizes
> and apparently only for certain hash randomization seeds.  At the
> moment I'm looking at how hash randomization impacts resizing.  I'm
> certainly seeing that the resizes happen at different item counts
> depending on the seed.
>

From greg.ewing at canterbury.ac.nz  Fri May 22 02:28:54 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 22 May 2015 12:28:54 +1200
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <555E7458.3040207@stoneleaf.us>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <555D6DFD.1070602@canterbury.ac.nz>
 <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555E6B36.9020402@canterbury.ac.nz> <555E7458.3040207@stoneleaf.us>
Message-ID: <555E7846.9030707@canterbury.ac.nz>

Ethan Furman wrote:
> I think the key difference is that the AST is not going to be converted 
> to run different Python code under Python, but under some other language 
> -- presumably to implement the semantics of the Python snippet.

If the semantics were exactly the same as the Python
snippet, there would be no need to convert it to another
language -- you might as well just run the Python
code as-is.

The whole point of this kind of facility is to express
things that you *can't* express the way you would like
using standard Python semantics.

 From the user's point of view, it doesn't matter whether
the implementation works by generating Python code, or
generating some other language, or processing the AST
directly. The effect is to assign non-Python semantics
to Python syntax.

(At least is *is* still Python syntax -- I can understand
Guido being wary of letting people redefine the syntax
as well.)

-- 
Greg

From ericsnowcurrently at gmail.com  Fri May 22 02:30:01 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 21 May 2015 18:30:01 -0600
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <555E76D1.4040901@mrabarnett.plus.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
 <555E1311.5090105@mrabarnett.plus.com>
 <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>
 <555E56E1.6060904@mrabarnett.plus.com>
 <CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>
 <555E5F2B.5030004@mrabarnett.plus.com>
 <CALFfu7C6cqjJ6Gb2WkJCZUZ-XP-_AYzjVJY6A8QJ+xEf=DVFCw@mail.gmail.com>
 <555E708B.2000203@mrabarnett.plus.com>
 <CALFfu7DoXrU0_sffOmF-E+ensr1iK51chWQ5y=bZEj-fmd+rkw@mail.gmail.com>
 <555E76D1.4040901@mrabarnett.plus.com>
Message-ID: <CALFfu7A7z-DvkOEWT_5e58WYJMRsgMv0_k=4S+-EoANoasEE-g@mail.gmail.com>

On Thu, May 21, 2015 at 6:22 PM, MRAB <python at mrabarnett.plus.com> wrote:
> Oh, well, I'll keep looking...

Thanks!

-eric

From pludemann at google.com  Fri May 22 02:29:17 2015
From: pludemann at google.com (Peter Ludemann)
Date: Thu, 21 May 2015 17:29:17 -0700
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <555E7458.3040207@stoneleaf.us>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <555D6DFD.1070602@canterbury.ac.nz>
 <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555E6B36.9020402@canterbury.ac.nz> <555E7458.3040207@stoneleaf.us>
Message-ID: <CACsRUKKKWKmnJr-0sqb4+HfgYb1yus1HV-15b1Q8gh_B2YMBvA@mail.gmail.com>

On 21 May 2015 at 17:12, Ethan Furman <ethan at stoneleaf.us> wrote:

> On 05/21/2015 04:33 PM, Greg Ewing wrote:
>
>> Steve Dower wrote:
>>
>>>
>>> It's only a macro system when you generate code in unexpected/unobvious
>>> places with it. This is more like inspect.getsource(), but going straight
>>> to the AST.
>>>
>>
>> Is it really that much different? The end result is
>> the same -- the user writes something that looks like
>> a Python expression, but it gets evaluated using some
>> other set of semantics that can be arbitrarily different
>> from Python's.
>>
>
> I think the key difference is that the AST is not going to be converted to
> run different Python code under Python, but under some other language --
> presumably to implement the semantics of the Python snippet.
>

As a simple example, a "macro" with access to the AST could decide to not
evaluate something, whereas normal Python rules would be to evaluate
(similar to wrapping with LISP QUOTE or LAMBDA). This would make PEP484
simpler (e.g., no need for special handling of "forward" references to
types).


>
> --
> ~Ethan~
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/pludemann%40google.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150521/9676dccb/attachment-0001.html>

From Steve.Dower at microsoft.com  Fri May 22 02:45:05 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Fri, 22 May 2015 00:45:05 +0000
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <555E7846.9030707@canterbury.ac.nz>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <555D6DFD.1070602@canterbury.ac.nz>
 <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555E6B36.9020402@canterbury.ac.nz>
 <555E7458.3040207@stoneleaf.us>,<555E7846.9030707@canterbury.ac.nz>
Message-ID: <BY1PR03MB146658775E3A32B91EC25439F5C00@BY1PR03MB1466.namprd03.prod.outlook.com>

The semantics could be the same while the execution plan is different, just like numba compiled code runs with the same semantics as the original.

A better way of getting the AST than decompiling byte code is all that's being asked for. Maybe not easy to do in the general case, but certainly not an unreasonable request.

Cheers,
Steve

Top-posted from my Windows Phone
________________________________
From: Greg Ewing<mailto:greg.ewing at canterbury.ac.nz>
Sent: ?5/?21/?2015 17:29
To: python-dev at python.org<mailto:python-dev at python.org>
Subject: Re: [Python-Dev] Enable access to the AST for Python code

Ethan Furman wrote:
> I think the key difference is that the AST is not going to be converted
> to run different Python code under Python, but under some other language
> -- presumably to implement the semantics of the Python snippet.

If the semantics were exactly the same as the Python
snippet, there would be no need to convert it to another
language -- you might as well just run the Python
code as-is.

The whole point of this kind of facility is to express
things that you *can't* express the way you would like
using standard Python semantics.

 From the user's point of view, it doesn't matter whether
the implementation works by generating Python code, or
generating some other language, or processing the AST
directly. The effect is to assign non-Python semantics
to Python syntax.

(At least is *is* still Python syntax -- I can understand
Guido being wary of letting people redefine the syntax
as well.)

--
Greg
_______________________________________________
Python-Dev mailing list
Python-Dev at python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/e9e025c8/attachment.html>

From benhoyt at gmail.com  Fri May 22 03:19:27 2015
From: benhoyt at gmail.com (Ben Hoyt)
Date: Thu, 21 May 2015 21:19:27 -0400
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
Message-ID: <CAL9jXCG=eOeJdRbPKy=sAOhX_SkwJ3ziuGJiCO+4y1zNPEHGHg@mail.gmail.com>

FYI, I've re-posted this on python-ideas now:
https://mail.python.org/pipermail/python-ideas/2015-May/033621.html

-Ben

On Wed, May 20, 2015 at 10:31 PM, Guido van Rossum <guido at python.org> wrote:
> Hey Ben, this is probably a better topic for python-ideas. I'll warn you
> that a hurdle for ideas like this is that ideally you don't want to support
> this just for CPython. It's definitely cool though! (Using movie poster
> style quotes you can turn this into a ringing endorsement: "definitely cool"
> -- The BDFL. :-)
>
> On Wed, May 20, 2015 at 7:26 PM, Ben Hoyt <benhoyt at gmail.com> wrote:
>>
>> Hi Python devs,
>>
>> Enabling access to the AST for compiled code would make some cool
>> things possible (C# LINQ-style ORMs, for example), and not knowing too
>> much about this part of Python internals, I'm wondering how possible
>> and practical this would be.
>>
>> Context: PonyORM (http://ponyorm.com/) allows you to write regular
>> Python generator expressions like this:
>>
>>     select(c for c in Customer if sum(c.orders.price) > 1000)
>>
>> which compile into and run SQL like this:
>>
>>     SELECT "c"."id"
>>     FROM "Customer" "c"
>>     LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
>>     GROUP BY "c"."id"
>>     HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
>>
>> I think the Pythonic syntax here is beautiful. But the tricks PonyORM
>> has to go to get it are ... not quite so beautiful. Because the AST is
>> not available, PonyORM decompiles Python bytecode into an AST first,
>> and then converts that to SQL. (More details on all that from author's
>> EuroPython talk at http://pyvideo.org/video/2968)
>>
>> I believe PonyORM needs the AST just for generator expressions and
>> lambda functions, but obviously if this kind of AST access feature
>> were in Python it'd probably be more general.
>>
>> I believe C#'s LINQ provides something similar, where if you're
>> developing a LINQ converter library (say LINQ to SQL), you essentially
>> get the AST of the code ("expression tree") and the library can do
>> what it wants with that.
>>
>> What would it take to enable this kind of AST access in Python? Is it
>> possible? Is it a good idea?
>>
>> -Ben
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
>
>
>
> --
> --Guido van Rossum (python.org/~guido)

From ethan at stoneleaf.us  Fri May 22 03:33:20 2015
From: ethan at stoneleaf.us (Ethan Furman)
Date: Thu, 21 May 2015 18:33:20 -0700
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <555E7846.9030707@canterbury.ac.nz>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <555D6DFD.1070602@canterbury.ac.nz>
 <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555E6B36.9020402@canterbury.ac.nz> <555E7458.3040207@stoneleaf.us>
 <555E7846.9030707@canterbury.ac.nz>
Message-ID: <555E8760.8050404@stoneleaf.us>

On 05/21/2015 05:28 PM, Greg Ewing wrote:
> Ethan Furman wrote:
>>
>> I think the key difference is that the AST is not going to be
>>  converted to run different Python code under Python, but under
>>  some other language -- presumably to implement the semantics of
>> the Python snippet.
>
> If the semantics were exactly the same as the Python
> snippet, there would be no need to convert it to another
> language -- you might as well just run the Python
> code as-is.

Going back to the OP:

> Context: PonyORM (http://ponyorm.com/) allows you to write regular
> Python generator expressions like this:
>
>     select(c for c in Customer if sum(c.orders.price) > 1000)
>
> which compile into and run SQL like this:
>
>     SELECT "c"."id"
>     FROM "Customer" "c"
>     LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
>     GROUP BY "c"."id"
>     HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000

That last code is /not/ Python.  ;)

--
~Ethan~

From greg.ewing at canterbury.ac.nz  Fri May 22 04:06:54 2015
From: greg.ewing at canterbury.ac.nz (Greg)
Date: Fri, 22 May 2015 14:06:54 +1200
Subject: [Python-Dev] Enable access to the AST for Python code
In-Reply-To: <555E8760.8050404@stoneleaf.us>
References: <CAL9jXCGCYTRFan3gb8WV6Et5ckr4D91o9uSpqvUuT6BK=4c13A@mail.gmail.com>
 <CAP7+vJ+Tqb_We68EpD2ms61_fkSV+_=waHOA98tU+x0Z13A8tg@mail.gmail.com>
 <555D6DFD.1070602@canterbury.ac.nz>
 <BY1PR03MB1466B550282CCDF1681981ABF5C10@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555E6B36.9020402@canterbury.ac.nz> <555E7458.3040207@stoneleaf.us>
 <555E7846.9030707@canterbury.ac.nz> <555E8760.8050404@stoneleaf.us>
Message-ID: <555E8F3E.2060707@canterbury.ac.nz>

On 22/05/2015 1:33 p.m., Ethan Furman wrote:
> Going back to the OP:
>
>>     select(c for c in Customer if sum(c.orders.price) > 1000)
>>
>> which compile into and run SQL like this:
>>
>>     SELECT "c"."id"
>>     FROM "Customer" "c"
>>     LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
>>     GROUP BY "c"."id"
>>     HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
>
> That last code is /not/ Python.  ;)

More importantly, it's not Python *semantics*. You can't view
it as simply a translation of the Python expression into a
different language.

I still think this is really a macro facility by a different
name. I'm not saying that's a bad thing, just pointing it out.

The main difference is that a macro would (or at least could)
be expanded at compile time, whereas this would require
processing the AST each time it's used.

-- 
Greg


From ericsnowcurrently at gmail.com  Fri May 22 04:42:28 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 21 May 2015 20:42:28 -0600
Subject: [Python-Dev] segfaults due to hash randomization in C
	OrderedDict
In-Reply-To: <555E76D1.4040901@mrabarnett.plus.com>
References: <CALFfu7AdAOMYjVGTMbn=2=OOZK3Vbz05eugncrjb-rnp36mWGg@mail.gmail.com>
 <555E1311.5090105@mrabarnett.plus.com>
 <CALFfu7BNjoo641veBkaJ1Uy6GP_ZttxF2iywPXCKCUsfJy5Jxw@mail.gmail.com>
 <555E56E1.6060904@mrabarnett.plus.com>
 <CALFfu7Cn-B7KCO2aOa-_BvkJav=s-Nqqk0EPH3nf6qbBcnLffg@mail.gmail.com>
 <555E5F2B.5030004@mrabarnett.plus.com>
 <CALFfu7C6cqjJ6Gb2WkJCZUZ-XP-_AYzjVJY6A8QJ+xEf=DVFCw@mail.gmail.com>
 <555E708B.2000203@mrabarnett.plus.com>
 <CALFfu7DoXrU0_sffOmF-E+ensr1iK51chWQ5y=bZEj-fmd+rkw@mail.gmail.com>
 <555E76D1.4040901@mrabarnett.plus.com>
Message-ID: <CALFfu7BQ5VhJXVr8EZS35uN5C8kQyt3z96mBXBG4tBztZ8bSeQ@mail.gmail.com>

On Thu, May 21, 2015 at 6:22 PM, MRAB <python at mrabarnett.plus.com> wrote:
> Oh, well, I'll keep looking...

I've posted some data to http://bugs.python.org/issue16991 that I hope
will shed some light on the issue.  We can continue the conversation
there.

-eric

From antti at haapala.name  Fri May 22 07:05:49 2015
From: antti at haapala.name (Antti Haapala)
Date: Fri, 22 May 2015 08:05:49 +0300
Subject: [Python-Dev] Tracker reviews look like spam
In-Reply-To: <CAPTjJmoz+nDE4dXnN7CaVyt7D+L8XY8C27Y-F2Qoj+JGBdojBw@mail.gmail.com>
References: <mittl6$bj3$1@ger.gmane.org> <20150512221524.GC1768@k3>
 <CAPTjJmoz+nDE4dXnN7CaVyt7D+L8XY8C27Y-F2Qoj+JGBdojBw@mail.gmail.com>
Message-ID: <CA+HSgbL3YXmvv3qsbKCDp_8OYnYM1wuGyjXxM+J3EzMxKtRgHA@mail.gmail.com>

There's an issue about this at
http://psf.upfronthosting.co.za/roundup/meta/issue562

I believe the problem is not that of the SPF, but the fact that mail gets
sent using IPv6 from an address that has neither a name mapping to it nor a
reverse pointer from IP address to name in DNS. See the second-first
comment where R. David Murray states that "Mail is consistently sent from
report at bugs.python.org, always from the same IP address, 46.4.197.70.
 46.4.197.70 resolves to bugs.python.org.", which clearly is false.


On 13 May 2015 at 08:20, Chris Angelico <rosuav at gmail.com> wrote:

> On Wed, May 13, 2015 at 8:15 AM, David Wilson <dw+python-dev at hmmz.org>
> wrote:
> > SPF only covers the envelope sender, so it should be possible to set
> > that to something that validates with SPF, keep the RFC822 From: header
> > as it is, and maybe(?) include a separate Sender: header matching the
> > envelope address.
>
> As Cameron says, Sender: isn't necessary - just have the envelope
> address be bounces@ or something and it should be fine. This is how
> SPF and (eg) mailing lists interact.
>
> ChrisA
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/antti%40haapala.name
>



-- 
Antti Haapala
antti.haapala at iki.fi
http://antti.haapala.name/
+358503693535
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/e547bab0/attachment.html>

From p.andrefreitas at gmail.com  Fri May 22 17:11:09 2015
From: p.andrefreitas at gmail.com (=?UTF-8?Q?Andr=C3=A9_Freitas?=)
Date: Fri, 22 May 2015 15:11:09 +0000
Subject: [Python-Dev] =?utf-8?q?Hello=2C_I_am_Andr=C3=A9_Freitas_=3A=29?=
Message-ID: <CAMkX=YVin0NYSGqs5SdFCxPZp7gthMBsuoDs7Uox4cbKdPsLig@mail.gmail.com>

Hi there,
My name is Andr? Freitas, I'm 22 years old and I live in Portugal (Porto).
I'm currently finishing my Masters in Informatics and Computer Science with
a thesis in Mining Software Repositories, where I am able to predict
defects in Software components: https://github.com/andrefreitas/schwa

I'm a Python developer with 4 years of experience and as a Speaker, did a
lot of Python workshops in Engineering Universities. I'm always learning
new things and I really love Python (it's my religion)! I have skills in
Security, Tests and Quality, Devops, Software Architecture and Engineering,
UI/UX and Compilers.

I am reading your guidelines and just checking around to see how this
mailing list works. Hope to send some patches soon and share some ideas.

Feel free to follow me on Github https://github.com/andrefreitas

Best regards,
Andr? Freitas.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/8be1018a/attachment.html>

From guido at python.org  Fri May 22 17:40:29 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 22 May 2015 08:40:29 -0700
Subject: [Python-Dev] PEP 484 (Type Hints) -- penultimate(?) draft
In-Reply-To: <CAP7+vJKQu3nmmsV5o0Lj9cOm6Z+epSY7+oZD6rMMsv+xt+SWYA@mail.gmail.com>
References: <CAP7+vJKQu3nmmsV5o0Lj9cOm6Z+epSY7+oZD6rMMsv+xt+SWYA@mail.gmail.com>
Message-ID: <CAP7+vJLhrQ4NLtBY4dvwG3ug7ZapU41YGsbZ5ZyzHFXM28WRBw@mail.gmail.com>

Another draft. This is mostly a bunch of clarifications and minor edits,
but it also removes the four version/platform constants (PY2, PY3, WINDOWS,
POSIX) in favor of asking type checkers to recognize common version checks
e.g. using sys.version_info or sys.platform. This time I think the new
version *will* appear on python.org. For more frequent updates, watch
https://github.com/ambv/typehinting .

Also note: I'm probably going to commit the typing.py module to the CPython
repo optimistically, while Mark is still pondering his decision. Off-list
he's told me he's happy with the PEP. I have to make some changes to
typing.py to satisfy him; I won't have time to work on those this
afternoon, and I don't want to miss (or hold up) Larry's tagging of the
tree for beta 1. So a few things may end up as bugs in the issue tracker (
https://github.com/ambv/typehinting/issues) and I'll rectify those before
beta 2.

--Guido

PEP: 484
Title: Type Hints
Version: $Revision$
Last-Modified: $Date$
Author: Guido van Rossum <guido at python.org>, Jukka Lehtosalo <
jukka.lehtosalo at iki.fi>, ?ukasz Langa <lukasz at langa.pl>
BDFL-Delegate: Mark Shannon
Discussions-To: Python-Dev <python-dev at python.org>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 29-Sep-2014
Post-History: 16-Jan-2015,20-Mar-2015,17-Apr-2015,20-May-2015,22-May-2015
Resolution:


Abstract
========

PEP 3107 introduced syntax for function annotations, but the semantics
were deliberately left undefined.  There has now been enough 3rd party
usage for static type analysis that the community would benefit from
a standard vocabulary and baseline tools within the standard library.

This PEP introduces a provisional module to provide these standard
definitions and tools, along with some conventions for situations
where annotations are not available.

Note that this PEP still explicitly does NOT prevent other uses of
annotations, nor does it require (or forbid) any particular processing
of annotations, even when they conform to this specification.  It
simply enables better coordination, as PEP 333 did for web frameworks.

For example, here is a simple function whose argument and return type
are declared in the annotations::

  def greeting(name: str) -> str:
      return 'Hello ' + name

While these annotations are available at runtime through the usual
``__annotations__`` attribute, *no type checking happens at runtime*.
Instead, the proposal assumes the existence of a separate off-line
type checker which users can run over their source code voluntarily.
Essentially, such a type checker acts as a very powerful linter.
(While it would of course be possible for individual users to employ
a similar checker at run time for Design By Contract enforcement or
JIT optimization, those tools are not yet as mature.)

The proposal is strongly inspired by mypy [mypy]_.  For example, the
type "sequence of integers" can be written as ``Sequence[int]``.  The
square brackets mean that no new syntax needs to be added to the
language.  The example here uses a custom type ``Sequence``, imported
from a pure-Python module ``typing``.  The ``Sequence[int]`` notation
works at runtime by implementing ``__getitem__()`` in the metaclass
(but its significance is primarily to an offline type checker).

The type system supports unions, generic types, and a special type
named ``Any`` which is consistent with (i.e. assignable to and from) all
types.  This latter feature is taken from the idea of gradual typing.
Gradual typing and the full type system are explained in PEP 483.

Other approaches from which we have borrowed or to which ours can be
compared and contrasted are described in PEP 482.


Rationale and Goals
===================

PEP 3107 added support for arbitrary annotations on parts of a
function definition.  Although no meaning was assigned to annotations
then, there has always been an implicit goal to use them for type
hinting [gvr-artima]_, which is listed as the first possible use case
in said PEP.

This PEP aims to provide a standard syntax for type annotations,
opening up Python code to easier static analysis and refactoring,
potential runtime type checking, and (perhaps, in some contexts)
code generation utilizing type information.

Of these goals, static analysis is the most important.  This includes
support for off-line type checkers such as mypy, as well as providing
a standard notation that can be used by IDEs for code completion and
refactoring.

Non-goals
---------

While the proposed typing module will contain some building blocks for
runtime type checking -- in particular a useful ``isinstance()``
implementation -- third party packages would have to be developed to
implement specific runtime type checking functionality, for example
using decorators or metaclasses.  Using type hints for performance
optimizations is left as an exercise for the reader.

It should also be emphasized that **Python will remain a dynamically
typed language, and the authors have no desire to ever make type hints
mandatory, even by convention.**


The meaning of annotations
==========================

Any function without annotations should be treated as having the most
general type possible, or ignored, by any type checker.  Functions
with the ``@no_type_check`` decorator or with a ``# type: ignore``
comment should be treated as having no annotations.

It is recommended but not required that checked functions have
annotations for all arguments and the return type.  For a checked
function, the default annotation for arguments and for the return type
is ``Any``.  An exception is that the first argument of instance and
class methods does not need to be annotated; it is assumed to have the
type of the containing class for instance methods, and a type object
type corresponding to the containing class object for class methods.
For example, in class ``A`` the first argument of an instance method
has the implicit type ``A``. In a class method, the precise type of
the first argument cannot be represented using the available type
notation.

(Note that the return type of ``__init__`` ought to be annotated with
``-> None``.  The reason for this is subtle.  If ``__init__`` assumed
a return annotation of ``-> None``, would that mean that an
argument-less, un-annotated ``__init__`` method should still be
type-checked?  Rather than leaving this ambiguous or introducing an
exception to the exception, we simply say that ``__init__`` ought to
have a return annotation; the default behavior is thus the same as for
other methods.)

A type checker is expected to check the body of a checked function for
consistency with the given annotations.  The annotations may also used
to check correctness of calls appearing in other checked functions.

Type checkers are expected to attempt to infer as much information as
necessary.  The minimum requirement is to handle the builtin
decorators ``@property``, ``@staticmethod`` and ``@classmethod``.


Type Definition Syntax
======================

The syntax leverages PEP 3107-style annotations with a number of
extensions described in sections below.  In its basic form, type
hinting is used by filling function annotation slots with classes::

  def greeting(name: str) -> str:
      return 'Hello ' + name

This states that the expected type of the ``name`` argument is
``str``.  Analogically, the expected return type is ``str``.

Expressions whose type is a subtype of a specific argument type are
also accepted for that argument.


Acceptable type hints
---------------------

Type hints may be built-in classes (including those defined in
standard library or third-party extension modules), abstract base
classes, types available in the ``types`` module, and user-defined
classes (including those defined in the standard library or
third-party modules).

While annotations are normally the best format for type hints,
there are times when it is more appropriate to represent them
by a special comment, or in a separately distributed stub
file.  (See below for examples.)

Annotations must be valid expressions that evaluate without raising
exceptions at the time the function is defined (but see below for
forward references).

Annotations should be kept simple or static analysis tools may not be
able to interpret the values. For example, dynamically computed types
are unlikely to be understood.  (This is an
intentionally somewhat vague requirement, specific inclusions and
exclusions may be added to future versions of this PEP as warranted by
the discussion.)

In addition to the above, the following special constructs defined
below may be used: ``None``, ``Any``, ``Union``, ``Tuple``,
``Callable``, all ABCs and stand-ins for concrete classes exported
from ``typing`` (e.g. ``Sequence`` and ``Dict``), type variables, and
type aliases.

All newly introduced names used to support features described in
following sections (such as ``Any`` and ``Union``) are available in
the ``typing`` module.


Using None
----------

When used in a type hint, the expression ``None`` is considered
equivalent to ``type(None)``.


Type aliases
------------

Type aliases are defined by simple variable assignments::

  Url = str

  def retry(url: Url, retry_count: int) -> None: ...

Note that we recommend capitalizing alias names, since they represent
user-defined types, which (like user-defined classes) are typically
spelled that way.

Type aliases may be as complex as type hints in annotations --
anything that is acceptable as a type hint is acceptable in a type
alias::

    from typing import TypeVar, Iterable, Tuple

    T = TypeVar('T', int, float, complex)
    Vector = Iterable[Tuple[T, T]]

    def inproduct(v: Vector) -> T:
        return sum(x*y for x, y in v)

This is equivalent to::

    from typing import TypeVar, Iterable, Tuple

    T = TypeVar('T', int, float, complex)

    def inproduct(v: Iterable[Tuple[T, T]]) -> T:
        return sum(x*y for x, y in v)


Callable
--------

Frameworks expecting callback functions of specific signatures might be
type hinted using ``Callable[[Arg1Type, Arg2Type], ReturnType]``.
Examples::

  from typing import Callable

  def feeder(get_next_item: Callable[[], str]) -> None:
      # Body

  def async_query(on_success: Callable[[int], None],
                  on_error: Callable[[int, Exception], None]) -> None:
      # Body

It is possible to declare the return type of a callable without
specifying the call signature by substituting a literal ellipsis
(three dots) for the list of arguments::

  def partial(func: Callable[..., str], *args) -> Callable[..., str]:
      # Body

Note that there are no square brackets around the ellipsis.  The
arguments of the callback are completely unconstrained in this case
(and keyword arguments are acceptable).

Since using callbacks with keyword arguments is not perceived as a
common use case, there is currently no support for specifying keyword
arguments with ``Callable``.  Similarly, there is no support for
specifying callback signatures with a variable number of argument of a
specific type.


Generics
--------

Since type information about objects kept in containers cannot be
statically inferred in a generic way, abstract base classes have been
extended to support subscription to denote expected types for container
elements.  Example::

  from typing import Mapping, Set

  def notify_by_email(employees: Set[Employee], overrides: Mapping[str,
str]) -> None: ...

Generics can be parametrized by using a new factory available in
``typing`` called ``TypeVar``.  Example::

  from typing import Sequence, TypeVar

  T = TypeVar('T')      # Declare type variable

  def first(l: Sequence[T]) -> T:   # Generic function
      return l[0]

In this case the contract is that the returned value is consistent with
the elements held by the collection.

A ``TypeVar()`` expression must always directly be assigned to a
variable (it should not be used as part of a larger expression).  The
argument to ``TypeVar()`` must be a string equal to the variable name
to which it is assigned.  Type variables must not be redefined.

``TypeVar`` supports constraining parametric types to a fixed set of
possible types.  For example, we can define a type variable that ranges
over just ``str`` and ``bytes``.  By default, a type variable ranges
over all possible types.  Example of constraining a type variable::

  from typing import TypeVar

  AnyStr = TypeVar('AnyStr', str, bytes)

  def concat(x: AnyStr, y: AnyStr) -> AnyStr:
      return x + y

The function ``concat`` can be called with either two ``str`` arguments
or two ``bytes`` arguments, but not with a mix of ``str`` and ``bytes``
arguments.

There should be at least two constraints, if any; specifying a single
constraint is disallowed.

Subtypes of types constrained by a type variable should be treated
as their respective explicitly listed base types in the context of the
type variable.  Consider this example::

  class MyStr(str): ...

  x = concat(MyStr('apple'), MyStr('pie'))

The call is valid but the type variable ``AnyStr`` will be set to
``str`` and not ``MyStr``. In effect, the inferred type of the return
value assigned to ``x`` will also be ``str``.

Additionally, ``Any`` is a valid value for every type variable.
Consider the following::

  def count_truthy(elements: List[Any]) -> int:
      return sum(1 for elem in elements if element)

This is equivalent to omitting the generic notation and just saying
``elements: List``.


User-defined generic types
--------------------------

You can include a ``Generic`` base class to define a user-defined class
as generic.  Example::

  from typing import TypeVar, Generic

  T = TypeVar('T')

  class LoggedVar(Generic[T]):
      def __init__(self, value: T, name: str, logger: Logger) -> None:
          self.name = name
          self.logger = logger
          self.value = value

      def set(self, new: T) -> None:
          self.log('Set ' + repr(self.value))
          self.value = new

      def get(self) -> T:
          self.log('Get ' + repr(self.value))
          return self.value

      def log(self, message: str) -> None:
          self.logger.info('{}: {}'.format(self.name message))

``Generic[T]`` as a base class defines that the class ``LoggedVar``
takes a single type parameter ``T``. This also makes ``T`` valid as
a type within the class body.

The ``Generic`` base class uses a metaclass that defines ``__getitem__``
so that ``LoggedVar[t]`` is valid as a type::

  from typing import Iterable

  def zero_all_vars(vars: Iterable[LoggedVar[int]]) -> None:
      for var in vars:
          var.set(0)

A generic type can have any number of type variables, and type variables
may be constrained. This is valid::

  from typing import TypeVar, Generic
  ...

  T = TypeVar('T')
  S = TypeVar('S')

  class Pair(Generic[T, S]):
      ...

Each type variable argument to ``Generic`` must be distinct. This is
thus invalid::

  from typing import TypeVar, Generic
  ...

  T = TypeVar('T')

  class Pair(Generic[T, T]):   # INVALID
      ...

You can use multiple inheritance with ``Generic``::

  from typing import TypeVar, Generic, Sized

  T = TypeVar('T')

  class LinkedList(Sized, Generic[T]):
      ...

Subclassing a generic class without specifying type parameters assumes
``Any`` for each position.  In the following example, ``MyIterable``
is not generic but implicitly inherits from ``Iterable[Any]``:

  from typing import Iterable

  class MyIterable(Iterable):  # Same as Iterable[Any]
      ...

Generic metaclasses are not supported.


Instantiating generic classes and type erasure
----------------------------------------------

Generic types like ``List`` or ``Sequence`` cannot be instantiated.
However, user-defined classes derived from them can be instantiated.
Suppose we write a ``Node`` class inheriting from ``Generic[T]``::

  from typing import TypeVar, Generic

  T = TypeVar('T')

  class Node(Generic[T]):
      ...

Now there are two ways we can instantiate this class; the type
inferred by a type checker may be different depending on the form we
use.  The first way is to give the value of the type parameter
explicitly -- this overrides whatever type inference the type
checker would otherwise perform:

  x = Node[T]()  # The type inferred for x is Node[T].

  y = Node[int]()  # The type inferred for y is Node[int].

If no explicit types are given, the type checker is given some
freedom. Consider this code:

  x = Node()

The inferred type could be ``Node[Any]``, as there isn't enough
context to infer a more precise type.  Alternatively, a type checker
may reject the line and require an explicit annotation, like this:

  x = Node()  # type: Node[int]  # Inferred type is Node[int].

A type checker with more powerful type inference could look at how
``x`` is used elsewhere in the file and try to infer a more precise
type such as ``Node[int]`` even without an explicit type annotation.
However, it is probably impossible to make such type inference work
well in all cases, since Python programs can be very dynamic.

This PEP doesn't specify the details of how type inference should
work.  We allow different tools to experiment with various approaches.
We may give more explicit rules in future revisions.

At runtime the type is not preserved, and the class of ``x`` is just
``Node`` in all cases.  This behavior is called "type erasure"; it is
common practice in languages with generics (e.g. Java, TypeScript).


Arbitrary generic types as base classes
---------------------------------------

``Generic[T]`` is only valid as a base class -- it's not a proper type.
However, user-defined generic types such as ``LinkedList[T]`` from the
above example and built-in generic types and ABCs such as ``List[T]``
and ``Iterable[T]`` are valid both as types and as base classes. For
example, we can define a subclass of ``Dict`` that specializes type
arguments::

  from typing import Dict, List, Optional

  class Node:
      ...

  class SymbolTable(Dict[str, List[Node]]):
      def push(self, name: str, node: Node) -> None:
          self.setdefault(name, []).append(node)

      def pop(self, name: str) -> Node:
          return self[name].pop()

      def lookup(self, name: str) -> Optional[Node]:
          nodes = self.get(name)
          if nodes:
              return nodes[-1]
          return None

``SymbolTable`` is a subclass of ``dict`` and a subtype of ``Dict[str,
List[Node]]``.

If a generic base class has a type variable as a type argument, this
makes the defined class generic. For example, we can define a generic
``LinkedList`` class that is iterable and a container::

  from typing import TypeVar, Iterable, Container

  T = TypeVar('T')

  class LinkedList(Iterable[T], Container[T]):
      ...

Now ``LinkedList[int]`` is a valid type. Note that we can use ``T``
multiple times in the base class list, as long as we don't use the
same type variable ``T`` multiple times within ``Generic[...]``.

Also consider the following example::

  from typing import TypeVar, Mapping

  T = TypeVar('T')

  class MyDict(Mapping[str, T]):
      ...

In this case MyDict has a single parameter, T.


Abstract generic types
----------------------

The metaclass used by ``Generic`` is a subclass of ``abc.ABCMeta``.
A generic class can be an ABC by including abstract methods
or properties, and generic classes can also have ABCs as base
classes without a metaclass conflict.


Type variables with an upper bound
----------------------------------

A type variable may specify an upper bound using ``bound=<type>``.
This means that an actual type substituted (explicitly or implictly)
for the type variable must be a subclass of the boundary type.  A
common example is the definition of a Comparable type that works well
enough to catch the most common errors::

  from typing import TypeVar

  class Comparable(metaclass=ABCMeta):
      @abstractmethod
      def __lt__(self, other: Any) -> bool: ...
      ... # __gt__ etc. as well

  CT = TypeVar('CT', bound=Comparable)

  def min(x: CT, y: CT) -> CT:
      if x < y:
          return x
      else:
          return y

  min(1, 2) # ok, return type int
  min('x', 'y') # ok, return type str

(Note that this is not ideal -- for example ``min('x', 1)`` is invalid
at runtime but a type checker would simply infer the return type
``Comparable``.  Unfortunately, addressing this would require
introducing a much more powerful and also much more complicated
concept, F-bounded polymorphism.  We may revisit this in the future.)

An upper bound cannot be combined with type constraints (as in used
``AnyStr``, see the example earlier); type constraints cause the
inferred type to be _exactly_ one of the constraint types, while an
upper bound just requires that the actual type is a subclass of the
boundary type.


Covariance and contravariance
-----------------------------

Consider a class ``Employee`` with a subclass ``Manager``.  Now
suppose we have a function with an argument annotated with
``List[Employee]``.  Should we be allowed to call this function with a
variable of type ``List[Manager]`` as its argument?  Many people would
answer "yes, of course" without even considering the consequences.
But unless we know more about the function, a type checker should
reject such a call: the function might append an ``Employee`` instance
to the list, which would violate the variable's type in the caller.

It turns out such an argument acts _contravariantly_, whereas the
intuitive answer (which is correct in case the function doesn't mutate
its argument!) requires the argument to act _covariantly_.  A longer
introduction to these concepts can be found on Wikipedia
[wiki-variance]_; here we just show how to control a type checker's
behavior.

By default type variables are considered _invariant_, which means that
arguments for arguments annotated with types like ``List[Employee]``
must exactly match the type annotation -- no subclasses or
superclasses of the type parameter (in this example ``Employee``) are
allowed.

To facilitate the declaration of container types where covariant type
checking is acceptable, a type variable can be declared using
``covariant=True``.  For the (rare) case where contravariant behavior
is desirable, pass ``contravariant=True``.  At most one of these may
be passed.

A typical example involves defining an immutable (or read-only)
container class::

  from typing import TypeVar, Generic, Iterable, Iterator

  T = TypeVar('T', covariant=True)

  class ImmutableList(Generic[T]):
      def __init__(self, items: Iterable[T]) -> None: ...
      def __iter__(self) -> Iterator[T]: ...
      ...

  class Employee: ...

  class Manager(Employee): ...

  def dump_employees(emps: ImmutableList[Employee]) -> None:
      for emp in emps:
          ...

  mgrs = ImmutableList([Manager()])  # type: ImmutableList[Manager]
  dump_employees(mgrs)  # OK

The read-only collection classes in ``typing`` are all defined using a
covariant type variable (e.g. ``Mapping`` and ``Sequence``).  The
mutable collection classes (e.g. ``MutableMapping`` and
``MutableSequence``) are defined using regular invariant type
variables.  The one example of a contravariant type variable is the
``Generator`` type, which is contravariant in the ``send()`` argument
type (see below).

Note: variance affects type parameters for generic types -- it does
not affect regular parameters.  For example, the following example is
fine::

  from typing import TypeVar

  class Employee: ...

  class Manager(Employee): ...

  E = TypeVar('E', bound=Employee)  # Invariant

  def dump_employee(e: E) -> None: ...

  dump_employee(Manager())  # OK


The numeric tower
-----------------

PEP 3141 defines Python's numeric tower, and the stdlib module
``numbers`` implements the corresponding ABCs (``Number``,
``Complex``, ``Real``, ``Rational`` and ``Integral``).  There are some
issues with these ABCs, but the built-in concrete numeric classes
``complex``, ``float`` and ``int`` are ubiquitous (especially the
latter two :-).

Rather than requiring that users write ``import numbers`` and then use
``numbers.Float`` etc., this PEP proposes a straightforward shortcut
that is almost as effective: when an argument is annotated as having
type ``float``, an argument of type ``int`` is acceptable; similar,
for an argument annotated as having type ``complex``, arguments of
type ``float`` or ``int`` are acceptable.  This does not handle
classes implementing the corresponding ABCs or the
``fractions.Fraction`` class, but we believe those use cases are
exceedingly rare.


The bytes types
---------------

There are three different builtin classes used for arrays of bytes
(not counting the classes available in the ``array`` module):
``bytes``, ``bytearray`` and ``memoryview``.  Of these, ``bytes`` and
``bytearray`` have many behaviors in common (though not all --
``bytearray`` is mutable).

While there is an ABC ``ByteString`` defined in ``collections.abc``
and a corresponding type in ``typing``, functions accepting bytes (of
some form) are so common that it would be cumbersome to have to write
``typing.ByteString`` everywhere.  So, as a shortcut similar to that
for the builtin numeric classes, when an argument is annotated as
having type ``bytes``, arguments of type ``bytearray`` or
``memoryview`` are acceptable.  (Again, there are situations where
this isn't sound, but we believe those are exceedingly rare in
practice.)


Forward references
------------------

When a type hint contains names that have not been defined yet, that
definition may be expressed as a string literal, to be resolved later.

A situation where this occurs commonly is the definition of a
container class, where the class being defined occurs in the signature
of some of the methods.  For example, the following code (the start of
a simple binary tree implementation) does not work::

  class Tree:
      def __init__(self, left: Tree, right: Tree):
          self.left = left
          self.right = right

To address this, we write::

  class Tree:
      def __init__(self, left: 'Tree', right: 'Tree'):
          self.left = left
          self.right = right

The string literal should contain a valid Python expression (i.e.,
``compile(lit, '', 'eval')`` should be a valid code object) and it
should evaluate without errors once the module has been fully loaded.
The local and global namespace in which it is evaluated should be the
same namespaces in which default arguments to the same function would
be evaluated.

Moreover, the expression should be parseable as a valid type hint, i.e.,
it is constrained by the rules from the section `Acceptable type hints`_
above.

It is allowable to use string literals as *part* of a type hint, for
example::

    class Tree:
        ...
        def leaves(self) -> List['Tree']:
            ...

A common use for forward references is when e.g. Django models are
needed in the signatures.  Typically, each model is in a separate
file, and has methods that arguments whose type involves other models.
Because of the way circular imports work in Python, it is often not
possible to import all the needed models directly::

    # File models/a.py
    from models.b import B
    class A(Model):
        def foo(self, b: B): ...

    # File models/b.py
    from models.a import A
    class B(Model):
        def bar(self, a: A): ...

    # File main.py
    from models.a import A
    from models.b import B

Assuming main is imported first, this will fail with an ImportError at
the line ``from models.a import A`` in models/b.py, which is being
imported from models/a.py before a has defined class A.  The solution
is to switch to module-only imports and reference the models by their
_module_._class_ name::

    # File models/a.py
    from models import b
    class A(Model):
        def foo(self, b: 'b.B'): ...

    # File models/b.py
    from models import a
    class B(Model):
        def bar(self, a: 'a.A'): ...

    # File main.py
    from models.a import A
    from models.b import B


Union types
-----------

Since accepting a small, limited set of expected types for a single
argument is common, there is a new special factory called ``Union``.
Example::

  from typing import Union

  def handle_employees(e: Union[Employee, Sequence[Employee]]) -> None:
      if isinstance(e, Employee):
          e = [e]
      ...

A type factored by ``Union[T1, T2, ...]`` responds ``True`` to
``issubclass`` checks for ``T1`` and any of its subtypes, ``T2`` and
any of its subtypes, and so on.

One common case of union types are *optional* types.  By default,
``None`` is an invalid value for any type, unless a default value of
``None`` has been provided in the function definition.  Examples::

  def handle_employee(e: Union[Employee, None]) -> None: ...

As a shorthand for ``Union[T1, None]`` you can write ``Optional[T1]``;
for example, the above is equivalent to::

  from typing import Optional

  def handle_employee(e: Optional[Employee]) -> None: ...

An optional type is also automatically assumed when the default value is
``None``, for example::

  def handle_employee(e: Employee = None): ...

This is equivalent to::

  def handle_employee(e: Optional[Employee] = None) -> None: ...

The ``Any`` type
----------------

A special kind of type is ``Any``.  Every type is a subtype of
``Any``.  This is also true for the builtin type ``object``.
However, to the static type checker these are completely different.

When the type of a value is ``object``, the type checker will reject
almost all operations on it, and assigning it to a variable (or using
it as a return value) of a more specialized type is a type error.  On
the other hand, when a value has type ``Any``, the type checker will
allow all operations on it, and a value of type ``Any`` can be assigned
to a variable (or used as a return value) of a more constrained type.


Version and platform checking
-----------------------------

Type checkers are expected to understand simple version and platform
checks, e.g.::

  import sys

  if sys.version_info[0] >= 3:
      # Python 3 specific definitions
  else:
      # Python 2 specific definitions

  if sys.platform == 'win32':
      # Windows specific definitions
  else:
      # Posix specific definitions

Don't expect a checker to understand obfuscations like
``"".join(reversed(sys.platform)) == "xunil"``.


Default argument values
-----------------------

In stubs it may be useful to declare an argument as having a default
without specifying the actual default value.  For example::

  def foo(x: AnyStr, y: AnyStr = ...) -> AnyStr: ...

What should the default value look like?  Any of the options ``""``,
``b""`` or ``None`` fails to satisfy the type constraint (actually,
``None`` will *modify* the type to become ``Optional[AnyStr]``).

In such cases the default value may be specified as a literal
ellipsis, i.e. the above example is literally what you would write.


Compatibility with other uses of function annotations
=====================================================

A number of existing or potential use cases for function annotations
exist, which are incompatible with type hinting.  These may confuse
a static type checker.  However, since type hinting annotations have no
runtime behavior (other than evaluation of the annotation expression and
storing annotations in the ``__annotations__`` attribute of the function
object), this does not make the program incorrect -- it just may cause
a type checker to emit spurious warnings or errors.

To mark portions of the program that should not be covered by type
hinting, you can use one or more of the following:

* a ``# type: ignore`` comment;

* a ``@no_type_check`` decorator on a class or function;

* a custom class or function decorator marked with
  ``@no_type_check_decorator``.

For more details see later sections.

In order for maximal compatibility with offline type checking it may
eventually be a good idea to change interfaces that rely on annotations
to switch to a different mechanism, for example a decorator.  In Python
3.5 there is no pressure to do this, however.  See also the longer
discussion under `Rejected alternatives`_ below.


Type comments
=============

No first-class syntax support for explicitly marking variables as being
of a specific type is added by this PEP.  To help with type inference in
complex cases, a comment of the following format may be used::

  x = []   # type: List[Employee]
  x, y, z = [], [], []  # type: List[int], List[int], List[str]
  x, y, z = [], [], []  # type: (List[int], List[int], List[str])
  x = [
     1,
     2,
  ]  # type: List[int]

Type comments should be put on the last line of the statement that
contains the variable definition. They can also be placed on
``with`` statements and ``for`` statements, right after the colon.

Examples of type comments on ``with`` and ``for`` statements::

  with frobnicate() as foo:  # type: int
      # Here foo is an int
      ...

  for x, y in points:  # type: float, float
      # Here x and y are floats
      ...

In stubs it may be useful to declare the existence of a variable
without giving it an initial value.  This can be done using a literal
ellipsis::

  from typing import IO

  stream = ...  # type: IO[str]

In non-stub code, there is a similar special case:

  from typing import IO

  stream = None  # type: IO[str]

Type checkers should not complain about this (despite the value
``None`` not matching the given type), nor should they change the
inferred type to ``Optional[...]`` (despite the rule that does this
for annotated arguments with a default value of ``None``).  The
assumption here is that other code will ensure that the variable is
given a value of the proper type, and all uses can assume that the
variable has the given type.

The ``# type: ignore`` comment should be put on the line that the
error refers to::

  import http.client
  errors = {
      'not_found': http.client.NOT_FOUND  # type: ignore
  }

A ``# type: ignore`` comment on a line by itself disables all type
checking for the rest of the file.

If type hinting proves useful in general, a syntax for typing variables
may be provided in a future Python version.

Casts
=====

Occasionally the type checker may need a different kind of hint: the
programmer may know that an expression is of a more constrained type
than a type checker may be able to infer.  For example::

  from typing import List, cast

  def find_first_str(a: List[object]) -> str:
      index = next(i for i, x in enumerate(a) if isinstance(x, str))
      # We only get here if there's at least one string in a
      return cast(str, a[index])

Some type checkers may not be able to infer that the type of
``a[index]`` is ``str`` and only infer ``object`` or ``Any``", but we
know that (if the code gets to that point) it must be a string.  The
``cast(t, x)`` call tells the type checker that we are confident that
the type of ``x`` is ``t``.  At runtime a cast always returns the
expression unchanged -- it does not check the type, and it does not
convert or coerce the value.

Casts differ from type comments (see the previous section).  When using
a type comment, the type checker should still verify that the inferred
type is consistent with the stated type.  When using a cast, the type
checker should blindly believe the programmer.  Also, casts can be used
in expressions, while type comments only apply to assignments.


Stub Files
==========

Stub files are files containing type hints that are only for use by
the type checker, not at runtime.  There are several use cases for
stub files:

* Extension modules

* Third-party modules whose authors have not yet added type hints

* Standard library modules for which type hints have not yet been
  written

* Modules that must be compatible with Python 2 and 3

* Modules that use annotations for other purposes

Stub files have the same syntax as regular Python modules.  There is one
feature of the ``typing`` module that may only be used in stub files:
the ``@overload`` decorator described below.

The type checker should only check function signatures in stub files;
It is recommended that function bodies in stub files just be a single
ellipsis (``...``).

The type checker should have a configurable search path for stub files.
If a stub file is found the type checker should not read the
corresponding "real" module.

While stub files are syntactically valid Python modules, they use the
``.pyi`` extension to make it possible to maintain stub files in the
same directory as the corresponding real module.  This also reinforces
the notion that no runtime behavior should be expected of stub files.

Additional notes on stub files:

* Modules and variables imported into the stub are not considered
  exported from the stub unless the import uses the ``import ... as
  ...`` form.

Function overloading
--------------------

The ``@overload`` decorator allows describing functions that support
multiple different combinations of argument types.  This pattern is
used frequently in builtin modules and types.  For example, the
``__getitem__()`` method of the ``bytes`` type can be described as
follows::

  from typing import overload

  class bytes:
      ...
      @overload
      def __getitem__(self, i: int) -> int: ...
      @overload
      def __getitem__(self, s: slice) -> bytes: ...

This description is more precise than would be possible using unions
(which cannot express the relationship between the argument and return
types)::

  from typing import Union

  class bytes:
      ...
      def __getitem__(self, a: Union[int, slice]) -> Union[int, bytes]: ...

Another example where ``@overload`` comes in handy is the type of the
builtin ``map()`` function, which takes a different number of
arguments depending on the type of the callable::

  from typing import Callable, Iterable, Iterator, Tuple, TypeVar, overload

  T1 = TypeVar('T1')
  T2 = TypeVar('T2)
  S = TypeVar('S')

  @overload
  def map(func: Callable[[T1], S], iter1: Iterable[T1]) -> Iterator[S]: ...
  @overload
  def map(func: Callable[[T1, T2], S],
          iter1: Iterable[T1], iter2: Iterable[T2]) -> Iterator[S]: ...
  # ... and we could add more items to support more than two iterables

Note that we could also easily add items to support ``map(None, ...)``::

  @overload
  def map(func: None, iter1: Iterable[T1]) -> Iterable[T1]: ...
  @overload
  def map(func: None,
          iter1: Iterable[T1],
          iter2: Iterable[T2]) -> Iterable[Tuple[T1, T2]]: ...

The ``@overload`` decorator may only be used in stub files.  While it
would be possible to provide a multiple dispatch implementation using
this syntax, its implementation would require using
``sys._getframe()``, which is frowned upon.  Also, designing and
implementing an efficient multiple dispatch mechanism is hard, which
is why previous attempts were abandoned in favor of
``functools.singledispatch()``.  (See PEP 443, especially its section
"Alternative approaches".)  In the future we may come up with a
satisfactory multiple dispatch design, but we don't want such a design
to be constrained by the overloading syntax defined for type hints in
stub files.  In the meantime, using the ``@overload`` decorator or
calling ``overload()`` directly raises ``RuntimeError``.

A constrained ``TypeVar`` type can often be used instead of using the
``@overload`` decorator.  For example, the definitions of ``concat1``
and ``concat2`` in this stub file are equivalent:

  from typing import TypeVar

  AnyStr = TypeVar('AnyStr', str, bytes)

  def concat1(x: AnyStr, y: AnyStr) -> AnyStr: ...

  @overload
  def concat2(x: str, y: str) -> str: ...
  @overload
  def concat2(x: bytes, y: bytes) -> bytes: ...

Some functions, such as ``map`` or ``bytes.__getitem__`` above, can't
be represented precisely using type variables.  However, unlike
``@overload``, type variables can also be used outside stub files.  We
recommend that ``@overload`` is only used in cases where a type
variable is not sufficient, due to its special stub-only status.

Another important difference between type variables such as ``AnyStr``
and using ``@overload`` is that the prior can also be used to define
constraints for generic class type parameters.  For example, the type
parameter of the generic class ``typing.IO`` is constrained (only
``IO[str]``, ``IO[bytes]`` and ``IO[Any]`` are valid):

  class IO(Generic[AnyStr]): ...

Storing and distributing stub files
-----------------------------------

The easiest form of stub file storage and distribution is to put them
alongside Python modules in the same directory.  This makes them easy to
find by both programmers and the tools.  However, since package
maintainers are free not to add type hinting to their packages,
third-party stubs installable by ``pip`` from PyPI are also supported.
In this case we have to consider three issues: naming, versioning,
installation path.

This PEP does not provide a recommendation on a naming scheme that
should be used for third-party stub file packages.  Discoverability will
hopefully be based on package popularity, like with Django packages for
example.

Third-party stubs have to be versioned using the lowest version of the
source package that is compatible.  Example: FooPackage has versions
1.0, 1.1, 1.2, 1.3, 2.0, 2.1, 2.2.  There are API changes in versions
1.1, 2.0 and 2.2.  The stub file package maintainer is free to release
stubs for all versions but at least 1.0, 1.1, 2.0 and 2.2 are needed
to enable the end user type check all versions.  This is because the
user knows that the closest *lower or equal* version of stubs is
compatible.  In the provided example, for FooPackage 1.3 the user would
choose stubs version 1.1.

Note that if the user decides to use the "latest" available source
package, using the "latest" stub files should generally also work if
they're updated often.

Third-party stub packages can use any location for stub storage.  Type
checkers should search for them using PYTHONPATH.  A default fallback
directory that is always checked is ``shared/typehints/python3.5/`` (or
3.6, etc.).  Since there can only be one package installed for a given
Python version per environment, no additional versioning is performed
under that directory (just like bare directory installs by ``pip`` in
site-packages).  Stub file package authors might use the following
snippet in ``setup.py``::

  ...
  data_files=[
      (
          'shared/typehints/python{}.{}'.format(*sys.version_info[:2]),
          pathlib.Path(SRC_PATH).glob('**/*.pyi'),
      ),
  ],
  ...

The Typeshed Repo
-----------------

There is a shared repository where useful stubs are being collected
[typeshed]_.  Note that stubs for a given package will not be included
here without the explicit consent of the package owner.  Further
policies regarding the stubs collected here will be decided at a later
time, after discussion on python-dev, and reported in the typeshed
repo's README.


Exceptions
==========

No syntax for listing explicitly raised exceptions is proposed.
Currently the only known use case for this feature is documentational,
in which case the recommendation is to put this information in a
docstring.


The ``typing`` Module
=====================

To open the usage of static type checking to Python 3.5 as well as older
versions, a uniform namespace is required.  For this purpose, a new
module in the standard library is introduced called ``typing``.

It defines the fundamental building blocks for constructing types
(e.g. ``Any``), types representing generic variants of builtin
collections (e.g. ``List``), types representing generic
collection ABCs (e.g. ``Sequence``), and a small collection of
convenience definitions.

Fundamental building blocks:

* Any, used as ``def get(key: str) -> Any: ...``

* Union, used as ``Union[Type1, Type2, Type3]``

* Callable, used as ``Callable[[Arg1Type, Arg2Type], ReturnType]``

* Tuple, used by listing the element types, for example
  ``Tuple[int, int, str]``.
  Arbitrary-length homogeneous tuples can be expressed
  using one type and ellipsis, for example ``Tuple[int, ...]``.
  (The ``...`` here are part of the syntax, a literal ellipsis.)

* TypeVar, used as ``X = TypeVar('X', Type1, Type2, Type3)`` or simply
  ``Y = TypeVar('Y')`` (see above for more details)

* Generic, used to create user-defined generic classes

Generic variants of builtin collections:

* Dict, used as ``Dict[key_type, value_type]``

* List, used as ``List[element_type]``

* Set, used as ``Set[element_type]``. See remark for ``AbstractSet``
  below.

* FrozenSet, used as ``FrozenSet[element_type]``

Note: ``Dict``, ``List``, ``Set`` and ``FrozenSet`` are mainly useful
for annotating return values.  For arguments, prefer the abstract
collection types defined below, e.g.  ``Mapping``, ``Sequence`` or
``AbstractSet``.

Generic variants of container ABCs (and a few non-containers):

* ByteString

* Callable (see above, listed here for completeness)

* Container

* Generator, used as ``Generator[yield_type, send_type,
  return_type]``.  This represents the return value of generator
  functions.  It is a subtype of ``Iterable`` and it has additional
  type variables for the type accepted by the ``send()`` method (which
  is contravariant -- a generator that accepts sending it ``Employee``
  instance is valid in a context where a generator is required that
  accepts sending it ``Manager`` instances) and the return type of the
  generator.

* Hashable (not generic, but present for completeness)

* ItemsView

* Iterable

* Iterator

* KeysView

* Mapping

* MappingView

* MutableMapping

* MutableSequence

* MutableSet

* Sequence

* Set, renamed to ``AbstractSet``. This name change was required
  because ``Set`` in the ``typing`` module means ``set()`` with
  generics.

* Sized (not generic, but present for completeness)

* ValuesView

A few one-off types are defined that test for single special methods
(similar to ``Hashable`` or ``Sized``):

* Reversible, to test for ``__reversed__``

* SupportsAbs, to test for ``__abs__``

* SupportsComplex, to test for ``__complex__``

* SupportsFloat, to test for ``__float__``

* SupportsInt, to test for ``__int__``

* SupportsRound, to test for ``__round__``

* SupportsBytes, to test for ``__bytes__``

Convenience definitions:

* Optional, defined by ``Optional[t] == Union[t, type(None)]``

* AnyStr, defined as ``TypeVar('AnyStr', str, bytes)``

* NamedTuple, used as
  ``NamedTuple(type_name, [(field_name, field_type), ...])``
  and equivalent to
  ``collections.namedtuple(type_name, [field_name, ...])``.
  This is useful to declare the types of the fields of a a named tuple
  type.

* cast(), described earlier

* @no_type_check, a decorator to disable type checking per class or
  function (see below)

* @no_type_check_decorator, a decorator to create your own decorators
  with the same meaning as ``@no_type_check`` (see below)

* @overload, described earlier

* get_type_hints(), a utility function to retrieve the type hints from a
  function or method.  Given a function or method object, it returns
  a dict with the same format as ``__annotations__``, but evaluating
  forward references (which are given as string literals) as expressions
  in the context of the original function or method definition.

Types available in the ``typing.io`` submodule:

* IO (generic over ``AnyStr``)

* BinaryIO (a simple subtype of ``IO[bytes]``)

* TextIO (a simple subtype of ``IO[str]``)

Types available in the ``typing.re`` submodule:

* Match and Pattern, types of ``re.match()`` and ``re.compile()``
  results (generic over ``AnyStr``)


Rejected Alternatives
=====================

During discussion of earlier drafts of this PEP, various objections
were raised and alternatives were proposed.  We discuss some of these
here and explain why we reject them.

Several main objections were raised.

Which brackets for generic type parameters?
-------------------------------------------

Most people are familiar with the use of angular brackets
(e.g. ``List<int>``) in languages like C++, Java, C# and Swift to
express the parametrization of generic types.  The problem with these
is that they are really hard to parse, especially for a simple-minded
parser like Python.  In most languages the ambiguities are usually
dealt with by only allowing angular brackets in specific syntactic
positions, where general expressions aren't allowed.  (And also by
using very powerful parsing techniques that can backtrack over an
arbitrary section of code.)

But in Python, we'd like type expressions to be (syntactically) the
same as other expressions, so that we can use e.g. variable assignment
to create type aliases.  Consider this simple type expression::

    List<int>

>From the Python parser's perspective, the expression begins with the
same four tokens (NAME, LESS, NAME, GREATER) as a chained comparison::

    a < b > c  # I.e., (a < b) and (b > c)

We can even make up an example that could be parsed both ways::

    a < b > [ c ]

Assuming we had angular brackets in the language, this could be
interpreted as either of the following two::

    (a<b>)[c]      # I.e., (a<b>).__getitem__(c)
    a < b > ([c])  # I.e., (a < b) and (b > [c])

It would surely be possible to come up with a rule to disambiguate
such cases, but to most users the rules would feel arbitrary and
complex.  It would also require us to dramatically change the CPython
parser (and every other parser for Python).  It should be noted that
Python's current parser is intentionally "dumb" -- a simple grammar is
easier for users to reason about.

For all these reasons, square brackets (e.g. ``List[int]``) are (and
have long been) the preferred syntax for generic type parameters.
They can be implemented by defining the ``__getitem__()`` method on
the metaclass, and no new syntax is required at all.  This option
works in all recent versions of Python (starting with Python 2.2).
Python is not alone in this syntactic choice -- generic classes in
Scala also use square brackets.

What about existing uses of annotations?
----------------------------------------

One line of argument points out that PEP 3107 explicitly supports
the use of arbitrary expressions in function annotations.  The new
proposal is then considered incompatible with the specification of PEP
3107.

Our response to this is that, first of all, the current proposal does
not introduce any direct incompatibilities, so programs using
annotations in Python 3.4 will still work correctly and without
prejudice in Python 3.5.

We do hope that type hints will eventually become the sole use for
annotations, but this will require additional discussion and a
deprecation period after the initial roll-out of the typing module
with Python 3.5.  The current PEP will have provisional status (see
PEP 411) until Python 3.6 is released.  The fastest conceivable scheme
would introduce silent deprecation of non-type-hint annotations in
3.6, full deprecation in 3.7, and declare type hints as the only
allowed use of annotations in Python 3.8.  This should give authors of
packages that use annotations plenty of time to devise another
approach, even if type hints become an overnight success.

Another possible outcome would be that type hints will eventually
become the default meaning for annotations, but that there will always
remain an option to disable them.  For this purpose the current
proposal defines a decorator ``@no_type_check`` which disables the
default interpretation of annotations as type hints in a given class
or function.  It also defines a meta-decorator
``@no_type_check_decorator`` which can be used to decorate a decorator
(!), causing annotations in any function or class decorated with the
latter to be ignored by the type checker.

There are also ``# type: ignore`` comments, and static checkers should
support configuration options to disable type checking in selected
packages.

Despite all these options, proposals have been circulated to allow
type hints and other forms of annotations to coexist for individual
arguments.  One proposal suggests that if an annotation for a given
argument is a dictionary literal, each key represents a different form
of annotation, and the key ``'type'`` would be use for type hints.
The problem with this idea and its variants is that the notation
becomes very "noisy" and hard to read.  Also, in most cases where
existing libraries use annotations, there would be little need to
combine them with type hints.  So the simpler approach of selectively
disabling type hints appears sufficient.

The problem of forward declarations
-----------------------------------

The current proposal is admittedly sub-optimal when type hints must
contain forward references.  Python requires all names to be defined
by the time they are used.  Apart from circular imports this is rarely
a problem: "use" here means "look up at runtime", and with most
"forward" references there is no problem in ensuring that a name is
defined before the function using it is called.

The problem with type hints is that annotations (per PEP 3107, and
similar to default values) are evaluated at the time a function is
defined, and thus any names used in an annotation must be already
defined when the function is being defined.  A common scenario is a
class definition whose methods need to reference the class itself in
their annotations.  (More general, it can also occur with mutually
recursive classes.)  This is natural for container types, for
example::

  class Node:
      """Binary tree node."""

      def __init__(self, left: Node, right: None):
          self.left = left
          self.right = right

As written this will not work, because of the peculiarity in Python
that class names become defined once the entire body of the class has
been executed.  Our solution, which isn't particularly elegant, but
gets the job done, is to allow using string literals in annotations.
Most of the time you won't have to use this though -- most *uses* of
type hints are expected to reference builtin types or types defined in
other modules.

A counterproposal would change the semantics of type hints so they
aren't evaluated at runtime at all (after all, type checking happens
off-line, so why would type hints need to be evaluated at runtime at
all).  This of course would run afoul of backwards compatibility,
since the Python interpreter doesn't actually know whether a
particular annotation is meant to be a type hint or something else.

A compromise is possible where a ``__future__`` import could enable
turning *all* annotations in a given module into string literals, as
follows::

  from __future__ import annotations

  class ImSet:
      def add(self, a: ImSet) -> List[ImSet]: ...

  assert ImSet.add.__annotations__ == {'a': 'ImSet', 'return':
'List[ImSet]'}

Such a ``__future__`` import statement may be proposed in a separate
PEP.


The double colon
----------------

A few creative souls have tried to invent solutions for this problem.
For example, it was proposed to use a double colon (``::``) for type
hints, solving two problems at once: disambiguating between type hints
and other annotations, and changing the semantics to preclude runtime
evaluation.  There are several things wrong with this idea, however.

* It's ugly.  The single colon in Python has many uses, and all of
  them look familiar because they resemble the use of the colon in
  English text.  This is a general rule of thumb by which Python
  abides for most forms of punctuation; the exceptions are typically
  well known from other programming languages.  But this use of ``::``
  is unheard of in English, and in other languages (e.g. C++) it is
  used as a scoping operator, which is a very different beast.  In
  contrast, the single colon for type hints reads naturally -- and no
  wonder, since it was carefully designed for this purpose (the idea
  long predates PEP 3107 [gvr-artima]_).  It is also used in the same
  fashion in other languages from Pascal to Swift.

* What would you do for return type annotations?

* It's actually a feature that type hints are evaluated at runtime.

  * Making type hints available at runtime allows runtime type
    checkers to be built on top of type hints.

  * It catches mistakes even when the type checker is not run.  Since
    it is a separate program, users may choose not to run it (or even
    install it), but might still want to use type hints as a concise
    form of documentation.  Broken type hints are no use even for
    documentation.

* Because it's new syntax, using the double colon for type hints would
  limit them to code that works with Python 3.5 only.  By using
  existing syntax, the current proposal can easily work for older
  versions of Python 3.  (And in fact mypy supports Python 3.2 and
  newer.)

* If type hints become successful we may well decide to add new syntax
  in the future to declare the type for variables, for example
  ``var age: int = 42``.  If we were to use a double colon for
  argument type hints, for consistency we'd have to use the same
  convention for future syntax, perpetuating the ugliness.

Other forms of new syntax
-------------------------

A few other forms of alternative syntax have been proposed, e.g. the
introduction of a ``where`` keyword [roberge]_, and Cobra-inspired
``requires`` clauses.  But these all share a problem with the double
colon: they won't work for earlier versions of Python 3.  The same
would apply to a new ``__future__`` import.

Other backwards compatible conventions
--------------------------------------

The ideas put forward include:

* A decorator, e.g. ``@typehints(name=str, returns=str)``.  This could
  work, but it's pretty verbose (an extra line, and the argument names
  must be repeated), and a far cry in elegance from the PEP 3107
  notation.

* Stub files.  We do want stub files, but they are primarily useful
  for adding type hints to existing code that doesn't lend itself to
  adding type hints, e.g. 3rd party packages, code that needs to
  support both Python 2 and Python 3, and especially extension
  modules.  For most situations, having the annotations in line with
  the function definitions makes them much more useful.

* Docstrings.  There is an existing convention for docstrings, based
  on the Sphinx notation (``:type arg1: description``).  This is
  pretty verbose (an extra line per parameter), and not very elegant.
  We could also make up something new, but the annotation syntax is
  hard to beat (because it was designed for this very purpose).

It's also been proposed to simply wait another release.  But what
problem would that solve?  It would just be procrastination.


PEP Development Process
=======================

A live draft for this PEP lives on GitHub [github]_.  There is also an
issue tracker [issues]_, where much of the technical discussion takes
place.

The draft on GitHub is updated regularly in small increments.  The
official PEPS repo [peps_] is (usually) only updated when a new draft
is posted to python-dev.


Acknowledgements
================

This document could not be completed without valuable input,
encouragement and advice from Jim Baker, Jeremy Siek, Michael Matson
Vitousek, Andrey Vlasovskikh, Radomir Dopieralski, Peter Ludemann,
and the BDFL-Delegate, Mark Shannon.

Influences include existing languages, libraries and frameworks
mentioned in PEP 482.  Many thanks to their creators, in alphabetical
order: Stefan Behnel, William Edwards, Greg Ewing, Larry Hastings,
Anders Hejlsberg, Alok Menghrajani, Travis E. Oliphant, Joe Pamer,
Raoul-Gabriel Urma, and Julien Verlaguet.


References
==========

.. [mypy]
   http://mypy-lang.org

.. [gvr-artima]
   http://www.artima.com/weblogs/viewpost.jsp?thread=85551

.. [wiki-variance]

http://en.wikipedia.org/wiki/Covariance_and_contravariance_%28computer_science%29

.. [typeshed]
   https://github.com/JukkaL/typeshed/

.. [pyflakes]
   https://github.com/pyflakes/pyflakes/

.. [pylint]
   http://www.pylint.org

.. [roberge]
   http://aroberge.blogspot.com/2015/01/type-hinting-in-python-focus-on.html

.. [github]
   https://github.com/ambv/typehinting

.. [issues]
   https://github.com/ambv/typehinting/issues

.. [peps]
   https://hg.python.org/peps/file/tip/pep-0484.txt


Copyright
=========

This document has been placed in the public domain.



..
   Local Variables:
   mode: indented-text
   indent-tabs-mode: nil
   sentence-end-double-space: t
   fill-column: 70
   coding: utf-8
   End:

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/f069794a/attachment-0001.html>

From status at bugs.python.org  Fri May 22 18:08:18 2015
From: status at bugs.python.org (Python tracker)
Date: Fri, 22 May 2015 18:08:18 +0200 (CEST)
Subject: [Python-Dev] Summary of Python tracker Issues
Message-ID: <20150522160818.15553566E7@psf.upfronthosting.co.za>


ACTIVITY SUMMARY (2015-05-15 - 2015-05-22)
Python tracker at http://bugs.python.org/

To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.

Issues counts and deltas:
  open    4833 ( -7)
  closed 31194 (+71)
  total  36027 (+64)

Open issues with patches: 2219 


Issues opened (40)
==================

#24147: Dialect class defaults are not documented.
http://bugs.python.org/issue24147  reopened by r.david.murray

#24203: Depreciate threading.Thread.isDaemon etc
http://bugs.python.org/issue24203  opened by anon

#24204: string.strip() documentation is misleading
http://bugs.python.org/issue24204  opened by PhoenixofMT

#24206: Issues with equality of inspect objects
http://bugs.python.org/issue24206  opened by serhiy.storchaka

#24207: Argument Clinic doesn't mangle conflicting names
http://bugs.python.org/issue24207  opened by serhiy.storchaka

#24209: Allow IPv6 bind in http.server
http://bugs.python.org/issue24209  opened by Link Mauve

#24212: Idle, 2.7, backport idlelib.__main__, enable py -m idlelib
http://bugs.python.org/issue24212  opened by terry.reedy

#24214: Exception with utf-8, surrogatepass and incremental decoding
http://bugs.python.org/issue24214  opened by RalfM

#24215: test_trace uses test_pprint
http://bugs.python.org/issue24215  opened by serhiy.storchaka

#24217: O_RDWR undefined in mmapmodule.c
http://bugs.python.org/issue24217  opened by Jeffrey.Armstrong

#24219: Repeated integer in Lexical analysis/Integer literals section
http://bugs.python.org/issue24219  opened by vlth

#24224: test_msilib is inadequate
http://bugs.python.org/issue24224  opened by zach.ware

#24225: Idlelib: changing file names
http://bugs.python.org/issue24225  opened by Al.Sweigart

#24228: Interpreter triggers  segmentation fault  at the starting
http://bugs.python.org/issue24228  opened by mdootb

#24229: pathlib.Path should have a copy() method
http://bugs.python.org/issue24229  opened by jshholland

#24230: tempfile.mkdtemp() doesn't work with bytes paths
http://bugs.python.org/issue24230  opened by durin42

#24231: os.makedirs('/', exist_ok=True) fails on Darwin
http://bugs.python.org/issue24231  opened by mew

#24234: Should we define complex.__complex__ and bytes.__bytes__?
http://bugs.python.org/issue24234  opened by gvanrossum

#24235: ABCs don't fail metaclass instantiation
http://bugs.python.org/issue24235  opened by Devin Jeanpierre

#24238: Avoid entity expansion attacks in Element Tree
http://bugs.python.org/issue24238  opened by vadmium

#24239: Allow to  configure which gpg to use in distutils upload
http://bugs.python.org/issue24239  opened by ced

#24241: webbrowser default browser detection and/or public API for _tr
http://bugs.python.org/issue24241  opened by daves

#24243: behavior for finding an empty string is inconsistent with docu
http://bugs.python.org/issue24243  opened by swanson

#24244: Python exception on strftime with %f on Python 3 and Python 2 
http://bugs.python.org/issue24244  opened by MajeedArni

#24247: "unittest discover" does modify sys.path
http://bugs.python.org/issue24247  opened by redixin

#24249: unittest API for detecting test failure in cleanup/teardown
http://bugs.python.org/issue24249  opened by r.david.murray

#24251: Different behavior for argparse between 2.7.8 and 2.7.9 when a
http://bugs.python.org/issue24251  opened by hhuang

#24252: IDLE removes elements from tracebacks.
http://bugs.python.org/issue24252  opened by ppperry

#24253: pydoc for namespace packages indicates FILE as built-in
http://bugs.python.org/issue24253  opened by Antony.Lee

#24254: Make class definition namespace ordered by default
http://bugs.python.org/issue24254  opened by eric.snow

#24255: Replace debuglevel-related logic with logging
http://bugs.python.org/issue24255  opened by demian.brecht

#24256: threading.Timer is not a class
http://bugs.python.org/issue24256  opened by jrunyon

#24258: BZ2File objects do not have name attribute
http://bugs.python.org/issue24258  opened by jojko.sivek

#24259: tar.extractall() does not recognize unexpected EOF
http://bugs.python.org/issue24259  opened by Thomas G??ttler

#24260: TabError behavior doesn't match documentation
http://bugs.python.org/issue24260  opened by abacabadabacaba

#24261: Add a command line flag to suppress default signal handlers
http://bugs.python.org/issue24261  opened by abacabadabacaba

#24263: Why VALID_MODULE_NAME in unittest/loader.py is r'[_a-z]\w*\.py
http://bugs.python.org/issue24263  opened by sih4sing5hong5

#24264: imageop Unsafe Arithmetic
http://bugs.python.org/issue24264  opened by JohnLeitch

#24265: IDLE produces error message when run with both -s and -c.
http://bugs.python.org/issue24265  opened by ppperry

#24266: raw_input function (with readline): Ctrl+C (during search mode
http://bugs.python.org/issue24266  opened by sping



Most recent 15 issues with no replies (15)
==========================================

#24266: raw_input function (with readline): Ctrl+C (during search mode
http://bugs.python.org/issue24266

#24265: IDLE produces error message when run with both -s and -c.
http://bugs.python.org/issue24265

#24264: imageop Unsafe Arithmetic
http://bugs.python.org/issue24264

#24263: Why VALID_MODULE_NAME in unittest/loader.py is r'[_a-z]\w*\.py
http://bugs.python.org/issue24263

#24260: TabError behavior doesn't match documentation
http://bugs.python.org/issue24260

#24259: tar.extractall() does not recognize unexpected EOF
http://bugs.python.org/issue24259

#24258: BZ2File objects do not have name attribute
http://bugs.python.org/issue24258

#24253: pydoc for namespace packages indicates FILE as built-in
http://bugs.python.org/issue24253

#24247: "unittest discover" does modify sys.path
http://bugs.python.org/issue24247

#24239: Allow to  configure which gpg to use in distutils upload
http://bugs.python.org/issue24239

#24238: Avoid entity expansion attacks in Element Tree
http://bugs.python.org/issue24238

#24235: ABCs don't fail metaclass instantiation
http://bugs.python.org/issue24235

#24234: Should we define complex.__complex__ and bytes.__bytes__?
http://bugs.python.org/issue24234

#24224: test_msilib is inadequate
http://bugs.python.org/issue24224

#24214: Exception with utf-8, surrogatepass and incremental decoding
http://bugs.python.org/issue24214



Most recent 15 issues waiting for review (15)
=============================================

#24254: Make class definition namespace ordered by default
http://bugs.python.org/issue24254

#24244: Python exception on strftime with %f on Python 3 and Python 2 
http://bugs.python.org/issue24244

#24238: Avoid entity expansion attacks in Element Tree
http://bugs.python.org/issue24238

#24230: tempfile.mkdtemp() doesn't work with bytes paths
http://bugs.python.org/issue24230

#24225: Idlelib: changing file names
http://bugs.python.org/issue24225

#24219: Repeated integer in Lexical analysis/Integer literals section
http://bugs.python.org/issue24219

#24217: O_RDWR undefined in mmapmodule.c
http://bugs.python.org/issue24217

#24215: test_trace uses test_pprint
http://bugs.python.org/issue24215

#24209: Allow IPv6 bind in http.server
http://bugs.python.org/issue24209

#24206: Issues with equality of inspect objects
http://bugs.python.org/issue24206

#24204: string.strip() documentation is misleading
http://bugs.python.org/issue24204

#24198: please align the platform tag for windows
http://bugs.python.org/issue24198

#24195: Add `Executor.filter` to concurrent.futures
http://bugs.python.org/issue24195

#24165: Free list for single-digits ints
http://bugs.python.org/issue24165

#24164: Support pickling objects with __new__ with keyword arguments w
http://bugs.python.org/issue24164



Top 10 most discussed issues (10)
=================================

#16991: Add OrderedDict written in C
http://bugs.python.org/issue16991  22 msgs

#4709: Mingw-w64 and python on windows x64
http://bugs.python.org/issue4709  18 msgs

#24230: tempfile.mkdtemp() doesn't work with bytes paths
http://bugs.python.org/issue24230  16 msgs

#24244: Python exception on strftime with %f on Python 3 and Python 2 
http://bugs.python.org/issue24244  12 msgs

#24195: Add `Executor.filter` to concurrent.futures
http://bugs.python.org/issue24195  10 msgs

#12319: [http.client] HTTPConnection.request not support "chunked" Tra
http://bugs.python.org/issue12319   9 msgs

#24225: Idlelib: changing file names
http://bugs.python.org/issue24225   9 msgs

#23699: Add a macro to ease writing rich comparisons
http://bugs.python.org/issue23699   8 msgs

#24215: test_trace uses test_pprint
http://bugs.python.org/issue24215   8 msgs

#6598: calling email.utils.make_msgid frequently has a non-trivial pr
http://bugs.python.org/issue6598   7 msgs



Issues closed (61)
==================

#4254: _cursesmodule.c callable update_lines_cols()
http://bugs.python.org/issue4254  closed by r.david.murray

#9858: Python and C implementations of io are out of sync
http://bugs.python.org/issue9858  closed by pitrou

#10170: Relationship between turtle speed setting and actual speed is 
http://bugs.python.org/issue10170  closed by terry.reedy

#15267: tempfile.TemporaryFile and httplib incompatibility
http://bugs.python.org/issue15267  closed by serhiy.storchaka

#15836: unittest assertRaises should verify excClass is actually a Bas
http://bugs.python.org/issue15836  closed by serhiy.storchaka

#16261: Fix bare excepts in various places in std lib
http://bugs.python.org/issue16261  closed by serhiy.storchaka

#18682: [PATCH] remove bogus codepath from pprint._safe_repr
http://bugs.python.org/issue18682  closed by serhiy.storchaka

#18986: Add a case-insensitive case-preserving dict
http://bugs.python.org/issue18986  closed by serhiy.storchaka

#20098: email policy needs a mangle_from setting
http://bugs.python.org/issue20098  closed by r.david.murray

#20438: inspect: Deprecate getfullargspec?
http://bugs.python.org/issue20438  closed by yselivanov

#20596: Support for alternate wcstok syntax for Windows compilers
http://bugs.python.org/issue20596  closed by Jeffrey.Armstrong

#20691: inspect.signature: Consider exposing 'follow_wrapper_chains' o
http://bugs.python.org/issue20691  closed by yselivanov

#21083: Add get_content_disposition() to email.message.Message
http://bugs.python.org/issue21083  closed by r.david.murray

#21800: Implement RFC 6855 (IMAP Support for UTF-8) in imaplib.
http://bugs.python.org/issue21800  closed by r.david.murray

#21804: Implement thr UTF8 command (RFC 6856) in poplib.
http://bugs.python.org/issue21804  closed by r.david.murray

#21931: Nonsense errors reported by msilib.FCICreate for bad argument
http://bugs.python.org/issue21931  closed by python-dev

#22027: RFC 6531 (SMTPUTF8) support in smtplib
http://bugs.python.org/issue22027  closed by r.david.murray

#22107: tempfile module misinterprets access denied error on Windows
http://bugs.python.org/issue22107  closed by serhiy.storchaka

#22155: Out of date code example for tkinter's createfilehandler
http://bugs.python.org/issue22155  closed by terry.reedy

#22804: Can't run Idle in Windows 8 or windows 64
http://bugs.python.org/issue22804  closed by terry.reedy

#23184: Unused imports, variables, file in IDLE
http://bugs.python.org/issue23184  closed by terry.reedy

#23780: Surprising behaviour when passing list to os.path.join.
http://bugs.python.org/issue23780  closed by serhiy.storchaka

#23889: Speedup inspect.Signature.bind
http://bugs.python.org/issue23889  closed by yselivanov

#23898: inspect() changes in Python3.4 are not compatible with objects
http://bugs.python.org/issue23898  closed by yselivanov

#23964: Update README documentation for IDLE tests.
http://bugs.python.org/issue23964  closed by terry.reedy

#23985: Crash when deleting slices from duplicated bytearray
http://bugs.python.org/issue23985  closed by pitrou

#24004: avoid explicit generator type check in asyncio
http://bugs.python.org/issue24004  closed by yselivanov

#24091: Use after free in Element.extend (1)
http://bugs.python.org/issue24091  closed by serhiy.storchaka

#24102: Multiple type confusions in unicode error handlers
http://bugs.python.org/issue24102  closed by serhiy.storchaka

#24127: Fatal error in launcher: Job information querying failed
http://bugs.python.org/issue24127  closed by paul.moore

#24162: [2.7 regression] test_asynchat test failure on i586-linux-gnu
http://bugs.python.org/issue24162  closed by python-dev

#24176: Incorrect parsing of unpacked expressions in call
http://bugs.python.org/issue24176  closed by python-dev

#24180: PEP 492: Documentation
http://bugs.python.org/issue24180  closed by yselivanov

#24190: BoundArguments facility to inject defaults
http://bugs.python.org/issue24190  closed by yselivanov

#24192: unexpected system error with pep420 style namespace packages
http://bugs.python.org/issue24192  closed by eric.snow

#24200: Redundant id in informative reprs
http://bugs.python.org/issue24200  closed by yselivanov

#24205: signature.bind error messages are sub-optimal
http://bugs.python.org/issue24205  closed by yselivanov

#24208: test_inspect leaks temporary directory
http://bugs.python.org/issue24208  closed by serhiy.storchaka

#24210: Tests failed with -Werror
http://bugs.python.org/issue24210  closed by berker.peksag

#24211: Add RFC 6532 support to the email library
http://bugs.python.org/issue24211  closed by r.david.murray

#24213: ProcessPoolExecutor().map() fails following an identical map()
http://bugs.python.org/issue24213  closed by ned.deily

#24216: Typo in bytes.join/bytearray.join documentation
http://bugs.python.org/issue24216  closed by r.david.murray

#24218: Also support SMTPUTF8 in smtplib's send_message method.
http://bugs.python.org/issue24218  closed by r.david.murray

#24220: ast.Call signature changed
http://bugs.python.org/issue24220  closed by willingc

#24221: Clean-up and optimization for heapq siftup() and siftdown()
http://bugs.python.org/issue24221  closed by rhettinger

#24222: Idle 2.7 -c, -r compile with print as function.
http://bugs.python.org/issue24222  closed by terry.reedy

#24223: argparse parsing (mingling --option and optional positional ar
http://bugs.python.org/issue24223  closed by vadmium

#24226: [3.5 Regression] unable to byte-compile the attached IN.py
http://bugs.python.org/issue24226  closed by yselivanov

#24227: IndentationError caused by async / await changes in parser
http://bugs.python.org/issue24227  closed by zach.ware

#24232: Speling fixes
http://bugs.python.org/issue24232  closed by berker.peksag

#24233: Link to getfqdn rather than "see above"
http://bugs.python.org/issue24233  closed by berker.peksag

#24236: TestNG results to Junit results conversion
http://bugs.python.org/issue24236  closed by tusharm

#24237: PEP 479: missing DeprecationWarning when generator_stop is not
http://bugs.python.org/issue24237  closed by yselivanov

#24240: PEP 448: Update the language reference
http://bugs.python.org/issue24240  closed by benjamin.peterson

#24242: property performance regression
http://bugs.python.org/issue24242  closed by rhettinger

#24245: Eliminate do-nothing exception handlers
http://bugs.python.org/issue24245  closed by serhiy.storchaka

#24246: mimetypes.guess_extension returns different values when init()
http://bugs.python.org/issue24246  closed by r.david.murray

#24248: Deprecate inspect.Signature.from_function and from_builtin
http://bugs.python.org/issue24248  closed by yselivanov

#24250: Optimization for strcpy(..., "") in file 'install.c'
http://bugs.python.org/issue24250  closed by benjamin.peterson

#24257: Incorrect use of PyObject_IsInstance
http://bugs.python.org/issue24257  closed by serhiy.storchaka

#24262: logging.FileHandler.close() is not thread-safe
http://bugs.python.org/issue24262  closed by dubglan

From jimjjewett at gmail.com  Fri May 22 18:45:52 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Fri, 22 May 2015 09:45:52 -0700 (PDT)
Subject: [Python-Dev] Status of PEP 484 and the typing module
In-Reply-To: <CAP7+vJ+qtRF4kngnHQEh-C+BVaS6cxPk2kifRaB03NA-NiC3sQ@mail.gmail.com>
Message-ID: <555f5d40.0a958c0a.3d53.ffff9ccf@mx.google.com>



Mark Shannon wrote:

> PY2, etc. really need to go.
> Assuming that this code type checks OK:
>
>  if typing.PY2:
>      type_safe_under_py2_only()
>  else:
>      type_safe_under_py3_only()
>
> Is the checker supposed to pass this:
>
>  if sys.hexversion < 0x03000000:
>      type_safe_under_py2_only()
>  else:
>      type_safe_under_py3_only()
>
> If it should pass, then why have PY2, etc. at all.

My immediate response was that there really is a difference,
when doing the equivalent of cross-compilation.  It would
help to make this explicit in the PEP.

But ...
> If it should fail, well that is just stupid and annoying.

so I'm not sure regular authors (as opposed to typing tools)
would ever have reason to use it, and making stub files more
different from regular python creates an attractive nuisance
bigger than the clarification.

So in the end, I believe PY2 should merely be part of the calling
convention for type tools, and that may not be worth standardizing
yet.  It *is* worth explaining why they were taken out, though.

And it is worth saying explicitly that typing tools should override
the sys module when checking for non-native environments.


-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From guido at python.org  Fri May 22 18:54:45 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 22 May 2015 09:54:45 -0700
Subject: [Python-Dev] Status of PEP 484 and the typing module
In-Reply-To: <555f5d40.0a958c0a.3d53.ffff9ccf@mx.google.com>
References: <CAP7+vJ+qtRF4kngnHQEh-C+BVaS6cxPk2kifRaB03NA-NiC3sQ@mail.gmail.com>
 <555f5d40.0a958c0a.3d53.ffff9ccf@mx.google.com>
Message-ID: <CAP7+vJJNkJOGSsZczb=itWF=Dh2cHYKCBFSbxcfGNi2-Qk7uww@mail.gmail.com>

On Fri, May 22, 2015 at 9:45 AM, Jim J. Jewett <jimjjewett at gmail.com> wrote:

>
>
> Mark Shannon wrote:
>
> > PY2, etc. really need to go.
> > Assuming that this code type checks OK:
> >
> >  if typing.PY2:
> >      type_safe_under_py2_only()
> >  else:
> >      type_safe_under_py3_only()
> >
> > Is the checker supposed to pass this:
> >
> >  if sys.hexversion < 0x03000000:
> >      type_safe_under_py2_only()
> >  else:
> >      type_safe_under_py3_only()
> >
> > If it should pass, then why have PY2, etc. at all.
>
> My immediate response was that there really is a difference,
> when doing the equivalent of cross-compilation.  It would
> help to make this explicit in the PEP.
>

That seems obvious. There's no reason why a type checker should care about
what sys.*version* is in the process that runs the type checker (that
process may not even be a Python interpreter).


> But ...
> > If it should fail, well that is just stupid and annoying.
>
> so I'm not sure regular authors (as opposed to typing tools)
> would ever have reason to use it, and making stub files more
> different from regular python creates an attractive nuisance
> bigger than the clarification.
>
> So in the end, I believe PY2 should merely be part of the calling
> convention for type tools, and that may not be worth standardizing
> yet.  It *is* worth explaining why they were taken out, though.
>

Because there is no advantage (either to the user or to the type checker)
of using e.g. typing.WINDOWS instead of using sys.platform == "win32".


> And it is worth saying explicitly that typing tools should override
> the sys module when checking for non-native environments.
>

OK, I am saying it here. People writing type checkers can decide for
themselves what they want to support. (It is already the case that mypy can
check code for conformance with various Python versions, but mypy itself
must always run in Python 3.4 or later.)

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/fef92fe8/attachment.html>

From jimjjewett at gmail.com  Fri May 22 19:23:16 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Fri, 22 May 2015 10:23:16 -0700 (PDT)
Subject: [Python-Dev] Status of PEP 484 and the typing module
In-Reply-To: <CAP7+vJJHZj0=3MgkNLQKtkY2AF-H_9t_ah68jSW4Q74PX9oGtQ@mail.gmail.com>
Message-ID: <555f6604.1435370a.7cc8.ffffa683@mx.google.com>



At Thu May 21 22:27:50 CEST 2015, Guido wrote:

> I want to encourage users to think about annotations as types,
> and for most users the distinction between type and class is
> too subtle,

So what is the distinction that you are trying to make?

That a type refers to a variable (name), and a class refers to a
piece of data (object) that might be bound to that name?

Whatever the intended distinction is, please be explicit in the
PEP, even if you decide to paper it over in normal code.  For
example, the above distinction would help to explain why the
typing types can't be directly instantiated, since they aren't
meant to refer to specific data. (They can still be used as
superclasses because practicality beats purity, and using them
as a marker base class is practical.)

-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From guido at python.org  Fri May 22 19:32:37 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 22 May 2015 10:32:37 -0700
Subject: [Python-Dev] Status of PEP 484 and the typing module
In-Reply-To: <555f6604.1435370a.7cc8.ffffa683@mx.google.com>
References: <CAP7+vJJHZj0=3MgkNLQKtkY2AF-H_9t_ah68jSW4Q74PX9oGtQ@mail.gmail.com>
 <555f6604.1435370a.7cc8.ffffa683@mx.google.com>
Message-ID: <CAP7+vJJ9KBBfBox5DCiKsVi8UJUz1VrD=AqjVm8sP-WhGzBuhA@mail.gmail.com>

On Fri, May 22, 2015 at 10:23 AM, Jim J. Jewett <jimjjewett at gmail.com>
wrote:

>
>
> At Thu May 21 22:27:50 CEST 2015, Guido wrote:
>
> > I want to encourage users to think about annotations as types,
> > and for most users the distinction between type and class is
> > too subtle,
>
> So what is the distinction that you are trying to make?
>
> That a type refers to a variable (name), and a class refers to a
> piece of data (object) that might be bound to that name?
>

Sort of. But really a type is something in the mind of the type checker (or
the programmer) while the class is a concept that can be inspected at
runtime.


> Whatever the intended distinction is, please be explicit in the
> PEP, even if you decide to paper it over in normal code.  For
> example, the above distinction would help to explain why the
> typing types can't be directly instantiated, since they aren't
> meant to refer to specific data. (They can still be used as
> superclasses because practicality beats purity, and using them
> as a marker base class is practical.)
>

There will have to be documentation and tutorials beyond the PEP. The PEP
mostly defines a standard to be used by people implementing type checkers.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/bb89fa79/attachment.html>

From larry at hastings.org  Fri May 22 21:53:34 2015
From: larry at hastings.org (Larry Hastings)
Date: Fri, 22 May 2015 12:53:34 -0700
Subject: [Python-Dev] Reminder: Python 3.5 beta 1 will be tagged tomorrow
Message-ID: <555F893E.6080806@hastings.org>



Howdy howdy.  It's-a me, Larry, your friendly neighborhood Python 3.5 
Release Manager.

Somewhere around 2 or 3pm tomorrow I expect to tag Python 3.5 beta 1.  
We'll actually release beta 1 on Sunday, once the binary installers are 
all built.

Beta 1 is also feature-freeze, meaning no new features may be added to 
3.5 without my permission.  Since it seems useful to have a specific 
cutoff time, please stop adding features at ** 8pm Saturday UTC **.  
(That's 1pm Pacific Daylight Time.  It's also almost exactly 24 hours 
from... now.)

I remind you that this time we're trying something new: we're going to 
create the 3.5 branch when we release beta 1, allowing feature 
development (for 3.6) to continue in trunk.  At the point that I check 
in and push beta 1, I'll also merge all checkins from trunk back into 
the 3.5 branch.  After that it'll be responsibility of the person 
checking in to check their bug fixes in to the appropriate place.  So 
please keep in mind: once the 3.5 branch becomes generally available on 
Sunday, the usual rules for a release branch will apply: bug fixes for 
3.5 should be checked in to the 3.5 branch and get merged forward into 
trunk.

If you have new features you want to ship with Python 3.5, please check 
them in as soon as possible!


Thank you for helping to make Python better,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/7bd4cb7c/attachment.html>

From martin at v.loewis.de  Fri May 22 22:00:54 2015
From: martin at v.loewis.de (=?windows-1252?Q?=22Martin_v=2E_L=F6wis=22?=)
Date: Fri, 22 May 2015 22:00:54 +0200
Subject: [Python-Dev] [python-committers] How shall we conduct the
 Python 3.5 beta and rc periods? (Please vote!)
In-Reply-To: <55530470.3010705@hastings.org>
References: <555232A7.7060002@hastings.org>	<CAP1=2W7PGFVKmPXSMhy_YbqQC-zqxpFm+xYkU2drxnNZ2LsAxw@mail.gmail.com>
 <CADiSq7c0AYZOWRaawwAR6Vrzd75vXXfKz7H8Z8x8kXRz9UrTmQ@mail.gmail.com>
 <55530470.3010705@hastings.org>
Message-ID: <555F8AF6.9000909@v.loewis.de>

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Am 13.05.15 um 09:59 schrieb Larry Hastings:
> When you say "branch testing", you mean "running the buildbots
> against it"?  Right now the UI for doing that is pretty clunky.
> Kicking off a build against a server-side clone (iirc) requires
> clicking through a couple web pages, filling out a form, and
> clicking on a teeny-tiny button.  It would help *tremendously* here
> if I could get this automated, so I could run a script locally that
> made everything happen.
> 
> Is there a remote API for starting builds?  Or existing automation
> of any kind?  Who should I talk to about this stuff?

Antoine, or me. For branch builds, it would be better to configure
them into the buildbot configuration, instead of trying to force them
from the outside.

To make this happen, you need to add a repository URL and branch name
into the buildbot configuration, and a post-push hook on the repository
to trigger the build. It's actually possible to configure a bitbucket
POST hook to trigger a buildbot build, but we haven't yet integrated
that into the buildbot master.

Regards,
Martin

-----BEGIN PGP SIGNATURE-----
Version: GnuPG/MacGPG2 v2.0.22 (Darwin)
Comment: GPGTools - http://gpgtools.org

iEYEARECAAYFAlVfivYACgkQavBT8H2dyNIzCACdG3yHShN/ZEc1sIiOVYj0lcg0
K9IAnjqLCFN+EewBPLfh651wQUq64nun
=0j5m
-----END PGP SIGNATURE-----

From mark at hotpy.org  Fri May 22 22:51:36 2015
From: mark at hotpy.org (Mark Shannon)
Date: Fri, 22 May 2015 21:51:36 +0100
Subject: [Python-Dev] PEP 484 (Type Hints) announcement
Message-ID: <555F96D8.8050504@hotpy.org>

Hello all,

I am pleased to announce that I am accepting PEP 484 (Type Hints).

Given the proximity of the beta release I thought I would get this 
announcement out now, even though there are some (very) minor details to 
iron out.
(If you want to know the details, it's all at 
https://github.com/ambv/typehinting)


I hope that PEP 484 will be a benefit to all users of Python.
I think the proposed annotation semantics and accompanying module are 
technically sound and I hope that they are socially acceptable to the 
Python community.

I have long been aware that as well as a powerful, sophisticated and 
"production quality" language, Python is also used by many casual 
programmers, and as a language to introduce children to programming.
I also realise that this PEP does not look like it will be any help to 
the part-time programmer or beginner. However, I am convinced that it 
will enable significant improvements to IDEs (hopefully including IDLE), 
static checkers and other tools.
These tools will then help us all, beginners included.

This PEP has been a huge amount of work, involving a lot of people.
So thank you to everyone involved. If I were to list names I would 
inevitably miss someone out. You know who you are.

Finally, if you are worried that this will make Python ugly and turn it 
into some sort of inferior Java, then I share you concerns, but I would 
like to remind you of another potential ugliness; operator overloading.

C++, Perl and Haskell have operator overloading and it gets abused 
something rotten to produce "concise" (a.k.a. line noise) code.
Python also has operator overloading and it is used sensibly, as it 
should be. Why?
It's a cultural issue; readability matters.

Python is your language, please use type-hints responsibly :)

Cheers,
Mark.

From guido at python.org  Fri May 22 23:20:11 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 22 May 2015 14:20:11 -0700
Subject: [Python-Dev] PEP 484 (Type Hints) announcement
In-Reply-To: <555F96D8.8050504@hotpy.org>
References: <555F96D8.8050504@hotpy.org>
Message-ID: <CAP7+vJLudofoe6ZGDJMApuL2OdD1fq1qf8_zzfhO=Tnu1__osg@mail.gmail.com>

Thanks Mark!
On May 22, 2015 1:52 PM, "Mark Shannon" <mark at hotpy.org> wrote:

> Hello all,
>
> I am pleased to announce that I am accepting PEP 484 (Type Hints).
>
> Given the proximity of the beta release I thought I would get this
> announcement out now, even though there are some (very) minor details to
> iron out.
> (If you want to know the details, it's all at
> https://github.com/ambv/typehinting)
>
>
> I hope that PEP 484 will be a benefit to all users of Python.
> I think the proposed annotation semantics and accompanying module are
> technically sound and I hope that they are socially acceptable to the
> Python community.
>
> I have long been aware that as well as a powerful, sophisticated and
> "production quality" language, Python is also used by many casual
> programmers, and as a language to introduce children to programming.
> I also realise that this PEP does not look like it will be any help to the
> part-time programmer or beginner. However, I am convinced that it will
> enable significant improvements to IDEs (hopefully including IDLE), static
> checkers and other tools.
> These tools will then help us all, beginners included.
>
> This PEP has been a huge amount of work, involving a lot of people.
> So thank you to everyone involved. If I were to list names I would
> inevitably miss someone out. You know who you are.
>
> Finally, if you are worried that this will make Python ugly and turn it
> into some sort of inferior Java, then I share you concerns, but I would
> like to remind you of another potential ugliness; operator overloading.
>
> C++, Perl and Haskell have operator overloading and it gets abused
> something rotten to produce "concise" (a.k.a. line noise) code.
> Python also has operator overloading and it is used sensibly, as it should
> be. Why?
> It's a cultural issue; readability matters.
>
> Python is your language, please use type-hints responsibly :)
>
> Cheers,
> Mark.
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/fe2d5f3d/attachment.html>

From chris.barker at noaa.gov  Fri May 22 23:29:13 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Fri, 22 May 2015 14:29:13 -0700
Subject: [Python-Dev] Reminder: Python 3.5 beta 1 will be tagged tomorrow
Message-ID: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>

Is it too late to get the isclose() code (PEP 485) into 3.5?

I posted the code here, and got a tiny bit of review, but have not yet
merged it into the source tree -- and don't know the process for getting it
committed to the official source.

So -- too late, or should I try to get that merge done soon -- if so, how?

-Chris



On Fri, May 22, 2015 at 12:53 PM, Larry Hastings <larry at hastings.org> wrote:

>
>
> Howdy howdy.  It's-a me, Larry, your friendly neighborhood Python 3.5
> Release Manager.
>
> Somewhere around 2 or 3pm tomorrow I expect to tag Python 3.5 beta 1.
> We'll actually release beta 1 on Sunday, once the binary installers are all
> built.
>
> Beta 1 is also feature-freeze, meaning no new features may be added to 3.5
> without my permission.  Since it seems useful to have a specific cutoff
> time, please stop adding features at ** 8pm Saturday UTC **.  (That's 1pm
> Pacific Daylight Time.  It's also almost exactly 24 hours from... now.)
>
> I remind you that this time we're trying something new: we're going to
> create the 3.5 branch when we release beta 1, allowing feature development
> (for 3.6) to continue in trunk.  At the point that I check in and push beta
> 1, I'll also merge all checkins from trunk back into the 3.5 branch.  After
> that it'll be responsibility of the person checking in to check their bug
> fixes in to the appropriate place.  So please keep in mind: once the 3.5
> branch becomes generally available on Sunday, the usual rules for a release
> branch will apply: bug fixes for 3.5 should be checked in to the 3.5 branch
> and get merged forward into trunk.
>
> If you have new features you want to ship with Python 3.5, please check
> them in as soon as possible!
>
>
> Thank you for helping to make Python better,
>
>
> */arry*
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov
>
>


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/8438b8bf/attachment.html>

From larry at hastings.org  Fri May 22 23:33:45 2015
From: larry at hastings.org (Larry Hastings)
Date: Fri, 22 May 2015 14:33:45 -0700
Subject: [Python-Dev] Reminder: Python 3.5 beta 1 will be tagged tomorrow
In-Reply-To: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
Message-ID: <555FA0B9.2010209@hastings.org>

On 05/22/2015 02:29 PM, Chris Barker wrote:
> Is it too late to get the isclose() code (PEP 485) into 3.5?
>
> I posted the code here, and got a tiny bit of review, but have not yet 
> merged it into the source tree -- and don't know the process for 
> getting it committed to the official source.
>
> So -- too late, or should I try to get that merge done soon -- if so, how?

Posting your plea here is a good start.  Hopefully you can find a core 
dev familiar enough with the issues involved that they can (quickly!) 
guide it through the process of getting it checked in.

Given that this is a new feature I can live with it being checked in at 
the last minute, as it shouldn't destabilize the build.  Still, I would 
prefer that it was checked in far enough in advance that you can watch 
the buildbots ( 
http://buildbot.python.org/all/waterfall?category=3.x.stable ) and maybe 
even iterate if the checkin causes problems.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/661bd1e2/attachment.html>

From ericsnowcurrently at gmail.com  Fri May 22 23:44:04 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Fri, 22 May 2015 15:44:04 -0600
Subject: [Python-Dev] Accepting PEP 489 (Multi-phase extension module
	initialization)
Message-ID: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>

Hi all,

After extended discussion over the last several months on import-sig,
the resulting proposal for multi-phase (PEP 451) extension module
initialization has finalized.  The resulting PEP provides a clean,
straight-forward, and backward-compatible way to import extension
modules using ModuleSpecs.

With that in mind and given the improvement it provides, PEP 489 is
now accepted.  I want to thank Petr, Nick, and Stefan for the time,
thought, and effort they put into the proposal (and implementation).
It was a disappointment to me when, at the time, we couldn't find a
good way to apply PEP 451 to builtins and extension modules.  So
thanks for easing my anxiety!

-eric

From guido at python.org  Fri May 22 23:52:11 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 22 May 2015 14:52:11 -0700
Subject: [Python-Dev] Accepting PEP 489 (Multi-phase extension module
	initialization)
In-Reply-To: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>
References: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>
Message-ID: <CAP7+vJK9=yJAyN8aAad4=69JSU3Oxwvay1NtKVnNioQMxOWBng@mail.gmail.com>

Congrats! Many thanks to all who contributed.
On May 22, 2015 2:45 PM, "Eric Snow" <ericsnowcurrently at gmail.com> wrote:

> Hi all,
>
> After extended discussion over the last several months on import-sig,
> the resulting proposal for multi-phase (PEP 451) extension module
> initialization has finalized.  The resulting PEP provides a clean,
> straight-forward, and backward-compatible way to import extension
> modules using ModuleSpecs.
>
> With that in mind and given the improvement it provides, PEP 489 is
> now accepted.  I want to thank Petr, Nick, and Stefan for the time,
> thought, and effort they put into the proposal (and implementation).
> It was a disappointment to me when, at the time, we couldn't find a
> good way to apply PEP 451 to builtins and extension modules.  So
> thanks for easing my anxiety!
>
> -eric
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/46101250/attachment.html>

From chris.barker at noaa.gov  Fri May 22 23:53:28 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Fri, 22 May 2015 14:53:28 -0700
Subject: [Python-Dev] Reminder: Python 3.5 beta 1 will be tagged tomorrow
In-Reply-To: <555FA0B9.2010209@hastings.org>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
Message-ID: <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>

On Fri, May 22, 2015 at 2:33 PM, Larry Hastings <larry at hastings.org> wrote:

>  On 05/22/2015 02:29 PM, Chris Barker wrote:
>
> Is it too late to get the isclose() code (PEP 485) into 3.5?
>
> ...

>   Hopefully you can find a core dev familiar enough with the issues
> involved that they can (quickly!) guide it through the process of getting
> it checked in.
>
> Ping!  Anyone willing to sponsor this?

Sorry I waited 'till this late in the game.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/26ccfd17/attachment.html>

From Steve.Dower at microsoft.com  Sat May 23 00:06:48 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Fri, 22 May 2015 22:06:48 +0000
Subject: [Python-Dev] [python-committers] Can we clean up the buildbots
	please?
In-Reply-To: <555FA32A.3030907@hastings.org>
References: <555FA32A.3030907@hastings.org>
Message-ID: <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>

The Windows 7 buildbots are failing on test_asdl_parser, but I have no idea why ? the test works for me just fine. Yury and Benjamin made the most recent changes to Python.asdl, but I have no idea what effect they would have here, or why it?s Windows only.

The WS2K3 machine needs a reboot ? I pinged Trent about that months ago ? and the XP one isn?t supported for 3.5.

Pending the test_asdl_parser fix, I?d also like to see AMD64 Windows 8 (http://buildbot.python.org/all/builders/AMD64%20Windows8%203.x) be promoted to stable, as it?s one of only two currently using the right compiler.

Cheers,
Steve

Coming to PyData Seattle 2015<http://conf.pydata.org/seattle2015/>? Hosted by Microsoft on our Redmond campus, July 24-26

From: python-committers [mailto:python-committers-bounces+steve.dower=microsoft.com at python.org] On Behalf Of Larry Hastings
Sent: Friday, May 22, 2015 1444
To: Python Dev; python-committers
Subject: [python-committers] Can we clean up the buildbots please?



Right now we have eight online buildbots for Python trunk.  Of those, currently *six* are reporting errors in either the compile or test phases.
http://buildbot.python.org/all/waterfall?category=3.x.stable
There's one platform ("AMD64 Snow Leop") where the failures are sporadic "stack overflow" errors encountered during the test suite.  But the other five platforms have consistent failures, build after build.  Those platforms:
AMD64 OpenIndiana
AMD64 Windows7 SP1
x86 Windows Server 2003 [SB]
x86 Windows7
x86 XP-4
That includes *all* of the Windows buildbots.

Gosh, it sure would be nice if Beta 1 didn't fail on Windows, wouldn't it?  Could some Windows core devs take a look at the failures and see about cleaning them up?

Naturally the OpenIndiana and OS X devs are invited to fix the errors on those platforms too,


/arry

p.s. My apologies for not bringing attention to this sooner.  But, well, we still have a day left, right?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/afc20cfd/attachment-0001.html>

From larry at hastings.org  Fri May 22 23:44:10 2015
From: larry at hastings.org (Larry Hastings)
Date: Fri, 22 May 2015 14:44:10 -0700
Subject: [Python-Dev] Can we clean up the buildbots please?
Message-ID: <555FA32A.3030907@hastings.org>



Right now we have eight online buildbots for Python trunk.  Of those, 
currently *six* are reporting errors in either the compile or test phases.

    http://buildbot.python.org/all/waterfall?category=3.x.stable

There's one platform ("AMD64 Snow Leop") where the failures are sporadic 
"stack overflow" errors encountered during the test suite. But the other 
five platforms have consistent failures, build after build.  Those 
platforms:

    AMD64 OpenIndiana
    AMD64 Windows7 SP1
    x86 Windows Server 2003 [SB]
    x86 Windows7
    x86 XP-4

That includes *all* of the Windows buildbots.

Gosh, it sure would be nice if Beta 1 didn't fail on Windows, wouldn't 
it?  Could some Windows core devs take a look at the failures and see 
about cleaning them up?

Naturally the OpenIndiana and OS X devs are invited to fix the errors on 
those platforms too,


//arry/

p.s. My apologies for not bringing attention to this sooner.  But, well, 
we still have a day left, right?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/79c7d365/attachment.html>

From larry at hastings.org  Sat May 23 00:29:33 2015
From: larry at hastings.org (Larry Hastings)
Date: Fri, 22 May 2015 15:29:33 -0700
Subject: [Python-Dev] [python-committers] Can we clean up the buildbots
	please?
In-Reply-To: <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <555FA32A.3030907@hastings.org>
 <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <555FADCD.8@hastings.org>

On 05/22/2015 03:06 PM, Steve Dower wrote:
>
> The Windows 7 buildbots are failing on test_asdl_parser, but I have no 
> idea why ? the test works for me just fine. Yury and Benjamin made the 
> most recent changes to Python.asdl, but I have no idea what effect 
> they would have here, or why it?s Windows only.
>
> The WS2K3 machine needs a reboot ? I pinged Trent about that months 
> ago ? and the XP one isn?t supported for 3.5.
>
> Pending the test_asdl_parser fix, I?d also like to see AMD64 Windows 8 
> (http://buildbot.python.org/all/builders/AMD64%20Windows8%203.x) be 
> promoted to stable, as it?s one of only two currently using the right 
> compiler.
>

So what you seem to be saying is, the Windows buildbots provide no 
useful signal and should be ignored?

Is MSVS 2015 the only supported compiler for Python 3.5 on Windows? 
What's the other buildbot using MSVS 2015?


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/0ce009ab/attachment.html>

From berker.peksag at gmail.com  Sat May 23 00:34:50 2015
From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=)
Date: Sat, 23 May 2015 01:34:50 +0300
Subject: [Python-Dev] Reminder: Python 3.5 beta 1 will be tagged tomorrow
In-Reply-To: <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
 <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
Message-ID: <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>

On Sat, May 23, 2015 at 12:53 AM, Chris Barker <chris.barker at noaa.gov> wrote:
> On Fri, May 22, 2015 at 2:33 PM, Larry Hastings <larry at hastings.org> wrote:
>>
>> On 05/22/2015 02:29 PM, Chris Barker wrote:
>>
>> Is it too late to get the isclose() code (PEP 485) into 3.5?
>
> ...
>>
>>   Hopefully you can find a core dev familiar enough with the issues
>> involved that they can (quickly!) guide it through the process of getting it
>> checked in.
>
> Ping!  Anyone willing to sponsor this?

Hi Chris,

Thanks for the PEP and the implementation!

You'll get more attention If you open an issue with a patch at
bugs.python.org. Having a GitHub repository is good, but the isclose()
code(with tests and documentation) needs to be integrated into the
CPython code base:

* The C implementation should be in Modules/mathmodule.c
* Tests should be in Lib/test/test_math.py
* Documentation should be in Doc/library/math.rst
* Add an entry to Doc/whatsnew/3.5.rst
* If I remember correctly, we don't need the Python implementation and its tests

--Berker

From Steve.Dower at microsoft.com  Sat May 23 00:47:11 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Fri, 22 May 2015 22:47:11 +0000
Subject: [Python-Dev] [python-committers] Can we clean up the buildbots
	please?
In-Reply-To: <555FADCD.8@hastings.org>
References: <555FA32A.3030907@hastings.org>
 <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555FADCD.8@hastings.org>
Message-ID: <BY1PR03MB1466BA4F6ED1CA349E7080D8F5C00@BY1PR03MB1466.namprd03.prod.outlook.com>

Two of them are useless (x86 Windows Server 2003 [SB] and x86 XP-4, to be precise), but the fact that everything other than test_asdl_parser passes is a very valuable signal. AMD64 Windows 7 SP1 is also using the correct compiler.

Since some of our core developers are yet to upgrade, I?m not against keeping the one VS 2010 buildbot around for now. When MSFT comes up with a better way of getting the compiler than installing 8GB+ of interactive environment, then I?ll be more forceful about it (and yes, I?m helping encourage the relevant teams).

Cheers,
Steve

Coming to PyData Seattle 2015<http://conf.pydata.org/seattle2015/>? Hosted by Microsoft on our Redmond campus, July 24-26

From: Larry Hastings [mailto:larry at midwinter.com] On Behalf Of Larry Hastings
Sent: Friday, May 22, 2015 1530
To: Steve Dower; Python Dev; python-committers
Cc: Yury Selivanov; Benjamin Peterson
Subject: Re: [python-committers] Can we clean up the buildbots please?

On 05/22/2015 03:06 PM, Steve Dower wrote:
The Windows 7 buildbots are failing on test_asdl_parser, but I have no idea why ? the test works for me just fine. Yury and Benjamin made the most recent changes to Python.asdl, but I have no idea what effect they would have here, or why it?s Windows only.

The WS2K3 machine needs a reboot ? I pinged Trent about that months ago ? and the XP one isn?t supported for 3.5.

Pending the test_asdl_parser fix, I?d also like to see AMD64 Windows 8 (http://buildbot.python.org/all/builders/AMD64%20Windows8%203.x) be promoted to stable, as it?s one of only two currently using the right compiler.

So what you seem to be saying is, the Windows buildbots provide no useful signal and should be ignored?

Is MSVS 2015 the only supported compiler for Python 3.5 on Windows?  What's the other buildbot using MSVS 2015?


/arry
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/cf5ab997/attachment.html>

From larry at hastings.org  Sat May 23 00:53:13 2015
From: larry at hastings.org (Larry Hastings)
Date: Fri, 22 May 2015 15:53:13 -0700
Subject: [Python-Dev] [python-committers] Can we clean up the buildbots
	please?
In-Reply-To: <BY1PR03MB1466BA4F6ED1CA349E7080D8F5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <555FA32A.3030907@hastings.org>
 <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555FADCD.8@hastings.org>
 <BY1PR03MB1466BA4F6ED1CA349E7080D8F5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <555FB359.5080307@hastings.org>


> *From:*Larry Hastings [mailto:larry at midwinter.com] *On Behalf Of 
> *Larry Hastings
> *Sent:* Friday, May 22, 2015 1530
> *To:* Steve Dower; Python Dev; python-committers
> *Cc:* Yury Selivanov; Benjamin Peterson
> *Subject:* Re: [python-committers] Can we clean up the buildbots please?
>
> Is MSVS 2015 the only supported compiler for Python 3.5 on Windows?  
> What's the other buildbot using MSVS 2015?

I'll answer my own question here.  According to PCbuild/readme.txt:

    This script will use the env.bat script to detect one of Visual
    Studio 2015, 2013, 2012, or 2010, any of which may be used to build
    Python, though only Visual Studio 2015 is officially supported.


I'll admit I'm puzzled by the wisdom of using unsupported compilers on 
buildbots.  I guess it's a historical thing.  But I gently suggest that 
we should either upgrade those buildbots to a supported compiler or 
remove them entirely.  Definitely we should remove unsupported the two 
unsupported platforms from the buildbots--that's just crazy.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/c3dd16ae/attachment.html>

From raymond.hettinger at gmail.com  Sat May 23 01:18:04 2015
From: raymond.hettinger at gmail.com (Raymond Hettinger)
Date: Fri, 22 May 2015 16:18:04 -0700
Subject: [Python-Dev] Accepting PEP 489 (Multi-phase extension module
	initialization)
In-Reply-To: <CAP7+vJK9=yJAyN8aAad4=69JSU3Oxwvay1NtKVnNioQMxOWBng@mail.gmail.com>
References: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>
 <CAP7+vJK9=yJAyN8aAad4=69JSU3Oxwvay1NtKVnNioQMxOWBng@mail.gmail.com>
Message-ID: <9DF82386-B143-4E69-9E3D-43AA295DD6D9@gmail.com>


> On May 22, 2015, at 2:52 PM, Guido van Rossum <guido at python.org> wrote:
> 
> Congrats! Many thanks to all who contributed.
> 
> On May 22, 2015 2:45 PM, "Eric Snow" <ericsnowcurrently at gmail.com> wrote:
> Hi all,
> 
> After extended discussion over the last several months on import-sig,
> the resulting proposal for multi-phase (PEP 451) extension module
> initialization has finalized.  The resulting PEP provides a clean,
> straight-forward, and backward-compatible way to import extension
> modules using ModuleSpecs.
> 
> With that in mind and given the improvement it provides, PEP 489 is
> now accepted.

I echo that sentiment.  Thank you for your work.


Raymond

From ncoghlan at gmail.com  Sat May 23 01:33:47 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 23 May 2015 09:33:47 +1000
Subject: [Python-Dev] Accepting PEP 489 (Multi-phase extension module
	initialization)
In-Reply-To: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>
References: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>
Message-ID: <CADiSq7cbjCtEhD+earVshGyjn5KSCeCcSe4fqByWcxQi3a2G8g@mail.gmail.com>

On 23 May 2015 07:45, "Eric Snow" <ericsnowcurrently at gmail.com> wrote:
>
> Hi all,
>
> After extended discussion over the last several months on import-sig,
> the resulting proposal for multi-phase (PEP 451) extension module
> initialization has finalized.  The resulting PEP provides a clean,
> straight-forward, and backward-compatible way to import extension
> modules using ModuleSpecs.
>
> With that in mind and given the improvement it provides, PEP 489 is
> now accepted.  I want to thank Petr, Nick, and Stefan for the time,
> thought, and effort they put into the proposal (and implementation).
> It was a disappointment to me when, at the time, we couldn't find a
> good way to apply PEP 451 to builtins and extension modules.  So
> thanks for easing my anxiety!

Ah, what a nice way to start my weekend :)

Thanks especially to Petr both for writing the PEP itself, and for taking
the general ideas Stefan and I originally had and turning them into a real
implementation.

Cheers,
Nick.

>
> -eric
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/a50647dc/attachment.html>

From encukou at gmail.com  Sat May 23 00:30:55 2015
From: encukou at gmail.com (Petr Viktorin)
Date: Sat, 23 May 2015 00:30:55 +0200
Subject: [Python-Dev] Accepting PEP 489 (Multi-phase extension module
	initialization)
In-Reply-To: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>
References: <CALFfu7CD2EwMTXZNSLW7BwPadmdbO8t19_AmLfZU7q9QN4KR4A@mail.gmail.com>
Message-ID: <CA+=+wqB8=Gg=WLzNb2cB=gcJn6zyP08g-jYGDfSV-+qWce5_jg@mail.gmail.com>

On Fri, May 22, 2015 at 11:44 PM, Eric Snow <ericsnowcurrently at gmail.com> wrote:
> Hi all,
>
> After extended discussion over the last several months on import-sig,
> the resulting proposal for multi-phase (PEP 451) extension module
> initialization has finalized.  The resulting PEP provides a clean,
> straight-forward, and backward-compatible way to import extension
> modules using ModuleSpecs.
>
> With that in mind and given the improvement it provides, PEP 489 is
> now accepted.  I want to thank Petr, Nick, and Stefan for the time,
> thought, and effort they put into the proposal (and implementation).
> It was a disappointment to me when, at the time, we couldn't find a
> good way to apply PEP 451 to builtins and extension modules.  So
> thanks for easing my anxiety!

Thank you for the thorough review, Eric! Also thanks to everyone
involved for insightful discussion, and especially Nick for guiding me
through writing my first PEP.
Let me know if there's anything more I can (or should -- still my
first time) do.
I'm off now but I'll be online tomorrow all day (UTC-ish hours).

From ncoghlan at gmail.com  Sat May 23 01:39:50 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 23 May 2015 09:39:50 +1000
Subject: [Python-Dev] =?utf-8?q?Hello=2C_I_am_Andr=C3=A9_Freitas_=3A=29?=
In-Reply-To: <CAMkX=YVin0NYSGqs5SdFCxPZp7gthMBsuoDs7Uox4cbKdPsLig@mail.gmail.com>
References: <CAMkX=YVin0NYSGqs5SdFCxPZp7gthMBsuoDs7Uox4cbKdPsLig@mail.gmail.com>
Message-ID: <CADiSq7fJZPcb6EzB0AgD=UNyQ=LtxtStpOvSMx4ij1RKM=7=3A@mail.gmail.com>

On 23 May 2015 01:15, "Andr? Freitas" <p.andrefreitas at gmail.com> wrote:
>
> Hi there,
> My name is Andr? Freitas, I'm 22 years old and I live in Portugal
(Porto). I'm currently finishing my Masters in Informatics and Computer
Science with a thesis in Mining Software Repositories, where I am able to
predict defects in Software components:
https://github.com/andrefreitas/schwa

Oh, nice. I've been intrigued by the MSR conference series since Greg
Wilson pointed it out to me at SciPy last year, but haven't had the time to
go look at the research myself.

> I'm a Python developer with 4 years of experience and as a Speaker, did a
lot of Python workshops in Engineering Universities. I'm always learning
new things and I really love Python (it's my religion)! I have skills in
Security, Tests and Quality, Devops, Software Architecture and Engineering,
UI/UX and Compilers.
>
> I am reading your guidelines and just checking around to see how this
mailing list works. Hope to send some patches soon and share some ideas.

Excellent! You may want to sign up for the core-mentorship mailing list (
http://pythonmentors.com/), as that's our preferred venue for helping folks
get used to core development processes.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/36135894/attachment.html>

From trent at snakebite.org  Sat May 23 02:11:39 2015
From: trent at snakebite.org (Trent Nelson)
Date: Fri, 22 May 2015 20:11:39 -0400
Subject: [Python-Dev] [python-committers] Can we clean up the buildbots
	please?
In-Reply-To: <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <555FA32A.3030907@hastings.org>
 <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <20150523000714.GA7927@snakebite.org>

On Fri, May 22, 2015 at 10:06:48PM +0000, Steve Dower wrote:
> The Windows 7 buildbots are failing on test_asdl_parser, but I have no
> idea why ? the test works for me just fine. Yury and Benjamin made the
> most recent changes to Python.asdl, but I have no idea what effect
> they would have here, or why it?s Windows only.
> 
> The WS2K3 machine needs a reboot ? I pinged Trent about that months
> ago ? and the XP one isn?t supported for 3.5.

    Gave it a little bit of love just then (haven't been able to access
    it for months as the main switch needed a reboot).  There were like,
    155 cl.exe processes wedged and a bunch of error reporting dialogs.

    Do we still support WS2K3?  (Can I even install VS 2015 on that?  I
    would have thought not.)

        Trent.

From larry at hastings.org  Sat May 23 02:24:53 2015
From: larry at hastings.org (Larry Hastings)
Date: Fri, 22 May 2015 17:24:53 -0700
Subject: [Python-Dev] [python-committers] Can we clean up the buildbots
	please?
In-Reply-To: <20150523000714.GA7927@snakebite.org>
References: <555FA32A.3030907@hastings.org>
 <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
 <20150523000714.GA7927@snakebite.org>
Message-ID: <555FC8D5.6090401@hastings.org>

On 05/22/2015 05:11 PM, Trent Nelson wrote:
> Do we still support WS2K3? (Can I even install VS 2015 on that? I 
> would have thought not.) 

According to PCbuild/readme.txt, no.  It says:

    This directory is used to build CPython for Microsoft Windows NT
    version 6.0 or higher (Windows Vista, Windows Server 2008, or later)
    on 32 and 64 bit platforms.



//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150522/93e44537/attachment.html>

From trent at snakebite.org  Sat May 23 02:31:05 2015
From: trent at snakebite.org (Trent Nelson)
Date: Fri, 22 May 2015 20:31:05 -0400
Subject: [Python-Dev] [python-committers] Can we clean up the buildbots
	please?
In-Reply-To: <555FC8D5.6090401@hastings.org>
References: <555FA32A.3030907@hastings.org>
 <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
 <20150523000714.GA7927@snakebite.org>
 <555FC8D5.6090401@hastings.org>
Message-ID: <20150523003052.GB7927@snakebite.org>

On Fri, May 22, 2015 at 05:24:53PM -0700, Larry Hastings wrote:
> On 05/22/2015 05:11 PM, Trent Nelson wrote:
> >Do we still support WS2K3? (Can I even install VS 2015 on that? I would
> >have thought not.)
> 
> According to PCbuild/readme.txt, no.  It says:
> 
>    This directory is used to build CPython for Microsoft Windows NT
>    version 6.0 or higher (Windows Vista, Windows Server 2008, or later)
>    on 32 and 64 bit platforms.

    Ah, yeah, thought so.  Pity, that box is probably the only one that
    hasn't had any form of hardware failure during its tenure ;-)

    Tried to get the W2K8 one back up on Monday when I had some remote
    hands but alas, no luck.  Think it has balked HDDs or something.

    The Solaris 11 AMD64 one Solaris 10 SPARC ones are back up now
    though and I just cleared out their 700+ build backlogs, FWIW.

        Trent.

From stephen at xemacs.org  Sat May 23 05:05:08 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Sat, 23 May 2015 12:05:08 +0900
Subject: [Python-Dev]  PEP 484 (Type Hints) announcement
In-Reply-To: <555F96D8.8050504@hotpy.org>
References: <555F96D8.8050504@hotpy.org>
Message-ID: <87mw0wgfiz.fsf@uwakimon.sk.tsukuba.ac.jp>

Mark Shannon writes:
 > Hello all,
 > 
 > I am pleased to announce that I am accepting PEP 484 (Type Hints).

Congratulations to all concerned!

 > Python is your language, please use type-hints responsibly :)

+1 QOTW (not to mention ROTFLMAO)

From taleinat at gmail.com  Sat May 23 11:29:02 2015
From: taleinat at gmail.com (Tal Einat)
Date: Sat, 23 May 2015 12:29:02 +0300
Subject: [Python-Dev] [python-committers] Reminder: Python 3.5 beta 1
 will be tagged tomorrow
In-Reply-To: <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
 <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
 <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>
Message-ID: <CALWZvp6Xi7s3MzoLDMt7vmPtL8jDMv8nAi6U21-d1DUuZ5p=gg@mail.gmail.com>

On Sat, May 23, 2015 at 1:34 AM, Berker Peksa? <berker.peksag at gmail.com> wrote:
>
> On Sat, May 23, 2015 at 12:53 AM, Chris Barker <chris.barker at noaa.gov> wrote:
> > On Fri, May 22, 2015 at 2:33 PM, Larry Hastings <larry at hastings.org> wrote:
> >>
> >> On 05/22/2015 02:29 PM, Chris Barker wrote:
> >>
> >> Is it too late to get the isclose() code (PEP 485) into 3.5?
> >
> > ...
> >>
> >>   Hopefully you can find a core dev familiar enough with the issues
> >> involved that they can (quickly!) guide it through the process of getting it
> >> checked in.
> >
> > Ping!  Anyone willing to sponsor this?
>
> ...
>
> * The C implementation should be in Modules/mathmodule.c
> * Tests should be in Lib/test/test_math.py
> * Documentation should be in Doc/library/math.rst
> * Add an entry to Doc/whatsnew/3.5.rst
> * If I remember correctly, we don't need the Python implementation and its tests

I'll happily review the patch once it's on the bug tracker as Berker described.

- Tal Einat

From ncoghlan at gmail.com  Sat May 23 15:25:19 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 23 May 2015 23:25:19 +1000
Subject: [Python-Dev] [python-committers] Reminder: Python 3.5 beta 1
 will be tagged tomorrow
In-Reply-To: <CALWZvp6Xi7s3MzoLDMt7vmPtL8jDMv8nAi6U21-d1DUuZ5p=gg@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
 <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
 <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>
 <CALWZvp6Xi7s3MzoLDMt7vmPtL8jDMv8nAi6U21-d1DUuZ5p=gg@mail.gmail.com>
Message-ID: <CADiSq7c8LeiWUW2x8vGeU3QmsXz-WqWGZz=KFHmPj0wcDxFHhg@mail.gmail.com>

On 23 May 2015 at 19:29, Tal Einat <taleinat at gmail.com> wrote:
> On Sat, May 23, 2015 at 1:34 AM, Berker Peksa? <berker.peksag at gmail.com> wrote:
>>
>> On Sat, May 23, 2015 at 12:53 AM, Chris Barker <chris.barker at noaa.gov> wrote:
>> > On Fri, May 22, 2015 at 2:33 PM, Larry Hastings <larry at hastings.org> wrote:
>> >>
>> >> On 05/22/2015 02:29 PM, Chris Barker wrote:
>> >>
>> >> Is it too late to get the isclose() code (PEP 485) into 3.5?
>> >
>> > ...
>> >>
>> >>   Hopefully you can find a core dev familiar enough with the issues
>> >> involved that they can (quickly!) guide it through the process of getting it
>> >> checked in.
>> >
>> > Ping!  Anyone willing to sponsor this?
>>
>> ...
>>
>> * The C implementation should be in Modules/mathmodule.c
>> * Tests should be in Lib/test/test_math.py
>> * Documentation should be in Doc/library/math.rst
>> * Add an entry to Doc/whatsnew/3.5.rst
>> * If I remember correctly, we don't need the Python implementation and its tests
>
> I'll happily review the patch once it's on the bug tracker as Berker described.

I filed http://bugs.python.org/issue24270 to track this, but there's a
fair bit of work to be done to integrate the changes into the existing
math module's code, tests and documentation.

And correct, there's no need for a pure Python implementation - Guido
rejected the idea of a pure Python fallback for the math module a
while back (http://bugs.python.org/issue23595)

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From alex.gronholm at nextday.fi  Sat May 23 11:20:16 2015
From: alex.gronholm at nextday.fi (=?windows-1252?Q?Alex_Gr=F6nholm?=)
Date: Sat, 23 May 2015 12:20:16 +0300
Subject: [Python-Dev] PEP 484 (Type Hints) announcement
In-Reply-To: <555F96D8.8050504@hotpy.org>
References: <555F96D8.8050504@hotpy.org>
Message-ID: <55604650.8030608@nextday.fi>

Would you mind updating the "typing" package on PyPI now to contain 
something useful? Thanks.

22.05.2015, 23:51, Mark Shannon kirjoitti:
> Hello all,
>
> I am pleased to announce that I am accepting PEP 484 (Type Hints).
>
> Given the proximity of the beta release I thought I would get this 
> announcement out now, even though there are some (very) minor details 
> to iron out.
> (If you want to know the details, it's all at 
> https://github.com/ambv/typehinting)
>
>
> I hope that PEP 484 will be a benefit to all users of Python.
> I think the proposed annotation semantics and accompanying module are 
> technically sound and I hope that they are socially acceptable to the 
> Python community.
>
> I have long been aware that as well as a powerful, sophisticated and 
> "production quality" language, Python is also used by many casual 
> programmers, and as a language to introduce children to programming.
> I also realise that this PEP does not look like it will be any help to 
> the part-time programmer or beginner. However, I am convinced that it 
> will enable significant improvements to IDEs (hopefully including 
> IDLE), static checkers and other tools.
> These tools will then help us all, beginners included.
>
> This PEP has been a huge amount of work, involving a lot of people.
> So thank you to everyone involved. If I were to list names I would 
> inevitably miss someone out. You know who you are.
>
> Finally, if you are worried that this will make Python ugly and turn 
> it into some sort of inferior Java, then I share you concerns, but I 
> would like to remind you of another potential ugliness; operator 
> overloading.
>
> C++, Perl and Haskell have operator overloading and it gets abused 
> something rotten to produce "concise" (a.k.a. line noise) code.
> Python also has operator overloading and it is used sensibly, as it 
> should be. Why?
> It's a cultural issue; readability matters.
>
> Python is your language, please use type-hints responsibly :)
>
> Cheers,
> Mark.
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/alex.gronholm%40nextday.fi


From db3l.net at gmail.com  Sat May 23 21:39:09 2015
From: db3l.net at gmail.com (David Bolen)
Date: Sat, 23 May 2015 15:39:09 -0400
Subject: [Python-Dev] Can we clean up the buildbots please?
References: <555FA32A.3030907@hastings.org>
 <BY1PR03MB14664E2808229B57A040E23CF5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555FADCD.8@hastings.org>
 <BY1PR03MB1466BA4F6ED1CA349E7080D8F5C00@BY1PR03MB1466.namprd03.prod.outlook.com>
 <555FB359.5080307@hastings.org>
Message-ID: <m24mn3ytgi.fsf@valheru.db3l.homeip.net>

Larry Hastings <larry at hastings.org> writes:

>> Is MSVS 2015 the only supported compiler for Python 3.5 on Windows?
>> What's the other buildbot using MSVS 2015?

For a while I think the only buildbot was my 8.1 slave, but I believe
at this point Jeremy may also have it on his 7 slave.  The latest on
my 7 slave is still 2010 (which is still working, sans recent test
failures).

> I'll answer my own question here.  According to PCbuild/readme.txt:
>
>    This script will use the env.bat script to detect one of Visual
>    Studio 2015, 2013, 2012, or 2010, any of which may be used to build
>    Python, though only Visual Studio 2015 is officially supported.
>
> I'll admit I'm puzzled by the wisdom of using unsupported compilers on
> buildbots.  I guess it's a historical thing.  But I gently suggest
> that we should either upgrade those buildbots to a supported compiler
> or remove them entirely.  Definitely we should remove unsupported the
> two unsupported platforms from the buildbots--that's just crazy.

To be fair, VS 2015 hasn't been officially released yet.  It only
recently (as in a few weeks ago) reached RC stage.  Given the size of
installing it, and earlier uncertainty about upgrading during the
pre-release cycle, plus some early issues with the build process, for
my part I've opted to hold off with my older slaves until it hits
release status, using only the 8.1 slave until then.  (Arguably the
current RC is supposed to be at most a minor update away from full
release, so we're probably close)

Along the way it was concluded that XP just wasn't worth making work
for the 3.5+ development, but the slave was still valuable for the 2.7
branch, so would be left around for now for that purpose.  It is a bit
misleading to still be trying to build the 3.x branch on it but I
suspect eliminating the branch from that slave is just an oversight,
or nobody with the proper access has had time yet.

-- David


From larry at hastings.org  Sat May 23 21:39:27 2015
From: larry at hastings.org (Larry Hastings)
Date: Sat, 23 May 2015 12:39:27 -0700
Subject: [Python-Dev] Fwd: Re: [python-committers] Reminder: Python 3.5 beta
 1 will be tagged tomorrow
In-Reply-To: <5560D39D.1070603@hastings.org>
References: <5560D39D.1070603@hastings.org>
Message-ID: <5560D76F.7090002@hastings.org>


Whoops, didn't send my reply to both lists.  Forwarded, below.

-------- Forwarded Message --------
Subject: 	Re: [python-committers] [Python-Dev] Reminder: Python 3.5 beta 
1 will be tagged tomorrow
Date: 	Sat, 23 May 2015 12:23:09 -0700
From: 	Larry Hastings <larry at hastings.org>
To: 	python-committers at python.org



On 05/23/2015 06:25 AM, Nick Coghlan wrote:
> I filedhttp://bugs.python.org/issue24270  to track this, but there's a
> fair bit of work to be done to integrate the changes into the existing
> math module's code, tests and documentation.

I'm willing to consider a feature freeze exception for this, as long as

  * it doesn't make invasive changes (it looks like it will literally
    add one new entry point, which is acceptable)
  * it's cleaned up in the way the core devs are proposing (integrate it
    into the math module, including tests and documentation)
  * it's done before beta 2

Somebody, please take that as an encouragement to get this cleaned up 
and ready for checkin.


//arry/

p.s. Would it make sense to add a form of isclose to unittest?


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/bfdeac2c/attachment.html>

From bcannon at gmail.com  Sat May 23 22:22:49 2015
From: bcannon at gmail.com (Brett Cannon)
Date: Sat, 23 May 2015 20:22:49 +0000
Subject: [Python-Dev] [Python-checkins] peps: PEP 489: The PEP is
	accepted.
In-Reply-To: <20150522214552.126970.6375@psf.io>
References: <20150522214552.126970.6375@psf.io>
Message-ID: <CAP1=2W5_pNtcXZfcM1+RcG=pB-yyBNPBj8rYFvibFd86TbWydg@mail.gmail.com>

Are you also going to check the code in or is someone else doing it?

On Fri, May 22, 2015, 17:47 eric.snow <python-checkins at python.org> wrote:

> https://hg.python.org/peps/rev/1fbc23a1078c
> changeset:   5874:1fbc23a1078c
> user:        Eric Snow <ericsnowcurrently at gmail.com>
> date:        Fri May 22 15:45:38 2015 -0600
> summary:
>   PEP 489: The PEP is accepted.
>
> files:
>   pep-0489.txt |  11 ++++++++---
>   1 files changed, 8 insertions(+), 3 deletions(-)
>
>
> diff --git a/pep-0489.txt b/pep-0489.txt
> --- a/pep-0489.txt
> +++ b/pep-0489.txt
> @@ -7,13 +7,13 @@
>          Nick Coghlan <ncoghlan at gmail.com>
>  BDFL-Delegate: Eric Snow <ericsnowcurrently at gmail.com>
>  Discussions-To: import-sig at python.org
> -Status: Draft
> +Status: Final
>  Type: Standards Track
>  Content-Type: text/x-rst
>  Created: 11-Aug-2013
>  Python-Version: 3.5
> -Post-History: 23-Aug-2013, 20-Feb-2015, 16-Apr-2015
> -Resolution:
> +Post-History: 23-Aug-2013, 20-Feb-2015, 16-Apr-2015, 7-May-2015,
> 18-May-2015
> +Resolution:
> https://mail.python.org/pipermail/python-dev/2015-May/140108.html
>
>
>  Abstract
> @@ -730,8 +730,13 @@
>
>  * PyModuleDef_Slot
>
> +Other changes:
> +
>  PyModuleDef.m_reload changes to PyModuleDef.m_slots.
>
> +``BuiltinImporter`` and ``ExtensionFileLoader`` will now implement
> +``create_module`` and ``exec_module``.
> +
>  The internal ``_imp`` module will have backwards incompatible changes:
>  ``create_builtin``, ``create_dynamic``, and ``exec_dynamic`` will be
> added;
>  ``init_builtin``, ``load_dynamic`` will be removed.
>
> --
> Repository URL: https://hg.python.org/peps
> _______________________________________________
> Python-checkins mailing list
> Python-checkins at python.org
> https://mail.python.org/mailman/listinfo/python-checkins
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/d61f3b55/attachment.html>

From tjreedy at udel.edu  Sat May 23 22:24:07 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Sat, 23 May 2015 16:24:07 -0400
Subject: [Python-Dev] 3.5 doc warnings
Message-ID: <mjqnli$7mr$1@ger.gmane.org>

35\Doc\whatsnew\3.5.rst:686: ERROR: Unknown interpreted text role "module".

35\Doc\library\typing.rst:: WARNING: document isn't included in any toctree

from building html docs just now

-- 
Terry Jan Reedy


From ericsnowcurrently at gmail.com  Sat May 23 22:32:58 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Sat, 23 May 2015 14:32:58 -0600
Subject: [Python-Dev] [Python-checkins] peps: PEP 489: The PEP is
	accepted.
In-Reply-To: <CAP1=2W5_pNtcXZfcM1+RcG=pB-yyBNPBj8rYFvibFd86TbWydg@mail.gmail.com>
References: <20150522214552.126970.6375@psf.io>
 <CAP1=2W5_pNtcXZfcM1+RcG=pB-yyBNPBj8rYFvibFd86TbWydg@mail.gmail.com>
Message-ID: <CALFfu7BxtVPHg=1HXk0sdvM9hALazc=9ESiMvR_WGWytW=eFrA@mail.gmail.com>

On Sat, May 23, 2015 at 2:22 PM, Brett Cannon <bcannon at gmail.com> wrote:
> Are you also going to check the code in or is someone else doing it?

Nick already did:

http://bugs.python.org/issue24268
https://hg.python.org/cpython/rev/e729b946cc03

:)

-eric

>
>
> On Fri, May 22, 2015, 17:47 eric.snow <python-checkins at python.org> wrote:
>>
>> https://hg.python.org/peps/rev/1fbc23a1078c
>> changeset:   5874:1fbc23a1078c
>> user:        Eric Snow <ericsnowcurrently at gmail.com>
>> date:        Fri May 22 15:45:38 2015 -0600
>> summary:
>>   PEP 489: The PEP is accepted.
>>
>> files:
>>   pep-0489.txt |  11 ++++++++---
>>   1 files changed, 8 insertions(+), 3 deletions(-)
>>
>>
>> diff --git a/pep-0489.txt b/pep-0489.txt
>> --- a/pep-0489.txt
>> +++ b/pep-0489.txt
>> @@ -7,13 +7,13 @@
>>          Nick Coghlan <ncoghlan at gmail.com>
>>  BDFL-Delegate: Eric Snow <ericsnowcurrently at gmail.com>
>>  Discussions-To: import-sig at python.org
>> -Status: Draft
>> +Status: Final
>>  Type: Standards Track
>>  Content-Type: text/x-rst
>>  Created: 11-Aug-2013
>>  Python-Version: 3.5
>> -Post-History: 23-Aug-2013, 20-Feb-2015, 16-Apr-2015
>> -Resolution:
>> +Post-History: 23-Aug-2013, 20-Feb-2015, 16-Apr-2015, 7-May-2015,
>> 18-May-2015
>> +Resolution:
>> https://mail.python.org/pipermail/python-dev/2015-May/140108.html
>>
>>
>>  Abstract
>> @@ -730,8 +730,13 @@
>>
>>  * PyModuleDef_Slot
>>
>> +Other changes:
>> +
>>  PyModuleDef.m_reload changes to PyModuleDef.m_slots.
>>
>> +``BuiltinImporter`` and ``ExtensionFileLoader`` will now implement
>> +``create_module`` and ``exec_module``.
>> +
>>  The internal ``_imp`` module will have backwards incompatible changes:
>>  ``create_builtin``, ``create_dynamic``, and ``exec_dynamic`` will be
>> added;
>>  ``init_builtin``, ``load_dynamic`` will be removed.
>>
>> --
>> Repository URL: https://hg.python.org/peps
>> _______________________________________________
>> Python-checkins mailing list
>> Python-checkins at python.org
>> https://mail.python.org/mailman/listinfo/python-checkins
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com
>

From berker.peksag at gmail.com  Sat May 23 23:28:20 2015
From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=)
Date: Sun, 24 May 2015 00:28:20 +0300
Subject: [Python-Dev] 3.5 doc warnings
In-Reply-To: <mjqnli$7mr$1@ger.gmane.org>
References: <mjqnli$7mr$1@ger.gmane.org>
Message-ID: <CAF4280+qb9orSn1JY+jf1YLMdxqeZeO2XPobeE-KjtQ2Gd7ObQ@mail.gmail.com>

On Sat, May 23, 2015 at 11:24 PM, Terry Reedy <tjreedy at udel.edu> wrote:
> 35\Doc\whatsnew\3.5.rst:686: ERROR: Unknown interpreted text role "module".
>
> 35\Doc\library\typing.rst:: WARNING: document isn't included in any toctree
>
> from building html docs just now

Fixed in https://hg.python.org/cpython/rev/ec1e187173f7

--Berker

From bcannon at gmail.com  Sat May 23 23:33:06 2015
From: bcannon at gmail.com (Brett Cannon)
Date: Sat, 23 May 2015 21:33:06 +0000
Subject: [Python-Dev] [Python-checkins] peps: PEP 489: The PEP is
	accepted.
In-Reply-To: <CALFfu7BxtVPHg=1HXk0sdvM9hALazc=9ESiMvR_WGWytW=eFrA@mail.gmail.com>
References: <20150522214552.126970.6375@psf.io>
 <CAP1=2W5_pNtcXZfcM1+RcG=pB-yyBNPBj8rYFvibFd86TbWydg@mail.gmail.com>
 <CALFfu7BxtVPHg=1HXk0sdvM9hALazc=9ESiMvR_WGWytW=eFrA@mail.gmail.com>
Message-ID: <CAP1=2W6wXarcpvRVywrLvZJLsZkLtgNOWSiTUM=JuPWeJo5sEw@mail.gmail.com>

Ah thanks. I just kept an eye out for your name. :)

On Sat, May 23, 2015, 16:32 Eric Snow <ericsnowcurrently at gmail.com> wrote:

> On Sat, May 23, 2015 at 2:22 PM, Brett Cannon <bcannon at gmail.com> wrote:
> > Are you also going to check the code in or is someone else doing it?
>
> Nick already did:
>
> http://bugs.python.org/issue24268
> https://hg.python.org/cpython/rev/e729b946cc03
>
> :)
>
> -eric
>
> >
> >
> > On Fri, May 22, 2015, 17:47 eric.snow <python-checkins at python.org>
> wrote:
> >>
> >> https://hg.python.org/peps/rev/1fbc23a1078c
> >> changeset:   5874:1fbc23a1078c
> >> user:        Eric Snow <ericsnowcurrently at gmail.com>
> >> date:        Fri May 22 15:45:38 2015 -0600
> >> summary:
> >>   PEP 489: The PEP is accepted.
> >>
> >> files:
> >>   pep-0489.txt |  11 ++++++++---
> >>   1 files changed, 8 insertions(+), 3 deletions(-)
> >>
> >>
> >> diff --git a/pep-0489.txt b/pep-0489.txt
> >> --- a/pep-0489.txt
> >> +++ b/pep-0489.txt
> >> @@ -7,13 +7,13 @@
> >>          Nick Coghlan <ncoghlan at gmail.com>
> >>  BDFL-Delegate: Eric Snow <ericsnowcurrently at gmail.com>
> >>  Discussions-To: import-sig at python.org
> >> -Status: Draft
> >> +Status: Final
> >>  Type: Standards Track
> >>  Content-Type: text/x-rst
> >>  Created: 11-Aug-2013
> >>  Python-Version: 3.5
> >> -Post-History: 23-Aug-2013, 20-Feb-2015, 16-Apr-2015
> >> -Resolution:
> >> +Post-History: 23-Aug-2013, 20-Feb-2015, 16-Apr-2015, 7-May-2015,
> >> 18-May-2015
> >> +Resolution:
> >> https://mail.python.org/pipermail/python-dev/2015-May/140108.html
> >>
> >>
> >>  Abstract
> >> @@ -730,8 +730,13 @@
> >>
> >>  * PyModuleDef_Slot
> >>
> >> +Other changes:
> >> +
> >>  PyModuleDef.m_reload changes to PyModuleDef.m_slots.
> >>
> >> +``BuiltinImporter`` and ``ExtensionFileLoader`` will now implement
> >> +``create_module`` and ``exec_module``.
> >> +
> >>  The internal ``_imp`` module will have backwards incompatible changes:
> >>  ``create_builtin``, ``create_dynamic``, and ``exec_dynamic`` will be
> >> added;
> >>  ``init_builtin``, ``load_dynamic`` will be removed.
> >>
> >> --
> >> Repository URL: https://hg.python.org/peps
> >> _______________________________________________
> >> Python-checkins mailing list
> >> Python-checkins at python.org
> >> https://mail.python.org/mailman/listinfo/python-checkins
> >
> >
> > _______________________________________________
> > Python-Dev mailing list
> > Python-Dev at python.org
> > https://mail.python.org/mailman/listinfo/python-dev
> > Unsubscribe:
> >
> https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/34b4c846/attachment-0001.html>

From benjamin at python.org  Sun May 24 01:50:04 2015
From: benjamin at python.org (Benjamin Peterson)
Date: Sat, 23 May 2015 19:50:04 -0400
Subject: [Python-Dev] [RELEASE] Python 2.7.10
Message-ID: <1432425004.1774211.276632913.5FA67C5E@webmail.messagingengine.com>

The next bugfix release of the Python 2.7.x series, Python 2.7.10, has
been released. The only interesting change since the release candidate
is a fix for a regression in cookie parsing.

Downloads are available at:
  https://www.python.org/downloads/release/python-2710/

Report bugs at:
  https://bugs.python.org

Enjoy your 2 digit versions,
Benjamin
(on behalf of 2.7.10's contributors)

From nad at acm.org  Sun May 24 02:44:45 2015
From: nad at acm.org (Ned Deily)
Date: Sat, 23 May 2015 17:44:45 -0700
Subject: [Python-Dev] devguide: Updated dev guide to reflect the new
	workflow we're trying for 3.5.
References: <20150523083226.126986.26481@psf.io> <5560B054.307@udel.edu>
Message-ID: <nad-03FA9D.17444523052015@news.gmane.org>

In article <5560B054.307 at udel.edu>, Terry Reedy <tjreedy at udel.edu> 
wrote:
> I somehow did not understand this last part before.  Rather I thought 
> the need for pull requests would be highly restricted (and not affect me 
> ;-).  3.5 bugfixes (and idlelib patches, unclassified), especially those 
> applied to 3.4, should automatically be in the next 3.5 release.  I 
> cannot imagine a reason not to so so. Otherwise, we could end up with 
> the awful anomaly that a bugfix (or Idle change) could be in 3.4.4 (the 
> final maintenance version that comes out after 3.5.0) but not in 3.5.0 
> itself.

The need for "pull requests" *is* highly restricted.  Note that the 
"pull request" proposal appears in the Release Candidate section and 
only applies after 3.5.0rc1 is finalized.  During the Beta phase that 
we're entering now, bugfixes should be checked into the new 3.5 branch 
and they will be released first in 3.5.0b2 or 3.5.0b3.  After rc1, 
bugfixes checked into the 3.5 branch will be released first in 3.5.1 
unless they are deemed release critical for 3.5.0 in which case the pull 
request would be needed.  The goal is to have zero release critical 
fixes after rc1; usually there are very few.

-- 
 Ned Deily,
 nad at acm.org


From ericsnowcurrently at gmail.com  Sun May 24 03:15:35 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Sat, 23 May 2015 19:15:35 -0600
Subject: [Python-Dev] Preserving the definition order of class namespaces.
Message-ID: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>

tl;dr Are there any objections to making making the default
cls.__prepare__ return OrderedDict instead of dict (and preserve that
order in a list on the class)?

A couple years ago [1][2] I proposed making class definition
namespaces use OrderedDict by default.  Said Guido [3]:

    I'm fine with doing this by default for a class namespace; the type of
    cls.__dict__ is already a non-dict (it's a proxy) and it's unlikely to
    have 100,000 entries.

It turns out making cls.__dict__ an OrderedDict isn't reasonably
tractable (due to the concrete API v. subclasses), but really that
isn't what I was looking for anyway.

Regardless, since it's been a while I just want to run the proposal by
the group again.  I'm hopeful about landing my C implementation of
OrderedDict [4] in the next few days.  Also, I have a patch up [5]
that implements using OrderedDict for class definitions.  So mostly I
just want to double check that I'm still good to go.

Just to be clear on what I'm proposing specifically, I've summed it up below.

-eric

---------------------

Currently if you want to preserve the order of a class definition you
have to use a metaclass with a __prepare__ method (see PEP 3115).
However, as that PEP points out [6], the common case for __prepare__
is to use OrderedDict.  I'm proposing that we return OrderedDict() by
default from __prepare__.  Considering the common case, we should also
expose that definition order on the class afterward since otherwise
the extra information from the class definition namespace is discarded
(type.__new__ copies it into a dict which is then used for
cls.__dict__).

So the key changes are:

* use OrderedDict by default for class definition namespace (e.g. from
type.__prepare__)
* expose that definition order as cls.__definition_order__ (a list)

(Note that I will not be changing the actual type of cls.__dict__
(i.e. tp_dict) which will remain a dict.)

The effect of the change would be that the following are basically
equivalent (relative to the the definition namespace):

    class Meta(type):
        @classmethod.
        def __prepare__(meta, *args, **kwargs):
            return OrderedDict()

    class SpamOld(metaclass=Meta):
        a = 1
        b = 2
        c = 3
        __definition_order__ = list(locals())

    class SpamNew:
        a = 1
        b = 2
        c = 3

    assert SpamOld.__definition__order == SpamNew.__definition_order__

The key differences are:

* for SpamNew you don't need to use a metaclass [7][8]
* for SpamNew you don't need to rely on the behavior of locals()
* for SpamNew the class definition isn't cluttered with extra
boilerplate for __definition_order__
* class decorators that care about definition order [9] don't have to
require that classes like SpamNew manually preserve that order somehow

The patch for the change is pretty minimal. [5]

Also, Nick Coghlan recently expressed that he favored using
OrderedDict by default over the alternative presented by PEP 422/487.
[10]


[1] https://mail.python.org/pipermail/python-ideas/2013-February/019690.html
[2] https://mail.python.org/pipermail/python-dev/2013-June/127103.html
[3] Guido: https://mail.python.org/pipermail/python-ideas/2013-February/019704.html
[4] http://bugs.python.org/issue16991
[5] http://bugs.python.org/issue24254
[6] see the "Alternate Proposals" section of
https://www.python.org/dev/peps/pep-3115/
[7] PEPs 422 and 487 relatedly focus on the benefits of reducing the
need to use metaclasses
[8] https://mail.python.org/pipermail/python-ideas/2013-February/019706.html
[9] see "Key points" on
https://mail.python.org/pipermail/python-dev/2013-February/124439.html
[10] Nick: https://mail.python.org/pipermail/python-ideas/2015-March/032254.html

From ncoghlan at gmail.com  Sun May 24 04:04:21 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 24 May 2015 12:04:21 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
Message-ID: <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>

On 24 May 2015 at 11:15, Eric Snow <ericsnowcurrently at gmail.com> wrote:
> tl;dr Are there any objections to making making the default
> cls.__prepare__ return OrderedDict instead of dict (and preserve that
> order in a list on the class)?
>
> A couple years ago [1][2] I proposed making class definition
> namespaces use OrderedDict by default.  Said Guido [3]:
>
>     I'm fine with doing this by default for a class namespace; the type of
>     cls.__dict__ is already a non-dict (it's a proxy) and it's unlikely to
>     have 100,000 entries.
>
> It turns out making cls.__dict__ an OrderedDict isn't reasonably
> tractable (due to the concrete API v. subclasses), but really that
> isn't what I was looking for anyway.
>
> Regardless, since it's been a while I just want to run the proposal by
> the group again.  I'm hopeful about landing my C implementation of
> OrderedDict [4] in the next few days.  Also, I have a patch up [5]
> that implements using OrderedDict for class definitions.  So mostly I
> just want to double check that I'm still good to go.

While it isn't controversial (since you already have the +1 from
Guido), it's worth writing up the change as a PEP for 3.6 anyway,
since that then provides clearer guidance to alternate implementations
that they're going to need to change the way their class namespace
evaluation works for 3.6.

Let's not repeat the zip archive and directory execution mistake that
3.5's PEP 441 aimed to resolve :)

PEP 487 could then be updated to reference that PEP as part of the
rationale for dropping the "easy namespace customisation" aspect of
the proposal.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Sun May 24 04:38:34 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 24 May 2015 12:38:34 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
Message-ID: <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>

On 24 May 2015 at 12:04, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 24 May 2015 at 11:15, Eric Snow <ericsnowcurrently at gmail.com> wrote:
>> tl;dr Are there any objections to making making the default
>> cls.__prepare__ return OrderedDict instead of dict (and preserve that
>> order in a list on the class)?
>>
>> A couple years ago [1][2] I proposed making class definition
>> namespaces use OrderedDict by default.  Said Guido [3]:
>>
>>     I'm fine with doing this by default for a class namespace; the type of
>>     cls.__dict__ is already a non-dict (it's a proxy) and it's unlikely to
>>     have 100,000 entries.
>>
>> It turns out making cls.__dict__ an OrderedDict isn't reasonably
>> tractable (due to the concrete API v. subclasses), but really that
>> isn't what I was looking for anyway.
>>
>> Regardless, since it's been a while I just want to run the proposal by
>> the group again.  I'm hopeful about landing my C implementation of
>> OrderedDict [4] in the next few days.  Also, I have a patch up [5]
>> that implements using OrderedDict for class definitions.  So mostly I
>> just want to double check that I'm still good to go.
>
> While it isn't controversial (since you already have the +1 from
> Guido), it's worth writing up the change as a PEP for 3.6 anyway,
> since that then provides clearer guidance to alternate implementations
> that they're going to need to change the way their class namespace
> evaluation works for 3.6.

Eric clarified for me that Larry was considering granting a feature
freeze exemption to defer landing this to beta 2 while Eric tracked
down a segfault bug in the current patch that provides a C
implementation of OrderedDict. That sounds like a nicer approach than
what I did for PEP 489 (where I checked in an initial version that I
knew still had a refleak bug in it), so +1 from me for going down that
path.

A top level section in the What's New would cover my concerns
regarding making sure folks are suitably aware of the change (as I
believe leaving it out of the original 2.6 What's New document was the
real problem with making people aware of the addition of zip archive
and directory execution support).

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From larry at hastings.org  Sun May 24 05:14:56 2015
From: larry at hastings.org (Larry Hastings)
Date: Sat, 23 May 2015 20:14:56 -0700
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
Message-ID: <55614230.5010904@hastings.org>



On 05/23/2015 07:38 PM, Nick Coghlan wrote:
> Eric clarified for me that Larry was considering granting a feature
> freeze exemption to defer landing this to beta 2 while Eric tracked
> down a segfault bug in the current patch that provides a C
> implementation of OrderedDict.

Yeah, I'm willing to grant the feature freeze exception, assuming he can 
find general approval from the community (and assuming he still has 
Guido's blessing).  I just wanted a little more sunlight on the topic, 
rather than rushing to check it in.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/daa538c8/attachment.html>

From guido at python.org  Sun May 24 06:46:57 2015
From: guido at python.org (Guido van Rossum)
Date: Sat, 23 May 2015 21:46:57 -0700
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
Message-ID: <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>

How will __definition_order__ be set in the case where __prepare__ doesn't
return an OrderedDict? Or where a custom metaclass's __new__ calls its
superclass's __new__ with a plain dict? (I just wrote some code that does
that. :-)

On Sat, May 23, 2015 at 7:38 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:

> On 24 May 2015 at 12:04, Nick Coghlan <ncoghlan at gmail.com> wrote:
> > On 24 May 2015 at 11:15, Eric Snow <ericsnowcurrently at gmail.com> wrote:
> >> tl;dr Are there any objections to making making the default
> >> cls.__prepare__ return OrderedDict instead of dict (and preserve that
> >> order in a list on the class)?
> >>
> >> A couple years ago [1][2] I proposed making class definition
> >> namespaces use OrderedDict by default.  Said Guido [3]:
> >>
> >>     I'm fine with doing this by default for a class namespace; the type
> of
> >>     cls.__dict__ is already a non-dict (it's a proxy) and it's unlikely
> to
> >>     have 100,000 entries.
> >>
> >> It turns out making cls.__dict__ an OrderedDict isn't reasonably
> >> tractable (due to the concrete API v. subclasses), but really that
> >> isn't what I was looking for anyway.
> >>
> >> Regardless, since it's been a while I just want to run the proposal by
> >> the group again.  I'm hopeful about landing my C implementation of
> >> OrderedDict [4] in the next few days.  Also, I have a patch up [5]
> >> that implements using OrderedDict for class definitions.  So mostly I
> >> just want to double check that I'm still good to go.
> >
> > While it isn't controversial (since you already have the +1 from
> > Guido), it's worth writing up the change as a PEP for 3.6 anyway,
> > since that then provides clearer guidance to alternate implementations
> > that they're going to need to change the way their class namespace
> > evaluation works for 3.6.
>
> Eric clarified for me that Larry was considering granting a feature
> freeze exemption to defer landing this to beta 2 while Eric tracked
> down a segfault bug in the current patch that provides a C
> implementation of OrderedDict. That sounds like a nicer approach than
> what I did for PEP 489 (where I checked in an initial version that I
> knew still had a refleak bug in it), so +1 from me for going down that
> path.
>
> A top level section in the What's New would cover my concerns
> regarding making sure folks are suitably aware of the change (as I
> believe leaving it out of the original 2.6 What's New document was the
> real problem with making people aware of the addition of zip archive
> and directory execution support).
>
> Regards,
> Nick.
>
> --
> Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/5176c663/attachment.html>

From larry at hastings.org  Sun May 24 06:58:26 2015
From: larry at hastings.org (Larry Hastings)
Date: Sat, 23 May 2015 21:58:26 -0700
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
Message-ID: <55615A72.1000906@hastings.org>



On 05/23/2015 09:46 PM, Guido van Rossum wrote:
> How will __definition_order__ be set in the case where __prepare__ 
> doesn't return an OrderedDict? Or where a custom metaclass's __new__ 
> calls its superclass's __new__ with a plain dict? (I just wrote some 
> code that does that. :-)

In his patch, type_new tests to see if the dict passed in is an ordered 
dict (PyODict_Check).  __definition_order__ is only created and 
populated if it passes the test.

    http://bugs.python.org/file39446/odict-class-definition-namespace.diff


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/ad3568e2/attachment.html>

From guido at python.org  Sun May 24 07:32:27 2015
From: guido at python.org (Guido van Rossum)
Date: Sat, 23 May 2015 22:32:27 -0700
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <55615A72.1000906@hastings.org>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <55615A72.1000906@hastings.org>
Message-ID: <CAP7+vJJPH2LaW3KSiYgzeb6rVZOBUjT6MkX2t+hU8ubb_xTvaA@mail.gmail.com>

But isn't that also a problem? It would make the existence of that member a
bit unpredictable.

On Saturday, May 23, 2015, Larry Hastings <larry at hastings.org> wrote:

>
>
> On 05/23/2015 09:46 PM, Guido van Rossum wrote:
>
> How will __definition_order__ be set in the case where __prepare__ doesn't
> return an OrderedDict? Or where a custom metaclass's __new__ calls its
> superclass's __new__ with a plain dict? (I just wrote some code that does
> that. :-)
>
>
> In his patch, type_new tests to see if the dict passed in is an ordered
> dict (PyODict_Check).  __definition_order__ is only created and populated
> if it passes the test.
>
> http://bugs.python.org/file39446/odict-class-definition-namespace.diff
>
>
> */arry*
>


-- 
--Guido van Rossum (on iPad)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/73652ffc/attachment-0001.html>

From ericsnowcurrently at gmail.com  Sun May 24 07:53:36 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Sat, 23 May 2015 23:53:36 -0600
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
Message-ID: <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>

On May 23, 2015 10:47 PM, "Guido van Rossum" <guido at python.org> wrote:
>
> How will __definition_order__ be set in the case where __prepare__
doesn't return an OrderedDict? Or where a custom metaclass's __new__ calls
its superclass's __new__ with a plain dict? (I just wrote some code that
does that. :-)

I was planning on setting it to None if the order is not available.  At the
moment that's just a check for OrderedDict.

-eric

>
> On Sat, May 23, 2015 at 7:38 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>
>> On 24 May 2015 at 12:04, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> > On 24 May 2015 at 11:15, Eric Snow <ericsnowcurrently at gmail.com> wrote:
>> >> tl;dr Are there any objections to making making the default
>> >> cls.__prepare__ return OrderedDict instead of dict (and preserve that
>> >> order in a list on the class)?
>> >>
>> >> A couple years ago [1][2] I proposed making class definition
>> >> namespaces use OrderedDict by default.  Said Guido [3]:
>> >>
>> >>     I'm fine with doing this by default for a class namespace; the
type of
>> >>     cls.__dict__ is already a non-dict (it's a proxy) and it's
unlikely to
>> >>     have 100,000 entries.
>> >>
>> >> It turns out making cls.__dict__ an OrderedDict isn't reasonably
>> >> tractable (due to the concrete API v. subclasses), but really that
>> >> isn't what I was looking for anyway.
>> >>
>> >> Regardless, since it's been a while I just want to run the proposal by
>> >> the group again.  I'm hopeful about landing my C implementation of
>> >> OrderedDict [4] in the next few days.  Also, I have a patch up [5]
>> >> that implements using OrderedDict for class definitions.  So mostly I
>> >> just want to double check that I'm still good to go.
>> >
>> > While it isn't controversial (since you already have the +1 from
>> > Guido), it's worth writing up the change as a PEP for 3.6 anyway,
>> > since that then provides clearer guidance to alternate implementations
>> > that they're going to need to change the way their class namespace
>> > evaluation works for 3.6.
>>
>> Eric clarified for me that Larry was considering granting a feature
>> freeze exemption to defer landing this to beta 2 while Eric tracked
>> down a segfault bug in the current patch that provides a C
>> implementation of OrderedDict. That sounds like a nicer approach than
>> what I did for PEP 489 (where I checked in an initial version that I
>> knew still had a refleak bug in it), so +1 from me for going down that
>> path.
>>
>> A top level section in the What's New would cover my concerns
>> regarding making sure folks are suitably aware of the change (as I
>> believe leaving it out of the original 2.6 What's New document was the
>> real problem with making people aware of the addition of zip archive
>> and directory execution support).
>>
>> Regards,
>> Nick.
>>
>> --
>> Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/guido%40python.org
>
>
>
>
> --
> --Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150523/e6f0ac1b/attachment.html>

From ncoghlan at gmail.com  Sun May 24 11:35:13 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 24 May 2015 19:35:13 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
Message-ID: <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>

On 24 May 2015 at 15:53, Eric Snow <ericsnowcurrently at gmail.com> wrote:
>
> On May 23, 2015 10:47 PM, "Guido van Rossum" <guido at python.org> wrote:
>>
>> How will __definition_order__ be set in the case where __prepare__ doesn't
>> return an OrderedDict? Or where a custom metaclass's __new__ calls its
>> superclass's __new__ with a plain dict? (I just wrote some code that does
>> that. :-)
>
> I was planning on setting it to None if the order is not available.  At the
> moment that's just a check for OrderedDict.

Is it specifically necessary to save the order by default? Metaclasses
would be able to access the ordered namespace in their __new__ method
regardless, and for 3.6, I still like the __init_subclass__ hook idea
proposed in PEP 487, which includes passing the original namespace to
the new hook.

So while I'm sold on the value of making class execution namespaces
ordered by default, I'm not yet sold on the idea of *remembering* that
order without opting in to doing so in the metaclass.

If we leave __definition_order__ out for the time being then, for the
vast majority of code, the fact that the ephemeral namespace used to
evaluate the class body switched from being a basic dictionary to an
ordered one would be a hidden implementation detail, rather than
making all type objects a little bigger.

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From mark at hotpy.org  Sun May 24 11:44:45 2015
From: mark at hotpy.org (Mark Shannon)
Date: Sun, 24 May 2015 10:44:45 +0100
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
 <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
Message-ID: <55619D8D.9070507@hotpy.org>



On 24/05/15 10:35, Nick Coghlan wrote:
> On 24 May 2015 at 15:53, Eric Snow <ericsnowcurrently at gmail.com> wrote:
>>
>> On May 23, 2015 10:47 PM, "Guido van Rossum" <guido at python.org> wrote:
>>>
>>> How will __definition_order__ be set in the case where __prepare__ doesn't
>>> return an OrderedDict? Or where a custom metaclass's __new__ calls its
>>> superclass's __new__ with a plain dict? (I just wrote some code that does
>>> that. :-)
>>
>> I was planning on setting it to None if the order is not available.  At the
>> moment that's just a check for OrderedDict.
>
> Is it specifically necessary to save the order by default? Metaclasses
> would be able to access the ordered namespace in their __new__ method
> regardless, and for 3.6, I still like the __init_subclass__ hook idea
> proposed in PEP 487, which includes passing the original namespace to
> the new hook.
>
> So while I'm sold on the value of making class execution namespaces
> ordered by default, I'm not yet sold on the idea of *remembering* that
> order without opting in to doing so in the metaclass.
>
> If we leave __definition_order__ out for the time being then, for the
> vast majority of code, the fact that the ephemeral namespace used to
> evaluate the class body switched from being a basic dictionary to an
> ordered one would be a hidden implementation detail, rather than
> making all type objects a little bigger.
and a little slower.

Cheers,
Mark.

From ncoghlan at gmail.com  Sun May 24 12:06:40 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 24 May 2015 20:06:40 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <55619D8D.9070507@hotpy.org>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
 <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
 <55619D8D.9070507@hotpy.org>
Message-ID: <CADiSq7fuHp7byY1cbweCNf2vrMocRHZvrHm=fJtYwZCR-=tRyg@mail.gmail.com>

On 24 May 2015 at 19:44, Mark Shannon <mark at hotpy.org> wrote:
> On 24/05/15 10:35, Nick Coghlan wrote:
>> If we leave __definition_order__ out for the time being then, for the
>> vast majority of code, the fact that the ephemeral namespace used to
>> evaluate the class body switched from being a basic dictionary to an
>> ordered one would be a hidden implementation detail, rather than
>> making all type objects a little bigger.
> and a little slower.

The runtime namespace used to store the class attributes is remaining
a plain dict object regardless, it's only the ephemeral one that's
used to evaluate the class body at definition time that Eric's
proposing to switch to an ordered dictionary.

That approach avoids any new runtime overhead when using the defined
class, while still making the order of attribute assignment available
to custom metaclasses by default.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From taleinat at gmail.com  Sun May 24 13:40:12 2015
From: taleinat at gmail.com (Tal Einat)
Date: Sun, 24 May 2015 14:40:12 +0300
Subject: [Python-Dev] [python-committers] Reminder: Python 3.5 beta 1
 will be tagged tomorrow
In-Reply-To: <CADiSq7c8LeiWUW2x8vGeU3QmsXz-WqWGZz=KFHmPj0wcDxFHhg@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
 <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
 <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>
 <CALWZvp6Xi7s3MzoLDMt7vmPtL8jDMv8nAi6U21-d1DUuZ5p=gg@mail.gmail.com>
 <CADiSq7c8LeiWUW2x8vGeU3QmsXz-WqWGZz=KFHmPj0wcDxFHhg@mail.gmail.com>
Message-ID: <CALWZvp5YWGV-kdb1ijjTwNg-+xOrkup7ZZTyz1SNS0kwVLDUtQ@mail.gmail.com>

On Sat, May 23, 2015 at 4:25 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 23 May 2015 at 19:29, Tal Einat <taleinat at gmail.com> wrote:
>> On Sat, May 23, 2015 at 1:34 AM, Berker Peksa? <berker.peksag at gmail.com> wrote:
>>>
>>> * The C implementation should be in Modules/mathmodule.c
>>> * Tests should be in Lib/test/test_math.py
>>> * Documentation should be in Doc/library/math.rst
>>> * Add an entry to Doc/whatsnew/3.5.rst
>>> * If I remember correctly, we don't need the Python implementation and its tests
>>
>> I'll happily review the patch once it's on the bug tracker as Berker described.
>
> I filed http://bugs.python.org/issue24270 to track this, but there's a
> fair bit of work to be done to integrate the changes into the existing
> math module's code, tests and documentation.

Done. Patch attached to the issue. Awaiting review!

- Tal Einat

From chris.barker at noaa.gov  Sun May 24 17:40:02 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Sun, 24 May 2015 08:40:02 -0700
Subject: [Python-Dev] [python-committers] Reminder: Python 3.5 beta 1
 will be tagged tomorrow
In-Reply-To: <CALWZvp5YWGV-kdb1ijjTwNg-+xOrkup7ZZTyz1SNS0kwVLDUtQ@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
 <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
 <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>
 <CALWZvp6Xi7s3MzoLDMt7vmPtL8jDMv8nAi6U21-d1DUuZ5p=gg@mail.gmail.com>
 <CADiSq7c8LeiWUW2x8vGeU3QmsXz-WqWGZz=KFHmPj0wcDxFHhg@mail.gmail.com>
 <CALWZvp5YWGV-kdb1ijjTwNg-+xOrkup7ZZTyz1SNS0kwVLDUtQ@mail.gmail.com>
Message-ID: <CALGmxE+=TqkBhOm29r85Rfe7UQFbgvQJwk3w==fhdS4NrEe9bg@mail.gmail.com>

On Sun, May 24, 2015 at 4:40 AM, Tal Einat <taleinat at gmail.com> wrote:

> > I filed http://bugs.python.org/issue24270 to track this, but there's a
> > fair bit of work to be done to integrate the changes into the existing
> > math module's code, tests and documentation.
>
> Done. Patch attached to the issue. Awaiting review!
>

Wow! thanks so much! I'm a bit tied up with my day job right now:

(http://incidentnews.noaa.gov/incident/8934)

so wasn't sure I could do that fiddly work soon enough.

I should find a couple hours to look over it all today, I hope.

What do folks think about adding one to cmath as well, while we are at it?
It should be pretty straightforward -- I could focus what time I have to do
that.

-Chris





-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/ecaf8cb4/attachment.html>

From taleinat at gmail.com  Sun May 24 19:00:35 2015
From: taleinat at gmail.com (Tal Einat)
Date: Sun, 24 May 2015 20:00:35 +0300
Subject: [Python-Dev] [python-committers] Reminder: Python 3.5 beta 1
 will be tagged tomorrow
In-Reply-To: <CALGmxE+=TqkBhOm29r85Rfe7UQFbgvQJwk3w==fhdS4NrEe9bg@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
 <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
 <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>
 <CALWZvp6Xi7s3MzoLDMt7vmPtL8jDMv8nAi6U21-d1DUuZ5p=gg@mail.gmail.com>
 <CADiSq7c8LeiWUW2x8vGeU3QmsXz-WqWGZz=KFHmPj0wcDxFHhg@mail.gmail.com>
 <CALWZvp5YWGV-kdb1ijjTwNg-+xOrkup7ZZTyz1SNS0kwVLDUtQ@mail.gmail.com>
 <CALGmxE+=TqkBhOm29r85Rfe7UQFbgvQJwk3w==fhdS4NrEe9bg@mail.gmail.com>
Message-ID: <CALWZvp5R_OODFtHHCGRMtpv__ZbAeknj0VmnkTyf6=XC6q4vCg@mail.gmail.com>

On Sun, May 24, 2015 at 6:40 PM, Chris Barker <chris.barker at noaa.gov> wrote:
>
> What do folks think about adding one to cmath as well, while we are at it?
> It should be pretty straightforward -- I could focus what time I have to do
> that.

I prefer focusing on getting math.isclose() in before tackling
cmath.isclose(), though it would indeed be very straightforward given
the work already done. Larry has stated he's willing to make an
exception to the "no new features" rule for this, so I think we should
have time to get the cmath version in for 3.5 even if we wait a few
days with it. So if you have time, I'd prefer that you thoroughly
review the patch.

- Tal Einat

From gmludo at gmail.com  Sun May 24 19:22:17 2015
From: gmludo at gmail.com (Ludovic Gasc)
Date: Sun, 24 May 2015 19:22:17 +0200
Subject: [Python-Dev] An yocto change proposal in logging module to simplify
 structured logs support
Message-ID: <CAON-fpEHScFMFyjL_uYWY4XtUGskYuYqDi5zgno4aYYFe2GJsw@mail.gmail.com>

Hi,

1. The problem

For now, when you want to write a log message, you concatenate the data
from your context to generate a string: In fact, you convert your
structured data to a string.
When a sysadmin needs to debug your logs when something is wrong, he must
write regular expressions to extract interesting data.

Often, he must find the beginning of the interesting log and follow the
path. Sometimes, you can have several requests in the same time in the log,
it's harder to find interesting log.
In fact, with regular expressions, the sysadmin tries to convert the log
lines strings to structured data.

2. A possible solution

You should provide a set of regular expressions to your sysadmins to help
them to find the right logs, however, another approach is possible:
structured logs.
Instead of to break your data structure to push in the log message, the
idea is to keep the data structure, to attach that as metadata of the log
message.
For now, I know at least Logstash and Journald that can handle structured
logs and provide a query tool to extract easily logs.

3. A concrete example with structured logs

As most Web developers, we build HTTP daemons used by several different
human clients in the same time.
In the Python source code, to support structured logs, you don't have a big
change, you can use "extra" parameter for that, example:

    [handle HTTP request]
    LOG.debug('Receive a create_or_update request', extra={'request_id':
request.request_id,

                     'account_id': account_id,

                     'aiohttp_request': request,

                     'payload': str(payload)})
   [create data in database]
    LOG.debug('Callflow created', extra={'account_id': account_id,
                                             'request_id':
request.request_id,
                                             'aiopg_cursor': cur,
                                             'results': row})

Now, if you want, you can enhance the structured log with a custom logging
Handler, because the standard journald handler doesn't know how to handle
aiohttp_request or aiopg_cursor.
My example is based on journald, but you can write an equivalent version
with python-logstash:
####
from systemdream.journal.handler import JournalHandler

class Handler(JournalHandler):
    # Tip: on a system without journald, use socat to test:
    # socat UNIX-RECV:/run/systemd/journal/socket STDIN
    def emit(self, record):
        if record.extra:
            # import ipdb; ipdb.set_trace()
            if 'aiohttp_request' in record.extra:
                record.extra['http_method'] =
record.extra['aiohttp_request'].method
                record.extra['http_path'] =
record.extra['aiohttp_request'].path
                record.extra['http_headers'] =
str(record.extra['aiohttp_request'].headers)
                del(record.extra['aiohttp_request'])
            if 'aiopg_cursor' in record.extra:
                record.extra['pg_query'] =
record.extra['aiopg_cursor'].query.decode('utf-8')
                record.extra['pg_status_message'] =
record.extra['aiopg_cursor'].statusmessage
                record.extra['pg_rows_count'] =
record.extra['aiopg_cursor'].rowcount
                del(record.extra['aiopg_cursor'])
        super().emit(record)
####

And you can enable this custom handler in your logging config file like
this:
[handler_journald]
class=XXXXXXXXXX.utils.logs.Handler
args=()
formatter=detailed

And now, with journalctl, you can easily extract logs, some examples:
Logs messages from 'lg' account:
    journalctl ACCOUNT_ID=lg
All HTTP requests that modify the 'lg' account (PUT, POST and DELETE):
    journalctl ACCOUNT_ID=lg HTTP_METHOD=PUT
HTTP_METHOD=POST HTTP_METHOD=DELETE
Retrieve all logs from one specific HTTP request:
    journalctl REQUEST_ID=130b8fa0-6576-43b6-a624-4a4265a2fbdd
All HTTP requests with a specific path:
    journalctl HTTP_PATH=/v1/accounts/lg/callflows
All logs of "create" function in the file "example.py"
   journalctl CODE_FUNC=create CODE_FILE=/path/example.py

If you already do a troubleshooting on a production system, you should
understand the interest of this:
In fact, it's like to have SQL queries capabilities, but it's logging
oriented.
We use that since a small time on one of our critical daemon that handles a
lot of requests across several servers, it's already adopted from our
support team.

4. The yocto issue with the Python logging module

I don't explain here a small part of my professional life for my pleasure,
but to help you to understand the context and the usages, because my patch
for logging is very small.
If you're an expert of Python logging, you already know that my Handler
class example I provided above can't run on a classical Python logging,
because LogRecord doesn't have an extra attribute.

extra parameter exists in the Logger, but, in the LogRecord, it's merged as
attributes of LogRecord:
https://github.com/python/cpython/blob/master/Lib/logging/__init__.py#L1386

It means, that when the LogRecord is sent to the Handler, you can't
retrieve the dict from the extra parameter of logger.
The only way to do that without to patch Python logging, is to rebuild by
yourself the dict with a list of official attributes of LogRecord, as is
done in python-logstash:
https://github.com/vklochan/python-logstash/blob/master/logstash/formatter.py#L23
At least to me, it's a little bit dirty.

My quick'n'dirty patch I use for now on our CPython on production:

diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py
index 104b0be..30fa6ef 100644
--- a/Lib/logging/__init__.py
+++ b/Lib/logging/__init__.py
@@ -1382,6 +1382,7 @@ class Logger(Filterer):
         """
         rv = _logRecordFactory(name, level, fn, lno, msg, args, exc_info,
func,
                              sinfo)
+        rv.extra = extra
         if extra is not None:
             for key in extra:
                 if (key in ["message", "asctime"]) or (key in rv.__dict__):

At least to me, it should be cleaner to add "extra" as parameter
of _logRecordFactory, but I've no idea of side effects, I understand that
logging module is critical, because it's used everywhere.
However, except with python-logstash, to my knowledge, extra parameter
isn't massively used.
The only backward incompatibility I see with a new extra attribute of
LogRecord, is that if you have a log like this:
    LOG.debug('message', extra={'extra': 'example'})
It will raise a KeyError("Attempt to overwrite 'extra' in LogRecord")
exception, but, at least to me, the probability of this use case is near to
0.

Instead of to "maintain" this yocto patch, even it's very small, I should
prefer to have a clean solution in Python directly.

Thanks for your remarks.

Regards.
--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/15e739d5/attachment.html>

From chris.barker at noaa.gov  Sun May 24 21:57:22 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Sun, 24 May 2015 12:57:22 -0700
Subject: [Python-Dev] [python-committers] Reminder: Python 3.5 beta 1
 will be tagged tomorrow
In-Reply-To: <CALWZvp5R_OODFtHHCGRMtpv__ZbAeknj0VmnkTyf6=XC6q4vCg@mail.gmail.com>
References: <CALGmxEJmVrCYqyLXZZhvvwfTRAHTMioiCNHuURk4ayzWpJAeBA@mail.gmail.com>
 <555FA0B9.2010209@hastings.org>
 <CALGmxEJwA3GkGGAGqaqm8q6ZSCv9QoOSvcfSjH+UtmLB1Apecw@mail.gmail.com>
 <CAF4280LfAGEyen+7bBFraRw73zNAbjtnzbjhLY760XxnnLrA2g@mail.gmail.com>
 <CALWZvp6Xi7s3MzoLDMt7vmPtL8jDMv8nAi6U21-d1DUuZ5p=gg@mail.gmail.com>
 <CADiSq7c8LeiWUW2x8vGeU3QmsXz-WqWGZz=KFHmPj0wcDxFHhg@mail.gmail.com>
 <CALWZvp5YWGV-kdb1ijjTwNg-+xOrkup7ZZTyz1SNS0kwVLDUtQ@mail.gmail.com>
 <CALGmxE+=TqkBhOm29r85Rfe7UQFbgvQJwk3w==fhdS4NrEe9bg@mail.gmail.com>
 <CALWZvp5R_OODFtHHCGRMtpv__ZbAeknj0VmnkTyf6=XC6q4vCg@mail.gmail.com>
Message-ID: <CALGmxE+Jkd_-chy7sW9vcb=i3HBiOgJ19-_taJrpvu7v=U+cnw@mail.gmail.com>

On Sun, May 24, 2015 at 10:00 AM, Tal Einat <taleinat at gmail.com> wrote:

> On Sun, May 24, 2015 at 6:40 PM, Chris Barker <chris.barker at noaa.gov>
> wrote:
> > What do folks think about adding one to cmath as well, while we are at
> it?
> > It should be pretty straightforward -- I could focus what time I have to
> do
> > that.
>
> I prefer focusing on getting math.isclose() in before tackling
> cmath.isclose(), though it would indeed be very straightforward given
> the work already done. Larry has stated he's willing to make an
> exception to the "no new features" rule for this, so I think we should
> have time to get the cmath version in for 3.5 even if we wait a few
> days with it. So if you have time, I'd prefer that you thoroughly
> review the patch.
>

makes sense. stay tuned.

-CHB


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/3ff5a765/attachment.html>

From ericsnowcurrently at gmail.com  Sun May 24 22:36:25 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Sun, 24 May 2015 14:36:25 -0600
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
 <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
Message-ID: <CALFfu7BcYKxp6GO1i_jv_XMWGjSwAg9gXQCnT7-8DTzDL+nnJg@mail.gmail.com>

On May 24, 2015 3:35 AM, "Nick Coghlan" <ncoghlan at gmail.com> wrote:
> Is it specifically necessary to save the order by default? Metaclasses
> would be able to access the ordered namespace in their __new__ method
> regardless, and for 3.6, I still like the __init_subclass__ hook idea
> proposed in PEP 487, which includes passing the original namespace to
> the new hook.
>
> So while I'm sold on the value of making class execution namespaces
> ordered by default, I'm not yet sold on the idea of *remembering* that
> order without opting in to doing so in the metaclass.
>
> If we leave __definition_order__ out for the time being then, for the
> vast majority of code, the fact that the ephemeral namespace used to
> evaluate the class body switched from being a basic dictionary to an
> ordered one would be a hidden implementation detail, rather than
> making all type objects a little bigger.

It's too late for 3.5 to negotiate much so I'll try to make my case here
for __definition_order__ one last time.   If that's not sufficient then
I'll defer further discussion to 3.6.

My premise for storing the definition order on the class is that Guido was
okay with using OrderedDict for cls.__dict__, which is a bigger change.
Regardless, there are two reasons why it makes sense:

* If it makes sense to use OrderedDict by default for class definition then
it makes sense to preserve the extra information OrderedDict provides.
* As I noted at the beginning of the thread, you could still preserve that
info manually, but that makes it less convenient for library authors.

If you still think that's not enough justification then we can table
__definition_order__ for now.

-eric
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/1119d38f/attachment-0001.html>

From guido at python.org  Sun May 24 23:26:09 2015
From: guido at python.org (Guido van Rossum)
Date: Sun, 24 May 2015 14:26:09 -0700
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CALFfu7BcYKxp6GO1i_jv_XMWGjSwAg9gXQCnT7-8DTzDL+nnJg@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
 <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
 <CALFfu7BcYKxp6GO1i_jv_XMWGjSwAg9gXQCnT7-8DTzDL+nnJg@mail.gmail.com>
Message-ID: <CAP7+vJLhKc0EtucrSQR+MOpxtoGTjBqY+eBm92Z6BXqtpFF5Gg@mail.gmail.com>

On Sun, May 24, 2015 at 1:36 PM, Eric Snow <ericsnowcurrently at gmail.com>
wrote:


> On May 24, 2015 3:35 AM, "Nick Coghlan" <ncoghlan at gmail.com> wrote:
> > Is it specifically necessary to save the order by default? Metaclasses
> > would be able to access the ordered namespace in their __new__ method
> > regardless, and for 3.6, I still like the __init_subclass__ hook idea
> > proposed in PEP 487, which includes passing the original namespace to
> > the new hook.
> >
> > So while I'm sold on the value of making class execution namespaces
> > ordered by default, I'm not yet sold on the idea of *remembering* that
> > order without opting in to doing so in the metaclass.
> >
> > If we leave __definition_order__ out for the time being then, for the
> > vast majority of code, the fact that the ephemeral namespace used to
> > evaluate the class body switched from being a basic dictionary to an
> > ordered one would be a hidden implementation detail, rather than
> > making all type objects a little bigger.
>
> It's too late for 3.5 to negotiate much so I'll try to make my case here
> for __definition_order__ one last time.   If that's not sufficient then
> I'll defer further discussion to 3.6.
>
> My premise for storing the definition order on the class is that Guido was
> okay with using OrderedDict for cls.__dict__, which is a bigger change.
> Regardless, there are two reasons why it makes sense:
>
> * If it makes sense to use OrderedDict by default for class definition
> then it makes sense to preserve the extra information OrderedDict provides.
> * As I noted at the beginning of the thread, you could still preserve that
> info manually, but that makes it less convenient for library authors.
>
> If you still think that's not enough justification then we can table
> __definition_order__ for now.
>

Let's table it. It's hard to compare alternatives on a single dimension of
"which is a bigger change".

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/2e1d6784/attachment.html>

From ncoghlan at gmail.com  Mon May 25 00:52:57 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 25 May 2015 08:52:57 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CAP7+vJLhKc0EtucrSQR+MOpxtoGTjBqY+eBm92Z6BXqtpFF5Gg@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
 <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
 <CALFfu7BcYKxp6GO1i_jv_XMWGjSwAg9gXQCnT7-8DTzDL+nnJg@mail.gmail.com>
 <CAP7+vJLhKc0EtucrSQR+MOpxtoGTjBqY+eBm92Z6BXqtpFF5Gg@mail.gmail.com>
Message-ID: <CADiSq7eD3EUYOT6NDbyw2sgegU+vzXv4Si3echr69jp8iLaiwg@mail.gmail.com>

On 25 May 2015 07:26, "Guido van Rossum" <guido at python.org> wrote:
>
> On Sun, May 24, 2015 at 1:36 PM, Eric Snow <ericsnowcurrently at gmail.com>
wrote:

>>
>> My premise for storing the definition order on the class is that Guido
was okay with using OrderedDict for cls.__dict__, which is a bigger
change.  Regardless, there are two reasons why it makes sense:
>>
>> * If it makes sense to use OrderedDict by default for class definition
then it makes sense to preserve the extra information OrderedDict provides.
>> * As I noted at the beginning of the thread, you could still preserve
that info manually, but that makes it less convenient for library authors.

It occurs to me that even the basic change makes it possible to provide
initialisation helpers that accept locals() from a currently running class
definition and return a definition ordered list of fields (perhaps
restricted to values of a certain type, such as database column
definitions, or webform field definitions).

>> If you still think that's not enough justification then we can table
__definition_order__ for now.
>
>
> Let's table it. It's hard to compare alternatives on a single dimension
of "which is a bigger change".

Right, it isn't that I think __definition_order__ is necessarily a bad
idea, I just suspect it's redundant if we end up going ahead with
__init_subclass__ (which would allow a base class to opt in to preserving
the definition order, either of all fields or selected ones), and the
latter change is definitely out of scope for 3.5 at this point.

There are also other open questions, like whether or not dir() should
respect the order when reporting attribute names, or if dict_proxy should
respect the order when iterating.

Regards,
Nick.

>
> --
> --Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150525/72c73415/attachment.html>

From tjreedy at udel.edu  Mon May 25 01:05:35 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Sun, 24 May 2015 19:05:35 -0400
Subject: [Python-Dev] An yocto change proposal in logging module to
 simplify structured logs support
In-Reply-To: <CAON-fpEHScFMFyjL_uYWY4XtUGskYuYqDi5zgno4aYYFe2GJsw@mail.gmail.com>
References: <CAON-fpEHScFMFyjL_uYWY4XtUGskYuYqDi5zgno4aYYFe2GJsw@mail.gmail.com>
Message-ID: <mjtlgb$ps1$1@ger.gmane.org>

Please post your idea to the python-ideas list.

-- 
Terry Jan Reedy


From larry at hastings.org  Mon May 25 01:39:00 2015
From: larry at hastings.org (Larry Hastings)
Date: Sun, 24 May 2015 16:39:00 -0700
Subject: [Python-Dev] [RELEASED] Python 3.5.0b1 is now available
Message-ID: <55626114.6070003@hastings.org>



On behalf of the Python development community and the Python 3.5 release 
team, I'm pleased to announce the availability of Python 3.5.0b1.  
Python 3.5 has now entered "feature freeze". By default new features may 
no longer be added to Python 3.5. (However, there are a handful of 
features that weren't quite ready for Python 3.5.0 beta 1; these were 
granted exceptions to the freeze, and are scheduled to be added before 
beta 2.)

This is a preview release, and its use is not recommended for production 
settings.

Three important notes for Windows users about Python 3.5.0b1:

  * If you have previously installed Python 3.5.0a1, you may need to
    manually uninstall it before installing Python 3.5.0b1 (issue23612).
  * If installing Python 3.5.0b1 as a non-privileged user, you may need
    to escalate to administrator privileges to install an update to your
    C runtime libraries.
  * There is now a third type of Windows build for Python 3.5.  In
    addition to the conventional installer and the web-based installer,
    Python 3.5 now has an embeddable release designed to be deployed as
    part of a larger application's installer for apps using or extending
    Python.  During the 3.5 alpha releases, this was an executable
    installer; as of 3.5.0 beta 1 the embeddable build of Python is now
    shipped in a zip file.


You can find Python 3.5.0b1 here:

    https://www.python.org/downloads/release/python-350b1/

Happy hacking,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/f9a13047/attachment.html>

From larry at hastings.org  Mon May 25 01:49:17 2015
From: larry at hastings.org (Larry Hastings)
Date: Sun, 24 May 2015 16:49:17 -0700
Subject: [Python-Dev] Reminder: 3.5 now has its own branch! "default" branch
	is now 3.6!
Message-ID: <5562637D.8070809@hastings.org>



I've now pushed the 3.5.0 beta 1 release-engineering checkins to 
hg.python.org.  At the same time I did this, I also created the 3.5 branch.

Quick FAQ:

Q: Where should I check in bugfixes for 3.5?
A: In the "3.5" branch.  You should also merge them forward into "default".

Q: Where should I check in new features for 3.5?
A: You sillyhead!  New features aren't allowed for 3.5 anymore, it's in 
feature freeze.

Q: What is "default" now?
A: "default" is now 3.6.  Meaning, you can now start on new features for 
3.6!  You don't have to wait until 3.5 final is released, like how we 
used to do it.

Q: What's all this about bitbucket and push requests?
A: We don't start doing that until 3.5.0 release candidate 1.  Don't 
worry about it for now.  When the time comes, I'll post instructions 
here and to the devguide.


A new day is dawning,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/b0e9e31a/attachment.html>

From ericsnowcurrently at gmail.com  Mon May 25 02:00:58 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Sun, 24 May 2015 18:00:58 -0600
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CADiSq7eD3EUYOT6NDbyw2sgegU+vzXv4Si3echr69jp8iLaiwg@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <CAP7+vJ+yRHZMdnKNDAn1Xe2fgK14-m4vp+KAmO74L4p0A5YKqw@mail.gmail.com>
 <CALFfu7CO=RPb2zzGnS368a4d6=fsBXFH-rjrcR0xW33eqhxXWA@mail.gmail.com>
 <CADiSq7exV6w+u-=8ib9r1AdaE__-W3ON9FAGyYb=+UGNXDVfhw@mail.gmail.com>
 <CALFfu7BcYKxp6GO1i_jv_XMWGjSwAg9gXQCnT7-8DTzDL+nnJg@mail.gmail.com>
 <CAP7+vJLhKc0EtucrSQR+MOpxtoGTjBqY+eBm92Z6BXqtpFF5Gg@mail.gmail.com>
 <CADiSq7eD3EUYOT6NDbyw2sgegU+vzXv4Si3echr69jp8iLaiwg@mail.gmail.com>
Message-ID: <CALFfu7AUf=c6qQUjMWHFBhwQSpSx5hK-Ok91rk2W+Fm3tVvzrw@mail.gmail.com>

On May 24, 2015 4:52 PM, "Nick Coghlan" <ncoghlan at gmail.com> wrote:
>
>
> On 25 May 2015 07:26, "Guido van Rossum" <guido at python.org> wrote:
> >
> > On Sun, May 24, 2015 at 1:36 PM, Eric Snow <ericsnowcurrently at gmail.com>
wrote:
> >> If you still think that's not enough justification then we can table
__definition_order__ for now.
> >
> >
> > Let's table it. It's hard to compare alternatives on a single dimension
of "which is a bigger change".

Sounds good.

>
> Right, it isn't that I think __definition_order__ is necessarily a bad
idea, I just suspect it's redundant if we end up going ahead with
__init_subclass__ (which would allow a base class to opt in to preserving
the definition order, either of all fields or selected ones),
> and the latter change is definitely out of scope for 3.5 at this point.
>
> There are also other open questions, like whether or not dir() should
respect the order when reporting attribute names, or if dict_proxy should
respect the order when iterating.

Yeah, I'll start up a thread on python-ideas once I've gotten the other
stuff wrapped up.  Thanks for the feedback.

-eric
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/eec06be0/attachment.html>

From schesis at gmail.com  Mon May 25 02:15:22 2015
From: schesis at gmail.com (Zero Piraeus)
Date: Sun, 24 May 2015 21:15:22 -0300
Subject: [Python-Dev] [RELEASED] Python 3.5.0b1 is now available
In-Reply-To: <55626114.6070003@hastings.org>
References: <55626114.6070003@hastings.org>
Message-ID: <20150525001522.GA30305@piedra>

:

On Sun, May 24, 2015 at 04:39:00PM -0700, Larry Hastings wrote:
> 
> You can find Python 3.5.0b1 here:
> 
>    https://www.python.org/downloads/release/python-350b1/

Source tarballs (both .tgz and .tar.xz) are missing ...

 -[]z.

-- 
Zero Piraeus: vive ut vivas
http://etiol.net/pubkey.asc

From rosuav at gmail.com  Mon May 25 03:01:41 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Mon, 25 May 2015 11:01:41 +1000
Subject: [Python-Dev] Reminder: 3.5 now has its own branch! "default"
 branch is now 3.6!
In-Reply-To: <5562637D.8070809@hastings.org>
References: <5562637D.8070809@hastings.org>
Message-ID: <CAPTjJmorO0n7CG8h+72Xi_ukgBeSscb8SWMTL6+iROWogMtFgA@mail.gmail.com>

On Mon, May 25, 2015 at 9:49 AM, Larry Hastings <larry at hastings.org> wrote:
> I've now pushed the 3.5.0 beta 1 release-engineering checkins to
> hg.python.org.  At the same time I did this, I also created the 3.5 branch.
>
> Quick FAQ:

Additional Q. What does this mean for buildbots? Will they immediately
pick up the new branch?

Apologies if this is a dumb question. My buildbot is temporarily down
(hardware failure, working on it) so I can't easily check what it's
doing. (And I'm not sure I'd know for sure if I saw the right result
anyway.)

ChrisA

From nad at acm.org  Mon May 25 03:03:50 2015
From: nad at acm.org (Ned Deily)
Date: Sun, 24 May 2015 18:03:50 -0700
Subject: [Python-Dev] [RELEASED] Python 3.5.0b1 is now available
References: <55626114.6070003@hastings.org> <20150525001522.GA30305@piedra>
Message-ID: <nad-A2583B.18035024052015@news.gmane.org>

In article <20150525001522.GA30305 at piedra>,
 Zero Piraeus <schesis at gmail.com> wrote:
> On Sun, May 24, 2015 at 04:39:00PM -0700, Larry Hastings wrote:
> > You can find Python 3.5.0b1 here:
> >    https://www.python.org/downloads/release/python-350b1/
> Source tarballs (both .tgz and .tar.xz) are missing ...

They seem to be there now.  Are you still not able to download them?

-- 
 Ned Deily,
 nad at acm.org


From larry at hastings.org  Mon May 25 03:18:42 2015
From: larry at hastings.org (Larry Hastings)
Date: Sun, 24 May 2015 18:18:42 -0700
Subject: [Python-Dev] Reminder: 3.5 now has its own branch! "default"
 branch is now 3.6!
In-Reply-To: <CAPTjJmorO0n7CG8h+72Xi_ukgBeSscb8SWMTL6+iROWogMtFgA@mail.gmail.com>
References: <5562637D.8070809@hastings.org>
 <CAPTjJmorO0n7CG8h+72Xi_ukgBeSscb8SWMTL6+iROWogMtFgA@mail.gmail.com>
Message-ID: <55627872.6090604@hastings.org>



On 05/24/2015 06:01 PM, Chris Angelico wrote:
> Additional Q. What does this mean for buildbots? Will they immediately
> pick up the new branch?

I don't know about "immediately", but yes the buildbots should get 
configured to point at the 3.5 branch, preferably soon.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/f340cfb7/attachment-0001.html>

From chris.barker at noaa.gov  Mon May 25 06:53:17 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Sun, 24 May 2015 21:53:17 -0700
Subject: [Python-Dev] PEP 485: math.isclose()
Message-ID: <CALGmxEJOTNWThHtM-PmRmeEWJ-wQzi=DevK-H10iUqq0DbLXeg@mail.gmail.com>

I don't think I have permissions to comment on the issue,so I'm posting
here. If there is a way for me to post to the issue, someone let me know...

In the issue (http://bugs.python.org/issue24270) Tal wrote

"""
I have a question regarding complex values. The code (from Chris Barker)
doesn't support complex values (only things that can be converted into
doubles). However, the PEP states the following under "Non-float types":

"complex : for complex, the absolute value of the complex values will be
used for scaling and comparison. If a complex tolerance is passed in, the
absolute value will be used as the tolerance."
"""

right -- that was written before it was decided that isclose() needed to be
written in C -- the python version supported that.

"""
Should math.isclose() support complex values?
"""
nope -- the math module is all about floats.

"""
Should an equivalent function be added to cmath?
"""

I think so -- lets see if we can do that in time for 3.5 -- but first get
the float one done.

"""
 Should we just leave things as they are and remove mention of complex
values from the PEP (it isn't mentioned in the docs)?
"""
I'll update the PEP.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/0efd2570/attachment.html>

From taleinat at gmail.com  Mon May 25 08:39:07 2015
From: taleinat at gmail.com (Tal Einat)
Date: Mon, 25 May 2015 09:39:07 +0300
Subject: [Python-Dev] PEP 485: math.isclose()
In-Reply-To: <CALGmxEJOTNWThHtM-PmRmeEWJ-wQzi=DevK-H10iUqq0DbLXeg@mail.gmail.com>
References: <CALGmxEJOTNWThHtM-PmRmeEWJ-wQzi=DevK-H10iUqq0DbLXeg@mail.gmail.com>
Message-ID: <CALWZvp7OQfMNeoSoUgmRkHxGmH6E9rtQF5gH305pwbAsUfC+Gg@mail.gmail.com>

On Mon, May 25, 2015 at 7:53 AM, Chris Barker <chris.barker at noaa.gov> wrote:
> I don't think I have permissions to comment on the issue,so I'm posting
> here. If there is a way for me to post to the issue, someone let me know...

You just need to register on the issue tracker. On bugs.python.org,
there is a "register" link under the "User" section on the navigation
bar.

> In the issue (http://bugs.python.org/issue24270) Tal wrote
>
> [...]
>
> """
>  Should we just leave things as they are and remove mention of complex
> values from the PEP (it isn't mentioned in the docs)?
> """
> I'll update the PEP.

If we're going to add a separate complex version of isclose() to the
cmath module, then you should probably leave the PEP as it is.

While we're on the subject of the PEP, is there a reason that the
cmath version should accept complex values for the tolerances? I'd
expect it to accept only floats, and I think that allowing complex
values would be more confusing than useful.

- Tal Einat

From chris.barker at noaa.gov  Mon May 25 08:45:04 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Sun, 24 May 2015 23:45:04 -0700
Subject: [Python-Dev] PEP 485: math.isclose()
In-Reply-To: <CALGmxEJOTNWThHtM-PmRmeEWJ-wQzi=DevK-H10iUqq0DbLXeg@mail.gmail.com>
References: <CALGmxEJOTNWThHtM-PmRmeEWJ-wQzi=DevK-H10iUqq0DbLXeg@mail.gmail.com>
Message-ID: <CALGmxE+9urfr58=-GUDN753Ri6DcP2tquW95gTAuvOr8-XEy8w@mail.gmail.com>

And a few comments on the patch ( I have not idea how to patch a patch...)
Is there a branch somewhere with this patch applied?

I'm going through PEP 7, and cleaned up the docstring a bit:

diff -r 15af4f58d143 Modules/mathmodule.c
--- a/Modules/mathmodule.c      Sun May 24 22:27:00 2015 -0700
+++ b/Modules/mathmodule.c      Sun May 24 22:57:52 2015 -0700
@@ -2051,8 +2051,8 @@
 }

 PyDoc_STRVAR(math_isclose_doc,
-"Determine if two floating point numbers are  in value\n\n"
-
+"is_close(a, b, rel_tol, abs_tol) -> bool\n\n"
+"Determine if two floating point numbers are similar in value\n\n"
 "Returns True if a is close in value to b. False otherwise\n\n"
 ":param a: one of the values to be tested\n\n"
 ":param b: the other value to be tested\n\n"

and there is a missing space in the docs:

in math.rst:

   Return ``True`` if the values *a* and *b* are close to each other and
   ``False`` otherwise.

need a space between "each" and "other"

But it all looks good otherwise -- thanks!

-Chris


On Sun, May 24, 2015 at 9:53 PM, Chris Barker <chris.barker at noaa.gov> wrote:

> I don't think I have permissions to comment on the issue,so I'm posting
> here. If there is a way for me to post to the issue, someone let me know...
>
> In the issue (http://bugs.python.org/issue24270) Tal wrote
>
> """
> I have a question regarding complex values. The code (from Chris Barker)
> doesn't support complex values (only things that can be converted into
> doubles). However, the PEP states the following under "Non-float types":
>
> "complex : for complex, the absolute value of the complex values will be
> used for scaling and comparison. If a complex tolerance is passed in, the
> absolute value will be used as the tolerance."
> """
>
> right -- that was written before it was decided that isclose() needed to
> be written in C -- the python version supported that.
>
> """
> Should math.isclose() support complex values?
> """
> nope -- the math module is all about floats.
>
> """
> Should an equivalent function be added to cmath?
> """
>
> I think so -- lets see if we can do that in time for 3.5 -- but first get
> the float one done.
>
> """
>  Should we just leave things as they are and remove mention of complex
> values from the PEP (it isn't mentioned in the docs)?
> """
> I'll update the PEP.
>
> -Chris
>
>
>
> --
>
> Christopher Barker, Ph.D.
> Oceanographer
>
> Emergency Response Division
> NOAA/NOS/OR&R            (206) 526-6959   voice
> 7600 Sand Point Way NE   (206) 526-6329   fax
> Seattle, WA  98115       (206) 526-6317   main reception
>
> Chris.Barker at noaa.gov
>



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150524/c7cc31d5/attachment.html>

From solipsis at pitrou.net  Mon May 25 09:33:14 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Mon, 25 May 2015 09:33:14 +0200
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org>
Message-ID: <20150525093314.3ce18048@fsol>

On Sat, 23 May 2015 20:14:56 -0700
Larry Hastings <larry at hastings.org> wrote:
> 
> On 05/23/2015 07:38 PM, Nick Coghlan wrote:
> > Eric clarified for me that Larry was considering granting a feature
> > freeze exemption to defer landing this to beta 2 while Eric tracked
> > down a segfault bug in the current patch that provides a C
> > implementation of OrderedDict.
> 
> Yeah, I'm willing to grant the feature freeze exception, assuming he can 
> find general approval from the community (and assuming he still has 
> Guido's blessing).  I just wanted a little more sunlight on the topic, 
> rather than rushing to check it in.

Given the pain that has gone into making the patch segfault- and
reference leak-free, and given it adds a lot of complication in the
data types area, I'm frankly uneasy with having this land after the
feature freeze. It's a sure recipe to *add* instability rather than
remove it.

Regards

Antoine.



From taleinat at gmail.com  Mon May 25 11:07:26 2015
From: taleinat at gmail.com (Tal Einat)
Date: Mon, 25 May 2015 12:07:26 +0300
Subject: [Python-Dev] PEP 485: math.isclose()
In-Reply-To: <CALGmxE+9urfr58=-GUDN753Ri6DcP2tquW95gTAuvOr8-XEy8w@mail.gmail.com>
References: <CALGmxEJOTNWThHtM-PmRmeEWJ-wQzi=DevK-H10iUqq0DbLXeg@mail.gmail.com>
 <CALGmxE+9urfr58=-GUDN753Ri6DcP2tquW95gTAuvOr8-XEy8w@mail.gmail.com>
Message-ID: <CALWZvp7C+-7hS50bvBKhdZ9rW0uNKdTMdb=omKWT1TTtf8YBzg@mail.gmail.com>

On Mon, May 25, 2015 at 9:45 AM, Chris Barker <chris.barker at noaa.gov> wrote:
> And a few comments on the patch ( I have not idea how to patch a patch...)
> Is there a branch somewhere with this patch applied?

Not at the moment. But if you click the "review" link next to the
patch on the tracker then you can leave comments "inside" the patch,
and we can discuss them there directly. For future reference, that's
the preferred place for these type of comments.

I'll work your comment into a revised version of the patch and have it
up later today.

- Tal Einat

From mikekozulya at kipt.kharkov.ua  Mon May 25 10:02:42 2015
From: mikekozulya at kipt.kharkov.ua (Mike Kozulya)
Date: Mon, 25 May 2015 11:02:42 +0300
Subject: [Python-Dev] PEP 484
Message-ID: <5562D722.3050207@kipt.kharkov.ua>

May I suggest to eliminate "->" in function definition?

     def function1 (variable1: variable1_type, variable2: 
variable2_type): function1_type
         return str2function1_type(str(variable1)+str(' ')+str(variable2))

OR

     def function1: function1_type (variable1: variable1_type, 
variable2: variable2_type):
         return str2function1_type(str(variable1)+str(' ')+str(variable2))

both look a bit simpler than

     def function1 (variable1: variable1_type, variable2: 
variable2_type) -> function1_type:
         return str2function1_type(str(variable1)+str(' ')+str(variable2))

Are there any convincing reasons to introduce syntactic sugar?

Yours Mike Kozulya

From schesis at gmail.com  Mon May 25 14:07:49 2015
From: schesis at gmail.com (Zero Piraeus)
Date: Mon, 25 May 2015 09:07:49 -0300
Subject: [Python-Dev] [RELEASED] Python 3.5.0b1 is now available
In-Reply-To: <nad-A2583B.18035024052015@news.gmane.org>
References: <55626114.6070003@hastings.org> <20150525001522.GA30305@piedra>
 <nad-A2583B.18035024052015@news.gmane.org>
Message-ID: <20150525120749.GA2408@piedra>

:

On Sun, May 24, 2015 at 06:03:50PM -0700, Ned Deily wrote:
> Zero Piraeus <schesis at gmail.com> wrote:
> > Source tarballs (both .tgz and .tar.xz) are missing ...
> 
> They seem to be there now.  Are you still not able to download them?

Oops. Both Larry's reply to me and my thankyou to him turn out to have
been offlist.

Yes, got 'em now. Turns out to have been a permissions error (which the
403 error would have alerted me to, had I been paying attention).

So, in public this time: Thanks, Larry (and thanks for all the work on
the release) ...

 -[]z.

-- 
Zero Piraeus: inter caetera
http://etiol.net/pubkey.asc

From benjamin at python.org  Mon May 25 15:40:17 2015
From: benjamin at python.org (Benjamin Peterson)
Date: Mon, 25 May 2015 09:40:17 -0400
Subject: [Python-Dev] PEP 484
In-Reply-To: <5562D722.3050207@kipt.kharkov.ua>
References: <5562D722.3050207@kipt.kharkov.ua>
Message-ID: <1432561217.2255896.277501049.56980F34@webmail.messagingengine.com>



On Mon, May 25, 2015, at 04:02, Mike Kozulya wrote:
> May I suggest to eliminate "->" in function definition?
> 
>      def function1 (variable1: variable1_type, variable2: 
> variable2_type): function1_type
>          return str2function1_type(str(variable1)+str('
>          ')+str(variable2))
> 
> OR
> 
>      def function1: function1_type (variable1: variable1_type, 
> variable2: variable2_type):
>          return str2function1_type(str(variable1)+str('
>          ')+str(variable2))
> 
> both look a bit simpler than
> 
>      def function1 (variable1: variable1_type, variable2: 
> variable2_type) -> function1_type:
>          return str2function1_type(str(variable1)+str('
>          ')+str(variable2))
> 
> Are there any convincing reasons to introduce syntactic sugar?

That's simply preexisting function annotation syntax.
https://www.python.org/dev/peps/pep-3107/

It's not invented by the type hinting pep.

From benjamin at python.org  Mon May 25 21:01:33 2015
From: benjamin at python.org (Benjamin Peterson)
Date: Mon, 25 May 2015 15:01:33 -0400
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <20150525093314.3ce18048@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
Message-ID: <1432580493.2320587.277693865.73C7D27E@webmail.messagingengine.com>



On Mon, May 25, 2015, at 03:33, Antoine Pitrou wrote:
> On Sat, 23 May 2015 20:14:56 -0700
> Larry Hastings <larry at hastings.org> wrote:
> > 
> > On 05/23/2015 07:38 PM, Nick Coghlan wrote:
> > > Eric clarified for me that Larry was considering granting a feature
> > > freeze exemption to defer landing this to beta 2 while Eric tracked
> > > down a segfault bug in the current patch that provides a C
> > > implementation of OrderedDict.
> > 
> > Yeah, I'm willing to grant the feature freeze exception, assuming he can 
> > find general approval from the community (and assuming he still has 
> > Guido's blessing).  I just wanted a little more sunlight on the topic, 
> > rather than rushing to check it in.
> 
> Given the pain that has gone into making the patch segfault- and
> reference leak-free, and given it adds a lot of complication in the
> data types area, I'm frankly uneasy with having this land after the
> feature freeze. It's a sure recipe to *add* instability rather than
> remove it.

I agree completely with Antoine. All the hard work that's gone into it
recently should make it easy to land stably in 3.6. :)

From ericsnowcurrently at gmail.com  Mon May 25 21:40:11 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Mon, 25 May 2015 13:40:11 -0600
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <20150525093314.3ce18048@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
Message-ID: <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>

On Mon, May 25, 2015 at 1:33 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> On Sat, 23 May 2015 20:14:56 -0700
> Larry Hastings <larry at hastings.org> wrote:
>> Yeah, I'm willing to grant the feature freeze exception, assuming he can
>> find general approval from the community (and assuming he still has
>> Guido's blessing).  I just wanted a little more sunlight on the topic,
>> rather than rushing to check it in.
>
> Given the pain that has gone into making the patch segfault- and
> reference leak-free, and given it adds a lot of complication in the
> data types area, I'm frankly uneasy with having this land after the
> feature freeze. It's a sure recipe to *add* instability rather than
> remove it.

Well, the exception for C OrderedDict itself is a separate matter.  I
chose to be more public than I could have been about the last
remaining bugs in the interest of getting them resolved a bit faster.
At this point I wouldn't consider C OrderedDict to add a whole lot of
risk to 3.5.

That said, at this point landing it in 3.5 it doesn't matter to me
much because my main motivator (__definition_order__) isn't landing in
3.5.  The fact that 3.6 is already open to new features eases the
sting a bit.  I'd still prefer to land OrdereDict-by-default class
definition namespaces in 3.5, which is dependent on C OrderedDict, but
alone that isn't as important to me for 3.5 as
cls.__definition_order__ was.

Regardless, I know there were a few folks (e.g. Yury) that wanted to
see C OrderedDict in 3.5 and there may be others that would really
like OrderedDict-by-default in 3.5 (Nick?).  Since Larry already gave
an exception, I'd still be glad to land both in 3.5 if Yury (or
others) wants to make that case.  The patches will be ready.

-eric

From python at mrabarnett.plus.com  Mon May 25 22:06:01 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Mon, 25 May 2015 21:06:01 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5 32-bit
Message-ID: <556380A9.4000206@mrabarnett.plus.com>

As the subject says, I've been unable to build the regex module against
Python 3.5b1 for 32-bit. MingGW says:

     skipping incompatible .../libpython35.a when searching for -lpython35

It builds without a problem against Python 3.5 for 64-bit.

Any ideas? Should I just wait until beta 2?

From larry at hastings.org  Mon May 25 22:10:31 2015
From: larry at hastings.org (Larry Hastings)
Date: Mon, 25 May 2015 13:10:31 -0700
Subject: [Python-Dev] [python-committers] Reminder: 3.5 now has its own
 branch! "default" branch is now 3.6!
In-Reply-To: <mjuogm$d12$1@ger.gmane.org>
References: <5562637D.8070809@hastings.org> <mjuogm$d12$1@ger.gmane.org>
Message-ID: <556381B7.6080107@hastings.org>


On 05/25/2015 02:03 AM, Serhiy Storchaka wrote:
> Perhaps needed version bump in the default branch? I think now 
> Misc/NEWS will have two modifiable sections - for 3.5 (bugfixes) and 
> for 3.6 (new features).

That's a good point!  I've added a "3.6.0 alpha 1" section as you suggest.

That suggests more FAQs:

Q: When I check in just to the default branch (3.6), where should I put 
my news items in Misc/NEWS?
A: There's a section for "3.6.0 alpha 1", put them there.

Q: When I check in to 3.5 and merge into to the default branch, where 
should I put my news items in Misc/NEWS?
A: It should go in the same section (3.5.0 beta 1, beta 2, rc 1, etc).

I suspect I'll still have to do some cleanup in Misc/NEWS when we ship 
3.5.0 final.  Isn't it always the way!


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150525/7ef555f3/attachment.html>

From rymg19 at gmail.com  Mon May 25 22:09:57 2015
From: rymg19 at gmail.com (Ryan Gonzalez)
Date: Mon, 25 May 2015 15:09:57 -0500
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <556380A9.4000206@mrabarnett.plus.com>
References: <556380A9.4000206@mrabarnett.plus.com>
Message-ID: <22A97B85-D139-406E-A927-B5C6B0307F9F@gmail.com>

Try building the module with -m32. The error message basically means: "../libpython35.a is 32-bit, but what you're building is 64-bit." Gotta love ld!

On May 25, 2015 3:06:01 PM CDT, MRAB <python at mrabarnett.plus.com> wrote:
>As the subject says, I've been unable to build the regex module against
>Python 3.5b1 for 32-bit. MingGW says:
>
>  skipping incompatible .../libpython35.a when searching for -lpython35
>
>It builds without a problem against Python 3.5 for 64-bit.
>
>Any ideas? Should I just wait until beta 2?
>_______________________________________________
>Python-Dev mailing list
>Python-Dev at python.org
>https://mail.python.org/mailman/listinfo/python-dev
>Unsubscribe:
>https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150525/b808e07c/attachment.html>

From python at mrabarnett.plus.com  Mon May 25 22:20:47 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Mon, 25 May 2015 21:20:47 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
 32-bit
In-Reply-To: <22A97B85-D139-406E-A927-B5C6B0307F9F@gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <22A97B85-D139-406E-A927-B5C6B0307F9F@gmail.com>
Message-ID: <5563841F.1050801@mrabarnett.plus.com>

On 2015-05-25 21:09, Ryan Gonzalez wrote:
 > Try building the module with -m32. The error message basically means: 
"../libpython35.a is 32-bit, but what you're building is 64-bit." Gotta 
love ld!
 >
Unless I've missing something, I'm already passing it to gcc.

All of the other versions build without a problem: Python 2.5-2.7 and 
Python 3.1-3.4, both 32-bit and 64-bit, and now Python 3.5 64-bit.

That's 15 building and 1 failing!

 > On May 25, 2015 3:06:01 PM CDT, MRAB <python at mrabarnett.plus.com> wrote:
 >
 >     As the subject says, I've been unable to build the regex module 
against
 >     Python 3.5b1 for 32-bit. MingGW says:
 >
 >          skipping incompatible .../libpython35.a when searching for 
-lpython35
 >
 >     It builds without a problem against Python 3.5 for 64-bit.
 >
 >     Any ideas? Should I just wait until beta 2?


From yselivanov.ml at gmail.com  Mon May 25 22:22:58 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Mon, 25 May 2015 16:22:58 -0400
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
Message-ID: <556384A2.2070505@gmail.com>

On 2015-05-25 3:40 PM, Eric Snow wrote:
> I'd still be glad to land both in 3.5 if Yury (or
> others) wants to make that case.

I'm big +1 for a speedy OrderedDict in 3.5 (TBH I
thought it was merged in 3.5 long before alpha-4)

I doubt that merging it will add such a significant
instability that we cannot track and fix in several
months before the release, given that all tests pass
without refleaks/segfaults before committing the
implementation.

Yury

From tjreedy at udel.edu  Mon May 25 22:40:30 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Mon, 25 May 2015 16:40:30 -0400
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
Message-ID: <mk01cd$lp1$1@ger.gmane.org>

On 5/25/2015 3:40 PM, Eric Snow wrote:
> On Mon, May 25, 2015 at 1:33 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
>> On Sat, 23 May 2015 20:14:56 -0700
>> Larry Hastings <larry at hastings.org> wrote:
>>> Yeah, I'm willing to grant the feature freeze exception, assuming he can
>>> find general approval from the community (and assuming he still has

To me, the message from Antoine, below, and Benjamin's second suggest a 
lack of 'general approval'.

>>> Guido's blessing).  I just wanted a little more sunlight on the topic,
>>> rather than rushing to check it in.
>>
>> Given the pain that has gone into making the patch segfault- and
>> reference leak-free, and given it adds a lot of complication in the
>> data types area, I'm frankly uneasy with having this land after the
>> feature freeze. It's a sure recipe to *add* instability rather than
>> remove it.
>
> Well, the exception for C OrderedDict itself is a separate matter.  I
> chose to be more public than I could have been about the last
> remaining bugs in the interest of getting them resolved a bit faster.
> At this point I wouldn't consider C OrderedDict to add a whole lot of
> risk to 3.5.

> That said, at this point landing it in 3.5 it doesn't matter to me
> much because my main motivator (__definition_order__) isn't landing in
> 3.5.  The fact that 3.6 is already open to new features eases the
> sting a bit.  I'd still prefer to land OrdereDict-by-default class
> definition namespaces in 3.5, which is dependent on C OrderedDict, but
> alone that isn't as important to me for 3.5 as
> cls.__definition_order__ was.
>
> Regardless, I know there were a few folks (e.g. Yury) that wanted to
> see C OrderedDict in 3.5 and there may be others that would really
> like OrderedDict-by-default in 3.5 (Nick?).  Since Larry already gave
> an exception,

Conditional on 'general approval of the community'.

 > I'd still be glad to land both in 3.5 if Yury (or
> others) wants to make that case.  The patches will be ready.

-- 
Terry Jan Reedy


From p.f.moore at gmail.com  Mon May 25 23:59:34 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Mon, 25 May 2015 22:59:34 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <556380A9.4000206@mrabarnett.plus.com>
References: <556380A9.4000206@mrabarnett.plus.com>
Message-ID: <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>

On 25 May 2015 at 21:06, MRAB <python at mrabarnett.plus.com> wrote:
> As the subject says, I've been unable to build the regex module against
> Python 3.5b1 for 32-bit. MingGW says:
>
>     skipping incompatible .../libpython35.a when searching for -lpython35
>
> It builds without a problem against Python 3.5 for 64-bit.
>
> Any ideas? Should I just wait until beta 2?

MinGW is (and always has been) only marginally supported,
unfortunately. I'd rather it didn't break totally for 3.5, but I am
anticipating some difficulties (there have been a lot of
compiler-related changes with 3.5).

Could you raise a bug, including details of precisely how you tried to
build the module (presumably
https://pypi.python.org/pypi/regex/2015.05.10) and assign it to me?
I'll try to take a look and reproduce the issue. With luck, it may be
as simple as the wrong version of libpython35.a being picked up
somewhere.

Just to check the obvious - you *are* using 32-bit Python 3.5b1 and a
32-bit Mingw to build the 32-bit version, and 64-bit Python 3.5b1 and
a 64-bit Mingw to build the 64-bit one? (I.e., two installations of
Python and two of Mingw)

Paul

From ericsnowcurrently at gmail.com  Tue May 26 00:22:02 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Mon, 25 May 2015 16:22:02 -0600
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <mk01cd$lp1$1@ger.gmane.org>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
Message-ID: <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>

On Mon, May 25, 2015 at 2:40 PM, Terry Reedy <tjreedy at udel.edu> wrote:
> On 5/25/2015 3:40 PM, Eric Snow wrote:
>> Since Larry already gave an exception,
>
> Conditional on 'general approval of the community'.

Unless I misunderstood him, Larry gave me an unconditional exception
for OrderedDict itself (as long as it is in before beta 2.)  The
condition only applied to making OrderedDict the default class
definition namespace and adding cls.__definition_order__.
Furthermore, the condition related to the semantic changes to Python,
not to concerns about destabilizing Python.

I don't mean to imply that Larry can't retract (or modify) the
exceptions he's given me.  In fact, if there is sufficient risk of
de-stabilizing the release then I'd expect him to do so.  However,
ultimately that's his call as release manager; and I do not believe
there is any greater risk now than what I explained to him in our
discussions leading up to the exceptions I received.

-eric

>
>> I'd still be glad to land both in 3.5 if Yury (or
>>
>> others) wants to make that case.  The patches will be ready.

From python at mrabarnett.plus.com  Tue May 26 01:34:35 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Tue, 26 May 2015 00:34:35 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
 32-bit
In-Reply-To: <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
Message-ID: <5563B18B.4040305@mrabarnett.plus.com>

On 2015-05-25 22:59, Paul Moore wrote:
 > On 25 May 2015 at 21:06, MRAB <python at mrabarnett.plus.com> wrote:
 > > As the subject says, I've been unable to build the regex module against
 > > Python 3.5b1 for 32-bit. MingGW says:
 > >
 > >     skipping incompatible .../libpython35.a when searching for 
-lpython35
 > >
 > > It builds without a problem against Python 3.5 for 64-bit.
 > >
 > > Any ideas? Should I just wait until beta 2?
 >
 > MinGW is (and always has been) only marginally supported,
 > unfortunately. I'd rather it didn't break totally for 3.5, but I am
 > anticipating some difficulties (there have been a lot of
 > compiler-related changes with 3.5).
 >
 > Could you raise a bug, including details of precisely how you tried to
 > build the module (presumably
 > https://pypi.python.org/pypi/regex/2015.05.10) and assign it to me?
 > I'll try to take a look and reproduce the issue. With luck, it may be
 > as simple as the wrong version of libpython35.a being picked up
 > somewhere.
 >
 > Just to check the obvious - you *are* using 32-bit Python 3.5b1 and a
 > 32-bit Mingw to build the 32-bit version, and 64-bit Python 3.5b1 and
 > a 64-bit Mingw to build the 64-bit one? (I.e., two installations of
 > Python and two of Mingw)
 >
I'm not sure what happened, but I'm now getting this for Python 3.5 
(32-bit):

C:\Python35(32-bit)\libs/libpython35.a(dsxbs01290.o):(.idata$7+0x0): 
undefined reference to `_head_C__build_cpython_PCBuild_win32_libpython35_a'
C:\Python35(32-bit)\libs/libpython35.a(dsxbs00283.o):(.idata$7+0x0): 
undefined reference to `_head_C__build_cpython_PCBuild_win32_libpython35_a'
C:\Python35(32-bit)\libs/libpython35.a(dsxbs00291.o):(.idata$7+0x0): 
undefined reference to `_head_C__build_cpython_PCBuild_win32_libpython35_a'
C:\Python35(32-bit)\libs/libpython35.a(dsxbs00273.o):(.idata$7+0x0): 
undefined reference to `_head_C__build_cpython_PCBuild_win32_libpython35_a'
C:\Python35(32-bit)\libs/libpython35.a(dsxbs00255.o):(.idata$7+0x0): 
undefined reference to `_head_C__build_cpython_PCBuild_win32_libpython35_a'
C:\Python35(32-bit)\libs/libpython35.a(dsxbs01280.o):(.idata$7+0x0): 
more undefined references to 
`_head_C__build_cpython_PCBuild_win32_libpython35_a' follow
collect2: ld returned 1 exit status


All other builds, from Python 2.5 to Python 3.4, both 32-bit and 64-bit, 
and also Python 3.5 (64-bit), work.

The 32-bit Python says it's 32-bit and the 64-bit Python says it's 64-bit.

---8<---

C:

rem Compile for Python 3.5 (64-bit) [works]
cd C:\MinGW64\bin
"C:\MinGW64\bin\gcc.exe" -mdll -m64 -DMS_WIN64 -O2 -Wall -Wsign-compare 
-Wconversion -I"C:\Python35\include" -c 
"D:\projects\mrab-regex\regex_3\regex\_regex_unicode.c" -o 
"D:\projects\mrab-regex\regex_3\Release(3.5)\_regex_unicode.o"
"C:\MinGW64\bin\gcc.exe" -mdll -m64 -DMS_WIN64 -O2 -Wall -Wsign-compare 
-Wconversion -I"C:\Python35\include" -c 
"D:\projects\mrab-regex\regex_3\regex\_regex.c" -o 
"D:\projects\mrab-regex\regex_3\Release(3.5)\_regex.o"
"C:\MinGW64\bin\gcc.exe" -m64 -shared -s 
"D:\projects\mrab-regex\regex_3\Release(3.5)\_regex_unicode.o" 
"D:\projects\mrab-regex\regex_3\Release(3.5)\_regex.o" 
-L"C:\Python35\libs" -lpython35 -o 
"D:\projects\mrab-regex\regex_3\Release(3.5)\_regex.pyd"

rem Compile for Python 3.5 (32-bit) [fails]
cd C:\MinGW\bin
"C:\MinGW\bin\gcc.exe" -mdll -m32  -O2 -Wall -Wsign-compare -Wconversion 
-I"C:\Python35(32-bit)\include" -c 
"D:\projects\mrab-regex\regex_3\regex\_regex_unicode.c" -o 
"D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex_unicode.o"
"C:\MinGW\bin\gcc.exe" -mdll -m32  -O2 -Wall -Wsign-compare -Wconversion 
-I"C:\Python35(32-bit)\include" -c 
"D:\projects\mrab-regex\regex_3\regex\_regex.c" -o 
"D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex.o"
"C:\MinGW\bin\gcc.exe" -m32 -shared -s 
"D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex_unicode.o" 
"D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex.o" 
-L"C:\Python35(32-bit)\libs" -lpython35 -o 
"D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex.pyd"

---8<---



From ncoghlan at gmail.com  Tue May 26 01:06:50 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 26 May 2015 09:06:50 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
Message-ID: <CADiSq7fNum9C5HpJSmxP=VSou4mYX+anCYOtPSskPSseD+VTew@mail.gmail.com>

On 26 May 2015 05:41, "Eric Snow" <ericsnowcurrently at gmail.com> wrote:
>
> Regardless, I know there were a few folks (e.g. Yury) that wanted to
> see C OrderedDict in 3.5 and there may be others that would really
> like OrderedDict-by-default in 3.5 (Nick?).

I think it's the combination with PEP 487 that makes OrderedDict-by-default
genuinely compelling, so I don't mind if the application to class
namespaces slips to 3.6 (and perhaps even becomes part of that PEP).

For the feature freeze exception for the C odict implementation itself, I
think I'm morally obliged to back that given my highly debatable decision
to check in the PEP 489 implementation even though we hadn't worked out all
the kinks in module unloading yet (Petr subsequently got all of the
refleaks in the machinery itself sorted out before the beta, including a
previously undetected one in PyType_FromSpecAndBases, but there's still a
leak when unloading the new multi-phase initialisation test module).

Regards,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150526/588dedac/attachment.html>

From larry at hastings.org  Tue May 26 02:30:02 2015
From: larry at hastings.org (Larry Hastings)
Date: Mon, 25 May 2015 17:30:02 -0700
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
Message-ID: <5563BE8A.9070406@hastings.org>



On 05/25/2015 03:22 PM, Eric Snow wrote:
> On Mon, May 25, 2015 at 2:40 PM, Terry Reedy <tjreedy at udel.edu> wrote:
>> On 5/25/2015 3:40 PM, Eric Snow wrote:
>>> Since Larry already gave an exception,
>> Conditional on 'general approval of the community'.
> Unless I misunderstood him, Larry gave me an unconditional exception
> for OrderedDict itself (as long as it is in before beta 2.)

For the record I've granted three exceptions to the beta 1 feature 
freeze (so far):

  * Raymond asked for one (a couple weeks ago!) for adding slice support
    to collections.deque.  He knew he wouldn't have time to finish it
    before beta 1.
  * Serhiy asked for one very-last-minute for a C reimplementation of
    lru_cache.  He checked it in about a half-hour before feature freeze
    and it made all the buildbots fail. (The ones that weren't already
    failing, that is.)
  * Eric asked for one for this C reimplementation of OrderedDict; the
    coding was done, the debugging wasn't.

And yes, as Eric said, I made separate pronouncements.  I said 
COrderedDict could go in as long as it was in before beta 2; "the other 
work" of __definition_order__ and switching type_prepare and 
__build_class__ to using ordered dicts I made conditional on "general 
approval of the community."  The latter has already been tabled for now.


So, in all three cases it's work that's been under development for a 
while.  These people did this work out of the kindness of their hearts, 
to make Python better.  As a community we want to encourage that and 
make sure these developers know we appreciate their efforts.  These 
people would be happier if the work shipped in 3.5 as opposed to 3.6 so 
it got into user's hands sooner.

Also, in Serhiy and Eric's cases, these are reimplementations of 
existing Python libraries in C.  On the one hand, that means we should 
have good regression test coverage in the library--which it seems like 
we do, as both of them are debugging problems uncovered by the 
regression tests.  This gives us a little more confidence that the work 
is good.  On the other hand, it does mean there's a higher chance of 
destabilization, as there's already an installed base using these 
libraries.  (As opposed to something new like math.isclose which has no 
installed base.)  So yes this could introduce bugs that will impact 
existing users.


Bottom line: while an important part job of my job is saying "no", I 
also feel like an important part of my job is saying "yes".  On balance, 
what will be best for Python?  In these cases, I think "yes" is better.  
My feeling is, let's check it in (before beta 2), and if it causes 
problems during the betas / rcs we can back them out.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150525/b101f8fd/attachment.html>

From ericsnowcurrently at gmail.com  Tue May 26 04:00:06 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Mon, 25 May 2015 20:00:06 -0600
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <5563BE8A.9070406@hastings.org>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org>
Message-ID: <CALFfu7DDA=JqcQSUZvdbiaoE2gm9kMPu5cV2EN48R5_1othqWQ@mail.gmail.com>

On Mon, May 25, 2015 at 6:30 PM, Larry Hastings <larry at hastings.org> wrote:
> Eric asked for one for this C reimplementation of OrderedDict; the coding
> was done, the debugging wasn't.
>
> And yes, as Eric said, I made separate pronouncements.  I said COrderedDict
> could go in as long as it was in before beta 2; "the other work" of
> __definition_order__ and switching type_prepare and __build_class__ to using
> ordered dicts I made conditional on "general approval of the community."
> The latter has already been tabled for now.
>
>
> So, in all three cases it's work that's been under development for a while.
> These people did this work out of the kindness of their hearts, to make
> Python better.  As a community we want to encourage that and make sure these
> developers know we appreciate their efforts.  These people would be happier
> if the work shipped in 3.5 as opposed to 3.6 so it got into user's hands
> sooner.
>
> Also, in Serhiy and Eric's cases, these are reimplementations of existing
> Python libraries in C.  On the one hand, that means we should have good
> regression test coverage in the library--which it seems like we do, as both
> of them are debugging problems uncovered by the regression tests.  This
> gives us a little more confidence that the work is good.  On the other hand,
> it does mean there's a higher chance of destabilization, as there's already
> an installed base using these libraries.  (As opposed to something new like
> math.isclose which has no installed base.)  So yes this could introduce bugs
> that will impact existing users.
>
>
> Bottom line: while an important part job of my job is saying "no", I also
> feel like an important part of my job is saying "yes".  On balance, what
> will be best for Python?  In these cases, I think "yes" is better.  My
> feeling is, let's check it in (before beta 2), and if it causes problems
> during the betas / rcs we can back them out.

Thanks, Larry.  As to the conditional exceptions, I'm just going to
drop those entirely in favor of getting them in 3.6.  I'll still
pursue OrderedDict for 3.5b2 though (and hope to land it this week).

-eric

From yselivanov.ml at gmail.com  Tue May 26 07:33:46 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 26 May 2015 01:33:46 -0400
Subject: [Python-Dev] [Python-checkins] cpython (3.4): Issue #23840:
 tokenize.open() now closes the temporary binary file on error to
In-Reply-To: <5563F5E6.3080705@udel.edu>
References: <20150525224931.126980.95819@psf.io> <5563F5E6.3080705@udel.edu>
Message-ID: <556405BA.3090201@gmail.com>



On 2015-05-26 12:26 AM, Terry Reedy wrote:
>> +    try:
>> +        encoding, lines = detect_encoding(buffer.readline)
>> +        buffer.seek(0)
>> +        text = TextIOWrapper(buffer, encoding, line_buffering=True)
>> +        text.mode = 'r'
>> +        return text
>> +    except:
>> +        buffer.close()
>> +        raise
>
> Please do not add bare 'except:'.  If you mean 'except 
> BaseException:', say so. 

try..finally would be even better.

Yury

From victor.stinner at gmail.com  Tue May 26 08:20:01 2015
From: victor.stinner at gmail.com (Victor Stinner)
Date: Tue, 26 May 2015 08:20:01 +0200
Subject: [Python-Dev] [Python-checkins] cpython (3.4): Issue #23840:
 tokenize.open() now closes the temporary binary file on error to
In-Reply-To: <556405BA.3090201@gmail.com>
References: <20150525224931.126980.95819@psf.io> <5563F5E6.3080705@udel.edu>
 <556405BA.3090201@gmail.com>
Message-ID: <CAMpsgwbrJCifEj0594ya5vJgonyTgJQgkXHwGM_SkvU1p1L3Vg@mail.gmail.com>

What is wrong with "except:" in this specific case?

Victor

Le mardi 26 mai 2015, Yury Selivanov <yselivanov.ml at gmail.com> a ?crit :

>
>
> On 2015-05-26 12:26 AM, Terry Reedy wrote:
>
>> +    try:
>>> +        encoding, lines = detect_encoding(buffer.readline)
>>> +        buffer.seek(0)
>>> +        text = TextIOWrapper(buffer, encoding, line_buffering=True)
>>> +        text.mode = 'r'
>>> +        return text
>>> +    except:
>>> +        buffer.close()
>>> +        raise
>>>
>>
>> Please do not add bare 'except:'.  If you mean 'except BaseException:',
>> say so.
>>
>
> try..finally would be even better.
>
> Yury
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150526/ee476903/attachment.html>

From storchaka at gmail.com  Tue May 26 08:44:22 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Tue, 26 May 2015 09:44:22 +0300
Subject: [Python-Dev] [Python-checkins] cpython (3.4): Issue #23840:
 tokenize.open() now closes the temporary binary file on error to
In-Reply-To: <556405BA.3090201@gmail.com>
References: <20150525224931.126980.95819@psf.io> <5563F5E6.3080705@udel.edu>
 <556405BA.3090201@gmail.com>
Message-ID: <mk14o6$bdm$1@ger.gmane.org>

On 26.05.15 08:33, Yury Selivanov wrote:
> On 2015-05-26 12:26 AM, Terry Reedy wrote:
>>> +    try:
>>> +        encoding, lines = detect_encoding(buffer.readline)
>>> +        buffer.seek(0)
>>> +        text = TextIOWrapper(buffer, encoding, line_buffering=True)
>>> +        text.mode = 'r'
>>> +        return text
>>> +    except:
>>> +        buffer.close()
>>> +        raise
>>
>> Please do not add bare 'except:'.  If you mean 'except
>> BaseException:', say so.
>
> try..finally would be even better.

No, finally is not correct there. buffer shouldn't be closed if no 
exception is raised.

And this is one of the cases in which bare 'except:' is absolutely legal.


From p.f.moore at gmail.com  Tue May 26 08:49:29 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 26 May 2015 07:49:29 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <5563B18B.4040305@mrabarnett.plus.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
Message-ID: <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>

On 26 May 2015 at 00:34, MRAB <python at mrabarnett.plus.com> wrote:
> The 32-bit Python says it's 32-bit and the 64-bit Python says it's 64-bit.
>
> ---8<---
>
> C:
>
> rem Compile for Python 3.5 (64-bit) [works]
> cd C:\MinGW64\bin
> "C:\MinGW64\bin\gcc.exe" -mdll -m64 -DMS_WIN64 -O2 -Wall -Wsign-compare
> -Wconversion -I"C:\Python35\include" -c
> "D:\projects\mrab-regex\regex_3\regex\_regex_unicode.c" -o
> "D:\projects\mrab-regex\regex_3\Release(3.5)\_regex_unicode.o"
> "C:\MinGW64\bin\gcc.exe" -mdll -m64 -DMS_WIN64 -O2 -Wall -Wsign-compare
> -Wconversion -I"C:\Python35\include" -c
> "D:\projects\mrab-regex\regex_3\regex\_regex.c" -o
> "D:\projects\mrab-regex\regex_3\Release(3.5)\_regex.o"
> "C:\MinGW64\bin\gcc.exe" -m64 -shared -s
> "D:\projects\mrab-regex\regex_3\Release(3.5)\_regex_unicode.o"
> "D:\projects\mrab-regex\regex_3\Release(3.5)\_regex.o" -L"C:\Python35\libs"
> -lpython35 -o "D:\projects\mrab-regex\regex_3\Release(3.5)\_regex.pyd"
>
> rem Compile for Python 3.5 (32-bit) [fails]
> cd C:\MinGW\bin
> "C:\MinGW\bin\gcc.exe" -mdll -m32  -O2 -Wall -Wsign-compare -Wconversion
> -I"C:\Python35(32-bit)\include" -c
> "D:\projects\mrab-regex\regex_3\regex\_regex_unicode.c" -o
> "D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex_unicode.o"
> "C:\MinGW\bin\gcc.exe" -mdll -m32  -O2 -Wall -Wsign-compare -Wconversion
> -I"C:\Python35(32-bit)\include" -c
> "D:\projects\mrab-regex\regex_3\regex\_regex.c" -o
> "D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex.o"
> "C:\MinGW\bin\gcc.exe" -m32 -shared -s
> "D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex_unicode.o"
> "D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex.o"
> -L"C:\Python35(32-bit)\libs" -lpython35 -o
> "D:\projects\mrab-regex\regex_3\Release(3.5)(32-bit)\_regex.pyd"
>
> ---8<---

Do you get the same failure when using distutils to build the extension?

Paul

PS This discussion should probably be moved to bugs.python.org, as I
mentioned...

From victor.stinner at gmail.com  Tue May 26 09:54:49 2015
From: victor.stinner at gmail.com (Victor Stinner)
Date: Tue, 26 May 2015 09:54:49 +0200
Subject: [Python-Dev] [Python-checkins] cpython (3.4): Issue #23840:
 tokenize.open() now closes the temporary binary file on error to
In-Reply-To: <mk14o6$bdm$1@ger.gmane.org>
References: <20150525224931.126980.95819@psf.io> <5563F5E6.3080705@udel.edu>
 <556405BA.3090201@gmail.com> <mk14o6$bdm$1@ger.gmane.org>
Message-ID: <CAMpsgwbFNyWkroEgs78SDYAfDzGh_uQyntJOD=iNyHOBLuQQPQ@mail.gmail.com>

2015-05-26 8:44 GMT+02:00 Serhiy Storchaka <storchaka at gmail.com>:
> No, finally is not correct there. buffer shouldn't be closed if no exception
> is raised.

Yep. The binary file must only be closed in case of error, as written
in the commit message.

> And this is one of the cases in which bare 'except:' is absolutely legal.

The "except: <cleanup code>; raise" is common in the Python stdlib.

Victor

From p.f.moore at gmail.com  Tue May 26 11:50:17 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 26 May 2015 10:50:17 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
 <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>
Message-ID: <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>

On 26 May 2015 at 07:49, Paul Moore <p.f.moore at gmail.com> wrote:
> Do you get the same failure when using distutils to build the extension?

Hmm, I just checked and it seems that only Python 3.5 ships
libpythonXY.a by default - 3.4 and earlier (at least on my machine)
don't have it. Presumably you generated it yourself for the previous
versions?

So I wonder if this is a difference between how you built your version
previously and how the shipped version is now built. The code to build
the shipped version is in Tools\msi\dev\dev.wixproj. It runs

    gendef - "$(BuildPath)$(PyDllName).dll" >
"$(IntermediateOutputPath)mingwlib.def"
    dlltool --dllname $(PyDllName).dll --def
"$(IntermediateOutputPath)mingwlib.def" --output-lib
"$(BuildPath)lib$(PyDllName).a" -m $(_GenDefPlatform)

which looks OK to me (_GenDefPlatform is i386 on 32-bit and
i386:x86-64 on 64-bit).

Paul

From Steve.Dower at microsoft.com  Tue May 26 14:55:59 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Tue, 26 May 2015 12:55:59 +0000
Subject: [Python-Dev] Unable to build regex module against Python
	3.5	32-bit
In-Reply-To: <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
 <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>,
 <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>
Message-ID: <BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>

The builds I am responsible for include it because someone reported an issue and was persistent and helpful enough that I fixed it for them.

That said, until MinGW agrees on a stable branch/version/fork, there seems to be a good chance that the shipped lib won't work for some people. If this is what's happened here, I see it as a good enough reason to stop shipping the lib and to add instructions on generating it instead (the gendef/dlltool dance).

Cheers,
Steve

Top-posted from my Windows Phone
________________________________
From: Paul Moore<mailto:p.f.moore at gmail.com>
Sent: ?5/?26/?2015 2:50
To: MRAB<mailto:python at mrabarnett.plus.com>
Cc: Python-Dev<mailto:python-dev at python.org>
Subject: Re: [Python-Dev] Unable to build regex module against Python 3.5 32-bit

On 26 May 2015 at 07:49, Paul Moore <p.f.moore at gmail.com> wrote:
> Do you get the same failure when using distutils to build the extension?

Hmm, I just checked and it seems that only Python 3.5 ships
libpythonXY.a by default - 3.4 and earlier (at least on my machine)
don't have it. Presumably you generated it yourself for the previous
versions?

So I wonder if this is a difference between how you built your version
previously and how the shipped version is now built. The code to build
the shipped version is in Tools\msi\dev\dev.wixproj. It runs

    gendef - "$(BuildPath)$(PyDllName).dll" >
"$(IntermediateOutputPath)mingwlib.def"
    dlltool --dllname $(PyDllName).dll --def
"$(IntermediateOutputPath)mingwlib.def" --output-lib
"$(BuildPath)lib$(PyDllName).a" -m $(_GenDefPlatform)

which looks OK to me (_GenDefPlatform is i386 on 32-bit and
i386:x86-64 on 64-bit).

Paul
_______________________________________________
Python-Dev mailing list
Python-Dev at python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150526/5b1f5f7b/attachment.html>

From p.f.moore at gmail.com  Tue May 26 15:24:09 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 26 May 2015 14:24:09 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
 <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>
 <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>
 <BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>

On 26 May 2015 at 13:55, Steve Dower <Steve.Dower at microsoft.com> wrote:
> The builds I am responsible for include it because someone reported an issue
> and was persistent and helpful enough that I fixed it for them.
>
> That said, until MinGW agrees on a stable branch/version/fork, there seems
> to be a good chance that the shipped lib won't work for some people. If this
> is what's happened here, I see it as a good enough reason to stop shipping
> the lib and to add instructions on generating it instead (the gendef/dlltool
> dance).

Agreed. If shipping it helps, then great. If it's going to cause bug
reports, let's go back to the status quo of not having it. The
instructions for generating it were in the old distutils docs, now
removed in the cleanup / redirection to packaging.python.org. I'm
inclined to just leave it undocumented - the people who need it know
how to do it or can find it, whereas documenting the process implies a
level of support that we're not yet really able to provide.

Let's wait to see what the OP comes back with before making a final
decision. I see a few questions:

1. Does using distutils work (as opposed to the quoted manual compile steps)?
2. Does using whatever process he used in the past to generate
libpythonXY.a result in a working version?

https://docs.python.org/2.7/install/index.html#gnu-c-cygwin-mingw
suggests using pexports rather than gendef. I don't know if that could
produce a different result, but I can't see how... It also implicitly
assumes you're using dlltool from the toolchain you'll be building
with rather than using -m. Again, I can't see why that would affect
the results.

Paul.

From yselivanov.ml at gmail.com  Tue May 26 15:30:38 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Tue, 26 May 2015 09:30:38 -0400
Subject: [Python-Dev] [Python-checkins] cpython (3.4): Issue #23840:
 tokenize.open() now closes the temporary binary file on error to
In-Reply-To: <CAMpsgwbFNyWkroEgs78SDYAfDzGh_uQyntJOD=iNyHOBLuQQPQ@mail.gmail.com>
References: <20150525224931.126980.95819@psf.io> <5563F5E6.3080705@udel.edu>
 <556405BA.3090201@gmail.com> <mk14o6$bdm$1@ger.gmane.org>
 <CAMpsgwbFNyWkroEgs78SDYAfDzGh_uQyntJOD=iNyHOBLuQQPQ@mail.gmail.com>
Message-ID: <5564757E.9090307@gmail.com>



On 2015-05-26 3:54 AM, Victor Stinner wrote:
> 2015-05-26 8:44 GMT+02:00 Serhiy Storchaka <storchaka at gmail.com>:
>> No, finally is not correct there. buffer shouldn't be closed if no exception
>> is raised.
> Yep. The binary file must only be closed in case of error, as written
> in the commit message.


Right.  My bad, sorry Victor ;)

Yury

From jimjjewett at gmail.com  Tue May 26 17:13:18 2015
From: jimjjewett at gmail.com (Jim J. Jewett)
Date: Tue, 26 May 2015 08:13:18 -0700 (PDT)
Subject: [Python-Dev] Preserving the definition order of
	class	namespaces.
In-Reply-To: <CADiSq7fuHp7byY1cbweCNf2vrMocRHZvrHm=fJtYwZCR-=tRyg@mail.gmail.com>">
 <META NAME="robots" CONTENT="index,nofollow
Message-ID: <55648d8e.a6ab320a.4475.5787@mx.google.com>



On Sun May 24 12:06:40 CEST 2015, Nick Coghlan wrote:
> On 24 May 2015 at 19:44, Mark Shannon <mark at hotpy.org> wrote:
>> On 24/05/15 10:35, Nick Coghlan wrote:
>>> If we leave __definition_order__ out for the time being then, for the
>>> vast majority of code, the fact that the ephemeral namespace used to
>>> evaluate the class body switched from being a basic dictionary to an
>>> ordered one would be a hidden implementation detail, rather than
>>> making all type objects a little bigger.

>> and a little slower.

> The runtime namespace used to store the class attributes is remaining
> a plain dict object regardless,

Lookup isn't any slower in the ordereddict.

Inserts are slower -- and those would happen in the ordereddict, as
the type object is being defined.

Note that since we're talking about the type objects, rather than the
instances, most* long-running code won't care, but it will hurt startup
time.

*code which creates lots of throwaway classes is obviously an exception.

FWIW, much of the extra per-insert cost is driven by either the need to
keep deletion O(1) or the desire to keep the C layout binary compatible.

A different layout (with its own lookdict) could optimize for the
insert-each-value-once case, or even for small dicts (e.g., keyword
dicts).  I could imagine this producing a speedup, with the ordering
being just a side benefit.  It is too late to use such a new layout by
default in 3.5, but we should be careful not to close it off.  (That
said, I don't think __definition_order__ would actually close it off,
though it might start to look like a wart.)

-jJ

--

If there are still threading problems with my replies, please
email me with details, so that I can try to resolve them.  -jJ

From python at mrabarnett.plus.com  Tue May 26 19:23:12 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Tue, 26 May 2015 18:23:12 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
 32-bit
In-Reply-To: <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>	<CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>	<5563B18B.4040305@mrabarnett.plus.com>	<CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>	<CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>	<BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>
Message-ID: <5564AC00.6010509@mrabarnett.plus.com>

On 2015-05-26 14:24, Paul Moore wrote:
 > On 26 May 2015 at 13:55, Steve Dower <Steve.Dower at microsoft.com> wrote:
 > > The builds I am responsible for include it because someone reported 
an issue
 > > and was persistent and helpful enough that I fixed it for them.
 > >
 > > That said, until MinGW agrees on a stable branch/version/fork, 
there seems
 > > to be a good chance that the shipped lib won't work for some 
people. If this
 > > is what's happened here, I see it as a good enough reason to stop 
shipping
 > > the lib and to add instructions on generating it instead (the 
gendef/dlltool
 > > dance).
 >
 > Agreed. If shipping it helps, then great. If it's going to cause bug
 > reports, let's go back to the status quo of not having it. The
 > instructions for generating it were in the old distutils docs, now
 > removed in the cleanup / redirection to packaging.python.org. I'm
 > inclined to just leave it undocumented - the people who need it know
 > how to do it or can find it, whereas documenting the process implies a
 > level of support that we're not yet really able to provide.
 >
 > Let's wait to see what the OP comes back with before making a final
 > decision. I see a few questions:
 >
 > 1. Does using distutils work (as opposed to the quoted manual compile 
steps)?
 > 2. Does using whatever process he used in the past to generate
 > libpythonXY.a result in a working version?
 >
 > https://docs.python.org/2.7/install/index.html#gnu-c-cygwin-mingw
 > suggests using pexports rather than gendef. I don't know if that could
 > produce a different result, but I can't see how... It also implicitly
 > assumes you're using dlltool from the toolchain you'll be building
 > with rather than using -m. Again, I can't see why that would affect
 > the results.
 >
It's been so long since the last Python release that I didn't remember 
that I had to make the lib file.

I made libpython35.a like I did for the others and it's all working now. 
All the tests pass. :-)


From p.f.moore at gmail.com  Tue May 26 19:27:06 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Tue, 26 May 2015 18:27:06 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <5564AC00.6010509@mrabarnett.plus.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
 <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>
 <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>
 <BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>
 <5564AC00.6010509@mrabarnett.plus.com>
Message-ID: <CACac1F9rNceTVMnQCZmug3eeYseZR+0CKS4YWdzAGo0hOom3zA@mail.gmail.com>

On 26 May 2015 at 18:23, MRAB <python at mrabarnett.plus.com> wrote:
> I made libpython35.a like I did for the others and it's all working now. All
> the tests pass. :-)

Cool. How did you make libpython35.a? Sounds like there is some
difference in the process being used for the shipped version.

Paul

From rdmurray at bitdance.com  Tue May 26 23:12:49 2015
From: rdmurray at bitdance.com (R. David Murray)
Date: Tue, 26 May 2015 17:12:49 -0400
Subject: [Python-Dev] [Python-checkins] cpython (3.4): Issue #23840:
	tokenize.open() now closes the temporary binary file on error to
In-Reply-To: <CAMpsgwbrJCifEj0594ya5vJgonyTgJQgkXHwGM_SkvU1p1L3Vg@mail.gmail.com>
References: <20150525224931.126980.95819@psf.io> <5563F5E6.3080705@udel.edu>
 <556405BA.3090201@gmail.com>
 <CAMpsgwbrJCifEj0594ya5vJgonyTgJQgkXHwGM_SkvU1p1L3Vg@mail.gmail.com>
Message-ID: <20150526211249.5C4E1250DE6@webabinitio.net>

On Tue, 26 May 2015 08:20:01 +0200, Victor Stinner <victor.stinner at gmail.com> wrote:
> What is wrong with "except:" in this specific case?

Nothing is wrong with it from a technical standpoint.  However, if we
use 'except BaseException' that makes it clear that someone has thought
about it and decided that all exceptions should be caught, as opposed to
it being legacy code or a programming mistake.

> > On 2015-05-26 12:26 AM, Terry Reedy wrote:
> >
> >>> +    try:
> >>> +        encoding, lines = detect_encoding(buffer.readline)
> >>> +        buffer.seek(0)
> >>> +        text = TextIOWrapper(buffer, encoding, line_buffering=True)
> >>> +        text.mode = 'r'
> >>> +        return text
> >>> +    except:
> >>> +        buffer.close()
> >>> +        raise
> >>>
> >>
> >> Please do not add bare 'except:'.  If you mean 'except BaseException:',
> >> say so.

From python at mrabarnett.plus.com  Wed May 27 04:02:52 2015
From: python at mrabarnett.plus.com (MRAB)
Date: Wed, 27 May 2015 03:02:52 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
 32-bit
In-Reply-To: <CACac1F9rNceTVMnQCZmug3eeYseZR+0CKS4YWdzAGo0hOom3zA@mail.gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>	<CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>	<5563B18B.4040305@mrabarnett.plus.com>	<CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>	<CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>	<BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>	<CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>	<5564AC00.6010509@mrabarnett.plus.com>
 <CACac1F9rNceTVMnQCZmug3eeYseZR+0CKS4YWdzAGo0hOom3zA@mail.gmail.com>
Message-ID: <556525CC.4050105@mrabarnett.plus.com>

On 2015-05-26 18:27, Paul Moore wrote:
 > On 26 May 2015 at 18:23, MRAB <python at mrabarnett.plus.com> wrote:
 > > I made libpython35.a like I did for the others and it's all working 
now. All
 > > the tests pass. :-)
 >
 > Cool. How did you make libpython35.a? Sounds like there is some
 > difference in the process being used for the shipped version.
 >
For making the .def files, I used:

     C:\MinGW64\x86_64-w64-mingw32\bin\gendef.exe

MinGW didn't contain gendef.exe!

For making the .a files, I used:

     C:\MinGW64\bin\dlltool.exe

for the 64-bit builds and:

     C:\MinGW\bin\dlltool.exe

for the 32-bit builds.

They all worked.

When I tried:

     C:\MinGW64\bin\dlltool.exe

or:

     C:\MinGW64\x86_64-w64-mingw32\bin\dlltool.exe

for the 32-bit builds, they wouldn't link.


From p.f.moore at gmail.com  Wed May 27 08:39:58 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 27 May 2015 07:39:58 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <556525CC.4050105@mrabarnett.plus.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
 <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>
 <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>
 <BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>
 <5564AC00.6010509@mrabarnett.plus.com>
 <CACac1F9rNceTVMnQCZmug3eeYseZR+0CKS4YWdzAGo0hOom3zA@mail.gmail.com>
 <556525CC.4050105@mrabarnett.plus.com>
Message-ID: <CACac1F8983X=+kOmQ2_e=s11EHV8rZH6MrQ2S45+Jpy1+7qorg@mail.gmail.com>

On 27 May 2015 at 03:02, MRAB <python at mrabarnett.plus.com> wrote:
> When I tried:
>
>     C:\MinGW64\bin\dlltool.exe
>
> or:
>
>     C:\MinGW64\x86_64-w64-mingw32\bin\dlltool.exe
>
> for the 32-bit builds, they wouldn't link.

Was that with "-m i386"? If so, then I suspect that's the issue.
Steve, did you use 64-bit mingw to build the .a files? Assuming so, I
guess we either stop shipping libpythonXY.a, or the instructions for
building the Windows release need to clearly state that you want a
32-bit mingw on PATH for the 32-bit builds, and a 64-bit mingw on PATH
for the 64-bit builds (which sounds messy and error-prone :-()

I'd be inclined to call this a mingw bug. However, I don't have the
first clue where to report it.

Paul

From ncoghlan at gmail.com  Wed May 27 10:10:30 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 27 May 2015 18:10:30 +1000
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
 <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>
 <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>
 <BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>
Message-ID: <CADiSq7f7odgBhTGVDkXo33qQkHeRy34HjE31aMJaLgfjVBp70w@mail.gmail.com>

On 26 May 2015 23:25, "Paul Moore" <p.f.moore at gmail.com> wrote:
>
> On 26 May 2015 at 13:55, Steve Dower <Steve.Dower at microsoft.com> wrote:
> > The builds I am responsible for include it because someone reported an
issue
> > and was persistent and helpful enough that I fixed it for them.
> >
> > That said, until MinGW agrees on a stable branch/version/fork, there
seems
> > to be a good chance that the shipped lib won't work for some people. If
this
> > is what's happened here, I see it as a good enough reason to stop
shipping
> > the lib and to add instructions on generating it instead (the
gendef/dlltool
> > dance).
>
> Agreed. If shipping it helps, then great. If it's going to cause bug
> reports, let's go back to the status quo of not having it. The
> instructions for generating it were in the old distutils docs, now
> removed in the cleanup / redirection to packaging.python.org. I'm
> inclined to just leave it undocumented - the people who need it know
> how to do it or can find it, whereas documenting the process implies a
> level of support that we're not yet really able to provide.

The old distutils docs aren't gone, the top level links just moved to the
distutils package docs: https://docs.python.org/3/library/distutils.html

I kept them (with the same deep link URLs) because I know there's stuff in
there that isn't currently documented anywhere else. I moved them to a more
obscure location because there's also stuff in there that's thoroughly
outdated, and it's a non-trivial task to figure out which is which and move
the still useful stuff to a more appropriate home :)

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/3d701fcf/attachment-0001.html>

From solipsis at pitrou.net  Wed May 27 10:16:22 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Wed, 27 May 2015 10:16:22 +0200
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org>
Message-ID: <20150527101622.10a36d75@fsol>

On Mon, 25 May 2015 17:30:02 -0700
Larry Hastings <larry at hastings.org> wrote:
> 
> So, in all three cases it's work that's been under development for a 
> while.  These people did this work out of the kindness of their hearts, 
> to make Python better.  As a community we want to encourage that and 
> make sure these developers know we appreciate their efforts.  These 
> people would be happier if the work shipped in 3.5 as opposed to 3.6 so 
> it got into user's hands sooner.

I second that sentiment. But it strikes me that we're doing this
because our release frequency is completely inadapted. If we had
feature releases, say, every 6 or 9 months, the problem wouldn't really
exist in the first place. Exceptions granted by the RM only tackle a
very small portion of the problem, because the long delay before an
accepted patch being in an official release *still* frustrates everyone,
and the unpredictability of exceptions only makes things *more*
frustrating for most players.

Of course, it was pretty much shut down before:
https://mail.python.org/pipermail/python-dev/2012-January/115619.html

Regards

Antoine.



From p.f.moore at gmail.com  Wed May 27 10:25:01 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Wed, 27 May 2015 09:25:01 +0100
Subject: [Python-Dev] Unable to build regex module against Python 3.5
	32-bit
In-Reply-To: <CADiSq7f7odgBhTGVDkXo33qQkHeRy34HjE31aMJaLgfjVBp70w@mail.gmail.com>
References: <556380A9.4000206@mrabarnett.plus.com>
 <CACac1F9EXVXH_8FEeyVoDkZBbGuDPz-+O4LzCLZ8PMjjOsr9_w@mail.gmail.com>
 <5563B18B.4040305@mrabarnett.plus.com>
 <CACac1F9aQ1Jj_HYQisjnDP3SJtVievP=oLgjEBJvjxj8btO9pg@mail.gmail.com>
 <CACac1F_j2-riRTvLzNe=dhrwbRVROqM+gT=rgyy4FzCAe-pibA@mail.gmail.com>
 <BY1PR03MB14661DDB918F7C6D961EE8B5F5CC0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <CACac1F9hrDLMsiXs+YpP0fzdMG+9PHcT1q=HgpuhOe9xT1ZHtQ@mail.gmail.com>
 <CADiSq7f7odgBhTGVDkXo33qQkHeRy34HjE31aMJaLgfjVBp70w@mail.gmail.com>
Message-ID: <CACac1F-n8zN+5cxQu8DsBGQWBkUkAVTVQUngsHrS1OZMNmUVAQ@mail.gmail.com>

On 27 May 2015 at 09:10, Nick Coghlan <ncoghlan at gmail.com> wrote:
> The old distutils docs aren't gone, the top level links just moved to the
> distutils package docs: https://docs.python.org/3/library/distutils.html
>
> I kept them (with the same deep link URLs) because I know there's stuff in
> there that isn't currently documented anywhere else. I moved them to a more
> obscure location because there's also stuff in there that's thoroughly
> outdated, and it's a non-trivial task to figure out which is which and move
> the still useful stuff to a more appropriate home :)

Thanks.

Your plan worked perfectly, because I never knew they were there :-)

https://docs.python.org/3/install/index.html#older-versions-of-python-and-mingw
implies that the libpythonXY.a files are only needed in older versions
of Python/mingw. I don't know how true that is, although I do know
that mingw should be able to link directly to a DLL without needing a
lib file.

It would be interesting to know if MRAB's build process can use the
DLL, rather than requiring a lib file (or for that matter if distutils
works without the lib file!)
Paul

From ncoghlan at gmail.com  Wed May 27 10:34:29 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 27 May 2015 18:34:29 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <20150527101622.10a36d75@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
Message-ID: <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>

On 27 May 2015 18:18, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
>
> On Mon, 25 May 2015 17:30:02 -0700
> Larry Hastings <larry at hastings.org> wrote:
> >
> > So, in all three cases it's work that's been under development for a
> > while.  These people did this work out of the kindness of their hearts,
> > to make Python better.  As a community we want to encourage that and
> > make sure these developers know we appreciate their efforts.  These
> > people would be happier if the work shipped in 3.5 as opposed to 3.6 so
> > it got into user's hands sooner.
>
> I second that sentiment. But it strikes me that we're doing this
> because our release frequency is completely inadapted. If we had
> feature releases, say, every 6 or 9 months, the problem wouldn't really
> exist in the first place. Exceptions granted by the RM only tackle a
> very small portion of the problem, because the long delay before an
> accepted patch being in an official release *still* frustrates everyone,
> and the unpredictability of exceptions only makes things *more*
> frustrating for most players.

I'd actually like to pursue a more nuanced view of what's permitted in
maintenance releases, based on a combination of the language moratorium
PEP, and an approach inspired by PEP 466, requiring that every feature
added in a maintenance release be detectable through an attribute check on
a module (with corresponding support in dependency checking tools).

The problem with simply speeding up the release cycle without constraining
the interim releases in some way is that it creates significant pain for
alternate implementations and for downstream redistributors (many of whom
are still dealing with the fallout of the Python 3 transition).

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/ea243fb7/attachment.html>

From solipsis at pitrou.net  Wed May 27 11:02:34 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Wed, 27 May 2015 11:02:34 +0200
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>
Message-ID: <20150527110234.5a0490a1@fsol>

On Wed, 27 May 2015 18:34:29 +1000
Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> I'd actually like to pursue a more nuanced view of what's permitted in
> maintenance releases, based on a combination of the language moratorium
> PEP, and an approach inspired by PEP 466, requiring that every feature
> added in a maintenance release be detectable through an attribute check on
> a module (with corresponding support in dependency checking tools).

I think this will only complicate the rules and make contributing
a specialists' game. Every time you complicate the rules, you add
uncertainty, cognitive overhead, opportunities for lengthy discussions,
controversies (for an extreme example, think of the Wikipedia process).
There is enough boilerplate to care about right now when integrating a
patch.

> The problem with simply speeding up the release cycle without constraining
> the interim releases in some way is that it creates significant pain for
> alternate implementations and for downstream redistributors (many of whom
> are still dealing with the fallout of the Python 3 transition).

At some point, we should recognize our pain is more important than
others' when it comes to the fitness of *our* community. I don't see
those other people caring about our pain, and proposing e.g. to offload
some of the maintenance burden (for example the 2.7 LTS maintenance
burden, the maintenance of security branches). For some reason it
sounds like we should be altruistic towards people who are not :-)

Regards

Antoine.

From ncoghlan at gmail.com  Wed May 27 14:40:53 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 27 May 2015 22:40:53 +1000
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <20150527110234.5a0490a1@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>
 <20150527110234.5a0490a1@fsol>
Message-ID: <CADiSq7f+a3gPNVRcncmPwm4KwN3d+wWR0K6GnaeG7LHo2OLJ5Q@mail.gmail.com>

On 27 May 2015 at 19:02, Antoine Pitrou <solipsis at pitrou.net> wrote:
> At some point, we should recognize our pain is more important than
> others' when it comes to the fitness of *our* community. I don't see
> those other people caring about our pain, and proposing e.g. to offload
> some of the maintenance burden (for example the 2.7 LTS maintenance
> burden, the maintenance of security branches).

Sure we care, corporate politics and business cases are just complex
beasts to wrangle when it comes to justifying funding of upstream
contributions. The main problem I personally have pushing for
increased direct upstream investment at the moment is that we're still
trying to get folks off Python *2.6*, provide a smoother transition
plan for the Python 2.7 network security fixes, and similarly get
ready to help customers handle the Python 3 migration, so it's hard
for me to make the case that upstream maintenance is the task most in
need of immediate investment. (We also don't have a well established
culture of Python users reporting Python bugs to the commercial
redistributor providing their copy of Python, which breaks one of the
key metrics we rely on as redistributors for getting upstream
contributions funded appropriately funded: "only files bugs and
feature requests directly with the upstream project" and "doesn't use
the project at all" unfortunately looks identical from a vendor
perspective)

Even with those sorts of challenges, Red Hat still covered
implementing the extension module importing improvements in 3.5 (as
well as making it possible for me to take the time for reviewing the
async/await changes, amongst other things), and HP have now taken over
from Rackspace as the primary funder of pypi.python.org development
(and a lot of PyPA development as well). The Red Hat sponsored CentOS
QA infrastructure is also the back end powering the pre-merge patch
testing Kushal set up. Red Hat and Canonical have also been major
drivers of Python 3 porting efforts for various projects as they've
aimed to migrate both Ubuntu and Fedora to using Python 3 as the
system Python.

Longer term, the best way I know of to get corporations to pick up the
tab for things like 2.7 maintenance is to measure it and publish the
results (as well as better publicising who is releasing products that
depend on it). Paying customers get nervous when open source
foundations are publishing contribution metrics that call into
question a vendor's ability to support their software (even if the
relevant foundation is too polite to point it out explicitly
themselves, just making this kind of data available means that
competing vendors inevitably use it to snipe at each other).

The OpenStack Foundation's stackalytics.openstack.com is a master
class in doing this well, with the Linux Foundation's annual kernel
development report a close second, so I'd actually like to get the PSF
to fund regular contribution metrics for CPython. Setting that up
unfortunately hasn't made it to the top of my todo list yet
(obviously, given it hasn't been done), but I'm cautiously optimistic
about being able to get to it at some point this year.

> For some reason it
> sounds like we should be altruistic towards people who are not :-)

Nope, if we're doing things for the benefit of corporations, we should
demand they pay for it if they want it done (or at least be happy that
the value of the free software, services and contributions they're
making available to the community are sufficient to earn them a fair
hearing).

However, even if we personally happen to be sceptical of the
contributions particular corporations have made to the open source
community (or choose to discount any contributions that consist
specifically of bug fix commits to the CPython repo), we don't get to
ignore their *users* so cavalierly, and we've ended up in a situation
where the majority of Python users are 5-7 years behind the progress
of core development at this point.

Donald's PyPI statistics
(https://caremad.io/2015/04/a-year-of-pypi-downloads/) suggest to me
that Linux distributions may need to shoulder a lot of the blame for
that, and assuming I'm right about that, fixing it involves changing
the way Linux distributions work in general, rather than being a
Python-specific problem (hence proposals like
https://fedoraproject.org/wiki/Env_and_Stacks/Projects/UserLevelPackageManagement
and new ways of approaching building Linux distributions, like CoreOS
and Project Atomic).

However, the fact that 2.6 currently still has a larger share of PyPI
downloads than Python 3, and that 2.7 is still by far the single most
popular version is a problem that we need to be worried about upstream
as well.

That doesn't point towards "we should make the entire standard library
move faster" for me: it points me towards leveraging PyPI to further
decouple the rate of evolution of the ecosystem from the rate of
evolution of the core interpreter and standard library. If attribute
level granularity sounds too complex (and I agree that such a scheme
would be awkward to work with), then it would also likely be feasible
to expand on the "bundled package" approach that was pioneered with
ensurepip, and have additional projects provided by default with
CPython, but have them be truly independently versioned using the
standard Python package management tools (this might even be an
appropriate approach to consider for some existing standard library
modules, such as ssl, unittest, idlelib or distutils).

Regards,
Nick.

[1] https://help.openshift.com/hc/en-us/articles/202399790-How-to-request-resources-for-Non-Profit-Open-Source-or-Educational-Institutions

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From donald at stufft.io  Wed May 27 16:28:19 2015
From: donald at stufft.io (Donald Stufft)
Date: Wed, 27 May 2015 10:28:19 -0400
Subject: [Python-Dev] Preserving the definition order of class
 namespaces.
In-Reply-To: <20150527101622.10a36d75@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
Message-ID: <etPan.5565d483.2db5806e.1296e@Draupnir.home>

On May 27, 2015 at 4:18:11 AM, Antoine Pitrou (solipsis at pitrou.net) wrote:
On Mon, 25 May 2015 17:30:02 -0700  
Larry Hastings <larry at hastings.org> wrote:  
>  
> So, in all three cases it's work that's been under development for a  
> while. These people did this work out of the kindness of their hearts,  
> to make Python better. As a community we want to encourage that and  
> make sure these developers know we appreciate their efforts. These  
> people would be happier if the work shipped in 3.5 as opposed to 3.6 so  
> it got into user's hands sooner.  

I second that sentiment. But it strikes me that we're doing this  
because our release frequency is completely inadapted. If we had  
feature releases, say, every 6 or 9 months, the problem wouldn't really  
exist in the first place. Exceptions granted by the RM only tackle a  
very small portion of the problem, because the long delay before an  
accepted patch being in an official release *still* frustrates everyone,  
and the unpredictability of exceptions only makes things *more*  
frustrating for most players.  

Of course, it was pretty much shut down before:  
https://mail.python.org/pipermail/python-dev/2012-January/115619.html  

Regards  

Antoine.  


_______________________________________________  
Python-Dev mailing list  
Python-Dev at python.org  
https://mail.python.org/mailman/listinfo/python-dev  
Unsubscribe: https://mail.python.org/mailman/options/python-dev/donald%40stufft.io  


I?m in favor of releasing more often as well.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/e3302033/attachment.html>

From barry at python.org  Wed May 27 16:29:48 2015
From: barry at python.org (Barry Warsaw)
Date: Wed, 27 May 2015 10:29:48 -0400
Subject: [Python-Dev] Preserving the definition order of class
 namespaces.
In-Reply-To: <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>
Message-ID: <20150527102948.498f7031@anarchist.wooz.org>

On May 27, 2015, at 06:34 PM, Nick Coghlan wrote:

>I'd actually like to pursue a more nuanced view of what's permitted in
>maintenance releases, based on a combination of the language moratorium
>PEP, and an approach inspired by PEP 466, requiring that every feature
>added in a maintenance release be detectable through an attribute check on
>a module (with corresponding support in dependency checking tools).

PEP 466 and Python 2.7 are a special case.  I wouldn't want to adopt such
tactics in normal Python 3 releases.

Imagine the nightmare of some poor library author who wants to make sure their
package works with Python 3.6.  They're faced with a source release of 3.6.5,
but 3.6.3 in Ubuntu, 3.6.4 in Fedora, 3.6.2 in Debian, and users of all
stripes of patch releases on Windows and OS X.  Now they have to pepper their
code with attribute tests just to support "Python 3.6".  In fact, claiming
support for Python 3.6 actually doesn't convey enough information to their
users.

Sure, we can limit this to new features, but even new features introduce risk.
We've decided to accept this risk for Python 2.7 for good and important
reasons, but we shouldn't do the same for ongoing normal releases.

>The problem with simply speeding up the release cycle without constraining
>the interim releases in some way is that it creates significant pain for
>alternate implementations and for downstream redistributors (many of whom
>are still dealing with the fallout of the Python 3 transition).

I'm not convinced that relaxing the maintenance release constraints lessens
the pain for anybody.

Cheers,
-Barry


From donald at stufft.io  Wed May 27 16:37:19 2015
From: donald at stufft.io (Donald Stufft)
Date: Wed, 27 May 2015 10:37:19 -0400
Subject: [Python-Dev] Preserving the definition order of class
 namespaces.
In-Reply-To: <20150527102948.498f7031@anarchist.wooz.org>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>
 <20150527102948.498f7031@anarchist.wooz.org>
Message-ID: <etPan.5565d69f.79466355.1296e@Draupnir.home>



On May 27, 2015 at 10:32:47 AM, Barry Warsaw (barry at python.org) wrote:
> On May 27, 2015, at 06:34 PM, Nick Coghlan wrote:
> 
> >I'd actually like to pursue a more nuanced view of what's permitted in
> >maintenance releases, based on a combination of the language moratorium
> >PEP, and an approach inspired by PEP 466, requiring that every feature
> >added in a maintenance release be detectable through an attribute check on
> >a module (with corresponding support in dependency checking tools).
> 
> PEP 466 and Python 2.7 are a special case. I wouldn't want to adopt such
> tactics in normal Python 3 releases.
> 
> Imagine the nightmare of some poor library author who wants to make sure their
> package works with Python 3.6. They're faced with a source release of 3.6.5,
> but 3.6.3 in Ubuntu, 3.6.4 in Fedora, 3.6.2 in Debian, and users of all
> stripes of patch releases on Windows and OS X. Now they have to pepper their
> code with attribute tests just to support "Python 3.6". In fact, claiming
> support for Python 3.6 actually doesn't convey enough information to their
> users.
> 
> Sure, we can limit this to new features, but even new features introduce risk.
> We've decided to accept this risk for Python 2.7 for good and important
> reasons, but we shouldn't do the same for ongoing normal releases.
> 
> >The problem with simply speeding up the release cycle without constraining
> >the interim releases in some way is that it creates significant pain for
> >alternate implementations and for downstream redistributors (many of whom
> >are still dealing with the fallout of the Python 3 transition).
> 
> I'm not convinced that relaxing the maintenance release constraints lessens
> the pain for anybody.
> 
> Cheers,
> -Barry
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/donald%40stufft.io 
> 


I think it increases the pain for everyone TBH.

I find the backport we did to Python 2.7 pretty crummy, but I think the only
thing worse than backporting to a random patch release of 2.7 was not making
it available to the 2.x line at all. I think that it would have been better to
release it as a 2.8, however that was a hill I felt like dying on personally.
Going forward I think we should either stick to the slower release schedule
and just say it is what it is or release more often. The inbetween is killer.

--- 
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From stephen at xemacs.org  Wed May 27 17:04:52 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Thu, 28 May 2015 00:04:52 +0900
Subject: [Python-Dev] Preserving the definition order of
	class	namespaces.
In-Reply-To: <20150527110234.5a0490a1@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <CADiSq7enL44ujRpDVwZa5fbr5cWaVZ=OOX-_CxHZhU6x9k1zhQ@mail.gmail.com>
 <20150527110234.5a0490a1@fsol>
Message-ID: <87vbfef4dn.fsf@uwakimon.sk.tsukuba.ac.jp>

Antoine Pitrou writes:

 > For some reason it sounds like we should be altruistic towards
 > people who are not :-)

There's always a question of how far to go, of course, but one of the
things I like about this community is the attention the developers
give to others' pain.  In that sense, I'm definitely +1 on altruism.


From newinferno at gmail.com  Wed May 27 19:04:16 2015
From: newinferno at gmail.com (Uladzimir Kryvian)
Date: Wed, 27 May 2015 20:04:16 +0300
Subject: [Python-Dev] Embedded Python. Debug Version and _ctypes
Message-ID: <CAFYy4bjp7025RzReMMc6-nrXUARqf6V9Q+FzWmdvrw9yKmvHfg@mail.gmail.com>

Hi!
I'm trying to use embedding of Python in my program.

Simple C-program, compiled in Debug, that uses py-script that just
imports "ctypes" gives me an error about "no module named "_ctypes".

How to compile python lib in Visual Studio statically with ctypes
support? Or how to use shared ctypes lib in debug mode?

From brett at python.org  Wed May 27 20:17:02 2015
From: brett at python.org (Brett Cannon)
Date: Wed, 27 May 2015 18:17:02 +0000
Subject: [Python-Dev] Embedded Python. Debug Version and _ctypes
In-Reply-To: <CAFYy4bjp7025RzReMMc6-nrXUARqf6V9Q+FzWmdvrw9yKmvHfg@mail.gmail.com>
References: <CAFYy4bjp7025RzReMMc6-nrXUARqf6V9Q+FzWmdvrw9yKmvHfg@mail.gmail.com>
Message-ID: <CAP1=2W7_1wEoQVd17FOVsGFkfqRRB6cK1RZH4Ltygn0VxQ=f9w@mail.gmail.com>

This mailing list is for the development *of *Python, not *with* it. The
best place to ask this would be on python-list at python.org.

On Wed, May 27, 2015 at 1:08 PM Uladzimir Kryvian <newinferno at gmail.com>
wrote:

> Hi!
> I'm trying to use embedding of Python in my program.
>
> Simple C-program, compiled in Debug, that uses py-script that just
> imports "ctypes" gives me an error about "no module named "_ctypes".
>
> How to compile python lib in Visual Studio statically with ctypes
> support? Or how to use shared ctypes lib in debug mode?
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/6a8356b8/attachment.html>

From tjreedy at udel.edu  Wed May 27 23:15:39 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 27 May 2015 17:15:39 -0400
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
In-Reply-To: <20150527101622.10a36d75@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
Message-ID: <mk5c6e$ruv$1@ger.gmane.org>

On 5/27/2015 4:16 AM, Antoine Pitrou wrote:

> I second that sentiment. But it strikes me that we're doing this
> because our release frequency is completely inadapted. If we had
> feature releases, say, every 6 or 9 months, the problem wouldn't really
> exist in the first place.

How about a feature release once a year, on a schedule we choose as best 
for us.  For instance, aim for end of May, with a bugfix release and 
alpha of the next version mid to late Sept after summer vacations. 
Encourage linux distributions to include the new version in their fall 
and spring releases.

Features that miss a beta1 deadline would then be available to early 
adopters 4 months later.  In general, I think alpha releases have 
usually been about as good as bugfix releases.  High test coverage and 
green buildbots help. With more frequent releases, there should be fewer 
major new features and perhaps fewer alpha and beta releases needed.

> Exceptions granted by the RM only tackle a
> very small portion of the problem, because the long delay before an
> accepted patch being in an official release *still* frustrates everyone,
> and the unpredictability of exceptions only makes things *more*
> frustrating for most players.

-- 
Terry Jan Reedy


From barry at python.org  Wed May 27 23:34:26 2015
From: barry at python.org (Barry Warsaw)
Date: Wed, 27 May 2015 17:34:26 -0400
Subject: [Python-Dev] time-based releases (was Re: Preserving the definition
 order of class namespaces.)
In-Reply-To: <mk5c6e$ruv$1@ger.gmane.org>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
Message-ID: <20150527173426.3a829b78@anarchist.wooz.org>

On May 27, 2015, at 05:15 PM, Terry Reedy wrote:

>How about a feature release once a year, on a schedule we choose as best for
>us.

We discussed timed releases ages ago and they were rejected by the majority.
Time-based releases can make a lot of sense, especially if the interval is
short enough.  If a feature doesn't make it into the May 2015 release, oh
well, there will be another one in X months.

Ubuntu has had a lot of success with X=6 time-based releases.  That's not to
say there aren't plenty of logistics to work out, or that they are a panacea,
or even that they would work with an all-volunteer developer community.  But
time-based releases do have advantages too, and maybe those would outweigh the
disadvantages for Python at this point.

Cheers,
-Barry

From guido at python.org  Wed May 27 23:45:52 2015
From: guido at python.org (Guido van Rossum)
Date: Wed, 27 May 2015 14:45:52 -0700
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <20150527173426.3a829b78@anarchist.wooz.org>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org> <20150527173426.3a829b78@anarchist.wooz.org>
Message-ID: <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>

On Wed, May 27, 2015 at 2:34 PM, Barry Warsaw <barry at python.org> wrote:

> On May 27, 2015, at 05:15 PM, Terry Reedy wrote:
>
> >How about a feature release once a year, on a schedule we choose as best
> for
> >us.
>
> We discussed timed releases ages ago and they were rejected by the
> majority.
> Time-based releases can make a lot of sense, especially if the interval is
> short enough.  If a feature doesn't make it into the May 2015 release, oh
> well, there will be another one in X months.
>
> Ubuntu has had a lot of success with X=6 time-based releases.  That's not
> to
> say there aren't plenty of logistics to work out, or that they are a
> panacea,
> or even that they would work with an all-volunteer developer community.
> But
> time-based releases do have advantages too, and maybe those would outweigh
> the
> disadvantages for Python at this point.
>

This favors developers (who want to see their feature launched) and early
adopters (who want to try out shiny new features).

The current system (release every 18-24 months) was established when users
who were decidedly not early adopters started complaining about the
breakneck pace of Python releases.

It's quite possible that the current crop of Python users are less averse
to change (although the conversion rate to Python 3 seems to indicate
otherwise). But it's also hard to compare Ubuntu (which is a roll-up of
thousands of different open-source projects, with a large variety of
different release schedules) to Python (which is a single,
centrally-controlled code base).

What do other projects that are at most 1 order of magnitude smaller or
larger than Python do? E.g. the Linux kernel, or Mysql, or Qt?

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/c352a42e/attachment.html>

From solipsis at pitrou.net  Wed May 27 23:49:55 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Wed, 27 May 2015 23:49:55 +0200
Subject: [Python-Dev] Preserving the definition order of class
	namespaces.
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
Message-ID: <20150527234955.65bb1dca@fsol>

On Wed, 27 May 2015 17:15:39 -0400
Terry Reedy <tjreedy at udel.edu> wrote:
> On 5/27/2015 4:16 AM, Antoine Pitrou wrote:
> 
> > I second that sentiment. But it strikes me that we're doing this
> > because our release frequency is completely inadapted. If we had
> > feature releases, say, every 6 or 9 months, the problem wouldn't really
> > exist in the first place.
> 
> How about a feature release once a year, on a schedule we choose as best 
> for us.  For instance, aim for end of May, with a bugfix release and 
> alpha of the next version mid to late Sept after summer vacations. 
> Encourage linux distributions to include the new version in their fall 
> and spring releases.
> 
> Features that miss a beta1 deadline would then be available to early 
> adopters 4 months later.  In general, I think alpha releases have 
> usually been about as good as bugfix releases.

I don't believe alpha releases are attractive for most users.  People
don't want to risk losing time over bugs that may be caused by
regressions in Python.  Regardless of their *actual* stability or
quality, releases labelled "alpha" are perceived as high-risk.

Regards

Antoine.



From ncoghlan at gmail.com  Thu May 28 00:31:45 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 08:31:45 +1000
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
Message-ID: <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>

On 28 May 2015 07:48, "Guido van Rossum" <guido at python.org> wrote:
>
> On Wed, May 27, 2015 at 2:34 PM, Barry Warsaw <barry at python.org> wrote:
>>
>> On May 27, 2015, at 05:15 PM, Terry Reedy wrote:
>>
>> >How about a feature release once a year, on a schedule we choose as
best for
>> >us.
>>
>> We discussed timed releases ages ago and they were rejected by the
majority.
>> Time-based releases can make a lot of sense, especially if the interval
is
>> short enough.  If a feature doesn't make it into the May 2015 release, oh
>> well, there will be another one in X months.
>>
>> Ubuntu has had a lot of success with X=6 time-based releases.  That's
not to
>> say there aren't plenty of logistics to work out, or that they are a
panacea,
>> or even that they would work with an all-volunteer developer community.
But
>> time-based releases do have advantages too, and maybe those would
outweigh the
>> disadvantages for Python at this point.
>
>
> This favors developers (who want to see their feature launched) and early
adopters (who want to try out shiny new features).
>
> The current system (release every 18-24 months) was established when
users who were decidedly not early adopters started complaining about the
breakneck pace of Python releases.
>
> It's quite possible that the current crop of Python users are less averse
to change (although the conversion rate to Python 3 seems to indicate
otherwise). But it's also hard to compare Ubuntu (which is a roll-up of
thousands of different open-source projects, with a large variety of
different release schedules) to Python (which is a single,
centrally-controlled code base).
>
> What do other projects that are at most 1 order of magnitude smaller or
larger than Python do? E.g. the Linux kernel, or Mysql, or Qt?

The Linux kernel iterates fairly rapidly, with redistributors agreeing on
which versions to target for long term support.

Decent overview here: https://ltsi.linuxfoundation.org/what-is-ltsi

I don't think the comparison really works though, because it's not just
redistributors that are affected, it's alternate implementations of the
language specification, as well as educational materials.

If we got to merge gating on trunk, such that it was feasible to release
3.6dev1 within 6 months of the 3.5 release, then that might be a way of
satisfying both crowds, since it would be a matter of speeding up the
pre-release cycle and increasing the stability expectations for
pre-releases, minimising the ripple effects on other parts of the ecosystem.

Regards,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/a8d5555d/attachment.html>

From ncoghlan at gmail.com  Thu May 28 00:48:11 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 08:48:11 +1000
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
 <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
Message-ID: <CADiSq7eq98tFt+fymQiE++7aTd5og0E2oDySthamHhUpv=oFsA@mail.gmail.com>

On 28 May 2015 08:31, "Nick Coghlan" <ncoghlan at gmail.com> wrote:
>
>
> On 28 May 2015 07:48, "Guido van Rossum" <guido at python.org> wrote:
> >
> > On Wed, May 27, 2015 at 2:34 PM, Barry Warsaw <barry at python.org> wrote:
> >>
> >> On May 27, 2015, at 05:15 PM, Terry Reedy wrote:
> >>
> >> >How about a feature release once a year, on a schedule we choose as
best for
> >> >us.
> >>
> >> We discussed timed releases ages ago and they were rejected by the
majority.
> >> Time-based releases can make a lot of sense, especially if the
interval is
> >> short enough.  If a feature doesn't make it into the May 2015 release,
oh
> >> well, there will be another one in X months.
> >>
> >> Ubuntu has had a lot of success with X=6 time-based releases.  That's
not to
> >> say there aren't plenty of logistics to work out, or that they are a
panacea,
> >> or even that they would work with an all-volunteer developer
community.  But
> >> time-based releases do have advantages too, and maybe those would
outweigh the
> >> disadvantages for Python at this point.
> >
> >
> > This favors developers (who want to see their feature launched) and
early adopters (who want to try out shiny new features).
> >
> > The current system (release every 18-24 months) was established when
users who were decidedly not early adopters started complaining about the
breakneck pace of Python releases.
> >
> > It's quite possible that the current crop of Python users are less
averse to change (although the conversion rate to Python 3 seems to
indicate otherwise). But it's also hard to compare Ubuntu (which is a
roll-up of thousands of different open-source projects, with a large
variety of different release schedules) to Python (which is a single,
centrally-controlled code base).
> >
> > What do other projects that are at most 1 order of magnitude smaller or
larger than Python do? E.g. the Linux kernel, or Mysql, or Qt?
>
> The Linux kernel iterates fairly rapidly, with redistributors agreeing on
which versions to target for long term support.
>
> Decent overview here: https://ltsi.linuxfoundation.org/what-is-ltsi
>
> I don't think the comparison really works though, because it's not just
redistributors that are affected, it's alternate implementations of the
language specification, as well as educational materials.
>
> If we got to merge gating on trunk, such that it was feasible to release
3.6dev1 within 6 months of the 3.5 release, then that might be a way of
satisfying both crowds, since it would be a matter of speeding up the
pre-release cycle and increasing the stability expectations for
pre-releases, minimising the ripple effects on other parts of the ecosystem.

I just remembered one of the biggest causes of pain: Windows binaries for
projects that aren't using the stable ABI. It used to regularly take 6+
months for the Windows ecosystem to catch up after each 2.x release.

Since community  project version compatibility time spans tend to be on the
order of 3 years (exceptional cases like 2.6 & 2.7 aside), we need to
consider:

* are we happy increasing that support matrix from 2 versions to 6?
* if we tell third party community projects to only test against and
provide prebuilt binaries for the less frequent LTS releases, how useful
would they really be over switching from the current alpha/beta/rc
pre-release cycle to a time based merge gated dev release cycle followed by
a beta/rc cycle?

After all, the real difference between the alphas and the final releases
isn't about anything *we* do, it's about the testing *other people* do that
picks up gaps in our test coverage. A gated trunk makes it more feasible
for other projects to do continuous integration against it.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/60735058/attachment.html>

From srinivas.vamsi.parasa at intel.com  Thu May 28 02:17:19 2015
From: srinivas.vamsi.parasa at intel.com (Parasa, Srinivas Vamsi)
Date: Thu, 28 May 2015 00:17:19 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
Message-ID: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>

Hi All,

This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.

Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).

Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon - Haswell EP CPU with 18 cores, hyper-threading off, turbo off).

Below is a summary of the performance we saw on the "grand unified python benchmarks" suite (available at https://hg.python.org/benchmarks/). We made 3 rigorous runs of the following benchmarks. In each rigorous run, a benchmark is run 100 times with and without the computed goto patch. Below we show the average performance boost for the 3 rigorous runs.

Python 2.7.10 (original) vs Computed Goto performance
Benchmark

Delta (rigorous run #1) %

Delta (rigorous run #2)  %

Delta (rigorous run #3) %

Avg. Delta %

iterative_count

24.48

24.36

23.78

24.2

unpack_sequence

19.06

18.47

19.48

19.0

slowspitfire

14.36

13.41

16.65

14.8

threaded_count

15.85

13.43

13.93

14.4

pystone

10.68

11.67

11.08

11.1

nbody

10.25

8.93

9.28

9.5

go

7.96

8.76

7.69

8.1

pybench

6.3

6.8

7.2

6.8

spectral_norm

5.49

9.37

4.62

6.5

float

6.09

6.2

6.96

6.4

richards

6.19

6.41

6.42

6.3

slowunpickle

6.37

8.78

3.55

6.2

json_dump_v2

1.96

12.53

3.57

6.0

call_simple

6.37

5.91

3.92

5.4

chaos

4.57

5.34

3.85

4.6

call_method_slots

2.63

3.27

7.71

4.5

telco

5.18

1.83

6.47

4.5

simple_logging

3.48

1.57

7.4

4.2

call_method

2.61

5.4

3.88

4.0

chameleon

2.03

6.26

3.2

3.8

fannkuch

3.89

3.19

4.39

3.8

silent_logging

4.33

3.07

3.39

3.6

slowpickle

5.72

-1.12

6.06

3.6

2to3

2.99

3.6

3.45

3.3

etree_iterparse

3.41

2.51

3

3.0

regex_compile

3.44

2.48

2.84

2.9

mako_v2

2.14

1.29

5.22

2.9

meteor_contest

2.01

2.2

3.88

2.7

django

6.68

-1.23

2.56

2.7

formatted_logging

1.97

5.82

-0.11

2.6

hexiom2

2.83

2.1

2.55

2.5

django_v2

1.93

2.53

2.92

2.5

etree_generate

2.38

2.13

2.51

2.3

mako

-0.3

9.66

-3.11

2.1

bzr_startup

0.35

1.97

3

1.8

etree_process

1.84

1.01

1.9

1.6

spambayes

1.76

0.76

0.48

1.0

regex_v8

1.96

-0.66

1.63

1.0

html5lib

0.83

0.72

0.97

0.8

normal_startup

1.41

0.39

0.24

0.7

startup_nosite

1.2

0.41

0.42

0.7

etree_parse

0.24

0.9

0.79

0.6

json_load

1.38

0.56

-0.25

0.6

pidigits

0.45

0.33

0.28

0.4

hg_startup

0.32

2.07

-1.41

0.3

rietveld

0.05

0.91

-0.43

0.2

tornado_http

2.34

-0.92

-1.27

0.1

call_method_unknown

0.72

1.26

-1.85

0.0

raytrace

-0.35

-0.75

0.94

-0.1

regex_effbot

1.97

-1.18

-2.57

-0.6

fastunpickle

-1.65

0.5

-0.88

-0.7

nqueens

-2.24

-1.53

-0.81

-1.5

fastpickle

-0.74

1.98

-6.26

-1.7



Thanks,
Vamsi

------------------------------------------------------------------------------------------------------------------------------------------------------------
From: gvanrossum at gmail.com<mailto:gvanrossum at gmail.com> [mailto:gvanrossum at gmail.com] On Behalf Of Guido van Rossum
Sent: Tuesday, May 19, 2015 1:59 PM
To: Cohn, Robert S
Cc: R. David Murray (r.david.murray at murrayandwalker.com<mailto:r.david.murray at murrayandwalker.com>)
Subject: Re: meeting at PyCon

Hi Robert and David,
I just skimmed that thread. There were a lot of noises about backporting it to 2.7 but the final message on the topic, by Antoine, claimed it was too late for 2.7. However, that was before we had announced the EOL extension of 2.7 till 2020, and perhaps we were also in denial about 3.x uptake vs. 2.x. So I think it's definitively worth bringing this up. I would start with a post on python-dev linking to the source code for your patch, and adding a message to the original tracker issue too (without reopening it though -- just so the people who were on the bug will be pinged about it).
Because of backwards compatibility with previous 2.7.x releases, it's very important that the patch not break anything -- in particular this means you can't add opcodes or change their specification. You will also undoubtedly be asked to test this on a variety of platforms 32-bit and 64-bit that people care about. But I'm sure you're expecting all that. :-)

You might also check with Benjamin Peterson, who is the 2.7 release manager. I think he just announced 2.7.10, so it's too late for that, but I assume we'll keep doing 2.7.x releases until 2020.
Good luck,

--Guido

PS. I am assuming you are contributing this under a PSF-accepted license, e.g. Apache 2.0, otherwise it's an automatic nogo.

On Tue, May 19, 2015 at 9:33 AM, Cohn, Robert S <robert.s.cohn at intel.com<mailto:robert.s.cohn at intel.com>> wrote:
Hi Guido,


When we met for lunch at pycon, I asked if performance related patches would be ok for python 2.x. My understanding is that you thought it was possible if it did not create a maintainability problem. We have an example now, a 2.7 patch for computed goto based on the implementation in python 3 http://bugs.python.org/issue4753 It increases performance by up to 10% across a wide range of workloads.



As I mentioned at lunch, we hired David Murray's company, and he is guiding intel through the development process for cpython. David and I thought it would be good to run this by you before raising the issue on python-dev. Do you have a specific concern about this patch or a more general concern about performance patches to 2.7? Thanks.



Robert
--------

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/d348f039/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: cgoto_py2710_hg_final.patch
Type: application/octet-stream
Size: 49186 bytes
Desc: cgoto_py2710_hg_final.patch
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/d348f039/attachment-0001.obj>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: Instructions_cgoto_patch.txt
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/d348f039/attachment-0001.txt>

From ncoghlan at gmail.com  Thu May 28 03:31:24 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 11:31:24 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
Message-ID: <CADiSq7fAhfqn=k-AWss+XMxm6ZXP3fWFRacMFC5EqxARrryGmA@mail.gmail.com>

On 28 May 2015 at 10:17, Parasa, Srinivas Vamsi <
srinivas.vamsi.parasa at intel.com> wrote:

>  Hi All,
>
>
>
> This is Vamsi from Server Scripting Languages Optimization team at Intel
> Corporation.
>
>
>
> Would like to submit a request to enable the computed goto based dispatch
> in Python 2.x (which happens to be enabled by default in Python 3 given its
> performance benefits on a wide range of workloads). We talked about this
> patch with Guido and he encouraged us to submit a request on Python-dev
> (email conversation with Guido shown at the bottom of this email).
>

+1 from me, for basically the same reasons Guido gives: Python 2.7 is going
to be with us for a long time, and this particular change shouldn't have
any externally visible impacts at either an ABI or API level.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/75ab0af0/attachment.html>

From raymond.hettinger at gmail.com  Thu May 28 04:27:49 2015
From: raymond.hettinger at gmail.com (Raymond Hettinger)
Date: Wed, 27 May 2015 19:27:49 -0700
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CADiSq7fAhfqn=k-AWss+XMxm6ZXP3fWFRacMFC5EqxARrryGmA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CADiSq7fAhfqn=k-AWss+XMxm6ZXP3fWFRacMFC5EqxARrryGmA@mail.gmail.com>
Message-ID: <C3371244-E7BA-4C64-BB75-51C78BAEFD06@gmail.com>


> On May 27, 2015, at 6:31 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> On 28 May 2015 at 10:17, Parasa, Srinivas Vamsi <srinivas.vamsi.parasa at intel.com> wrote:
> Hi All,
> 
>  
> 
> This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.
> 
>  
> 
> Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).
> 
> 
> +1 from me, for basically the same reasons Guido gives: Python 2.7 is going to be with us for a long time, and this particular change shouldn't have any externally visible impacts at either an ABI or API level.

+1 from me a well.   We probably should have done this long ago.


Raymond Hettinger

From ned at nedbatchelder.com  Thu May 28 04:51:09 2015
From: ned at nedbatchelder.com (Ned Batchelder)
Date: Wed, 27 May 2015 22:51:09 -0400
Subject: [Python-Dev] Issue 24285: regression for importing extensions in
	packages
Message-ID: <5566829D.60907@nedbatchelder.com>

This issue has been fixed, but a day or two late for 3.5b1.  It prevents 
loading the coverage.py extension.  It'd be great to get a new beta 
release soon. :)

--Ned.

From ncoghlan at gmail.com  Thu May 28 05:02:04 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 13:02:04 +1000
Subject: [Python-Dev] Issue 24285: regression for importing extensions
	in packages
In-Reply-To: <5566829D.60907@nedbatchelder.com>
References: <5566829D.60907@nedbatchelder.com>
Message-ID: <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>

On 28 May 2015 at 12:51, Ned Batchelder <ned at nedbatchelder.com> wrote:
> This issue has been fixed, but a day or two late for 3.5b1.

Aye, we only found out about the missing test case via feedback *on*
the beta. We had never needed to worry about it before, but it turns
out all our extension modules in the standard library are top level
modules and we didn't previously have an explicit test for the
submodule case :(

> It prevents
> loading the coverage.py extension.  It'd be great to get a new beta release
> soon. :)

Until your email, I hadn't fully thought through the consequences, but
the bug is actually going to block a *lot* of potential testing of the
beta release - anything that requires a C extension module that isn't
a top level module isn't going to work with 3.5b1.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From yselivanov.ml at gmail.com  Thu May 28 05:07:34 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Wed, 27 May 2015 23:07:34 -0400
Subject: [Python-Dev] Issue 24285: regression for importing extensions
 in packages
In-Reply-To: <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>
References: <5566829D.60907@nedbatchelder.com>
 <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>
Message-ID: <55668676.5020703@gmail.com>

On 2015-05-27 11:02 PM, Nick Coghlan wrote:
>> >It prevents
>> >loading the coverage.py extension.  It'd be great to get a new beta release
>> >soon. :)
> Until your email, I hadn't fully thought through the consequences, but
> the bug is actually going to block a*lot*  of potential testing of the
> beta release - anything that requires a C extension module that isn't
> a top level module isn't going to work with 3.5b1.

It would also be great to release the new beta shortly with
the new OrderedDict implemented in C and math.isclose().

Yury

From stefan_ml at behnel.de  Thu May 28 05:41:44 2015
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Thu, 28 May 2015 05:41:44 +0200
Subject: [Python-Dev] Issue 24285: regression for importing extensions
	in packages
In-Reply-To: <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>
References: <5566829D.60907@nedbatchelder.com>
 <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>
Message-ID: <mk62pp$gmu$1@ger.gmane.org>

Nick Coghlan schrieb am 28.05.2015 um 05:02:
> On 28 May 2015 at 12:51, Ned Batchelder wrote:
>> This issue has been fixed, but a day or two late for 3.5b1.
> 
> Aye, we only found out about the missing test case via feedback *on*
> the beta. We had never needed to worry about it before, but it turns
> out all our extension modules in the standard library are top level
> modules and we didn't previously have an explicit test for the
> submodule case :(
> 
>> It prevents
>> loading the coverage.py extension.  It'd be great to get a new beta release
>> soon. :)
> 
> Until your email, I hadn't fully thought through the consequences, but
> the bug is actually going to block a *lot* of potential testing of the
> beta release - anything that requires a C extension module that isn't
> a top level module isn't going to work with 3.5b1.

+1 for a quick beta 2 from me, too (obviously). I've already seen a bug
report because a Cython compiled package doesn't work in Py3.5. Having to
tell them to wait a while for beta 2 is annoying and discouraging for early
testers.

Stefan



From larry at hastings.org  Thu May 28 06:30:56 2015
From: larry at hastings.org (Larry Hastings)
Date: Wed, 27 May 2015 21:30:56 -0700
Subject: [Python-Dev] Issue 24285: regression for importing extensions
 in packages
In-Reply-To: <5566829D.60907@nedbatchelder.com>
References: <5566829D.60907@nedbatchelder.com>
Message-ID: <55669A00.8060704@hastings.org>

On 05/27/2015 07:51 PM, Ned Batchelder wrote:
> This issue has been fixed, but a day or two late for 3.5b1.  It 
> prevents loading the coverage.py extension.  It'd be great to get a 
> new beta release soon. :)

http://legacy.python.org/dev/peps/pep-0478/


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/63b687d6/attachment.html>

From ncoghlan at gmail.com  Thu May 28 06:54:16 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 14:54:16 +1000
Subject: [Python-Dev] Issue 24285: regression for importing extensions
	in packages
In-Reply-To: <55669A00.8060704@hastings.org>
References: <5566829D.60907@nedbatchelder.com> <55669A00.8060704@hastings.org>
Message-ID: <CADiSq7cvisbpmw6cwYWUu52wALpoBSS7gbr0ryUHLybGoHn-3Q@mail.gmail.com>

On 28 May 2015 at 14:30, Larry Hastings <larry at hastings.org> wrote:
> On 05/27/2015 07:51 PM, Ned Batchelder wrote:
>
> This issue has been fixed, but a day or two late for 3.5b1.  It prevents
> loading the coverage.py extension.  It'd be great to get a new beta release
> soon. :)
>
>
> http://legacy.python.org/dev/peps/pep-0478/

Aye, it's the long gap from May 24 (3.5b1) to July 5 (3.5b2) that we
were hoping to see shortened if it was possible to get the release
team together.

The fact folks can't currently import extension modules that aren't at
top level is unfortunately going to limit the amount of 3.5b1 testing
the community can do in those 6 weeks :(

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From tjreedy at udel.edu  Thu May 28 07:01:20 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Thu, 28 May 2015 01:01:20 -0400
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CADiSq7fAhfqn=k-AWss+XMxm6ZXP3fWFRacMFC5EqxARrryGmA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CADiSq7fAhfqn=k-AWss+XMxm6ZXP3fWFRacMFC5EqxARrryGmA@mail.gmail.com>
Message-ID: <mk67fk$gdk$1@ger.gmane.org>

On 5/27/2015 9:31 PM, Nick Coghlan wrote:

> +1 from me, for basically the same reasons Guido gives: Python 2.7 is
> going to be with us for a long time, and this particular change
> shouldn't have any externally visible impacts at either an ABI or API level.

Immediately after a release, giving the patch plenty of time to be be 
tested, seems like a good time.

-- 
Terry Jan Reedy


From njs at pobox.com  Thu May 28 07:19:03 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Wed, 27 May 2015 22:19:03 -0700
Subject: [Python-Dev] Issue 24285: regression for importing extensions
	in packages
In-Reply-To: <CADiSq7cvisbpmw6cwYWUu52wALpoBSS7gbr0ryUHLybGoHn-3Q@mail.gmail.com>
References: <5566829D.60907@nedbatchelder.com> <55669A00.8060704@hastings.org>
 <CADiSq7cvisbpmw6cwYWUu52wALpoBSS7gbr0ryUHLybGoHn-3Q@mail.gmail.com>
Message-ID: <CAPJVwBmCi9_kbOC5MG_KQJ0fq98wf8AwRvLcFbC4izyH8OiEMQ@mail.gmail.com>

On Wed, May 27, 2015 at 9:54 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>
> On 28 May 2015 at 14:30, Larry Hastings <larry at hastings.org> wrote:
> > On 05/27/2015 07:51 PM, Ned Batchelder wrote:
> >
> > This issue has been fixed, but a day or two late for 3.5b1.  It prevents
> > loading the coverage.py extension.  It'd be great to get a new beta release
> > soon. :)
> >
> >
> > http://legacy.python.org/dev/peps/pep-0478/
>
> Aye, it's the long gap from May 24 (3.5b1) to July 5 (3.5b2) that we
> were hoping to see shortened if it was possible to get the release
> team together.
>
> The fact folks can't currently import extension modules that aren't at
> top level is unfortunately going to limit the amount of 3.5b1 testing
> the community can do in those 6 weeks :(

Just chiming in to confirm that yep, numpy (and thus the whole
numerical stack) are also missing on 3.5b1. Of course this was filed
as a bug report on numpy :-):

   https://github.com/numpy/numpy/issues/5915

I guess the good news is that people are in fact testing the beta, but
it sounds like pretty much the only code running on b1 is pure Python
code whose entire dependency chain is also pure Python code.

On the upside, compatibility between PyPy and CPython has improved greatly ;-).

-n

-- 
Nathaniel J. Smith -- http://vorpus.org

From larry at hastings.org  Thu May 28 07:35:24 2015
From: larry at hastings.org (Larry Hastings)
Date: Wed, 27 May 2015 22:35:24 -0700
Subject: [Python-Dev] Issue 24285: regression for importing extensions
 in packages
In-Reply-To: <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>
References: <5566829D.60907@nedbatchelder.com>
 <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>
Message-ID: <5566A91C.4000502@hastings.org>

On 05/27/2015 08:02 PM, Nick Coghlan wrote:
> On 28 May 2015 at 12:51, Ned Batchelder <ned at nedbatchelder.com> wrote:
>> This issue has been fixed, but a day or two late for 3.5b1.
> Aye, we only found out about the missing test case via feedback *on*
> the beta. We had never needed to worry about it before, but it turns
> out all our extension modules in the standard library are top level
> modules and we didn't previously have an explicit test for the
> submodule case :(

Well, certainly this sounds like something that needs to go into the 
regression test suite.  Can someone create the issue?

... and the patch?


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/0615dcff/attachment.html>

From greg at krypto.org  Thu May 28 07:46:59 2015
From: greg at krypto.org (Gregory P. Smith)
Date: Thu, 28 May 2015 05:46:59 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <mk67fk$gdk$1@ger.gmane.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CADiSq7fAhfqn=k-AWss+XMxm6ZXP3fWFRacMFC5EqxARrryGmA@mail.gmail.com>
 <mk67fk$gdk$1@ger.gmane.org>
Message-ID: <CAGE7PNKacfkRoAoFkLCMS7ovQErgXYoh6VO63XH0FFdiE18wCw@mail.gmail.com>

Why now?  We intentionally decided not to do this for 2.7 in the past
because it was too late for the release cutoff.

Has anyone benchmarked compiling python in profile-opt mode vs having the
computed goto patch?  I'd *expect* the benefits to be the roughly the same.
Has this been compared to that?  (Anyone not compiling their Python
interpreter in profile-opt mode is doing themselves a major disservice.)

Does it shows noteworthy improvements even when used with a profile-opt
build using a PROFILE_TASK of regrtest.py with the same test exclusion list
as the debian python2.7 package?

Could you please rerun benchmarks including the profile-opt build with and
without the patch for comparsion.

-gps

PS I recommend attaching the up to date patch against 2.7.10 to issue4753.
That is where anyone will go looking for it, not buried in a mailing list
archive.

On Wed, May 27, 2015 at 10:01 PM Terry Reedy <tjreedy at udel.edu> wrote:

> On 5/27/2015 9:31 PM, Nick Coghlan wrote:
>
> > +1 from me, for basically the same reasons Guido gives: Python 2.7 is
> > going to be with us for a long time, and this particular change
> > shouldn't have any externally visible impacts at either an ABI or API
> level.
>
> Immediately after a release, giving the patch plenty of time to be be
> tested, seems like a good time.
>
> --
> Terry Jan Reedy
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/greg%40krypto.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/97dbc8d0/attachment-0001.html>

From larry at hastings.org  Thu May 28 07:48:27 2015
From: larry at hastings.org (Larry Hastings)
Date: Wed, 27 May 2015 22:48:27 -0700
Subject: [Python-Dev] Issue 24285: regression for importing extensions
 in packages
In-Reply-To: <5566A91C.4000502@hastings.org>
References: <5566829D.60907@nedbatchelder.com>
 <CADiSq7d5+JxvX+JpZk1bZxEf1jFs1sv2AJxGOnLb+w=6Y7yTdw@mail.gmail.com>
 <5566A91C.4000502@hastings.org>
Message-ID: <5566AC2B.6090209@hastings.org>

On 05/27/2015 10:35 PM, Larry Hastings wrote:
> Well, certainly this sounds like something that needs to go into the 
> regression test suite.  Can someone create the issue?
>
> ... and the patch?

NM, the existing fix already added a test to the regression test suite.  
I should have read the issue first!


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150527/ad78ad04/attachment.html>

From mal at egenix.com  Thu May 28 10:12:03 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Thu, 28 May 2015 10:12:03 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
Message-ID: <5566CDD3.1090001@egenix.com>

On 28.05.2015 02:17, Parasa, Srinivas Vamsi wrote:
> Hi All,
> 
> This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.
> 
> Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).

+1.

It's been successful for Python 3, doesn't change semantics and
increases performance.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 28 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From berker.peksag at gmail.com  Thu May 28 10:54:45 2015
From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=)
Date: Thu, 28 May 2015 11:54:45 +0300
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
Message-ID: <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>

On Thu, May 28, 2015 at 3:17 AM, Parasa, Srinivas Vamsi
<srinivas.vamsi.parasa at intel.com> wrote:
> Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon ? Haswell EP CPU with 18 cores, hyper-threading off, turbo off).

Hi Vamsi,

Thank you for your work and your detailed email.

I'm -1 on the idea because:

* Performance improvements are not bug fixes
* The patch doesn't make the migration process from Python 2 to Python 3 easier
* In long term, it would be nice to work on making Python 3 better:
See http://bugs.python.org/issue11549 for an example task.

--Berker

From ncoghlan at gmail.com  Thu May 28 11:47:08 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 19:47:08 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
Message-ID: <CADiSq7f3iW=0o3GccwN7+e+L26syzZvSwMizo9DE__CQ0th49Q@mail.gmail.com>

On 28 May 2015 at 18:54, Berker Peksa? <berker.peksag at gmail.com> wrote:
> On Thu, May 28, 2015 at 3:17 AM, Parasa, Srinivas Vamsi
> <srinivas.vamsi.parasa at intel.com> wrote:
>> Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon ? Haswell EP CPU with 18 cores, hyper-threading off, turbo off).
>
> Hi Vamsi,
>
> Thank you for your work and your detailed email.
>
> I'm -1 on the idea because:
>
> * Performance improvements are not bug fixes

The "nothing except backwards compatible bug fixes in maintenance
releases" rule was adopted for the *benefit of Python users*. When a
new feature release can be reliably expected every 18-24 months, it
makes sense to err heavily on the side of minimising risks to
stability when it comes to making judgement calls on whether or not a
change is appropriate to a maintenance release or not.

Python 2.7 is an unusual case, as even though there *are* newer
feature releases available, the barriers to migration are much higher
than they would otherwise be. Each progressive 3.x release has brought
those barriers down a bit, and 3.5 and the static type checking work
being done through mypy et al will bring them down even further, but
version migration work is still work where end users don't see any
immediate practical benefit - they only see the benefit *after*
they're able to drop Python 2 compatibility, and can start using
Python 3 only features like matrix multiplication and the async/await
syntax.

*Adding* features to Python 2.7 is quite rightly still controversial,
as they add complexity to the compatibility matrix for testing
purposes. Code that runs correctly with the PEP 466 and 476 changes to
SSL handling, may fail on earlier versions.

Internal performance improvements, by contrast, don't hurt end users
at all beyond the stability risks, and in this case, the request to
make the change is being accompanied by the offer to assist with
ongoing maintenance (including engaging an experienced core developer
to help coach Intel contributors through the contribution process).

So when folks ask "What changed?" in relation to this request, what
changed is the fact that it isn't expected to be a one off
contribution, but rather part of a broader effort focused on improving
the performance of both Python 2 and Python 3, including contributions
to ongoing maintenance activities.

> * The patch doesn't make the migration process from Python 2 to Python 3 easier
> * In long term, it would be nice to work on making Python 3 better:

Indeed, but we also need help living up to the "Python 2.7 will be
supported to 2020" commitment. Python 2.7 maintenance is *not* a
particularly exciting task, and it's only going to get less
interesting as Python 3 adoption climbs, so we're going to need paid
contributors to start filling the gap as volunteers (quite reasonably)
move on to the more inherently rewarding task of working to move
Python 3 forward.

That's going to be a negotiation process - companies don't typically
contribute paid development time to open source projects out of the
kindness of their hearts, they do it either because they're using the
project themselves, because of deals they've made with individual
contributors around how they spend their time, or because it helps
them influence the direction of upstream development in ways that help
them and their customers. (And sometimes it's a mix of all 3 of those
factors)

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Thu May 28 12:01:07 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 20:01:07 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CADiSq7f3iW=0o3GccwN7+e+L26syzZvSwMizo9DE__CQ0th49Q@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CADiSq7f3iW=0o3GccwN7+e+L26syzZvSwMizo9DE__CQ0th49Q@mail.gmail.com>
Message-ID: <CADiSq7f0Cd0Pz+gV48X-1eLY+Tvw5ByhtCJ_m8gxOtUzhhkA+Q@mail.gmail.com>

On 28 May 2015 at 19:47, Nick Coghlan <ncoghlan at gmail.com> wrote:
> That's going to be a negotiation process - companies don't typically
> contribute paid development time to open source projects out of the
> kindness of their hearts, they do it either because they're using the
> project themselves, because of deals they've made with individual
> contributors around how they spend their time, or because it helps
> them influence the direction of upstream development in ways that help
> them and their customers. (And sometimes it's a mix of all 3 of those
> factors)

And to be completely transparent about this: this probably won't be
the last of these kinds of discussions we're likely to see.

Various folks (including me) have been negotiating to have their
employers fund paid CPython contribution positions and as we coach
colleagues that take up these roles through the core team's
contribution processes, one of the consequences will be that we will
sometimes advocate for acceptance of changes that we would have
historically rejected as imposing too high a maintenance burden for an
all-volunteer development team to be expected to deal with.

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From jtaylor.debian at googlemail.com  Thu May 28 13:30:59 2015
From: jtaylor.debian at googlemail.com (Julian Taylor)
Date: Thu, 28 May 2015 13:30:59 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CADiSq7f0Cd0Pz+gV48X-1eLY+Tvw5ByhtCJ_m8gxOtUzhhkA+Q@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CADiSq7f3iW=0o3GccwN7+e+L26syzZvSwMizo9DE__CQ0th49Q@mail.gmail.com>
 <CADiSq7f0Cd0Pz+gV48X-1eLY+Tvw5ByhtCJ_m8gxOtUzhhkA+Q@mail.gmail.com>
Message-ID: <CAK5FAtG+EPAM+SCpf3iWGhssYPy0iZUa8VhXas4BnUmz1z0+8g@mail.gmail.com>

won't this need python compiled with gcc 5.1 to have any effect? Which
compiler version was used for the benchmark?
the issue that negated most computed goto improvements
(https://gcc.gnu.org/bugzilla/show_bug.cgi?id=39284) was only closed
very recently (r212172, 9f4ec746affbde1)

From fijall at gmail.com  Thu May 28 13:55:25 2015
From: fijall at gmail.com (Maciej Fijalkowski)
Date: Thu, 28 May 2015 13:55:25 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
Message-ID: <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>

> I'm -1 on the idea because:
>
> * Performance improvements are not bug fixes
> * The patch doesn't make the migration process from Python 2 to Python 3 easier

And this is why people have been porting Python applications to Go.
Maybe addressing Python performance and making Python (2 or 3) a
better language/platform would mitigate that.

Cheers,
fijal

From doko at ubuntu.com  Thu May 28 14:00:44 2015
From: doko at ubuntu.com (Matthias Klose)
Date: Thu, 28 May 2015 14:00:44 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
Message-ID: <5567036C.4010905@ubuntu.com>

On 05/28/2015 02:17 AM, Parasa, Srinivas Vamsi wrote:
> Hi All,
> 
> This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.
> 
> Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).
> 
> Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon - Haswell EP CPU with 18 cores, hyper-threading off, turbo off).
> 
> Below is a summary of the performance we saw on the "grand unified python benchmarks" suite (available at https://hg.python.org/benchmarks/). We made 3 rigorous runs of the following benchmarks. In each rigorous run, a benchmark is run 100 times with and without the computed goto patch. Below we show the average performance boost for the 3 rigorous runs.
> 
> Python 2.7.10 (original) vs Computed Goto performance
> Benchmark

-1

As Gregory pointed out, there are other options to build the interpreter, and we
are missing data how these compare with your patch.

I assume, you tested with the Intel compiler, so it would be good to see results
for other compilers as well (GCC, clang).  Please could you provide the data for
LTO and profile guided optimized builds (maybe combined too)?  I'm happy to work
with you on setting up these builds, but currently don't have the machine
resources to do so myself.

If the benefits show up for these configurations too, then I'm +/-0 on this patch.

Matthias


From raymond.hettinger at gmail.com  Thu May 28 14:46:56 2015
From: raymond.hettinger at gmail.com (Raymond Hettinger)
Date: Thu, 28 May 2015 05:46:56 -0700
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
Message-ID: <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>


> On May 28, 2015, at 1:54 AM, Berker Peksa? <berker.peksag at gmail.com> wrote:
> 
> * Performance improvements are not bug fixes

Practicality beats purity here.   
Recognize that a huge number of Python users will remain in the Python2.7 world
for some time.  We have a responsibility to the bulk of our users (my estimate is
that adoption rate for Python 3 is under 2%).  The computed goto patch makes
substantial performance improvements.  It is callous to deny the improvement
to 2.7 users.


> * The patch doesn't make the migration process from Python 2 to Python 3 easier

Sorry, that is a red-herring (an orthogonal issue).
If you care about 2-to-3 migration, then start
opposing proposals for API changes that increase
the semantic difference between 2 and 3.



Raymond


From ncoghlan at gmail.com  Thu May 28 15:37:39 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 23:37:39 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <5567036C.4010905@ubuntu.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <5567036C.4010905@ubuntu.com>
Message-ID: <CADiSq7dgZfkx74Hffg1LPRxDg-QVZXvDxkZJ+OQneQu99m5j0w@mail.gmail.com>

On 28 May 2015 at 22:00, Matthias Klose <doko at ubuntu.com> wrote:
> On 05/28/2015 02:17 AM, Parasa, Srinivas Vamsi wrote:
>> Hi All,
>>
>> This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.
>>
>> Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).
>>
>> Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon - Haswell EP CPU with 18 cores, hyper-threading off, turbo off).
>>
>> Below is a summary of the performance we saw on the "grand unified python benchmarks" suite (available at https://hg.python.org/benchmarks/). We made 3 rigorous runs of the following benchmarks. In each rigorous run, a benchmark is run 100 times with and without the computed goto patch. Below we show the average performance boost for the 3 rigorous runs.
>>
>> Python 2.7.10 (original) vs Computed Goto performance
>> Benchmark
>
> -1
>
> As Gregory pointed out, there are other options to build the interpreter, and we
> are missing data how these compare with your patch.

That's shifting the goal posts: suggesting that we should optimise
Python 2 differently from the way we optimised Python 3 isn't a
reasonable request to make in the context of a backporting proposal.

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From srinivas.vamsi.parasa at intel.com  Thu May 28 14:37:09 2015
From: srinivas.vamsi.parasa at intel.com (Parasa, Srinivas Vamsi)
Date: Thu, 28 May 2015 12:37:09 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <5567036C.4010905@ubuntu.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <5567036C.4010905@ubuntu.com>
Message-ID: <34384DEB0F607E42BD61D446586AD4E86B5119A8@ORSMSX103.amr.corp.intel.com>

Hi Matthias and Gregory,

The results shown were run on Python 2.7.10 built using gcc. The goal of our team is to make long-term open source contributions with emphasis on performance optimization and support for the larger community and hence icc wasn't used.

We've experimented with gcc profile-guided optimization (PGO) and LTO a month ago. PGO being an independent/orthogonal optimization, it shows improvement for both the stock version (i.e. current switch based dispatch) and the computed-goto version. We ran PGO optimized Python on the workloads available at language benchmarks game (http://benchmarksgame.alioth.debian.org/u64/python.php) and found that PGO benefits computed-goto version more than the stock version. I haven't run PGO optimized Python with the "grand unified python benchmarks" (GUPB) suite ...please give me a day or two and will get back to you with PGO (and LTO) numbers as well. (LTO hasn't shown much benefit so far on the language benchmarks game workloads).

Also, in our analysis using CPU performance counters, we found that python workloads (in general) have higher CPU front-end issues (mainly I-cache misses) and PGO is very helpful in mitigating those issues. We're also investigating and working on ways to further reduce those front-end issues and speedup Python workloads.

Thanks,
Vamsi

-----Original Message-----
From: Matthias Klose [mailto:doko at ubuntu.com] 
Sent: Thursday, May 28, 2015 5:01 AM
To: Parasa, Srinivas Vamsi; 'python-dev at python.org'
Subject: Re: [Python-Dev] Computed Goto dispatch for Python 2

On 05/28/2015 02:17 AM, Parasa, Srinivas Vamsi wrote:
> Hi All,
> 
> This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.
> 
> Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).
> 
> Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon - Haswell EP CPU with 18 cores, hyper-threading off, turbo off).
> 
> Below is a summary of the performance we saw on the "grand unified python benchmarks" suite (available at https://hg.python.org/benchmarks/). We made 3 rigorous runs of the following benchmarks. In each rigorous run, a benchmark is run 100 times with and without the computed goto patch. Below we show the average performance boost for the 3 rigorous runs.
> 
> Python 2.7.10 (original) vs Computed Goto performance Benchmark

-1

As Gregory pointed out, there are other options to build the interpreter, and we are missing data how these compare with your patch.

I assume, you tested with the Intel compiler, so it would be good to see results for other compilers as well (GCC, clang).  Please could you provide the data for LTO and profile guided optimized builds (maybe combined too)?  I'm happy to work with you on setting up these builds, but currently don't have the machine resources to do so myself.

If the benefits show up for these configurations too, then I'm +/-0 on this patch.

Matthias


From ncoghlan at gmail.com  Thu May 28 15:44:03 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 28 May 2015 23:44:03 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CADiSq7dgZfkx74Hffg1LPRxDg-QVZXvDxkZJ+OQneQu99m5j0w@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <5567036C.4010905@ubuntu.com>
 <CADiSq7dgZfkx74Hffg1LPRxDg-QVZXvDxkZJ+OQneQu99m5j0w@mail.gmail.com>
Message-ID: <CADiSq7eLRN9EVE80R3NBb=4zBktJk4WyqSJ=FqpTpV=bWk8q6A@mail.gmail.com>

On 28 May 2015 at 23:37, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 28 May 2015 at 22:00, Matthias Klose <doko at ubuntu.com> wrote:
>> On 05/28/2015 02:17 AM, Parasa, Srinivas Vamsi wrote:
>>> Hi All,
>>>
>>> This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.
>>>
>>> Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).
>>>
>>> Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon - Haswell EP CPU with 18 cores, hyper-threading off, turbo off).
>>>
>>> Below is a summary of the performance we saw on the "grand unified python benchmarks" suite (available at https://hg.python.org/benchmarks/). We made 3 rigorous runs of the following benchmarks. In each rigorous run, a benchmark is run 100 times with and without the computed goto patch. Below we show the average performance boost for the 3 rigorous runs.
>>>
>>> Python 2.7.10 (original) vs Computed Goto performance
>>> Benchmark
>>
>> -1
>>
>> As Gregory pointed out, there are other options to build the interpreter, and we
>> are missing data how these compare with your patch.
>
> That's shifting the goal posts: suggesting that we should optimise
> Python 2 differently from the way we optimised Python 3 isn't a
> reasonable request to make in the context of a backporting proposal.

Sorry, I misread your email. I thought you were talking about the part
of Greg's email where he suggested different optimisation techniques,
but you were actually referring to the part where he requested more
detail on the compiler used and the number for gcc and clang.

*That* request I agree with.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Thu May 28 16:08:30 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 00:08:30 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
Message-ID: <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>

On 28 May 2015 at 21:55, Maciej Fijalkowski <fijall at gmail.com> wrote:
>> I'm -1 on the idea because:
>>
>> * Performance improvements are not bug fixes
>> * The patch doesn't make the migration process from Python 2 to Python 3 easier
>
> And this is why people have been porting Python applications to Go.

For folks hitting the kinds of scalability problems that Go is
designed to help with, a few percentage points here and there in
CPython performance aren't going to make a big difference - they'll
need the kinds of speed multipliers that PyPy can offer.

Given that Go can't run Python C extensions any more than PyPy can,
and involves a rewrite in a different programming language to boot,
we'd do well to ponder what Go currently offers that PyPy doesn't. If
we ignore the fashion-driven aspect of "Google wrote it, so it must be
cool" (which we can't do anything about), and if we ignore the
multi-vendor commercial support question (which tends to significantly
lag community adoption for true community driven projects like PyPy),
then one of the big keys in my view is the easy redistributability of
Go binaries.

For Linux based network services (and even Windows these days), Docker
containers offer a potentially compelling way of bundling the PyPy
runtime with Python applications, and Docker, Inc already maintain a
set of PyPy base images at https://registry.hub.docker.com/_/pypy/

Docker's image layering model then means that applications sharing a
PyPy runtime shouldn't need to download the interpreter runtime itself
more than once.

As a result, I personally suspect that better documenting and
promoting the CPython->Docker+PyPy migration option is likely to offer
a more effective alternative to CPython->Go migrations than the more
modest performance improvements we can make to the CPython runtime
itself. (I still think the latter are a good idea, though - there's no
point leaving Python 2.7 slower than it needs to be given the offer of
assistance in maintaining it)

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From p.f.moore at gmail.com  Thu May 28 16:11:34 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 15:11:34 +0100
Subject: [Python-Dev] Usability of the limited API
Message-ID: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>

With Python 3.5 shipping an embeddable copy of the interpreter on
Windows, I thought I'd try out a simple embedded interpreter as an
experiment. I wanted to use the limited API, as I'd rather it were
easy to upgrade the interpreter without recompiling the embedding app.

But the "Very high-level embedding" example in the docs doesn't
compile with the limited API.

#include <Python.h>

int
main(int argc, char *argv[])
{
    wchar_t *program = Py_DecodeLocale(argv[0], NULL);
    if (program == NULL) {
        fprintf(stderr, "Fatal error: cannot decode argv[0]\n");
        exit(1);
    }
    Py_SetProgramName(program);  /* optional but recommended */
    Py_Initialize();
    PyRun_SimpleString("from time import time,ctime\n"
                       "print('Today is', ctime(time()))\n");
    Py_Finalize();
    PyMem_RawFree(program);
    return 0;
}

The Py_DecodeLocale/Py_SetProgramName/PyMem_RawFree bit can probably
be replaced by a Py_SetProgramName call specifying a static value,
it's not exactly crucial. (Py_DecodeLocale appears to be defined as in
the limited API by the headers, but not exported from python3.dll, by
the way, which implies that something's out of sync).

But PyRun_SimpleString doesn't appear to be exposed in the limited
API, even though https://docs.python.org/3/c-api/veryhigh.html doesn't
mention this, and https://docs.python.org/3/c-api/stable.html says
that functions not part of the stable API will be marked as such.

I dumped out the exported symbols from python3.dll, which is the
simplest way I could think of finding out what is in the limited API
(it's hardly user friendly, but never mind). And frustratingly, none
of the very high level PyRun_XXX APIs are available.

At this point, I think I'll probably just give up and use the full
API, but it does make me question whether the limited API is actually
usable as it stands.

I was hoping to be able to suggest as an application bundling option
that people could write a trivial wrapper script in C to fire up a
Python script, and bundle that along with its dependencies and the
embeddable Python distribution. Looks like that's doable, but only
using the full API, which makes upgrading the bundled Python
interpreter a bit messier. Ah, well, no huge loss :-(

But after this experiment, I do wonder - is the limited API really a
viable option for embedders?

Paul

From ncoghlan at gmail.com  Thu May 28 16:28:39 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 00:28:39 +1000
Subject: [Python-Dev] Usability of the limited API
In-Reply-To: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>
References: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>
Message-ID: <CADiSq7fUYXnBDC4=GHQC76a1u3jAMcBL6D+xyWOPPoJtVyjqLw@mail.gmail.com>

On 29 May 2015 at 00:11, Paul Moore <p.f.moore at gmail.com> wrote:
> I was hoping to be able to suggest as an application bundling option
> that people could write a trivial wrapper script in C to fire up a
> Python script, and bundle that along with its dependencies and the
> embeddable Python distribution. Looks like that's doable, but only
> using the full API, which makes upgrading the bundled Python
> interpreter a bit messier. Ah, well, no huge loss :-(
>
> But after this experiment, I do wonder - is the limited API really a
> viable option for embedders?

I personally suspect the requirement to preserve source compatibility
with Python 2 has meant that the limited ABI hasn't been exercised
very well to date.

As far as the high level embedding API goes, I expect it's just an
oversight that it's missing from the stable ABI. There are some that
*can't* be exposed (the ones that rely on FILE pointers), but the
others should be OK.

Cheers,
Nick.


-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From brett at python.org  Thu May 28 16:02:03 2015
From: brett at python.org (Brett Cannon)
Date: Thu, 28 May 2015 14:02:03 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
Message-ID: <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>

On Thu, May 28, 2015 at 8:47 AM Raymond Hettinger <
raymond.hettinger at gmail.com> wrote:

>
> > On May 28, 2015, at 1:54 AM, Berker Peksa? <berker.peksag at gmail.com>
> wrote:
> >
> > * Performance improvements are not bug fixes
>
> Practicality beats purity here.

Recognize that a huge number of Python users will remain in the Python2.7
> world
> for some time.  We have a responsibility to the bulk of our users (my
> estimate is
> that adoption rate for Python 3 is under 2%).


Where does that number come from? I have not seen numbers less than 5% for
the overall community adoption (and all of them are extremely rough and all
skewed towards Python 2 for various technical reasons).


>   The computed goto patch makes
> substantial performance improvements.  It is callous to deny the
> improvement
> to 2.7 users.
>

But you could argue that "Special cases aren't special enough to break the
rules" and that's what we are proposing here by claiming Python 2.7 is a
special case and thus we should accept a patch that is not a one-liner
change to gain some performance in a bugfix release.


>
>
> > * The patch doesn't make the migration process from Python 2 to Python 3
> easier
>
> Sorry, that is a red-herring (an orthogonal issue).
> If you care about 2-to-3 migration, then start
> opposing proposals for API changes that increase
> the semantic difference between 2 and 3.
>

That's misdirection for Berker's point that the proposal at hand does not
help with getting to people to Python 3 even though what is proposed is an
enhancement and not a bugfix (since we are not fixing a performance
regression). I had someone on Twitter earlier this month point out that
Python 3 was slower than Python 2 on some benchmark and that was a reason
they weren't switching. Doing this is not going to help make that case
(although I think arguing about performance between 2 and 3 is misleading
when I've seen other workloads win out in Python 3).

I'm -0 on this because I would like to stick to our policy of no
enhancements in a bugfix release, but in the end it's Benjamin's call as
2.7 RM as to whether this is appropriate for 2.7.11.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/0f432d9a/attachment.html>

From donald at stufft.io  Thu May 28 16:37:24 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 10:37:24 -0400
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
Message-ID: <etPan.55672824.4d4993ef.12a4d@Draupnir.home>



On May 28, 2015 at 10:10:03 AM, Nick Coghlan (ncoghlan at gmail.com) wrote:
> On 28 May 2015 at 21:55, Maciej Fijalkowski wrote:
> >> I'm -1 on the idea because:
> >>
> >> * Performance improvements are not bug fixes
> >> * The patch doesn't make the migration process from Python 2 to Python 3 easier
> >
> > And this is why people have been porting Python applications to Go.
>  
> For folks hitting the kinds of scalability problems that Go is
> designed to help with, a few percentage points here and there in
> CPython performance aren't going to make a big difference - they'll
> need the kinds of speed multipliers that PyPy can offer.
>  
> Given that Go can't run Python C extensions any more than PyPy can,
> and involves a rewrite in a different programming language to boot,
> we'd do well to ponder what Go currently offers that PyPy doesn't. If
> we ignore the fashion-driven aspect of "Google wrote it, so it must be
> cool" (which we can't do anything about), and if we ignore the
> multi-vendor commercial support question (which tends to significantly
> lag community adoption for true community driven projects like PyPy),
> then one of the big keys in my view is the easy redistributability of
> Go binaries.
>  
> For Linux based network services (and even Windows these days), Docker
> containers offer a potentially compelling way of bundling the PyPy
> runtime with Python applications, and Docker, Inc already maintain a
> set of PyPy base images at https://registry.hub.docker.com/_/pypy/
>  
> Docker's image layering model then means that applications sharing a
> PyPy runtime shouldn't need to download the interpreter runtime itself
> more than once.
>  
> As a result, I personally suspect that better documenting and
> promoting the CPython->Docker+PyPy migration option is likely to offer
> a more effective alternative to CPython->Go migrations than the more
> modest performance improvements we can make to the CPython runtime
> itself. (I still think the latter are a good idea, though - there's no
> point leaving Python 2.7 slower than it needs to be given the offer of
> assistance in maintaining it)
>  
> Regards,
> Nick.
>  
> --
> Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/donald%40stufft.io  
>  


I think docker is a pretty crummy answer to Go?s static binaries. What I would
love is for Python to get:

* The ability to import .so modules via zipzimport (ideally without a temporary
? directory, but that might require newer APIs from libc and such).
* The ability to create a ?static? Python that links everything it needs into
? the binary to do a zipimport of everything else (including the stdlib).
* The ability to execute a zipfile that has been concat onto the end of the
? Python binary.

I think that if we get all of that, we could easily create a single file executable
with real, native support from Python by simply compiling Python in that static
mode and then appending a zip file containing the standard library and any other
distributions we need to the end of it.

We?d probably want some more quality of life improvements around accessing resources
from within that zip file as well, but that can be done as a library easier than
the above three things can.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From skip.montanaro at gmail.com  Thu May 28 16:47:54 2015
From: skip.montanaro at gmail.com (Skip Montanaro)
Date: Thu, 28 May 2015 09:47:54 -0500
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
Message-ID: <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>

On Thu, May 28, 2015 at 9:02 AM, Brett Cannon <brett at python.org> wrote:
> But you could argue that "Special cases aren't special enough to break the
> rules" and that's what we are proposing here by claiming Python 2.7 is a
> special case and thus we should accept a patch that is not a one-liner
> change to gain some performance in a bugfix release.

One can read anything he wants into the Zen. I could respond with this
retort: "Although practicality beats purity," but I won't. :-)

Skip

From p.f.moore at gmail.com  Thu May 28 16:52:56 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 15:52:56 +0100
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
Message-ID: <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>

On 28 May 2015 at 15:37, Donald Stufft <donald at stufft.io> wrote:
> I think docker is a pretty crummy answer to Go?s static binaries. What I would
> love is for Python to get:
>
> * The ability to import .so modules via zipzimport (ideally without a temporary
>   directory, but that might require newer APIs from libc and such).
> * The ability to create a ?static? Python that links everything it needs into
>   the binary to do a zipimport of everything else (including the stdlib).
> * The ability to execute a zipfile that has been concat onto the end of the
>   Python binary.
>
> I think that if we get all of that, we could easily create a single file executable
> with real, native support from Python by simply compiling Python in that static
> mode and then appending a zip file containing the standard library and any other
> distributions we need to the end of it.
>
> We?d probably want some more quality of life improvements around accessing resources
> from within that zip file as well, but that can be done as a library easier than
> the above three things can.

+1. The new embeddable Python distribution for Windows is a great step
forward for this. It's not single-file, but it's easy to produce a
single-directory self-contained application with it. I don't know if
there's anything equivalent for Linux/OSX - maybe it's something we
should look at for them as well (although the whole "static binaries"
concept seems to be fairly frowned on in the Unix world, from what
I've seen).

Paul

From brett at python.org  Thu May 28 16:50:18 2015
From: brett at python.org (Brett Cannon)
Date: Thu, 28 May 2015 14:50:18 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
Message-ID: <CAP1=2W4D9oC-cTQ=gHo20zKsaeDMYA-2jtCsj9OmJ9fRwoBP_g@mail.gmail.com>

On Thu, May 28, 2015 at 10:47 AM Skip Montanaro <skip.montanaro at gmail.com>
wrote:

> On Thu, May 28, 2015 at 9:02 AM, Brett Cannon <brett at python.org> wrote:
> > But you could argue that "Special cases aren't special enough to break
> the
> > rules" and that's what we are proposing here by claiming Python 2.7 is a
> > special case and thus we should accept a patch that is not a one-liner
> > change to gain some performance in a bugfix release.
>
> One can read anything he wants into the Zen. I could respond with this
> retort: "Although practicality beats purity," but I won't. :-)
>

Because Raymond already said that. =) And you explicitly made the point I
was trying to implicitly make: in this instance there is no clear argument
for or against that is going to make this an obvious decision based on PEP
20.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/9e5950bd/attachment.html>

From p.f.moore at gmail.com  Thu May 28 16:49:07 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 15:49:07 +0100
Subject: [Python-Dev] Usability of the limited API
In-Reply-To: <BY1PR03MB1466903FF0712CA9595A5B99F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>
 <BY1PR03MB1466903FF0712CA9595A5B99F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <CACac1F-GJPQOhuett3i1G4TkuvNdOskONLZOrQcA+1quoGctZA@mail.gmail.com>

On 28 May 2015 at 15:28, Steve Dower <Steve.Dower at microsoft.com> wrote:
> I don't have the issue number handy, but it should be near the top of the
> recently modified list.

I recall seeing that issue. I'm fine with that - getting the two in
sync is obviously worth doing (and clearly in hand). I'm personally
not sure whether automating the exposure of symbols is the correct
approach, as I'm not sure people typically even consider the stable
API when adding functions. Is the default (what you get if somebody
just blindly adds a symbol with no thought for the stable API) to
expose it or not? If the default is that it's not exposed, then
automation seems reasonable, otherwise I'm not so sure.

The bigger issue for me is that it looks like the stable API doesn't
include functions that allow you to just run a script/file.

At a minimum, PyRun_SimpleString should be available. I'd also like to
see a variant of PyRun_SimpleFile that let you pass a filename (either
a .py file, or a zipfile, or a directory name - basically what you can
pass to the Python interpreter directly). You can sort of do it via
"import runpy; runpy.run_path(filename)", but you get into all sorts
of fun with requoting filenames, etc.

With the fact that we're distributing an embeddable interpreter for
Windows, I'd like to be able to promote it as a bundling option,
easier to use than things like py2exe/cx_Freeze.

Paul

From donald at stufft.io  Thu May 28 17:01:45 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 11:01:45 -0400
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <etPan.55672dd9.4cf37072.12a4d@Draupnir.home>



On May 28, 2015 at 10:55:08 AM, Steve Dower (steve.dower at microsoft.com) wrote:
> Donald Stufft wrote:
> > I think docker is a pretty crummy answer to Go?s static binaries. What I would
> > love is for Python to get:
> >
> > * The ability to import .so modules via zipzimport (ideally without a temporary
> > directory, but that might require newer APIs from libc and such).
> > * The ability to create a ?static? Python that links everything it needs into
> > the binary to do a zipimport of everything else (including the stdlib).
> > * The ability to execute a zipfile that has been concat onto the end of the
> > Python binary.
> >
> > I think that if we get all of that, we could easily create a single file
> > executable with real, native support from Python by simply compiling Python in
> > that static mode and then appending a zip file containing the standard library
> > and any other distributions we need to the end of it.
>  
> And it would look like a 20MB+ file just for a simple 1KB Python script...
>  
> For Windows at least, I'd prefer to have some app-style installer generation (e.g. http://pynsist.readthedocs.org/en/latest/)  
> which, combined with the embeddable Python distro (new for 3.5.0b1 in case anyone missed  
> it), can simply extract everything into an install directory and run it from there. None  
> of the items on the list above are needed for or would help with this.
>  
> (Some other Windows-specific advantages of the latter - installers get special compatibility  
> treatment when the OS does stuff to break them, modifying official Python binaries breaks  
> the signatures, signed executables are fully scanned before running (slow if the file  
> is big), IT departments know how to deal with installers and users know how to deal with  
> installed binaries, and probably more.)
>  
> Alright everyone, back on topic now unless you want to rename the thread :)
>  
> Cheers,
> Steve
>  
>?

Well Python 3.4.3 binary is 4kb for me, so you'd have that + your 1KB Python
script + whatever other pieces you need. It would be entirely possible to only
include the parts of the standard library you actually need. There's no rule
that if your single file executable doesn't use xmlrpc that you have to include
xmlrpc just for purities sake.

This isn't something that can't be done today using something like PyInstaller,
it's just super janky and finnicky because it's being hacked in after the fact
by PyInstaller and because it's not an officially supported thing a lot of
projects simply don't support it. A CLI I worked on that uses PyInstaller is
5MB. It's certainly a trade off but it's not nearly as big of a trade off as
you say.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From Steve.Dower at microsoft.com  Thu May 28 17:02:36 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Thu, 28 May 2015 15:02:36 +0000
Subject: [Python-Dev] Usability of the limited API
In-Reply-To: <CACac1F-GJPQOhuett3i1G4TkuvNdOskONLZOrQcA+1quoGctZA@mail.gmail.com>
References: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>
 <BY1PR03MB1466903FF0712CA9595A5B99F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <CACac1F-GJPQOhuett3i1G4TkuvNdOskONLZOrQcA+1quoGctZA@mail.gmail.com>
Message-ID: <BY1PR03MB14662DABC5D961ECB86BF590F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>

Paul Moore wrote:
> On 28 May 2015 at 15:28, Steve Dower <Steve.Dower at microsoft.com> wrote:
>> I don't have the issue number handy, but it should be near the top of
>> the recently modified list.
> 
> I recall seeing that issue. I'm fine with that - getting the two in sync is
> obviously worth doing (and clearly in hand). I'm personally not sure whether
> automating the exposure of symbols is the correct approach, as I'm not sure
> people typically even consider the stable API when adding functions. Is the
> default (what you get if somebody just blindly adds a symbol with no thought for
> the stable API) to expose it or not? If the default is that it's not exposed,
> then automation seems reasonable, otherwise I'm not so sure.

Now I'm at my desk, the issue is http://bugs.python.org/issue23903

I believe new symbols are considered stable by default, so perhaps we actually want a test that will generate a C file that imports everything "stable" and will break the buildbots if someone adds something new without explicitly adding it to the list of stable functions? That sounds like the only way to make the extra overhead of two lists work. I guess we could also invert all the logic in the header files so that symbols are unstable by default.

> The bigger issue for me is that it looks like the stable API doesn't include
> functions that allow you to just run a script/file.

I think a combination of Py_CompileString and PyEval_EvalCode is what you want here, though I can't think of any good reason not to have a stable helper method (or even a macro?) for this.

> At a minimum, PyRun_SimpleString should be available. I'd also like to see a
> variant of PyRun_SimpleFile that let you pass a filename (either a .py file, or
> a zipfile, or a directory name - basically what you can pass to the Python
> interpreter directly). You can sort of do it via "import runpy;
> runpy.run_path(filename)", but you get into all sorts of fun with requoting
> filenames, etc.
> 
> With the fact that we're distributing an embeddable interpreter for Windows, I'd
> like to be able to promote it as a bundling option, easier to use than things
> like py2exe/cx_Freeze.

Agreed. This is the point of the zip file :)

Cheers,
Steve

> Paul


From breamoreboy at yahoo.co.uk  Thu May 28 17:05:56 2015
From: breamoreboy at yahoo.co.uk (Mark Lawrence)
Date: Thu, 28 May 2015 16:05:56 +0100
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
Message-ID: <mk7asm$37i$1@ger.gmane.org>

On 28/05/2015 15:47, Skip Montanaro wrote:
> On Thu, May 28, 2015 at 9:02 AM, Brett Cannon <brett at python.org> wrote:
>> But you could argue that "Special cases aren't special enough to break the
>> rules" and that's what we are proposing here by claiming Python 2.7 is a
>> special case and thus we should accept a patch that is not a one-liner
>> change to gain some performance in a bugfix release.
>
> One can read anything he wants into the Zen. I could respond with this
> retort: "Although practicality beats purity," but I won't. :-)
>
> Skip
>

That's good, otherwise you'd just be repeating what Raymond said further 
up this subthread two hours and one minute before you didn't say it :)

-- 
My fellow Pythonistas, ask not what our language can do for you, ask
what you can do for our language.

Mark Lawrence


From donald at stufft.io  Thu May 28 17:39:50 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 11:39:50 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <etPan.556736c6.581308f4.12a4d@Draupnir.home>



On May 28, 2015 at 11:30:37 AM, Steve Dower (steve.dower at microsoft.com) wrote:
> Donald Stufft wrote:
> > Well Python 3.4.3 binary is 4kb for me, so you'd have that + your 1KB Python script + whatever  
> other pieces you need.
>  
> For contrast, here are the things you need on Windows to be able to get to an interactive  
> prompt (I don't know how other platforms get this down to 4KB...):
>  
> * python.exe (or some equivalent launcher) 39KB
> * python35.dll 3,788KB
> * vcruntime140.dll 87KB (the rest of the CRT is about 1MB, but is not redistributable  
> so doesn't count here)
> * 26 files in Lib 343KB
>  
> This gets you to ">>>", and basically everything after that is going to fail for some reason.  
> That's an unavoidable 4,257KB.
>  
> The rest of the stdlib adds another ~16MB once you exclude the test suite, so a fully functioning  
> Python is not cheap. (Using compressed .pyc's in a zip file can make a big difference here  
> though, assuming you're willing to trade CPU for HDD.)
>  
> Cheers,
> Steve
>  
>  

You don?t need a "fully functioning Python" for a single file binary, you only
need enough to actually run your application. For example, if you're making
an application that can download files over HTTP, you don't need to include
parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite, csv, email,
mailcap, mailbox, imaplib, nntplib, etc.

Of course deciding which pieces you include in the zip file you're appending
to the end of Python is up to whatever tool builds this executable which
doesn't need to be part of Python itself. If Python itself gained the ability
to operate in that manner than third party tools could handle trying to do the
optimizations where it only includes the things it actually needs in the stdlib
and excludes things it doesn't. The key thing here is that since you're doing
a single file binary, you don't need to have a Python which is suitable to
execute random Python code, you only need one that is suitable to execute this
particular code so you can specialize what that includes.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From barry at python.org  Thu May 28 17:58:34 2015
From: barry at python.org (Barry Warsaw)
Date: Thu, 28 May 2015 11:58:34 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.556736c6.581308f4.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
Message-ID: <20150528115834.69284cb1@anarchist.wooz.org>

On May 28, 2015, at 11:39 AM, Donald Stufft wrote:

>You don?t need a "fully functioning Python" for a single file binary, you
>only need enough to actually run your application. For example, if you're
>making an application that can download files over HTTP, you don't need to
>include parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite,
>csv, email, mailcap, mailbox, imaplib, nntplib, etc.

There are actually two related but different use cases to "single file
executables".

The first is nicely solved by tools like pex, where you don't need to include
a fully functional Python at the head of the zip file because the environment
you're deploying it into will have enough Python to make the zip work.  This
can certainly result in smaller zip files.  This is the approach I took with
Snappy Ubuntu Core support for Python 3, based on the current situation that
the atomic upgrade client is written in Python 3.  If that changes and Python
3 is removed from the image, then this approach won't work.

pex (and others) does a great job at this, so unless there are things better
refactored into upstream Python, I don't think we need to do much here.

The second use case is as you describe: put a complete functional Python
environment at the head of the zip file so you don't need anything in the
target deployment environment.  "Complete" can easily mean the entire stdlib,
and although that would usually be more bloated than you normally need, hey,
it's just some extra unused bits so who cares? <wink>.  I think this would be
an excellent starting point which can be optimized to trim unnecessary bits
later, maybe by third party tools.

>Of course deciding which pieces you include in the zip file you're appending
>to the end of Python is up to whatever tool builds this executable which
>doesn't need to be part of Python itself. If Python itself gained the ability
>to operate in that manner than third party tools could handle trying to do
>the optimizations where it only includes the things it actually needs in the
>stdlib and excludes things it doesn't. The key thing here is that since
>you're doing a single file binary, you don't need to have a Python which is
>suitable to execute random Python code, you only need one that is suitable to
>execute this particular code so you can specialize what that includes.

I'd love to see Python itself gain such a tool, but if it had the critical
pieces to execute in this way, that would enable a common approach to
supporting this in third party tools, on a variety of platforms.

I do think single-file executables are an important piece to Python's
long-term competitiveness.

Cheers,
-Barry

From guido at python.org  Thu May 28 18:07:46 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 28 May 2015 09:07:46 -0700
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <mk7asm$37i$1@ger.gmane.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
Message-ID: <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>

Wow. Such thread. :-)

This patch could save companies like Dropbox a lot of money. We run a ton
of Python code in large datacenters, and while we are slow in moving to
Python 3, we're good at updating to the latest 2.7.

The patch is forward and backward compatible.I'm strongly in favor.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/4f95fa0c/attachment.html>

From barry at python.org  Thu May 28 18:13:41 2015
From: barry at python.org (Barry Warsaw)
Date: Thu, 28 May 2015 12:13:41 -0400
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
Message-ID: <20150528121341.74d087da@anarchist.wooz.org>

Go seems to be popular where I work.  It is replacing Python in a number of
places, although Python (and especially Python 3) is still a very important
part of our language toolbox.

There are several reasons why Go is gaining popularity.  Single-file
executables is definitely a reason; it makes deployment very easy, even if it
increases the maintenance burden (e.g. without shared libraries, you have
multiple copies of things so when a security fix is required for one of those
things you have to recompile the world).

Start up times and memory footprint are also factors.  Probably not much to be
done about the latter, but perhaps PEP 432 can lead to improvements in the
former.  (Hey Nick, I'm guessing you'll want to bump that one back to 3.6.)

Certainly better support for multi-cores comes up a lot.  It should be a SMoE
to just get rid of the GIL once and for all <wink>.

One thing I've seen more than once is that new development happens in Python
until the problem is understood, then the code is ported to Go.  Python's
short path from idea to working code, along with its ability to quickly morph
as requirements and understanding changes, its batteries included philosophy,
and its "fits-your-brain" consistency are its biggest strengths!

On May 28, 2015, at 10:37 AM, Donald Stufft wrote:

>I think docker is a pretty crummy answer to Go?s static binaries. What I would
>love is for Python to get:
>
>* The ability to import .so modules via zipzimport (ideally without a
>temporary ? directory, but that might require newer APIs from libc and such).

+1 - Thomas Wouters mentioned at the language summit some work being done on
glibc to add dlopen_from_memory() (sp?) which would allow for loading .so
files directly from a zip.  Not sure what the status is of that, but it would
be a great addition.

>* The ability to create a ?static? Python that links everything it needs into
>the binary to do a zipimport of everything else (including the stdlib).

+1

>*The ability to execute a zipfile that has been concat onto the end of the ?
>Python binary.

+1

>I think that if we get all of that, we could easily create a single file
>executable with real, native support from Python by simply compiling Python
>in that static mode and then appending a zip file containing the standard
>library and any other distributions we need to the end of it.
>
>We?d probably want some more quality of life improvements around accessing
>resources from within that zip file as well, but that can be done as a
>library easier than the above three things can.

E.g. you really should be using the pkg_resources APIs for loading resources
from your packages, otherwise you're gonna have problems with zip
executables.  We've talked before about adopting some of these APIs into
Python's stdlib.  pkgutil is a start, and the higher level APIs from
pkg_resources should probably go there.

Cheers,
-Barry


From Steve.Dower at microsoft.com  Thu May 28 16:55:05 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Thu, 28 May 2015 14:55:05 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
Message-ID: <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>

Donald Stufft wrote:
> I think docker is a pretty crummy answer to Go?s static binaries. What I would
> love is for Python to get:
> 
> * The ability to import .so modules via zipzimport (ideally without a temporary
> directory, but that might require newer APIs from libc and such).
> * The ability to create a ?static? Python that links everything it needs into
> the binary to do a zipimport of everything else (including the stdlib).
> * The ability to execute a zipfile that has been concat onto the end of the
> Python binary.
> 
> I think that if we get all of that, we could easily create a single file
> executable with real, native support from Python by simply compiling Python in
> that static mode and then appending a zip file containing the standard library
> and any other distributions we need to the end of it.

And it would look like a 20MB+ file just for a simple 1KB Python script...

For Windows at least, I'd prefer to have some app-style installer generation (e.g. http://pynsist.readthedocs.org/en/latest/) which, combined with the embeddable Python distro (new for 3.5.0b1 in case anyone missed it), can simply extract everything into an install directory and run it from there. None of the items on the list above are needed for or would help with this.

(Some other Windows-specific advantages of the latter - installers get special compatibility treatment when the OS does stuff to break them, modifying official Python binaries breaks the signatures, signed executables are fully scanned before running (slow if the file is big), IT departments know how to deal with installers and users know how to deal with installed binaries, and probably more.)

Alright everyone, back on topic now unless you want to rename the thread :)

Cheers,
Steve


From chris.barker at noaa.gov  Thu May 28 18:28:16 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Thu, 28 May 2015 09:28:16 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
Message-ID: <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>

On Thu, May 28, 2015 at 9:23 AM, Chris Barker <chris.barker at noaa.gov> wrote:

> Barry Warsaw wrote:
> >> I do think single-file executables are an important piece to Python's long-term
> competitiveness.
>
> Really? It seems to me that desktop development is dying. What are the
> critical use-cases for a single file executable?
>

oops, sorry -- I see this was addressed in another thread. Though I guess I
still don't see why "single file" is critical, over "single thing to
install" -- like a OS-X app bundle that can just be dragged into the
Applications folder.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/5763fd60/attachment.html>

From p.f.moore at gmail.com  Thu May 28 18:38:49 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 17:38:49 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528115834.69284cb1@anarchist.wooz.org>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <20150528115834.69284cb1@anarchist.wooz.org>
Message-ID: <CACac1F_xbuFXh90Yc9Zvrv-FyfVEZc-YNi52b49diaje9_uP_w@mail.gmail.com>

On 28 May 2015 at 16:58, Barry Warsaw <barry at python.org> wrote:
> On May 28, 2015, at 11:39 AM, Donald Stufft wrote:
>
>>You don?t need a "fully functioning Python" for a single file binary, you
>>only need enough to actually run your application. For example, if you're
>>making an application that can download files over HTTP, you don't need to
>>include parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite,
>>csv, email, mailcap, mailbox, imaplib, nntplib, etc.
>
> There are actually two related but different use cases to "single file
> executables".
>
> The first is nicely solved by tools like pex, where you don't need to include
> a fully functional Python at the head of the zip file because the environment
> you're deploying it into will have enough Python to make the zip work.  This
> can certainly result in smaller zip files.  This is the approach I took with
> Snappy Ubuntu Core support for Python 3, based on the current situation that
> the atomic upgrade client is written in Python 3.  If that changes and Python
> 3 is removed from the image, then this approach won't work.
>
> pex (and others) does a great job at this, so unless there are things better
> refactored into upstream Python, I don't think we need to do much here.

One problem with pex is that it doesn't appear to work on Windows (I
just gave it a try, and got errors because it relies on symlinks).

IMO, any solution to "distributing Python applications" that is
intended to compete with the idea that "go produces nice single-file
executables" needs to be cross-platform. At the moment, zipapp (and in
general, the core support for running applications from a zip file)
handles this for the case where you're allowed to assume an already
installed Python interpreter. The proviso here, as Donald pointed out,
is that it doesn't handle C extensions.

The biggest problem with 3rd-party solutions is that they don't always
support the full range of platforms that Python supports. That's fine
for a 3rd party tool, but if we want to have a response to people
asking how to bundle their application written in Python, we need a
better answer than "if you're on Windows, use py2exe, or if you're on
Unix use pex, or maybe..."

Python has core support for the equivalent of Java's jar format in
zipapp. It's not well promoted (and doesn't support C extensions) but
it's a pretty viable option for a lot of situations.

> The second use case is as you describe: put a complete functional Python
> environment at the head of the zip file so you don't need anything in the
> target deployment environment.  "Complete" can easily mean the entire stdlib,
> and although that would usually be more bloated than you normally need, hey,
> it's just some extra unused bits so who cares? <wink>.  I think this would be
> an excellent starting point which can be optimized to trim unnecessary bits
> later, maybe by third party tools.

Tools like py2exe and cx_Freeze do this, and are pretty commonly used
on Windows. An obvious example of use is Mercurial. If you're looking
at this scenario, a good place to start would probably be
understanding why cx_Freeze isn't more commonly used on Unix (AFAIK,
it supports Unix, but I've only ever really heard of it being used on
Windows).

I suspect "single file executables" just aren't viewed as a desirable
solution on Unix. Although Donald referred to a 4K binary, which
probably means just a stub exe that depends on system-installed .so
files, likely including Python (I'm just guessing here). It's easy to
do something similar on Windows, but it's *not* what most Windows
users think of when you say a "single file executable for a Python
program" (because there's no system package manager doing dependencies
for you).

Again, platform-specific answers are one thing, and are relatively
common, but having a good cross-platform answer at the language level
(a section on docs.python.org "How to ship your Python program") is
much harder.

>>Of course deciding which pieces you include in the zip file you're appending
>>to the end of Python is up to whatever tool builds this executable which
>>doesn't need to be part of Python itself. If Python itself gained the ability
>>to operate in that manner than third party tools could handle trying to do
>>the optimizations where it only includes the things it actually needs in the
>>stdlib and excludes things it doesn't. The key thing here is that since
>>you're doing a single file binary, you don't need to have a Python which is
>>suitable to execute random Python code, you only need one that is suitable to
>>execute this particular code so you can specialize what that includes.
>
> I'd love to see Python itself gain such a tool, but if it had the critical
> pieces to execute in this way, that would enable a common approach to
> supporting this in third party tools, on a variety of platforms.

Stripping out unused code is a hard problem in a language as dynamic
as Python. It would be great to see it happen, but I'm not sure how
much better we can do than existing tools like modulefinder. (consider
that stripping out parts of the stdlib is the same in principle as
stripping out unused bits of a 3rd party library like requests - when
this issue comes up, people often talk about slimming down the stdlib
to just what's needed, but why not take out the json support from
requests if you don't use it?)

> I do think single-file executables are an important piece to Python's
> long-term competitiveness.

Agreed. But also, I think that "single-file" executables
(single-directory, in practice) are *already* important - as I say,
for projects like Mercurial. Doing better is great, but we could do
worse than start by asking the Mercurial/TortoiseHg project and others
what are the problems with the current situation that changes to the
core could help to improve. I doubt "please make pythonXY.zip 50%
smaller" would be the key issue :-)

Paul

From p.f.moore at gmail.com  Thu May 28 18:43:02 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 17:43:02 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
Message-ID: <CACac1F8LDpuAUBWTHF3mYkAyjWfODCckLi=qNoyk-kqPLD3x3A@mail.gmail.com>

On 28 May 2015 at 17:28, Chris Barker <chris.barker at noaa.gov> wrote:
> On Thu, May 28, 2015 at 9:23 AM, Chris Barker <chris.barker at noaa.gov> wrote:
>>
>> Barry Warsaw wrote:
>> >> I do think single-file executables are an important piece to Python's
>> >> long-term competitiveness.
>>
>> Really? It seems to me that desktop development is dying. What are the
>> critical use-cases for a single file executable?
>
>
> oops, sorry -- I see this was addressed in another thread. Though I guess I
> still don't see why "single file" is critical, over "single thing to
> install" -- like a OS-X app bundle that can just be dragged into the
> Applications folder.

On Windows, there's a strong interest in "no install" download-and-run
applications (.Net has "xcopy deployment", portable applications are a
big deal to some people).

"Just unzip the distribution somewhere and run the exe" is definitely
a selling point for that audience. And "download an exe and just run
it" is even more so. But the latter is definitely an incremental
improvement - the former is the big deal (IMO).

Paul

From cournape at gmail.com  Thu May 28 18:44:09 2015
From: cournape at gmail.com (David Cournapeau)
Date: Fri, 29 May 2015 01:44:09 +0900
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
Message-ID: <CAGY4rcUkUvw3NcB+Y7HdDEVmahpy3LB1T=ScA+-O=9GWJWawPw@mail.gmail.com>

On Fri, May 29, 2015 at 1:28 AM, Chris Barker <chris.barker at noaa.gov> wrote:

> On Thu, May 28, 2015 at 9:23 AM, Chris Barker <chris.barker at noaa.gov>
> wrote:
>
>> Barry Warsaw wrote:
>> >> I do think single-file executables are an important piece to Python's long-term
>> competitiveness.
>>
>> Really? It seems to me that desktop development is dying. What are the
>> critical use-cases for a single file executable?
>>
>
> oops, sorry -- I see this was addressed in another thread. Though I guess
> I still don't see why "single file" is critical, over "single thing to
> install" -- like a OS-X app bundle that can just be dragged into the
> Applications folder.
>

It is much simpler to deploy in an automated, recoverable way (and also
much faster), because you can't have parts of the artefact "unsynchronized"
with another part of the program. Note also that moving a python
installation in your fs is actually quite unlikely to work in interesting
usecases on unix because of the relocatability issue.

Another advantage: it makes it impossible for users to tamper an
application's content and be surprised things don't work anymore (a very
common source of issues, familiar to anybody deploying complex python
applications in the "enterprise world").

I recently started using some services written in go, and the single file
approach is definitely a big +. It makes *using* applications written in it
so much easier than python, even though I am complete newbie in go and
relatively comfortable with python.

One should keep in mind that go has some inherent advantages over python in
those contexts even if python were to gain single file distribution
tomorrow. Most of go stdlib is written in go now I believe, and it is much
more portable across linux systems on a given CPU arch compared to python.
IOW, it is more robust against ABI variability.

David
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/c8e8a2d4/attachment.html>

From donald at stufft.io  Thu May 28 18:44:25 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 12:44:25 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528115834.69284cb1@anarchist.wooz.org>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <20150528115834.69284cb1@anarchist.wooz.org>
Message-ID: <etPan.556745e9.61ba8ba3.12a4d@Draupnir.home>



On May 28, 2015 at 12:01:22 PM, Barry Warsaw (barry at python.org) wrote:
> On May 28, 2015, at 11:39 AM, Donald Stufft wrote:
>  
> >You don?t need a "fully functioning Python" for a single file binary, you
> >only need enough to actually run your application. For example, if you're
> >making an application that can download files over HTTP, you don't need to
> >include parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite,
> >csv, email, mailcap, mailbox, imaplib, nntplib, etc.
>  
> There are actually two related but different use cases to "single file
> executables".
>  
> The first is nicely solved by tools like pex, where you don't need to include
> a fully functional Python at the head of the zip file because the environment
> you're deploying it into will have enough Python to make the zip work. This
> can certainly result in smaller zip files. This is the approach I took with
> Snappy Ubuntu Core support for Python 3, based on the current situation that
> the atomic upgrade client is written in Python 3. If that changes and Python
> 3 is removed from the image, then this approach won't work.
>  
> pex (and others) does a great job at this, so unless there are things better
> refactored into upstream Python, I don't think we need to do much here.

Pex would be improved by having native support for importing .so?s from within
a zipfile via zipimport. It would also be improved by having good, built in
support for extraneous resources in the stdlib too. It?s doing pretty well on
it?s own though besides those quality of life improvements.

>  
> The second use case is as you describe: put a complete functional Python
> environment at the head of the zip file so you don't need anything in the
> target deployment environment. "Complete" can easily mean the entire stdlib,
> and although that would usually be more bloated than you normally need, hey,
> it's just some extra unused bits so who cares? . I think this would be
> an excellent starting point which can be optimized to trim unnecessary bits
> later, maybe by third party tools.
>  
> >Of course deciding which pieces you include in the zip file you're appending
> >to the end of Python is up to whatever tool builds this executable which
> >doesn't need to be part of Python itself. If Python itself gained the ability
> >to operate in that manner than third party tools could handle trying to do
> >the optimizations where it only includes the things it actually needs in the
> >stdlib and excludes things it doesn't. The key thing here is that since
> >you're doing a single file binary, you don't need to have a Python which is
> >suitable to execute random Python code, you only need one that is suitable to
> >execute this particular code so you can specialize what that includes.
>  
> I'd love to see Python itself gain such a tool, but if it had the critical
> pieces to execute in this way, that would enable a common approach to
> supporting this in third party tools, on a variety of platforms.

Right, it would be great to get it built into Python itself, but I consider that
less important than getting the critical pieces into Python. If those pieces are
there then we can iterate outside of the standard library and try different
approaches to *building* such a file, and eventually take a look at the landscape
and bless one approach (or not, if we don?t want to).

>  
> I do think single-file executables are an important piece to Python's
> long-term competitiveness.
>  

I completely agree. I talk to a lot of people about packaging of things, and while
I think there are some serious problems with huge parts of Go?s packaging and
distribution story the static linking and compiling down to a ?single? file is not
one of them. People *really* enjoy this and it simplifies a ton of things for people.
They don?t have to worry about making sure a whole set of files are deployed, they
don?t have to worry about what version of Python is installed (or if any version is
installed), they don?t have to worry about what other things have been installed, they
just copy an executable and then they are good to go.

I think we could potentially do *better* than Go in this regard, because we can make
it possible to do both ?static? and ?dynamic? dependencies, as well as provide the
ecosystem around that to do things like provide visibility into what versions of
libraries are ?compiled? into that single file executable, etc.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From guido at python.org  Thu May 28 18:47:11 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 28 May 2015 09:47:11 -0700
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528121341.74d087da@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
Message-ID: <CAP7+vJK5XhAZnxrkPW1aE7kskgJ__YqsnztG4tc2tdm+gkf1YQ@mail.gmail.com>

Single-file binaries are indeed important. (Though in most cases they don't
have to be totally stand-alone -- they can depend on a system python and
its stdlib. At least in typical datacenter setups.) Have people looked at
PEX (a format developed by Twitter) or Pants (which seems to be an
open-source tool that can build PEX files)?

?ukasz has told me that at Facebook they have a similar system that they
are now using to deploy Python 3 binaries.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/c8a36f13/attachment.html>

From chris.barker at noaa.gov  Thu May 28 18:23:57 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Thu, 28 May 2015 09:23:57 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.556736c6.581308f4.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
Message-ID: <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>

I'm confused:

Doesn't py2exe (optionally) create a single file executable?

And py2app on the Mac creates an application bundle, but that is
more-or-less the equivalent on OS-X (you may not even be able to have a
single file executable that can access the Window Manager, for instance)

Depending on what extra packages you need, py2exe's single file doesn't
always work, but last I tried, it worked for a fair bit (I think all of the
stdlib).

I don't know what PyInstaller or others create. And I have no idea if there
is a linux option -- but it seems like the standard of practice for an
application for linux is a bunch of files scattered over the system anyway
:-)

Yes, the resulting exe is pretty big, but it does try to include only those
modules and packages that are used, and that kind of optimization could be
improved in any case.

So is something different being asked for here?

Barry Warsaw wrote:
>> I do think single-file executables are an important piece to Python's long-term
competitiveness.

Really? It seems to me that desktop development is dying. What are the
critical use-cases for a single file executable?

And I'd note that getting a good way to use Python to develop for iOS,
Android, and Mobile Windows is FAR more critical!  -- maybe that's the same
problem ?

-Chris


On Thu, May 28, 2015 at 8:39 AM, Donald Stufft <donald at stufft.io> wrote:

>
>
> On May 28, 2015 at 11:30:37 AM, Steve Dower (steve.dower at microsoft.com)
> wrote:
> > Donald Stufft wrote:
> > > Well Python 3.4.3 binary is 4kb for me, so you'd have that + your 1KB
> Python script + whatever
> > other pieces you need.
> >
> > For contrast, here are the things you need on Windows to be able to get
> to an interactive
> > prompt (I don't know how other platforms get this down to 4KB...):
> >
> > * python.exe (or some equivalent launcher) 39KB
> > * python35.dll 3,788KB
> > * vcruntime140.dll 87KB (the rest of the CRT is about 1MB, but is not
> redistributable
> > so doesn't count here)
> > * 26 files in Lib 343KB
> >
> > This gets you to ">>>", and basically everything after that is going to
> fail for some reason.
> > That's an unavoidable 4,257KB.
> >
> > The rest of the stdlib adds another ~16MB once you exclude the test
> suite, so a fully functioning
> > Python is not cheap. (Using compressed .pyc's in a zip file can make a
> big difference here
> > though, assuming you're willing to trade CPU for HDD.)
> >
> > Cheers,
> > Steve
> >
> >
>
> You don?t need a "fully functioning Python" for a single file binary, you
> only
> need enough to actually run your application. For example, if you're making
> an application that can download files over HTTP, you don't need to include
> parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite, csv,
> email,
> mailcap, mailbox, imaplib, nntplib, etc.
>
> Of course deciding which pieces you include in the zip file you're
> appending
> to the end of Python is up to whatever tool builds this executable which
> doesn't need to be part of Python itself. If Python itself gained the
> ability
> to operate in that manner than third party tools could handle trying to do
> the
> optimizations where it only includes the things it actually needs in the
> stdlib
> and excludes things it doesn't. The key thing here is that since you're
> doing
> a single file binary, you don't need to have a Python which is suitable to
> execute random Python code, you only need one that is suitable to execute
> this
> particular code so you can specialize what that includes.
>
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov
>



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/a6586494/attachment.html>

From rosuav at gmail.com  Thu May 28 18:53:03 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Fri, 29 May 2015 02:53:03 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
Message-ID: <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>

On Fri, May 29, 2015 at 2:28 AM, Chris Barker <chris.barker at noaa.gov> wrote:
> oops, sorry -- I see this was addressed in another thread. Though I guess I
> still don't see why "single file" is critical, over "single thing to
> install" -- like a OS-X app bundle that can just be dragged into the
> Applications folder.

There's also "single thing to uninstall", which IMO is more important.
If I download a tiny program that's supposed to just do one tiny
thing, and it has to install itself into Program Files, Common Files,
Windows\System32, and Documents & Settings\my-user-name\Applications,
then I have to hope it has a proper uninstaller. If it's a single
executable that just does its stuff (or, failing that, a single zip
file that I extract to anywhere and run the program), I can expect
that deleting that file (or directory) will get rid of it all. Of
course, it's entirely possible that it's gone and left its droppings
all over the system, but that's a matter of trust - a legit program
won't lie about that.

Is this a Windows-specific issue, or is it also intended for Linux and
Mac OS, where there'll already be a system Python (so a
single-file-executable would be used to be independent of the system
Python)?

ChrisA

From donald at stufft.io  Thu May 28 18:54:49 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 12:54:49 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
Message-ID: <etPan.55674859.543b4eb3.12a4d@Draupnir.home>



On May 28, 2015 at 12:24:42 PM, Chris Barker (chris.barker at noaa.gov) wrote:
> I'm confused:
>  
> Doesn't py2exe (optionally) create a single file executable?
>  
> And py2app on the Mac creates an application bundle, but that is
> more-or-less the equivalent on OS-X (you may not even be able to have a
> single file executable that can access the Window Manager, for instance)
>  
> Depending on what extra packages you need, py2exe's single file doesn't
> always work, but last I tried, it worked for a fair bit (I think all of the
> stdlib).
>  
> I don't know what PyInstaller or others create. And I have no idea if there
> is a linux option -- but it seems like the standard of practice for an
> application for linux is a bunch of files scattered over the system anyway
> :-)
>  
> Yes, the resulting exe is pretty big, but it does try to include only those
> modules and packages that are used, and that kind of optimization could be
> improved in any case.
>  
> So is something different being asked for here?

All of those solutions ?work? to varying degrees of work, almost all of them rely
on hacks in order to make things ?work? because the ability to do it isn?t built
into Python itself. If the critical pieces to execute in this way was built into
Python itself, then those tools would work a whole lot better than they currently
do.

>  
> Barry Warsaw wrote:
> >> I do think single-file executables are an important piece to Python's long-term
> competitiveness.
>  
> Really? It seems to me that desktop development is dying. What are the
> critical use-cases for a single file executable?

The desktop isn?t dying, Mobile is becoming a very important thing of course,
but that?s just because people are using devices *more* to account for the
use of Mobile, they aren?t really using their Desktop?s less.

See:?http://blogs.wsj.com/cmo/2015/05/26/mobile-isnt-killing-the-desktop-internet/

>  
> And I'd note that getting a good way to use Python to develop for iOS,
> Android, and Mobile Windows is FAR more critical! -- maybe that's the same
> problem ?
>  

It?s not the same problem, but it?s also not very relevant. Volunteer time isn?t
fungible, you get what people are willing to work on regardless of whether it
will help Python as a whole. It?s also not an either/or proposition, we can both
improve our ability to develop under iOS/Android/etc and improve our ability to
handle desktop applications.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From p.f.moore at gmail.com  Thu May 28 19:01:27 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 18:01:27 +0100
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528121341.74d087da@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
Message-ID: <CACac1F9aQzaNmbgo3O5n00ypD_BcWvu8yi3czWETmmQGPAsSBA@mail.gmail.com>

On 28 May 2015 at 17:13, Barry Warsaw <barry at python.org> wrote:
> On May 28, 2015, at 10:37 AM, Donald Stufft wrote:
>
>>I think docker is a pretty crummy answer to Go?s static binaries. What I would
>>love is for Python to get:
>>
>>* The ability to import .so modules via zipzimport (ideally without a
>>temporary   directory, but that might require newer APIs from libc and such).
>
> +1 - Thomas Wouters mentioned at the language summit some work being done on
> glibc to add dlopen_from_memory() (sp?) which would allow for loading .so
> files directly from a zip.  Not sure what the status is of that, but it would
> be a great addition.

+1 but it needs to be cross-platform - py2exe has something similar
for Windows, which we should expect to use if the glibc solution is
Unix-only.

>>* The ability to create a ?static? Python that links everything it needs into
>>the binary to do a zipimport of everything else (including the stdlib).
>
> +1

+0 - I suppose it would be nice, but how important is this really? If
all we do is end up with a single massive file instead of a directory,
then I don't see we've gained much.

Smallish C programs tend to hit the 10-100k executable size. Offhand I
don't know how big a typical go "single file executable" is. If I
could bundle up my Python script (something simple, let's say a "Hello
world" for argument's sake) and get a 10-20k single-file executable,
this would be worthwhile. If the resulting exe was 5MB (which is what
today's solutions that bundle the Python DLL and full zipped stdlib
tend to weigh in at) then I'm not interested.

>>*The ability to execute a zipfile that has been concat onto the end of the
>>Python binary.
>
> +1

+1. This would be immensely useful to tack on the front of a pyz
archive. Even just the existing (39K on Windows) python.exe which
relies on the Python installation would be great. Basically it could
replace the setuptools/distlib "script wrapper" executables with
something that doesn't need to run 2 processes just to run a script.

>>We?d probably want some more quality of life improvements around accessing
>>resources from within that zip file as well, but that can be done as a
>>library easier than the above three things can.
>
> E.g. you really should be using the pkg_resources APIs for loading resources
> from your packages, otherwise you're gonna have problems with zip
> executables.  We've talked before about adopting some of these APIs into
> Python's stdlib.  pkgutil is a start, and the higher level APIs from
> pkg_resources should probably go there.

+1. We need the APIs available in the stdlib, then 3rd party libraries
have a solution when users complain "your library isn't zipimport
safe". (Of course "well, we have to support Python 2.7/3.4" remains a
response for some time yet :-() Then we can start working on the
culture change where library authors start expecting their code to be
deployed in single-file "run-from-zip" applications.

Paul

From Steve.Dower at microsoft.com  Thu May 28 17:30:33 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Thu, 28 May 2015 15:30:33 +0000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
Message-ID: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>

Donald Stufft wrote:
> Well Python 3.4.3 binary is 4kb for me, so you'd have that + your 1KB Python script + whatever other pieces you need.

For contrast, here are the things you need on Windows to be able to get to an interactive prompt (I don't know how other platforms get this down to 4KB...):

* python.exe (or some equivalent launcher) 39KB
* python35.dll 3,788KB
* vcruntime140.dll 87KB (the rest of the CRT is about 1MB, but is not redistributable so doesn't count here)
* 26 files in Lib 343KB

This gets you to ">>>", and basically everything after that is going to fail for some reason. That's an unavoidable 4,257KB.

The rest of the stdlib adds another ~16MB once you exclude the test suite, so a fully functioning Python is not cheap. (Using compressed .pyc's in a zip file can make a big difference here though, assuming you're willing to trade CPU for HDD.)

Cheers,
Steve
 

From brian at python.org  Thu May 28 19:04:24 2015
From: brian at python.org (Brian Curtin)
Date: Thu, 28 May 2015 12:04:24 -0500
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
Message-ID: <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>

On Thu, May 28, 2015 at 11:23 AM, Chris Barker <chris.barker at noaa.gov> wrote:
> I'm confused:
>
> Doesn't py2exe (optionally) create a single file executable?
>
> And py2app on the Mac creates an application bundle, but that is
> more-or-less the equivalent on OS-X (you may not even be able to have a
> single file executable that can access the Window Manager, for instance)
>
> Depending on what extra packages you need, py2exe's single file doesn't
> always work, but last I tried, it worked for a fair bit (I think all of the
> stdlib).
>
> I don't know what PyInstaller or others create. And I have no idea if there
> is a linux option -- but it seems like the standard of practice for an
> application for linux is a bunch of files scattered over the system anyway
> :-)
>
> Yes, the resulting exe is pretty big, but it does try to include only those
> modules and packages that are used, and that kind of optimization could be
> improved in any case.
>
> So is something different being asked for here?
>
> Barry Warsaw wrote:
>>> I do think single-file executables are an important piece to Python's
>>> long-term competitiveness.
>
> Really? It seems to me that desktop development is dying. What are the
> critical use-cases for a single file executable?

Donald mentioned one earlier: command line utilities. I want a single
CLI I can deploy to my customers that doesn't make them have to
install Python or even know it's Python at all. My users write code in
all types of languages on all OSes, but I should be able to produce
one thing that they can all use. Donald himself initiated the CLI in
particular I'm talking about, but Go is picking up steam here as we
have other utilities that quickly solved the "write one thing, every
user can run it immediately, no one knows/cares what it's written in"

When I worked on Ubuntu One, I was the Windows guy responsible for
making sure the end-user experience was the same there as it was on
Ubuntu. On Ubuntu we were a part of the base install and didn't have
to worry about much. On Windows we had none of that, not even the C
runtime, so we had some pre-installer work to do, and then a bunch of
py2exe hacking to make everything play nicely and transparently.

From greg at krypto.org  Thu May 28 19:13:29 2015
From: greg at krypto.org (Gregory P. Smith)
Date: Thu, 28 May 2015 17:13:29 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
Message-ID: <CAGE7PNLbzNvOJS+OquOUn_r69Sg2NPJ6f95kR7NS9NG0z+Hksw@mail.gmail.com>

On Thu, May 28, 2015 at 9:08 AM Guido van Rossum <guido at python.org> wrote:

> Wow. Such thread. :-)
>
> This patch could save companies like Dropbox a lot of money. We run a ton
> of Python code in large datacenters, and while we are slow in moving to
> Python 3, we're good at updating to the latest 2.7.
>

Dropbox should be compiling its own interpreter with whatever patches it
deems appropriate. The people it'll save resources for are companies not
enlightened enough to do that: thousands of them, generally small or
non-tech focused :)

The patch is forward and backward compatible.I'm strongly in favor.
>

+1 I'm in favor as well.  I mostly wanted to make sure that people were
aware of profile-opt builds and that it was being compared.  Sounds like
both benefit, even used together.  Win win.

This is a 100% API compatible change.  It just rearranges the interpreter
loop on compilers enlightened enough to allow it.  I was always bummed that
it didn't make it into 2.7 itself.  But given the world+dog is going to
have 2.7 around and kicking for a long time, lets save the world some CPU
cycles (read: carbon) for little effort.  Very practical.  Good for the
world.

People who need to save orders of magnitude more cycles shouldn't use an
interpreter. ie: PyPy. Or consider the costs of moving to a compiled
language.

-gps
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/af30aabe/attachment.html>

From p.f.moore at gmail.com  Thu May 28 19:15:11 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 18:15:11 +0100
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CAP7+vJK5XhAZnxrkPW1aE7kskgJ__YqsnztG4tc2tdm+gkf1YQ@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <CAP7+vJK5XhAZnxrkPW1aE7kskgJ__YqsnztG4tc2tdm+gkf1YQ@mail.gmail.com>
Message-ID: <CACac1F_ZxxVsQ3iDeGLSAydR=hSHdsCQY5ytFQ4PiBKY2=b30Q@mail.gmail.com>

On 28 May 2015 at 17:47, Guido van Rossum <guido at python.org> wrote:
> Single-file binaries are indeed important. (Though in most cases they don't
> have to be totally stand-alone -- they can depend on a system python and its
> stdlib. At least in typical datacenter setups.) Have people looked at PEX (a
> format developed by Twitter) or Pants (which seems to be an open-source tool
> that can build PEX files)?

There appear to be problems with pex on Windows. I've reported a
couple of bugs, which appear to have been fixed (although I don't know
the timeframe, I lost interest before a fix was implemented). But
there are new ones I just found on a quick test.

I'd like pex to work, it looks like a nice tool. But I don't want to
be their Windows support resource, and it seems like they may not
currently have anyone else :-)

Paul

(I'll probably go and lie down now and stop banging the cross-platform
drum for a while :-))

From rosuav at gmail.com  Thu May 28 19:15:41 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Fri, 29 May 2015 03:15:41 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
Message-ID: <CAPTjJmoXKfnOjCxDhcjEaFDxc_fqXEAhBediHWuJJsMHaHOTig@mail.gmail.com>

On Fri, May 29, 2015 at 3:04 AM, Brian Curtin <brian at python.org> wrote:
> Donald mentioned one earlier: command line utilities. I want a single
> CLI I can deploy to my customers that doesn't make them have to
> install Python or even know it's Python at all. My users write code in
> all types of languages on all OSes, but I should be able to produce
> one thing that they can all use. Donald himself initiated the CLI in
> particular I'm talking about, but Go is picking up steam here as we
> have other utilities that quickly solved the "write one thing, every
> user can run it immediately, no one knows/cares what it's written in"

Unix-like systems have this courtesy of the shebang, so as long as
there's some sort of Python installed, people don't need to know or
care that /usr/local/bin/mailmail is implemented in Python. Maybe the
solution is to push for Windows to always include a Python
interpreter, which would allow a tiny stub to go and call on that?
Obviously a full shebang concept would be a huge change to Windows,
but if a 4KB executable can go and locate the rest of Python, or open
up a web browser saying "Please install OS update KB123456", that
would do it for most end users.

ChrisA

From Steve.Dower at microsoft.com  Thu May 28 17:45:20 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Thu, 28 May 2015 15:45:20 +0000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.556736c6.581308f4.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
Message-ID: <BY1PR03MB146643D09B0C30D48AB529B6F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>

Donald Stufft wrote:
> On May 28, 2015 at 11:30:37 AM, Steve Dower (steve.dower at microsoft.com) wrote:
>> Donald Stufft wrote:
>> > Well Python 3.4.3 binary is 4kb for me, so you'd have that + your
>> > 1KB Python script + whatever
>> other pieces you need.
>>
>> For contrast, here are the things you need on Windows to be able to
>> get to an interactive prompt (I don't know how other platforms get this down
>> to 4KB...):
>>
>> * python.exe (or some equivalent launcher) 39KB
>> * python35.dll 3,788KB
>> * vcruntime140.dll 87KB (the rest of the CRT is about 1MB, but is not
>> redistributable so doesn't count here)
>> * 26 files in Lib 343KB
>>
>> This gets you to ">>>", and basically everything after that is going to fail
> for some reason.
>> That's an unavoidable 4,257KB.
>>
>> The rest of the stdlib adds another ~16MB once you exclude the test
>> suite, so a fully functioning Python is not cheap. (Using compressed
>> .pyc's in a zip file can make a big difference here though, assuming
>> you're willing to trade CPU for HDD.)
>>
>> Cheers,
>> Steve
>>
>>
> 
> You don?t need a "fully functioning Python" for a single file binary, you only
> need enough to actually run your application. For example, if you're making an
> application that can download files over HTTP, you don't need to include parts
> of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite, csv, email,
> mailcap, mailbox, imaplib, nntplib, etc.
> 
> Of course deciding which pieces you include in the zip file you're appending to
> the end of Python is up to whatever tool builds this executable which doesn't
> need to be part of Python itself. If Python itself gained the ability to operate
> in that manner than third party tools could handle trying to do the
> optimizations where it only includes the things it actually needs in the stdlib
> and excludes things it doesn't. The key thing here is that since you're doing a
> single file binary, you don't need to have a Python which is suitable to execute
> random Python code, you only need one that is suitable to execute this
> particular code so you can specialize what that includes.

Agreed, but the minimally functioning Python is barely under 5MB. That will be considered bloated and won't help us compete with Go, so we should find a better way to fix Python application distribution and stop getting so hung up on putting everything into a single executable file.

Cheers,
Steve


From barry at python.org  Thu May 28 19:19:29 2015
From: barry at python.org (Barry Warsaw)
Date: Thu, 28 May 2015 13:19:29 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
Message-ID: <20150528131929.4687ef89@anarchist.wooz.org>

On May 28, 2015, at 09:23 AM, Chris Barker wrote:

>Barry Warsaw wrote:
>>I do think single-file executables are an important piece to Python's
>>long-term competitiveness.
>
>Really? It seems to me that desktop development is dying. What are the
>critical use-cases for a single file executable?
>
>And I'd note that getting a good way to use Python to develop for iOS,
>Android, and Mobile Windows is FAR more critical!  -- maybe that's the same
>problem ?

Well, in my world they are the same problem!  With mobile, IoT, etc. you can't
or shouldn't assume that Python will be available in the base environment.
There are ways to deploy to the various new world of devices that perhaps
don't require single-file executables, but those are more complicated to
manage, and they still have to ship the Python environment along with the
application code.

Cheers,
-Barry


From donald at stufft.io  Thu May 28 19:20:06 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 13:20:06 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
Message-ID: <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>



On May 28, 2015 at 12:54:34 PM, Chris Angelico (rosuav at gmail.com) wrote:
> On Fri, May 29, 2015 at 2:28 AM, Chris Barker wrote:
> > oops, sorry -- I see this was addressed in another thread. Though I guess I
> > still don't see why "single file" is critical, over "single thing to
> > install" -- like a OS-X app bundle that can just be dragged into the
> > Applications folder.
>  
> There's also "single thing to uninstall", which IMO is more important.
> If I download a tiny program that's supposed to just do one tiny
> thing, and it has to install itself into Program Files, Common Files,
> Windows\System32, and Documents & Settings\my-user-name\Applications,
> then I have to hope it has a proper uninstaller. If it's a single
> executable that just does its stuff (or, failing that, a single zip
> file that I extract to anywhere and run the program), I can expect
> that deleting that file (or directory) will get rid of it all. Of
> course, it's entirely possible that it's gone and left its droppings
> all over the system, but that's a matter of trust - a legit program
> won't lie about that.
>  
> Is this a Windows-specific issue, or is it also intended for Linux and
> Mac OS, where there'll already be a system Python (so a
> single-file-executable would be used to be independent of the system
> Python)?
>  

I think it?s an issue for all platforms, even when there is a system Python
that can be used.

Here?s why:

* Even on Linux systems Python isn?t always a guaranteed thing to be installed,
? for instance Debian works just fine without any Python installed.

* On OS X you have the system Python yes, but that is in an unknown state. It
? could have any number of changes made to it or things installed or what have
? you.

* Even if you have Python installed already, is it the right one? What if it?s
? an ancient RHEL box that has 2.6 or (heaven forbid) 2.4? What if it?s a not
? ancient box that has Python 2.7 but you want to deploy your app in Python 3?

* What if you have Python installed already, but it?s been patched by the place
? you got it from and now the behavior is different than what you expected?

etc etc.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From brett at python.org  Thu May 28 19:25:18 2015
From: brett at python.org (Brett Cannon)
Date: Thu, 28 May 2015 17:25:18 +0000
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528121341.74d087da@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
Message-ID: <CAP1=2W6w-5qz18Fo0rbe3rdG9wu4bLXg1C4PPovHfuORkc=gww@mail.gmail.com>

 On Thu, May 28, 2015, 12:14 Barry Warsaw <barry at python.org> wrote:

Go seems to be popular where I work.  It is replacing Python in a number of
places, although Python (and especially Python 3) is still a very important
part of our language toolbox.

There are several reasons why Go is gaining popularity.  Single-file
executables is definitely a reason; it makes deployment very easy, even if
it
increases the maintenance burden (e.g. without shared libraries, you have
multiple copies of things so when a security fix is required for one of
those
things you have to recompile the world).

Start up times and memory footprint are also factors.  Probably not much to
be
done about the latter, but perhaps PEP 432 can lead to improvements in the
former.  (Hey Nick, I'm guessing you'll want to bump that one back to 3.6.)

Certainly better support for multi-cores comes up a lot.  It should be a
SMoE
to just get rid of the GIL once and for all <wink>.

One thing I've seen more than once is that new development happens in Python
until the problem is understood, then the code is ported to Go.  Python's
short path from idea to working code, along with its ability to quickly
morph
as requirements and understanding changes, its batteries included
philosophy,
and its "fits-your-brain" consistency are its biggest strengths!

On May 28, 2015, at 10:37 AM, Donald Stufft wrote:

>I think docker is a pretty crummy answer to Go?s static binaries. What I
would
>love is for Python to get:
>
>* The ability to import .so modules via zipzimport (ideally without a
>temporary   directory, but that might require newer APIs from libc and
such).

+1 - Thomas Wouters mentioned at the language summit some work being done on
glibc to add dlopen_from_memory() (sp?) which would allow for loading .so
files directly from a zip.  Not sure what the status is of that, but it
would
be a great addition.

>* The ability to create a ?static? Python that links everything it needs
into
>the binary to do a zipimport of everything else (including the stdlib).

+1

>*The ability to execute a zipfile that has been concat onto the end of the

>Python binary.

+1

>I think that if we get all of that, we could easily create a single file
>executable with real, native support from Python by simply compiling Python
>in that static mode and then appending a zip file containing the standard
>library and any other distributions we need to the end of it.
>
>We?d probably want some more quality of life improvements around accessing
>resources from within that zip file as well, but that can be done as a
>library easier than the above three things can.

E.g. you really should be using the pkg_resources APIs for loading resources
from your packages, otherwise you're gonna have problems with zip
executables.  We've talked before about adopting some of these APIs into
Python's stdlib.  pkgutil is a start, and the higher level APIs from
pkg_resources should probably go there.

 Donald Stuff proposed importlib.resources a little while back to handle
the storage-agnostic api dor reading data and I have been thinking about it
for years. I plan to make it happen in Python 3.6.

-brett


Cheers,
-Barry

_______________________________________________
Python-Dev mailing list
Python-Dev at python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/brett%40python.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/31967f52/attachment.html>

From barry at python.org  Thu May 28 19:25:25 2015
From: barry at python.org (Barry Warsaw)
Date: Thu, 28 May 2015 13:25:25 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.556745e9.61ba8ba3.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <20150528115834.69284cb1@anarchist.wooz.org>
 <etPan.556745e9.61ba8ba3.12a4d@Draupnir.home>
Message-ID: <20150528132525.7a289c0c@anarchist.wooz.org>

On May 28, 2015, at 12:44 PM, Donald Stufft wrote:

>Pex would be improved by having native support for importing .so?s from within
>a zipfile via zipimport. It would also be improved by having good, built in
>support for extraneous resources in the stdlib too.

Completely agree on both points.  Having an API for importing .so's from a zip
would be really useful.  Today that can be implemented as "copy to tempdir"
and tomorrow the implementation could optionally dlopen_from_memory() without
any client code changing.

>Right, it would be great to get it built into Python itself, but I consider
>that less important than getting the critical pieces into Python. If those
>pieces are there then we can iterate outside of the standard library and try
>different approaches to *building* such a file, and eventually take a look at
>the landscape and bless one approach (or not, if we don?t want to).

Sounds good to me!

Cheers,
-Barry
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/8182b9dc/attachment.sig>

From rymg19 at gmail.com  Thu May 28 19:32:57 2015
From: rymg19 at gmail.com (Ryan Gonzalez)
Date: Thu, 28 May 2015 12:32:57 -0500
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
	dispatch for Python 2)
In-Reply-To: <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
Message-ID: <B0CCF1CF-D6A7-4CE6-9043-37BE1A5EAE8C@gmail.com>

py2exe tends to invoke DLL hell if you have various versions of VS or Office or both installed. Because Windows.


On May 28, 2015 11:23:57 AM CDT, Chris Barker <chris.barker at noaa.gov> wrote:
>I'm confused:
>
>Doesn't py2exe (optionally) create a single file executable?
>
>And py2app on the Mac creates an application bundle, but that is
>more-or-less the equivalent on OS-X (you may not even be able to have a
>single file executable that can access the Window Manager, for
>instance)
>
>Depending on what extra packages you need, py2exe's single file doesn't
>always work, but last I tried, it worked for a fair bit (I think all of
>the
>stdlib).
>
>I don't know what PyInstaller or others create. And I have no idea if
>there
>is a linux option -- but it seems like the standard of practice for an
>application for linux is a bunch of files scattered over the system
>anyway
>:-)
>
>Yes, the resulting exe is pretty big, but it does try to include only
>those
>modules and packages that are used, and that kind of optimization could
>be
>improved in any case.
>
>So is something different being asked for here?
>
>Barry Warsaw wrote:
>>> I do think single-file executables are an important piece to
>Python's long-term
>competitiveness.
>
>Really? It seems to me that desktop development is dying. What are the
>critical use-cases for a single file executable?
>
>And I'd note that getting a good way to use Python to develop for iOS,
>Android, and Mobile Windows is FAR more critical!  -- maybe that's the
>same
>problem ?
>
>-Chris
>
>
>On Thu, May 28, 2015 at 8:39 AM, Donald Stufft <donald at stufft.io>
>wrote:
>
>>
>>
>> On May 28, 2015 at 11:30:37 AM, Steve Dower
>(steve.dower at microsoft.com)
>> wrote:
>> > Donald Stufft wrote:
>> > > Well Python 3.4.3 binary is 4kb for me, so you'd have that + your
>1KB
>> Python script + whatever
>> > other pieces you need.
>> >
>> > For contrast, here are the things you need on Windows to be able to
>get
>> to an interactive
>> > prompt (I don't know how other platforms get this down to 4KB...):
>> >
>> > * python.exe (or some equivalent launcher) 39KB
>> > * python35.dll 3,788KB
>> > * vcruntime140.dll 87KB (the rest of the CRT is about 1MB, but is
>not
>> redistributable
>> > so doesn't count here)
>> > * 26 files in Lib 343KB
>> >
>> > This gets you to ">>>", and basically everything after that is
>going to
>> fail for some reason.
>> > That's an unavoidable 4,257KB.
>> >
>> > The rest of the stdlib adds another ~16MB once you exclude the test
>> suite, so a fully functioning
>> > Python is not cheap. (Using compressed .pyc's in a zip file can
>make a
>> big difference here
>> > though, assuming you're willing to trade CPU for HDD.)
>> >
>> > Cheers,
>> > Steve
>> >
>> >
>>
>> You don?t need a "fully functioning Python" for a single file binary,
>you
>> only
>> need enough to actually run your application. For example, if you're
>making
>> an application that can download files over HTTP, you don't need to
>include
>> parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite,
>csv,
>> email,
>> mailcap, mailbox, imaplib, nntplib, etc.
>>
>> Of course deciding which pieces you include in the zip file you're
>> appending
>> to the end of Python is up to whatever tool builds this executable
>which
>> doesn't need to be part of Python itself. If Python itself gained the
>> ability
>> to operate in that manner than third party tools could handle trying
>to do
>> the
>> optimizations where it only includes the things it actually needs in
>the
>> stdlib
>> and excludes things it doesn't. The key thing here is that since
>you're
>> doing
>> a single file binary, you don't need to have a Python which is
>suitable to
>> execute random Python code, you only need one that is suitable to
>execute
>> this
>> particular code so you can specialize what that includes.
>>
>> ---
>> Donald Stufft
>> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>>
>>
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>>
>https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov
>>
>
>
>
>-- 
>
>Christopher Barker, Ph.D.
>Oceanographer
>
>Emergency Response Division
>NOAA/NOS/OR&R            (206) 526-6959   voice
>7600 Sand Point Way NE   (206) 526-6329   fax
>Seattle, WA  98115       (206) 526-6317   main reception
>
>Chris.Barker at noaa.gov
>
>
>------------------------------------------------------------------------
>
>_______________________________________________
>Python-Dev mailing list
>Python-Dev at python.org
>https://mail.python.org/mailman/listinfo/python-dev
>Unsubscribe:
>https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/bb859e3a/attachment.html>

From rymg19 at gmail.com  Thu May 28 19:40:08 2015
From: rymg19 at gmail.com (Ryan Gonzalez)
Date: Thu, 28 May 2015 12:40:08 -0500
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
	dispatch for Python 2)
In-Reply-To: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <5E1F5753-38B2-4264-9E4F-D3380844834C@gmail.com>

I agree that size is an issue, but is it really that bad? Just compare it to the recent "web surge" where everyone is writing desktop apps in HTML5+CSS+JS and bundling a huge WebKit engine in their apps binary.

Python on Windows is seriously in a bad state. IMO, what needs to be prioritized is the ability to make exes that *actually work* with nicer GUI abilities. py2exe gives too many random DLL errors and PyInstaller is an ugly hack that just shoves 20 DLLs with your executable. Mixed with the fact that TkInter looks even uglier when built via py2exe and almost everything else (PyGI, PySide, etc.) requires yet another 20 DLLs (PySide threw in Qt DLLs that I didn't even use!), it's sad. Really sad.

This is most of the reason I write programs that I plan on distributing to various crowds in some statically compiled language (C++, Nim, Felix, not Go) with (when necessary) a statically-linked GUI library. Less DLL hell, less free files, etc.

Oh yeah, and add to that the problems with running both Python 2 and 3 on Windows while using some binaries that want 3 and others that want 2. It's painful.


On May 28, 2015 10:30:33 AM CDT, Steve Dower <Steve.Dower at microsoft.com> wrote:
>Donald Stufft wrote:
>> Well Python 3.4.3 binary is 4kb for me, so you'd have that + your 1KB
>Python script + whatever other pieces you need.
>
>For contrast, here are the things you need on Windows to be able to get
>to an interactive prompt (I don't know how other platforms get this
>down to 4KB...):
>
>* python.exe (or some equivalent launcher) 39KB
>* python35.dll 3,788KB
>* vcruntime140.dll 87KB (the rest of the CRT is about 1MB, but is not
>redistributable so doesn't count here)
>* 26 files in Lib 343KB
>
>This gets you to ">>>", and basically everything after that is going to
>fail for some reason. That's an unavoidable 4,257KB.
>
>The rest of the stdlib adds another ~16MB once you exclude the test
>suite, so a fully functioning Python is not cheap. (Using compressed
>.pyc's in a zip file can make a big difference here though, assuming
>you're willing to trade CPU for HDD.)
>
>Cheers,
>Steve
> 
>_______________________________________________
>Python-Dev mailing list
>Python-Dev at python.org
>https://mail.python.org/mailman/listinfo/python-dev
>Unsubscribe:
>https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/6f083754/attachment.html>

From rosuav at gmail.com  Thu May 28 19:43:00 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Fri, 29 May 2015 03:43:00 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
Message-ID: <CAPTjJmohzZCyqF_gz5EpM+URtx=14VEwrtdAtfX18qKiZ4hvEg@mail.gmail.com>

On Fri, May 29, 2015 at 3:20 AM, Donald Stufft <donald at stufft.io> wrote:
> On May 28, 2015 at 12:54:34 PM, Chris Angelico (rosuav at gmail.com) wrote:
>> Is this a Windows-specific issue, or is it also intended for Linux and
>> Mac OS, where there'll already be a system Python (so a
>> single-file-executable would be used to be independent of the system
>> Python)?
>>
>
> I think it?s an issue for all platforms, even when there is a system Python
> that can be used.
>
> Here?s why:
>
> * Even on Linux systems Python isn?t always a guaranteed thing to be installed,
>   for instance Debian works just fine without any Python installed.
>
> * On OS X you have the system Python yes, but that is in an unknown state. It
>   could have any number of changes made to it or things installed or what have
>   you.
>
> * Even if you have Python installed already, is it the right one? What if it?s
>   an ancient RHEL box that has 2.6 or (heaven forbid) 2.4? What if it?s a not
>   ancient box that has Python 2.7 but you want to deploy your app in Python 3?
>
> * What if you have Python installed already, but it?s been patched by the place
>   you got it from and now the behavior is different than what you expected?

The independence argument. Yep, reasonable. The trouble is that if
everyone seeks to be independent of the system Python, it makes the
system Python redundant with every single Python app; and the bundled
Python will have to be platform-specific (not just OS, but
architecture, word size, possibly OS version in some cases, etc, etc).
And that independence also means you miss out on
security/compatibility updates from upstream, so you're having to
manage your own updates.

I'd still much rather see a small stub that goes and calls on a system
Python than something that has to duplicate all of Python into every
binary, but it's a choice for the application developer to make. If
there can at least be an easy way to strip off the self-executability
header and get back the base Python application, that would be a big
help - you could have a script that goes through all your executables,
looks for the signature that says "hi, I'm a Python 2.7.10
self-runner", and strips it off in favour of a 2.7.11 that fixes some
critical issue. Or you could strip that off and run the underlying
Python code on a different OS. Binary blobs are so often unworkable.

ChrisA

From p.f.moore at gmail.com  Thu May 28 19:52:22 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 18:52:22 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
Message-ID: <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>

On 28 May 2015 at 18:04, Brian Curtin <brian at python.org> wrote:
> Donald mentioned one earlier: command line utilities. I want a single
> CLI I can deploy to my customers that doesn't make them have to
> install Python or even know it's Python at all.

Yep, that's the killer for me as well.

I know it's unrealistic in some sense, but my benchmark is what does a
"Hello, world" program look like? In C, it's a 38K executable with no
external dependencies. What does it look like in Python? (I'm not too
worried if it's a bit bigger, but if it's a *lot* bigger that starts
to be noticeable - "Python generates bloated exes").

What I'd like to be able to do is to write Python ports of a range of
core Unix utilities (comm, cut, join, od, seq, tr, ...) and have them
be viable alternatives to my current C builds (a series of single-file
100-200k static exes).

On 28 May 2015 at 18:15, Chris Angelico <rosuav at gmail.com> wrote:
> Unix-like systems have this courtesy of the shebang, so as long as
> there's some sort of Python installed, people don't need to know or
> care that /usr/local/bin/mailmail is implemented in Python. Maybe the
> solution is to push for Windows to always include a Python
> interpreter, which would allow a tiny stub to go and call on that?

Unfortunately (and believe me, I've been down this road many times) on
Windows *only* the exe format is a "first-class" executable.
Executable scripts and shebangs are very useful, but there are always
corner cases where they don't work *quite* like an exe. On Windows,
you have to be prepared to ship an exe if you want to compete with
languages that generate exes.

Having said that, the trick of appending a zipfile to an exe (or
similar) is already common practice in the Python world, and works
really well. Vinay Sanjip's pyzzer is a good example of this approach.

On 28 May 2015 at 16:45, Steve Dower <Steve.Dower at microsoft.com> wrote:
> Agreed, but the minimally functioning Python is barely under 5MB. That will be
> considered bloated and won't help us compete with Go, so we should find a better
> way to fix Python application distribution and stop getting so hung up on putting
> everything into a single executable file.

There's a perception issue here. You can compile C# code into small
exes (that's how people think) and "all" you need is the .net runtime
installed. If we shipped a "pyc" compiler that "compiled" Python code
into small exes that "just needed the Python runtime installed" would
that feel the same to people? Would they be happy to view that as
comparable to go compiled executables? (I assume go *doesn't* rely on
a separate runtime, though).

Nevertheless, I would like to understand how Unix can manage to have a
Python 3.4.3 binary at 4kb. Does that *really* have no external
dependencies (other than the C library)? Are we really comparing like
with like here?

Paul

From Steve.Dower at microsoft.com  Thu May 28 16:28:55 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Thu, 28 May 2015 14:28:55 +0000
Subject: [Python-Dev] Usability of the limited API
In-Reply-To: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>
References: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>
Message-ID: <BY1PR03MB1466903FF0712CA9595A5B99F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>

Zach has a patch to automate putting the right exports in python3.dll, which I'm strongly in favor of, but it was rejected because people may have added APIs that aren't meant to be stable.

Right now, you can #include a number of prototypes that aren't actually available because there are two places to update and so one (in this case, the DLL) doesn't get updated.

I think the current plan is to remove everything not currently in the DLL from the stable ABI and force people to add them back manually. This way we can enable the generator without committing to a large set of new APIs.

I don't have the issue number handy, but it should be near the top of the recently modified list.

Cheers,
Steve

Top-posted from my Windows Phone
________________________________
From: Paul Moore<mailto:p.f.moore at gmail.com>
Sent: ?5/?28/?2015 7:12
To: Python Dev<mailto:python-dev at python.org>
Subject: [Python-Dev] Usability of the limited API

With Python 3.5 shipping an embeddable copy of the interpreter on
Windows, I thought I'd try out a simple embedded interpreter as an
experiment. I wanted to use the limited API, as I'd rather it were
easy to upgrade the interpreter without recompiling the embedding app.

But the "Very high-level embedding" example in the docs doesn't
compile with the limited API.

#include <Python.h>

int
main(int argc, char *argv[])
{
    wchar_t *program = Py_DecodeLocale(argv[0], NULL);
    if (program == NULL) {
        fprintf(stderr, "Fatal error: cannot decode argv[0]\n");
        exit(1);
    }
    Py_SetProgramName(program);  /* optional but recommended */
    Py_Initialize();
    PyRun_SimpleString("from time import time,ctime\n"
                       "print('Today is', ctime(time()))\n");
    Py_Finalize();
    PyMem_RawFree(program);
    return 0;
}

The Py_DecodeLocale/Py_SetProgramName/PyMem_RawFree bit can probably
be replaced by a Py_SetProgramName call specifying a static value,
it's not exactly crucial. (Py_DecodeLocale appears to be defined as in
the limited API by the headers, but not exported from python3.dll, by
the way, which implies that something's out of sync).

But PyRun_SimpleString doesn't appear to be exposed in the limited
API, even though https://docs.python.org/3/c-api/veryhigh.html doesn't
mention this, and https://docs.python.org/3/c-api/stable.html says
that functions not part of the stable API will be marked as such.

I dumped out the exported symbols from python3.dll, which is the
simplest way I could think of finding out what is in the limited API
(it's hardly user friendly, but never mind). And frustratingly, none
of the very high level PyRun_XXX APIs are available.

At this point, I think I'll probably just give up and use the full
API, but it does make me question whether the limited API is actually
usable as it stands.

I was hoping to be able to suggest as an application bundling option
that people could write a trivial wrapper script in C to fire up a
Python script, and bundle that along with its dependencies and the
embeddable Python distribution. Looks like that's doable, but only
using the full API, which makes upgrading the bundled Python
interpreter a bit messier. Ah, well, no huge loss :-(

But after this experiment, I do wonder - is the limited API really a
viable option for embedders?

Paul
_______________________________________________
Python-Dev mailing list
Python-Dev at python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/1c2c474a/attachment.html>

From wes.turner at gmail.com  Thu May 28 20:03:09 2015
From: wes.turner at gmail.com (Wes Turner)
Date: Thu, 28 May 2015 13:03:09 -0500
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F_xbuFXh90Yc9Zvrv-FyfVEZc-YNi52b49diaje9_uP_w@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <20150528115834.69284cb1@anarchist.wooz.org>
 <CACac1F_xbuFXh90Yc9Zvrv-FyfVEZc-YNi52b49diaje9_uP_w@mail.gmail.com>
Message-ID: <CACfEFw_9xY=OBsKNqV7zuRRxfpob3LqFJcxMd53wVPbZrx8yLg@mail.gmail.com>

On Thu, May 28, 2015 at 11:38 AM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 28 May 2015 at 16:58, Barry Warsaw <barry at python.org> wrote:
> > On May 28, 2015, at 11:39 AM, Donald Stufft wrote:
> >
> >>You don?t need a "fully functioning Python" for a single file binary, you
> >>only need enough to actually run your application. For example, if you're
> >>making an application that can download files over HTTP, you don't need
> to
> >>include parts of the stdlib like xmlrpc, pickle, shelve, marshall,
> sqlite,
> >>csv, email, mailcap, mailbox, imaplib, nntplib, etc.
> >
> > There are actually two related but different use cases to "single file
> > executables".
> >
> > The first is nicely solved by tools like pex, where you don't need to
> include
> > a fully functional Python at the head of the zip file because the
> environment
> > you're deploying it into will have enough Python to make the zip work.
> This
> > can certainly result in smaller zip files.  This is the approach I took
> with
> > Snappy Ubuntu Core support for Python 3, based on the current situation
> that
> > the atomic upgrade client is written in Python 3.  If that changes and
> Python
> > 3 is removed from the image, then this approach won't work.
> >
> > pex (and others) does a great job at this, so unless there are things
> better
> > refactored into upstream Python, I don't think we need to do much here.
>
> One problem with pex is that it doesn't appear to work on Windows (I
> just gave it a try, and got errors because it relies on symlinks).
>
> IMO, any solution to "distributing Python applications" that is
> intended to compete with the idea that "go produces nice single-file
> executables" needs to be cross-platform. At the moment, zipapp (and in
> general, the core support for running applications from a zip file)
> handles this for the case where you're allowed to assume an already
> installed Python interpreter. The proviso here, as Donald pointed out,
> is that it doesn't handle C extensions.
>
> The biggest problem with 3rd-party solutions is that they don't always
> support the full range of platforms that Python supports. That's fine
> for a 3rd party tool, but if we want to have a response to people
> asking how to bundle their application written in Python, we need a
> better answer than "if you're on Windows, use py2exe, or if you're on
> Unix use pex, or maybe..."
>
> Python has core support for the equivalent of Java's jar format in
> zipapp. It's not well promoted (and doesn't support C extensions) but
> it's a pretty viable option for a lot of situations.
>
> > The second use case is as you describe: put a complete functional Python
> > environment at the head of the zip file so you don't need anything in the
> > target deployment environment.  "Complete" can easily mean the entire
> stdlib,
> > and although that would usually be more bloated than you normally need,
> hey,
> > it's just some extra unused bits so who cares? <wink>.  I think this
> would be
> > an excellent starting point which can be optimized to trim unnecessary
> bits
> > later, maybe by third party tools.
>
> Tools like py2exe and cx_Freeze do this, and are pretty commonly used
> on Windows. An obvious example of use is Mercurial. If you're looking
> at this scenario, a good place to start would probably be
> understanding why cx_Freeze isn't more commonly used on Unix (AFAIK,
> it supports Unix, but I've only ever really heard of it being used on
> Windows).
>

Esky https://github.com/cloudmatrix/esky/

* supports "py2exe, py2app, cxfreeze and bbfreeze"
* builds a zip archive containing an .exe
* manages (failed) [auto-]updates

PEX https://pantsbuild.github.io/pex_design.html

* adds an executable header to a (topo-sorted?) ZIP file with a minimal path

* pipsi https://github.com/mitsuhiko/pipsi/blob/master/pipsi.py
  * installs packages with console_scripts into separate virtualenvs with
minimal sys.paths and ~/.local/bin)

At the end of the day I still need packaging or configmgmt or NIX
for checksums (a manifest wrapped around the executable wrapper).


>
> I suspect "single file executables" just aren't viewed as a desirable
> solution on Unix. Although Donald referred to a 4K binary, which
> probably means just a stub exe that depends on system-installed .so
> files, likely including Python (I'm just guessing here). It's easy to
> do something similar on Windows, but it's *not* what most Windows
> users think of when you say a "single file executable for a Python
> program" (because there's no system package manager doing dependencies
> for you).
>

NuGet, Chocolatey, -> OneGet

It's a topologically sorted adjacency list + build + install + uninstall
scripts.


>
> Again, platform-specific answers are one thing, and are relatively
> common, but having a good cross-platform answer at the language level
> (a section on docs.python.org "How to ship your Python program") is
> much harder.
>
> >>Of course deciding which pieces you include in the zip file you're
> appending
> >>to the end of Python is up to whatever tool builds this executable which
> >>doesn't need to be part of Python itself. If Python itself gained the
> ability
> >>to operate in that manner than third party tools could handle trying to
> do
> >>the optimizations where it only includes the things it actually needs in
> the
> >>stdlib and excludes things it doesn't. The key thing here is that since
> >>you're doing a single file binary, you don't need to have a Python which
> is
> >>suitable to execute random Python code, you only need one that is
> suitable to
> >>execute this particular code so you can specialize what that includes.
> >
> > I'd love to see Python itself gain such a tool, but if it had the
> critical
> > pieces to execute in this way, that would enable a common approach to
> > supporting this in third party tools, on a variety of platforms.
>
> Stripping out unused code is a hard problem in a language as dynamic
> as Python. It would be great to see it happen, but I'm not sure how
> much better we can do than existing tools like modulefinder. (consider
> that stripping out parts of the stdlib is the same in principle as
> stripping out unused bits of a 3rd party library like requests - when
> this issue comes up, people often talk about slimming down the stdlib
> to just what's needed, but why not take out the json support from
> requests if you don't use it?)
>

Is this what they do for AppEngine / AppScale Python?


>
> > I do think single-file executables are an important piece to Python's
> > long-term competitiveness.
>
> Agreed. But also, I think that "single-file" executables
> (single-directory, in practice) are *already* important - as I say,
> for projects like Mercurial. Doing better is great, but we could do
> worse than start by asking the Mercurial/TortoiseHg project and others
> what are the problems with the current situation that changes to the
> core could help to improve. I doubt "please make pythonXY.zip 50%
> smaller" would be the key issue :-)
>

"Select your platform" (According to User-Agent)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/2a600729/attachment.html>

From tjreedy at udel.edu  Thu May 28 20:08:53 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Thu, 28 May 2015 14:08:53 -0400
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <mk7lka$uig$1@ger.gmane.org>

On 5/28/2015 10:55 AM, Steve Dower wrote:

> And it would look like a 20MB+ file just for a simple 1KB Python
> script...
>
> For Windows at least, I'd prefer to have some app-style installer
> generation (e.g. http://pynsist.readthedocs.org/en/latest/) which,
> combined with the embeddable Python distro (new for 3.5.0b1 in case
> anyone missed it), can simply extract everything into an install
> directory and run it from there. None of the items on the list above
> are needed for or would help with this.

What I wish, of course, is that Windows just came with Python3, the way 
that DOS came with BASIC, so people could publish and trade Python 
programs the way we once did with BASIC programs.  Then a simple 1KB 
Python script would just take an extra 1KB on disk.  To me, the removal 
of a simple, builtin programming language for everyone was the biggest 
Windows mistake.

Failing that, maybe PSF & edu-sig could somehow encourage universities 
that requires students to have a computer to recommend or even require 
that Python be loaded so students could at least assume that others 
students have Python already loaded.  Python is the BASIC of the 21 century.

Somehow, trying to make it easier to have 50 duplicate copies of Python 
on a system seems the wrong direction to go.

-- 
Terry Jan Reedy


From mal at egenix.com  Thu May 28 20:13:50 2015
From: mal at egenix.com (M.-A. Lemburg)
Date: Thu, 28 May 2015 20:13:50 +0200
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528115834.69284cb1@anarchist.wooz.org>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>	<etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <20150528115834.69284cb1@anarchist.wooz.org>
Message-ID: <55675ADE.7090803@egenix.com>

You might want to have a look at eGenix PyRun, which gives you
an almost complete Python runtime in 4-13MB (depending on what
startup performance needs you have):

http://www.egenix.com/products/python/PyRun/

On 28.05.2015 17:58, Barry Warsaw wrote:
> On May 28, 2015, at 11:39 AM, Donald Stufft wrote:
> 
>> You don?t need a "fully functioning Python" for a single file binary, you
>> only need enough to actually run your application. For example, if you're
>> making an application that can download files over HTTP, you don't need to
>> include parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite,
>> csv, email, mailcap, mailbox, imaplib, nntplib, etc.
> 
> There are actually two related but different use cases to "single file
> executables".
> 
> The first is nicely solved by tools like pex, where you don't need to include
> a fully functional Python at the head of the zip file because the environment
> you're deploying it into will have enough Python to make the zip work.  This
> can certainly result in smaller zip files.  This is the approach I took with
> Snappy Ubuntu Core support for Python 3, based on the current situation that
> the atomic upgrade client is written in Python 3.  If that changes and Python
> 3 is removed from the image, then this approach won't work.
> 
> pex (and others) does a great job at this, so unless there are things better
> refactored into upstream Python, I don't think we need to do much here.
> 
> The second use case is as you describe: put a complete functional Python
> environment at the head of the zip file so you don't need anything in the
> target deployment environment.  "Complete" can easily mean the entire stdlib,
> and although that would usually be more bloated than you normally need, hey,
> it's just some extra unused bits so who cares? <wink>.  I think this would be
> an excellent starting point which can be optimized to trim unnecessary bits
> later, maybe by third party tools.

See above. This is what eGenix PyRun provides.

Our main motivation is to have a binary which works on all
Unix platforms, without having to rely on the way too many
system dependent Python distribution (with all their quirks
and whistles ;-)).

On Windows, we use py2exe at the moment, but a port of PyRun
to Windows would be possible as well. You'd still need the
separate Python DLL, though in order to stay compatible to
C extensions which link against this.

As for application packaging: we don't have a need to put
everything into a single ZIP file or even concatenate such
a ZIP file to PyRun (which is possible: just add sys.executable to
sys.path to import from the executable).

We have plans to create a tool to make such packaging possible,
though, since it's handy to have for building small executable
apps, e.g. to drive installations or larger applications.

>> Of course deciding which pieces you include in the zip file you're appending
>> to the end of Python is up to whatever tool builds this executable which
>> doesn't need to be part of Python itself. If Python itself gained the ability
>> to operate in that manner than third party tools could handle trying to do
>> the optimizations where it only includes the things it actually needs in the
>> stdlib and excludes things it doesn't. The key thing here is that since
>> you're doing a single file binary, you don't need to have a Python which is
>> suitable to execute random Python code, you only need one that is suitable to
>> execute this particular code so you can specialize what that includes.
> 
> I'd love to see Python itself gain such a tool, but if it had the critical
> pieces to execute in this way, that would enable a common approach to
> supporting this in third party tools, on a variety of platforms.
> 
> I do think single-file executables are an important piece to Python's
> long-term competitiveness.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, May 28 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...       http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::::: Try our mxODBC.Connect Python Database Interface for free ! ::::::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/

From rosuav at gmail.com  Thu May 28 20:22:43 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Fri, 29 May 2015 04:22:43 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
Message-ID: <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>

On Fri, May 29, 2015 at 3:52 AM, Paul Moore <p.f.moore at gmail.com> wrote:
> On 28 May 2015 at 18:15, Chris Angelico <rosuav at gmail.com> wrote:
>> Unix-like systems have this courtesy of the shebang, so as long as
>> there's some sort of Python installed, people don't need to know or
>> care that /usr/local/bin/mailmail is implemented in Python. Maybe the
>> solution is to push for Windows to always include a Python
>> interpreter, which would allow a tiny stub to go and call on that?
>
> Unfortunately (and believe me, I've been down this road many times) on
> Windows *only* the exe format is a "first-class" executable.
> Executable scripts and shebangs are very useful, but there are always
> corner cases where they don't work *quite* like an exe. On Windows,
> you have to be prepared to ship an exe if you want to compete with
> languages that generate exes.

I'm aware of that. When I said "a tiny stub", I was thinking in terms
of a small executable. The idea is that its sole purpose is to locate
Python someplace else, and chain to it; that has to be actual
executable code, complete with the 512-byte "MZ" header and
everything, to ensure compatibility. But it should be able to be
small, tight, and easy to verify correctness of, so there aren't (in
theory!) security exploits in the header itself.

ChrisA

From carl at oddbird.net  Thu May 28 20:07:17 2015
From: carl at oddbird.net (Carl Meyer)
Date: Thu, 28 May 2015 12:07:17 -0600
Subject: [Python-Dev] Single-file Python executables
In-Reply-To: <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
Message-ID: <55675955.5010505@oddbird.net>

On 05/28/2015 11:52 AM, Paul Moore wrote:
[snip]
> Nevertheless, I would like to understand how Unix can manage to have a
> Python 3.4.3 binary at 4kb. Does that *really* have no external
> dependencies (other than the C library)? Are we really comparing like
> with like here?

I don't know what Donald was looking at, but I'm not seeing anything
close to that 4k figure here. (Maybe he's on OS X, where framework
builds have a "stub" executable that just execs the real one?)

On my Ubuntu Trusty system, the system Python 3.4 executable is 3.9M,
and the one I compiled myself from source, without any special options,
is almost 12M. (Not really sure what accounts for that difference -
Ubuntu system Python uses shared libraries for more stuff?)

Carl

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 819 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/70b71195/attachment-0001.sig>

From donald at stufft.io  Thu May 28 20:33:59 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 14:33:59 -0400
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <mk7lka$uig$1@ger.gmane.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <mk7lka$uig$1@ger.gmane.org>
Message-ID: <etPan.55675f97.4794acf7.12a4d@Draupnir.home>



On May 28, 2015 at 2:11:02 PM, Terry Reedy (tjreedy at udel.edu) wrote:
> On 5/28/2015 10:55 AM, Steve Dower wrote:
>  
> > And it would look like a 20MB+ file just for a simple 1KB Python
> > script...
> >
> > For Windows at least, I'd prefer to have some app-style installer
> > generation (e.g. http://pynsist.readthedocs.org/en/latest/) which,
> > combined with the embeddable Python distro (new for 3.5.0b1 in case
> > anyone missed it), can simply extract everything into an install
> > directory and run it from there. None of the items on the list above
> > are needed for or would help with this.
>  
> What I wish, of course, is that Windows just came with Python3, the way
> that DOS came with BASIC, so people could publish and trade Python
> programs the way we once did with BASIC programs. Then a simple 1KB
> Python script would just take an extra 1KB on disk. To me, the removal
> of a simple, builtin programming language for everyone was the biggest
> Windows mistake.
>  
> Failing that, maybe PSF & edu-sig could somehow encourage universities
> that requires students to have a computer to recommend or even require
> that Python be loaded so students could at least assume that others
> students have Python already loaded. Python is the BASIC of the 21 century.
>  
> Somehow, trying to make it easier to have 50 duplicate copies of Python
> on a system seems the wrong direction to go.
>  
> --
> Terry Jan Reedy
>  
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/donald%40stufft.io  
>  

Honestly, I?m on an OS that *does* ship Python (OS X) and part of me hopes
that they stop shipping it. It?s very rare that someone ships Python as
part of their OS without modifying it in some way, and those modifications
almost always cause pain to some set of users (and since I work on pip, they
tend to come to us with the weirdo problems). Case in point: Python on OS X
adds some preinstalled software, but they put this pre-installed software before
site-packages in sys.path, so pip can?t upgrade those pre-installed software
packages at all.?

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From donald at stufft.io  Thu May 28 20:36:44 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 14:36:44 -0400
Subject: [Python-Dev] Single-file Python executables
In-Reply-To: <55675955.5010505@oddbird.net>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <55675955.5010505@oddbird.net>
Message-ID: <etPan.5567603c.7761bde0.12a4d@Draupnir.home>



On May 28, 2015 at 2:33:25 PM, Carl Meyer (carl at oddbird.net) wrote:
> On 05/28/2015 11:52 AM, Paul Moore wrote:
> [snip]
> > Nevertheless, I would like to understand how Unix can manage to have a
> > Python 3.4.3 binary at 4kb. Does that *really* have no external
> > dependencies (other than the C library)? Are we really comparing like
> > with like here?
> 
> I don't know what Donald was looking at, but I'm not seeing anything
> close to that 4k figure here. (Maybe he's on OS X, where framework
> builds have a "stub" executable that just execs the real one?)
> 
> On my Ubuntu Trusty system, the system Python 3.4 executable is 3.9M,
> and the one I compiled myself from source, without any special options,
> is almost 12M. (Not really sure what accounts for that difference -
> Ubuntu system Python uses shared libraries for more stuff?)
> 
> Carl
> 
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/donald%40stufft.io 
> 

The problem is I'm an idiot and did du -h against ``which python``, which of
course resolved to the symlink of python that points to python3.4. The real
executable on my OSX box is 2.6M (built using pyenv).

--- 
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From tjreedy at udel.edu  Thu May 28 20:37:44 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Thu, 28 May 2015 14:37:44 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.556745e9.61ba8ba3.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <20150528115834.69284cb1@anarchist.wooz.org>
 <etPan.556745e9.61ba8ba3.12a4d@Draupnir.home>
Message-ID: <mk7nad$rvh$1@ger.gmane.org>

On 5/28/2015 12:44 PM, Donald Stufft wrote:

>> I do think single-file executables are an important piece to Python's
>> long-term competitiveness.
>>
>
> I completely agree. I talk to a lot of people about packaging of things, and while
> I think there are some serious problems with huge parts of Go?s packaging and
> distribution story the static linking and compiling down to a ?single? file is not
> one of them.

How about the following compromise between assuming the presence of 
installed python and ignoring the possibility of installed python: we 
provide (as least for Windows) a C startup file that checks for the 
needed version of python and offers to download and start the installer 
if not present.  A user would see 'Python 3.5 is needed to run this 
file; shall I download and install it?' just once on a particular machine.

-- 
Terry Jan Reedy



From chris.barker at noaa.gov  Thu May 28 21:24:28 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Thu, 28 May 2015 12:24:28 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
Message-ID: <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>

OK, I'm really confused here:

1) what the heck is so special about go all of a sudden? People have been
writing and deploying single file executables built with C and ++, and
whatever else? forever. (and indeed, it was a big sticking point for me
when I introduced python in my organization)

2) Why the sudden interest in this as core a Python issue? I've been using
Python for desktop apps, on primarily Windows and the Mac for years -- and
had to deal with py2exe, py2app, etc. forever. And it has been a very very
common question on the various mailing lists for ages: how do I deploy
this? how do I make it easy to install? The answer from the developers of
cPython itself has always been that that's a third party problem -- and go
look for py2exe and friends to solve it. And that it is a solved-enough
problem. The biggest "unsolved" issues are that you get a really  big
application.

Don't get me wrong -- I've wanted for years for it to be easier to deploy
python-based apps as a single thinking for users to easily install and
uninstall where they don't need to know it's python -- but what the heck is
different now?

3) There was mention of a platform-neutral way to do this. Isn't that
simply impossible? The platforms are different in all the ways that matter
for this problem: both technical differences, and conventions. Which isn't
to say you couldn't have one API to produce a single "thing" executable, so
it would look like one solution for multiple platforms to the user. But the
end product should be (would have to be) a different beast altogether.

And doesn't PyInstaller already provide that (may it can't do
single-file...)

Anyway -- if there really is a new interest in this problem such that
people will put some time into, here are some thoughts I've had for ages:

The big one is Separation of concerns: in order to build a single "thing"
executable, you need three things:
  a) An API to for the developer to specify what they want
  b) Figure out what needs to be included -- what extra modules, etc.
  c) A way to package it all up: App bundle on the Mac, single file
executable on Windows (statically linked? zip file, ???)

That third one -- (c) is inherently platform dependent -- and there "is
more than one way to do it" even on one platform. But it sure would be nice
if the API between a) b), and c)  could be unified so we could mix and
match different implementations.

And, of course, if cPython itself could be built in a way that makes
step(c) easier/less kludgy great!

-Chris


On Thu, May 28, 2015 at 9:54 AM, Donald Stufft <donald at stufft.io> wrote:

>
>
> On May 28, 2015 at 12:24:42 PM, Chris Barker (chris.barker at noaa.gov)
> wrote:
> > I'm confused:
> >
> > Doesn't py2exe (optionally) create a single file executable?
> >
> > And py2app on the Mac creates an application bundle, but that is
> > more-or-less the equivalent on OS-X (you may not even be able to have a
> > single file executable that can access the Window Manager, for instance)
> >
> > Depending on what extra packages you need, py2exe's single file doesn't
> > always work, but last I tried, it worked for a fair bit (I think all of
> the
> > stdlib).
> >
> > I don't know what PyInstaller or others create. And I have no idea if
> there
> > is a linux option -- but it seems like the standard of practice for an
> > application for linux is a bunch of files scattered over the system
> anyway
> > :-)
> >
> > Yes, the resulting exe is pretty big, but it does try to include only
> those
> > modules and packages that are used, and that kind of optimization could
> be
> > improved in any case.
> >
> > So is something different being asked for here?
>
> All of those solutions ?work? to varying degrees of work, almost all of
> them rely
> on hacks in order to make things ?work? because the ability to do it isn?t
> built
> into Python itself. If the critical pieces to execute in this way was
> built into
> Python itself, then those tools would work a whole lot better than they
> currently
> do.
>
> >
> > Barry Warsaw wrote:
> > >> I do think single-file executables are an important piece to Python's
> long-term
> > competitiveness.
> >
> > Really? It seems to me that desktop development is dying. What are the
> > critical use-cases for a single file executable?
>
> The desktop isn?t dying, Mobile is becoming a very important thing of
> course,
> but that?s just because people are using devices *more* to account for the
> use of Mobile, they aren?t really using their Desktop?s less.
>
> See:
> http://blogs.wsj.com/cmo/2015/05/26/mobile-isnt-killing-the-desktop-internet/
>
> >
> > And I'd note that getting a good way to use Python to develop for iOS,
> > Android, and Mobile Windows is FAR more critical! -- maybe that's the
> same
> > problem ?
> >
>
> It?s not the same problem, but it?s also not very relevant. Volunteer time
> isn?t
> fungible, you get what people are willing to work on regardless of whether
> it
> will help Python as a whole. It?s also not an either/or proposition, we
> can both
> improve our ability to develop under iOS/Android/etc and improve our
> ability to
> handle desktop applications.
>
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>
>
>


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/89f3334c/attachment.html>

From sturla.molden at gmail.com  Thu May 28 21:25:17 2015
From: sturla.molden at gmail.com (Sturla Molden)
Date: Thu, 28 May 2015 19:25:17 +0000 (UTC)
Subject: [Python-Dev] Computed Goto dispatch for Python 2
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <mk7lka$uig$1@ger.gmane.org> <etPan.55675f97.4794acf7.12a4d@Draupnir.home>
Message-ID: <1434330486454533457.360774sturla.molden-gmail.com@news.gmane.org>

Donald Stufft <donald at stufft.io> wrote:

> Honestly, I?m on an OS that *does* ship Python (OS X) and part of me hopes
> that they stop shipping it. It?s very rare that someone ships Python as
> part of their OS without modifying it in some way, and those modifications
> almost always cause pain to some set of users (and since I work on pip, they
> tend to come to us with the weirdo problems). Case in point: Python on OS X
> adds some preinstalled software, but they put this pre-installed software before
> site-packages in sys.path, so pip can?t upgrade those pre-installed software
> packages at all. 

Many Unix tools need Python, so Mac OS X (like Linux distros and FreeBSD)
will always need a system Python. Yes, it would be great if could be called
spython or something else than python. But the main problem is that it is
used by end-users as well, not just the operating system. 

Anyone who use Python on OSX should install their own Python. The system
Python should be left alone as it is. 

If the system Python needs updating, it is the responsibility of Apple to
distribute the upgrade. Nobody should attempt to use pip to update the
system Python. Who knows what side-effects it might have. Preferably pip
should have a check for it and bluntly refuse to do it.

Sturla


From p.f.moore at gmail.com  Thu May 28 21:26:18 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 20:26:18 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
Message-ID: <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>

On 28 May 2015 at 19:22, Chris Angelico <rosuav at gmail.com> wrote:
>> Unfortunately (and believe me, I've been down this road many times) on
>> Windows *only* the exe format is a "first-class" executable.
>> Executable scripts and shebangs are very useful, but there are always
>> corner cases where they don't work *quite* like an exe. On Windows,
>> you have to be prepared to ship an exe if you want to compete with
>> languages that generate exes.
>
> I'm aware of that. When I said "a tiny stub", I was thinking in terms
> of a small executable. The idea is that its sole purpose is to locate
> Python someplace else, and chain to it; that has to be actual
> executable code, complete with the 512-byte "MZ" header and
> everything, to ensure compatibility. But it should be able to be
> small, tight, and easy to verify correctness of, so there aren't (in
> theory!) security exploits in the header itself.

OK, cool. I'm sort of working on that as a bit of a side project - a
tiny stub exe that you can prepend to a Python zipapp which runs it
via the standard embedding APIs. It's little more than an idea at the
moment, but I don't think it'll be too hard to implement...

Paul

From srinivas.vamsi.parasa at intel.com  Thu May 28 21:28:50 2015
From: srinivas.vamsi.parasa at intel.com (Parasa, Srinivas Vamsi)
Date: Thu, 28 May 2015 19:28:50 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <5567036C.4010905@ubuntu.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <5567036C.4010905@ubuntu.com>
Message-ID: <34384DEB0F607E42BD61D446586AD4E86B51210C@ORSMSX103.amr.corp.intel.com>

Sorry for missing Julian's question. The GCC version used for the benchmarks is 4.8.2
Will look into the discussion at https://gcc.gnu.org/bugzilla/show_bug.cgi?id=39284 and will investigate it.

> Julian Taylor jtaylor.debian at googlemail.com 
> Thu May 28 13:30:59 CEST 2015
> won't this need python compiled with gcc 5.1 to have any effect? Which
> compiler version was used for the benchmark?
> the issue that negated most computed goto improvements
> (https://gcc.gnu.org/bugzilla/show_bug.cgi?id=39284) was only closed
> very recently (r212172, 9f4ec746affbde1)

-----Original Message-----
From: Matthias Klose [mailto:doko at ubuntu.com] 
Sent: Thursday, May 28, 2015 5:01 AM
To: Parasa, Srinivas Vamsi; 'python-dev at python.org'
Subject: Re: [Python-Dev] Computed Goto dispatch for Python 2

On 05/28/2015 02:17 AM, Parasa, Srinivas Vamsi wrote:
> Hi All,
> 
> This is Vamsi from Server Scripting Languages Optimization team at Intel Corporation.
> 
> Would like to submit a request to enable the computed goto based dispatch in Python 2.x (which happens to be enabled by default in Python 3 given its performance benefits on a wide range of workloads). We talked about this patch with Guido and he encouraged us to submit a request on Python-dev (email conversation with Guido shown at the bottom of this email).
> 
> Attached is the computed goto patch (along with instructions to run) for Python 2.7.10 (based on the patch submitted by Jeffrey Yasskin  at http://bugs.python.org/issue4753). We built and tested this patch for Python 2.7.10 on a Linux machine (Ubuntu 14.04 LTS server, Intel Xeon - Haswell EP CPU with 18 cores, hyper-threading off, turbo off).
> 
> Below is a summary of the performance we saw on the "grand unified python benchmarks" suite (available at https://hg.python.org/benchmarks/). We made 3 rigorous runs of the following benchmarks. In each rigorous run, a benchmark is run 100 times with and without the computed goto patch. Below we show the average performance boost for the 3 rigorous runs.
> 
> Python 2.7.10 (original) vs Computed Goto performance Benchmark

-1

As Gregory pointed out, there are other options to build the interpreter, and we are missing data how these compare with your patch.

I assume, you tested with the Intel compiler, so it would be good to see results for other compilers as well (GCC, clang).  Please could you provide the data for LTO and profile guided optimized builds (maybe combined too)?  I'm happy to work with you on setting up these builds, but currently don't have the machine resources to do so myself.

If the benefits show up for these configurations too, then I'm +/-0 on this patch.

Matthias


From brett at python.org  Thu May 28 21:47:37 2015
From: brett at python.org (Brett Cannon)
Date: Thu, 28 May 2015 19:47:37 +0000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
 <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
Message-ID: <CAP1=2W7BDX6jqszQrKE4X4QPRrqF3dcMgNXqOH46zkivLnRf8A@mail.gmail.com>

On Thu, May 28, 2015 at 3:25 PM Chris Barker <chris.barker at noaa.gov> wrote:

> OK, I'm really confused here:
>
> 1) what the heck is so special about go all of a sudden? People have been
> writing and deploying single file executables built with C and ++, and
> whatever else? forever. (and indeed, it was a big sticking point for me
> when I introduced python in my organization)
>

Because Go is much easier to use to write those CLI apps than C or C++. The
barrier now is low enough that the ease of development plus the ability to
do statically compiled binaries is enticing folks to drop Python and Ruby
for Go when making a CLI app.


>
> 2) Why the sudden interest in this as core a Python issue? I've been using
> Python for desktop apps, on primarily Windows and the Mac for years -- and
> had to deal with py2exe, py2app, etc. forever. And it has been a very very
> common question on the various mailing lists for ages: how do I deploy
> this? how do I make it easy to install? The answer from the developers of
> cPython itself has always been that that's a third party problem -- and go
> look for py2exe and friends to solve it. And that it is a solved-enough
> problem. The biggest "unsolved" issues are that you get a really  big
> application.
>

Anecdotal evidence suggests Go's user base has a decent amount of converts
from Python and it's currently the only language that seems to be siphoning
people out of our community. You do hear stories of people skipping Python
3 and going to Go, but considering how much more work that is than writing
Python 2/3 code I believe that typically happens when the organization
wanted to jump ship anyway and the Python 3 transition just gave them an
excuse to rewrite their stuff (plus we all know how enticing it can be to
play with a shiny new piece of tech if given the chance).


>
> Don't get me wrong -- I've wanted for years for it to be easier to deploy
> python-based apps as a single thinking for users to easily install and
> uninstall where they don't need to know it's python -- but what the heck is
> different now?
>

Active user loss where the biggest reason people are leaving that we can
actively fix now is easy app deployment (the other is performance; some
might argue concurrency but concurrent.futures and async/await water that
down somewhat for me).


>
> 3) There was mention of a platform-neutral way to do this. Isn't that
> simply impossible? The platforms are different in all the ways that matter
> for this problem: both technical differences, and conventions. Which isn't
> to say you couldn't have one API to produce a single "thing" executable, so
> it would look like one solution for multiple platforms to the user. But the
> end product should be (would have to be) a different beast altogether.
>

I think it's to have a single tool to do it for any platform, not to have
the technical nuts and bolts be the same necessarily. I think it's also to
figure out if there is anything the interpreter and/or stdlib can do to
facilitate this.

-Brett
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/581c0f25/attachment-0001.html>

From storchaka at gmail.com  Thu May 28 21:52:26 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Thu, 28 May 2015 22:52:26 +0300
Subject: [Python-Dev] cpython (3.5): remove STORE_MAP, since it's unused
In-Reply-To: <20150528194021.25413.16032@psf.io>
References: <20150528194021.25413.16032@psf.io>
Message-ID: <mk7rlq$6lt$1@ger.gmane.org>

On 28.05.15 22:40, benjamin.peterson wrote:
> https://hg.python.org/cpython/rev/ac891c518d4e
> changeset:   96342:ac891c518d4e
> branch:      3.5
> parent:      96339:6f05f83c7010
> user:        Benjamin Peterson <benjamin at python.org>
> date:        Thu May 28 14:40:08 2015 -0500
> summary:
>    remove STORE_MAP, since it's unused

Wouldn't it break support of .pyc files compiled with older versions of 
Python 3?



From benjamin at python.org  Thu May 28 21:53:49 2015
From: benjamin at python.org (Benjamin Peterson)
Date: Thu, 28 May 2015 15:53:49 -0400
Subject: [Python-Dev] cpython (3.5): remove STORE_MAP, since it's unused
In-Reply-To: <mk7rlq$6lt$1@ger.gmane.org>
References: <20150528194021.25413.16032@psf.io> <mk7rlq$6lt$1@ger.gmane.org>
Message-ID: <1432842829.100580.280790721.517DF908@webmail.messagingengine.com>



On Thu, May 28, 2015, at 15:52, Serhiy Storchaka wrote:
> On 28.05.15 22:40, benjamin.peterson wrote:
> > https://hg.python.org/cpython/rev/ac891c518d4e
> > changeset:   96342:ac891c518d4e
> > branch:      3.5
> > parent:      96339:6f05f83c7010
> > user:        Benjamin Peterson <benjamin at python.org>
> > date:        Thu May 28 14:40:08 2015 -0500
> > summary:
> >    remove STORE_MAP, since it's unused
> 
> Wouldn't it break support of .pyc files compiled with older versions of 
> Python 3?

Those won't work anyway because the PYC magic version has changed.

From chris.barker at noaa.gov  Thu May 28 21:37:51 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Thu, 28 May 2015 12:37:51 -0700
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <1434330486454533457.360774sturla.molden-gmail.com@news.gmane.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <mk7lka$uig$1@ger.gmane.org> <etPan.55675f97.4794acf7.12a4d@Draupnir.home>
 <1434330486454533457.360774sturla.molden-gmail.com@news.gmane.org>
Message-ID: <CALGmxELkjrFzwAyKC6gn9cp1RnvjqGSLCxp3DWL2xfADy-bf_g@mail.gmail.com>

On Thu, May 28, 2015 at 12:25 PM, Sturla Molden <sturla.molden at gmail.com>
wrote:

> Many Unix tools need Python, so Mac OS X (like Linux distros and FreeBSD)
> will always need a system Python. Yes, it would be great if could be called
> spython or something else than python. But the main problem is that it is
> used by end-users as well, not just the operating system.
>

I think it's great for it to be used by end users as a system library /
utility. i.e. like you would a the system libc -- so if you can write a
little python script that only uses the stdlib -- you can simply deliver
that script.

But if you want to go an install a bunch of extra non-standard packages
(or, for heaven's sake, want a version with bug fixes!), they you really
are better off installing a new python you can control.

The system
> Python should be left alone as it is.
>

absolutely!

By the way, py2app will build an application bundle that depends on the
system python, indeed, that's all it will do if you run it with the system
python, as Apple has added some non-redistributable bits in there. But
things get kin dof confusing if you want to rely on non-system packages...

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/71b50e13/attachment.html>

From p.f.moore at gmail.com  Thu May 28 22:29:28 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Thu, 28 May 2015 21:29:28 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CAP1=2W7BDX6jqszQrKE4X4QPRrqF3dcMgNXqOH46zkivLnRf8A@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
 <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
 <CAP1=2W7BDX6jqszQrKE4X4QPRrqF3dcMgNXqOH46zkivLnRf8A@mail.gmail.com>
Message-ID: <CACac1F_FpAf-9vZJGScRq_TX5o07GaCd2m4d1J=D0WB7xm9qzQ@mail.gmail.com>

On 28 May 2015 at 20:47, Brett Cannon <brett at python.org> wrote:
> I think it's to have a single tool to do it for any platform, not to have
> the technical nuts and bolts be the same necessarily. I think it's also to
> figure out if there is anything the interpreter and/or stdlib can do to
> facilitate this.

Precisely. At the moment, the story seems to be "if you're on Windows,
use py2exe, if you're on OSX use py2app, or on Unix, ..., or..."

What would be a compelling story is "to build your app into a single
file executable, do "python -m build <myapp>". The machinery behind
the build can be as different as necessary - but being able to use the
same command on every platform is the goal.

zipapp is a start down that direction, but there's a *lot* more to be
done before we have a story good enough to address the growing trend
towards wanting a strong single-file deployment solution.

Paul

From ned at nedbatchelder.com  Thu May 28 22:43:30 2015
From: ned at nedbatchelder.com (Ned Batchelder)
Date: Thu, 28 May 2015 16:43:30 -0400
Subject: [Python-Dev] cpython (3.5): remove STORE_MAP, since it's unused
In-Reply-To: <mk7rlq$6lt$1@ger.gmane.org>
References: <20150528194021.25413.16032@psf.io> <mk7rlq$6lt$1@ger.gmane.org>
Message-ID: <55677DF2.1090401@nedbatchelder.com>

On 5/28/15 3:52 PM, Serhiy Storchaka wrote:
> On 28.05.15 22:40, benjamin.peterson wrote:
>> https://hg.python.org/cpython/rev/ac891c518d4e
>> changeset:   96342:ac891c518d4e
>> branch:      3.5
>> parent:      96339:6f05f83c7010
>> user:        Benjamin Peterson <benjamin at python.org>
>> date:        Thu May 28 14:40:08 2015 -0500
>> summary:
>>    remove STORE_MAP, since it's unused
>
> Wouldn't it break support of .pyc files compiled with older versions 
> of Python 3?
.pyc files are not compatible across versions of Python.  3.5 changed 
the meaning of the BUILD_MAP opcode, for example.

--Ned.

From sturla.molden at gmail.com  Thu May 28 22:48:11 2015
From: sturla.molden at gmail.com (Sturla Molden)
Date: Thu, 28 May 2015 22:48:11 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CALGmxELkjrFzwAyKC6gn9cp1RnvjqGSLCxp3DWL2xfADy-bf_g@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <mk7lka$uig$1@ger.gmane.org> <etPan.55675f97.4794acf7.12a4d@Draupnir.home>
 <1434330486454533457.360774sturla.molden-gmail.com@news.gmane.org>
 <CALGmxELkjrFzwAyKC6gn9cp1RnvjqGSLCxp3DWL2xfADy-bf_g@mail.gmail.com>
Message-ID: <mk7uud$ruf$1@ger.gmane.org>

On 28/05/15 21:37, Chris Barker wrote:

> I think it's great for it to be used by end users as a system library /
> utility. i.e. like you would a the system libc -- so if you can write a
> little python script that only uses the stdlib -- you can simply deliver
> that script.

No it is not, because someone will be 'clever' and try to upgrade it 
with pip or install packages into it.

Sturla


From ncoghlan at gmail.com  Thu May 28 23:08:43 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 07:08:43 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
 <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
Message-ID: <CADiSq7dtNX3fDbZJgsLF1tkgRaK-_O_L-YZWmBS+az3RFwbmYw@mail.gmail.com>

On 29 May 2015 05:25, "Chris Barker" <chris.barker at noaa.gov> wrote:
>
> OK, I'm really confused here:
>
> 1) what the heck is so special about go all of a sudden? People have been
writing and deploying single file executables built with C and ++, and
whatever else? forever. (and indeed, it was a big sticking point for me
when I introduced python in my organization)

For scientific Python folks, the equivalent conversations I have are about
Julia.

If you're not used to thinking of Python's competitive position as "best
orchestration language, solid competitor in any given niche", then the rise
of niche specific competitors like Go & Julia can feel terrifying, as the
relatively narrow user base changes the trade-offs you can make in the
language & ecosystem design to better optimise them for that purpose.

We don't need to debate the accuracy of that perception of risk, though. If
it motivates folks to invest time & energy into providing one-obvious-way
to do cross-platform single-file distribution, lower barriers to adoption
for PyPy, and work on a Rust-style memory ownership based model for
concurrent execution of subinterpreters across multiple cores, then the
community wins regardless :)

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/c652a41f/attachment.html>

From v+python at g.nevcal.com  Thu May 28 23:09:24 2015
From: v+python at g.nevcal.com (Glenn Linderman)
Date: Thu, 28 May 2015 14:09:24 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
 <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
Message-ID: <55678404.40602@g.nevcal.com>

On 5/28/2015 12:26 PM, Paul Moore wrote:
> On 28 May 2015 at 19:22, Chris Angelico <rosuav at gmail.com> wrote:
>>> Unfortunately (and believe me, I've been down this road many times) on
>>> Windows *only* the exe format is a "first-class" executable.
>>> Executable scripts and shebangs are very useful, but there are always
>>> corner cases where they don't work *quite* like an exe. On Windows,
>>> you have to be prepared to ship an exe if you want to compete with
>>> languages that generate exes.
>> I'm aware of that. When I said "a tiny stub", I was thinking in terms
>> of a small executable. The idea is that its sole purpose is to locate
>> Python someplace else, and chain to it; that has to be actual
>> executable code, complete with the 512-byte "MZ" header and
>> everything, to ensure compatibility. But it should be able to be
>> small, tight, and easy to verify correctness of, so there aren't (in
>> theory!) security exploits in the header itself.
> OK, cool. I'm sort of working on that as a bit of a side project - a
> tiny stub exe that you can prepend to a Python zipapp which runs it
> via the standard embedding APIs. It's little more than an idea at the
> moment, but I don't think it'll be too hard to implement...

Paul,

I've been using zipapps for some years now, and it would be really cool 
to have what I think you are talking about here:

1) Build zipapp as normal.  It likely depends on some minimal Python 
version.
2) Prepend stub .exe (Windows) or !# line (Unix) that declares a version 
of Python to actually use. This can be as broad as Python 2 or Python 3 
(not sure how to specify that either works, especially on Windows), or 
more detailed/restrictive by specifying  2.n or 3.n.  On Windows, it 
would find the newest qualifying installation and run it, and Unix, the 
symlinks are set up to do that already, is my understanding, so the 
proper !# line does that for you.

This would be something I could use and benefit from immediately upon it 
being available, so I laud your idea, and hope you have a successful 
implementation, and look forward to using it.  It would largely replace 
the need for the py.exe launcher for some classes of applications.

Of course, per other disccusions, this doesn't solve the problem for:

A) machine without Python installed
B) programs that need binary extensions

Other discussions have suggested:

3) The stub could offer to download and install Python

A corollary:

4) The stub could offer to download and install the needed binary 
extensions as well as Python. This would require the installation 
uniformity of something like pip, so perhaps would be restricted to 
extensions available via pip.  And it would be much enhanced by some 
technique where the zipapp would contain metadata readable by the stub, 
that would declare the list of binary extensions required. Or, of 
course, it could even declare non-binary extension that are not packaged 
with the zipapp, if the process is smooth, the modules available via 
pip, etc., as a tradeoff.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/dba720a2/attachment.html>

From doko at ubuntu.com  Thu May 28 23:48:59 2015
From: doko at ubuntu.com (Matthias Klose)
Date: Thu, 28 May 2015 23:48:59 +0200
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528121341.74d087da@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
Message-ID: <55678D4B.1020005@ubuntu.com>

On 05/28/2015 06:13 PM, Barry Warsaw wrote:
> Go seems to be popular where I work.  It is replacing Python in a number of
> places, although Python (and especially Python 3) is still a very important
> part of our language toolbox.
> 
> There are several reasons why Go is gaining popularity.  Single-file
> executables is definitely a reason; it makes deployment very easy, even if it
> increases the maintenance burden (e.g. without shared libraries, you have
> multiple copies of things so when a security fix is required for one of those
> things you have to recompile the world).

And the very same place where you are working is investing in getting shared
libraries working for Go.  Single binaries may be popular for distributing end
user applications, but definitely not for distributing a core OS or a SDK.
Sorry, you didn't yet arrive in distro land ...

Matthias


From chris.barker at noaa.gov  Thu May 28 20:11:36 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Thu, 28 May 2015 11:11:36 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <B0CCF1CF-D6A7-4CE6-9043-37BE1A5EAE8C@gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <B0CCF1CF-D6A7-4CE6-9043-37BE1A5EAE8C@gmail.com>
Message-ID: <CALGmxEKSiNUwBJvo0NQx0DO7Jop4_biedKbt7nTJbWukBAWY+w@mail.gmail.com>

On Thu, May 28, 2015 at 10:32 AM, Ryan Gonzalez <rymg19 at gmail.com> wrote:

> py2exe tends to invoke DLL hell if you have various versions of VS or
> Office or both installed. Because Windows.
>

uh, yes -- Windows applications invoke dll hell......nothign to be done
about that!

-Chris


>
> On May 28, 2015 11:23:57 AM CDT, Chris Barker <chris.barker at noaa.gov>
> wrote:
>
>> I'm confused:
>>
>> Doesn't py2exe (optionally) create a single file executable?
>>
>> And py2app on the Mac creates an application bundle, but that is
>> more-or-less the equivalent on OS-X (you may not even be able to have a
>> single file executable that can access the Window Manager, for instance)
>>
>> Depending on what extra packages you need, py2exe's single file doesn't
>> always work, but last I tried, it worked for a fair bit (I think all of the
>> stdlib).
>>
>> I don't know what PyInstaller or others create. And I have no idea if
>> there is a linux option -- but it seems like the standard of practice for
>> an application for linux is a bunch of files scattered over the system
>> anyway :-)
>>
>> Yes, the resulting exe is pretty big, but it does try to include only
>> those modules and packages that are used, and that kind of optimization
>> could be improved in any case.
>>
>> So is something different being asked for here?
>>
>> Barry Warsaw wrote:
>> >> I do think single-file executables are an important piece to Python's long-term
>> competitiveness.
>>
>> Really? It seems to me that desktop development is dying. What are the
>> critical use-cases for a single file executable?
>>
>> And I'd note that getting a good way to use Python to develop for iOS,
>> Android, and Mobile Windows is FAR more critical!  -- maybe that's the same
>> problem ?
>>
>> -Chris
>>
>>
>> On Thu, May 28, 2015 at 8:39 AM, Donald Stufft <donald at stufft.io> wrote:
>>
>>>
>>>
>>> On May 28, 2015 at 11:30:37 AM, Steve Dower (steve.dower at microsoft.com)
>>> wrote:
>>> > Donald Stufft wrote:
>>> > > Well Python 3.4.3 binary is 4kb for me, so you'd have that + your
>>> 1KB Python script + whatever
>>> > other pieces you need.
>>> >
>>> > For contrast, here are the things you need on Windows to be able to
>>> get to an interactive
>>> > prompt (I don't know how other platforms get this down to 4KB...):
>>> >
>>> > * python.exe (or some equivalent launcher) 39KB
>>> > * python35.dll 3,788KB
>>> > * vcruntime140.dll 87KB (the rest of the CRT is about 1MB, but is not
>>> redistributable
>>> > so doesn't count here)
>>> > * 26 files in Lib 343KB
>>> >
>>> > This gets you to ">>>", and basically everything after that is going
>>> to fail for some reason.
>>> > That's an unavoidable 4,257KB.
>>> >
>>> > The rest of the stdlib adds another ~16MB once you exclude the test
>>> suite, so a fully functioning
>>> > Python is not cheap. (Using compressed .pyc's in a zip file can make a
>>> big difference here
>>> > though, assuming you're willing to trade CPU for HDD.)
>>> >
>>> > Cheers,
>>> > Steve
>>> >
>>> >
>>>
>>> You don?t need a "fully functioning Python" for a single file binary,
>>> you only
>>> need enough to actually run your application. For example, if you're
>>> making
>>> an application that can download files over HTTP, you don't need to
>>> include
>>> parts of the stdlib like xmlrpc, pickle, shelve, marshall, sqlite, csv,
>>> email,
>>> mailcap, mailbox, imaplib, nntplib, etc.
>>>
>>> Of course deciding which pieces you include in the zip file you're
>>> appending
>>> to the end of Python is up to whatever tool builds this executable which
>>> doesn't need to be part of Python itself. If Python itself gained the
>>> ability
>>> to operate in that manner than third party tools could handle trying to
>>> do the
>>> optimizations where it only includes the things it actually needs in the
>>> stdlib
>>> and excludes things it doesn't. The key thing here is that since you're
>>> doing
>>> a single file binary, you don't need to have a Python which is suitable
>>> to
>>> execute random Python code, you only need one that is suitable to
>>> execute this
>>> particular code so you can specialize what that includes.
>>>
>>> ---
>>> Donald Stufft
>>> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>>>
>>>
>>> _______________________________________________
>>> Python-Dev mailing list
>>> Python-Dev at python.org
>>> https://mail.python.org/mailman/listinfo/python-dev
>>> Unsubscribe:
>>> https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov
>>>
>>
>>
>>
>> --
>>
>> Christopher Barker, Ph.D.
>> Oceanographer
>>
>> Emergency Response Division
>> NOAA/NOS/OR&R            (206) 526-6959   voice
>> 7600 Sand Point Way NE   (206) 526-6329   fax
>> Seattle, WA  98115       (206) 526-6317   main reception
>>
>> Chris.Barker at noaa.gov
>>
>> ------------------------------
>>
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe: https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com
>>
>>
> --
> Sent from my Android device with K-9 Mail. Please excuse my brevity.
>



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/a6bc2a44/attachment-0001.html>

From donald at stufft.io  Fri May 29 00:04:14 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 18:04:14 -0400
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <55678D4B.1020005@ubuntu.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <55678D4B.1020005@ubuntu.com>
Message-ID: <etPan.556790de.346945ff.182f2@Draupnir.home>



On May 28, 2015 at 5:50:32 PM, Matthias Klose (doko at ubuntu.com) wrote:
> On 05/28/2015 06:13 PM, Barry Warsaw wrote:
> > Go seems to be popular where I work. It is replacing Python in a number of
> > places, although Python (and especially Python 3) is still a very important
> > part of our language toolbox.
> >
> > There are several reasons why Go is gaining popularity. Single-file
> > executables is definitely a reason; it makes deployment very easy, even if it
> > increases the maintenance burden (e.g. without shared libraries, you have
> > multiple copies of things so when a security fix is required for one of those
> > things you have to recompile the world).
>  
> And the very same place where you are working is investing in getting shared
> libraries working for Go. Single binaries may be popular for distributing end
> user applications, but definitely not for distributing a core OS or a SDK.
> Sorry, you didn't yet arrive in distro land ...
>  
>

I don?t think anyone is claiming that single file should be the *only* way, just
that for a sizable set of people it is a very attractive way.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From pmiscml at gmail.com  Fri May 29 00:08:32 2015
From: pmiscml at gmail.com (Paul Sokolovsky)
Date: Fri, 29 May 2015 01:08:32 +0300
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <55678D4B.1020005@ubuntu.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <55678D4B.1020005@ubuntu.com>
Message-ID: <20150529010832.2ed2a25a@x230>

Hello,

On Thu, 28 May 2015 23:48:59 +0200
Matthias Klose <doko at ubuntu.com> wrote:

[]

> And the very same place where you are working is investing in getting
> shared libraries working for Go.  Single binaries may be popular for
> distributing end user applications, but definitely not for
> distributing a core OS or a SDK. Sorry, you didn't yet arrive in
> distro land ...

Of course it did. Like, Ubuntu 14.04LTS ships Go 1.2. No, it starts
with the fact that when you don't have Go installed and type "go", it
suggests to install gccgo, which just segfaults on running. Then you
figure out that you need to install "golang", and that's 1.2, and a lot
of things simply don't work with that version, like "go get" reports
that a package not found, while it perfectly exists. So, let Go stay
what it is - a corporate toy lingo for press-releases. That's until
Google has thought that it generated enough buzz and it's time to shut
it down like their numerous other projects. (Isn't Go old already and
"everyone" uses Rust?)


-- 
Best regards,
 Paul                          mailto:pmiscml at gmail.com

From ncoghlan at gmail.com  Fri May 29 00:13:56 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 08:13:56 +1000
Subject: [Python-Dev] Usability of the limited API
In-Reply-To: <BY1PR03MB14662DABC5D961ECB86BF590F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
References: <CACac1F9jThsGyTT=yqJoRLgW7dHdp_Z_bzMMcs9O9iyLP0qeww@mail.gmail.com>
 <BY1PR03MB1466903FF0712CA9595A5B99F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <CACac1F-GJPQOhuett3i1G4TkuvNdOskONLZOrQcA+1quoGctZA@mail.gmail.com>
 <BY1PR03MB14662DABC5D961ECB86BF590F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
Message-ID: <CADiSq7f-idbUq9iWgKez5A0GATG_a_atmqw0VoYeLQrJy_T1yg@mail.gmail.com>

On 29 May 2015 01:04, "Steve Dower" <Steve.Dower at microsoft.com> wrote:
>
> Paul Moore wrote:
> > On 28 May 2015 at 15:28, Steve Dower <Steve.Dower at microsoft.com> wrote:
> >> I don't have the issue number handy, but it should be near the top of
> >> the recently modified list.
> >
> > I recall seeing that issue. I'm fine with that - getting the two in
sync is
> > obviously worth doing (and clearly in hand). I'm personally not sure
whether
> > automating the exposure of symbols is the correct approach, as I'm not
sure
> > people typically even consider the stable API when adding functions. Is
the
> > default (what you get if somebody just blindly adds a symbol with no
thought for
> > the stable API) to expose it or not? If the default is that it's not
exposed,
> > then automation seems reasonable, otherwise I'm not so sure.
>
> Now I'm at my desk, the issue is http://bugs.python.org/issue23903
>
> I believe new symbols are considered stable by default, so perhaps we
actually want a test that will generate a C file that imports everything
"stable" and will break the buildbots if someone adds something new without
explicitly adding it to the list of stable functions?

The stable CPython ABI is actually tracked on
http://upstream-tracker.org/versions/python_stable_api.html

Ideally we'd be running those checks automatically as part of our own QA
with http://ispras.linuxbase.org/index.php/ABI_compliance_checker, similar
to Antoine's regular refleak checks.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/a2ad1ade/attachment.html>

From graffatcolmingov at gmail.com  Fri May 29 00:18:36 2015
From: graffatcolmingov at gmail.com (Ian Cordasco)
Date: Thu, 28 May 2015 17:18:36 -0500
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150529010832.2ed2a25a@x230>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <55678D4B.1020005@ubuntu.com> <20150529010832.2ed2a25a@x230>
Message-ID: <CAN-Kwu1BYPzOnzraGLLuVM9sXXGcBt0J_AgYEopgF3Y=d8iQqw@mail.gmail.com>

On Thu, May 28, 2015 at 5:08 PM, Paul Sokolovsky <pmiscml at gmail.com> wrote:
> Hello,
>
> On Thu, 28 May 2015 23:48:59 +0200
> Matthias Klose <doko at ubuntu.com> wrote:
>
> []
>
>> And the very same place where you are working is investing in getting
>> shared libraries working for Go.  Single binaries may be popular for
>> distributing end user applications, but definitely not for
>> distributing a core OS or a SDK. Sorry, you didn't yet arrive in
>> distro land ...
>
> Of course it did. Like, Ubuntu 14.04LTS ships Go 1.2. No, it starts
> with the fact that when you don't have Go installed and type "go", it
> suggests to install gccgo, which just segfaults on running. Then you
> figure out that you need to install "golang", and that's 1.2, and a lot
> of things simply don't work with that version, like "go get" reports
> that a package not found, while it perfectly exists. So, let Go stay
> what it is - a corporate toy lingo for press-releases. That's until
> Google has thought that it generated enough buzz and it's time to shut
> it down like their numerous other projects. (Isn't Go old already and
> "everyone" uses Rust?)
>
>
> --
> Best regards,
>  Paul                          mailto:pmiscml at gmail.com
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/graffatcolmingov%40gmail.com

Note that as much as I love Rust, it still isn't the replacement for
Go. It doesn't have a stable ABI so if you distribute a binary and
that person has a different version of Rust 1.x installed, it won't be
guaranteed to work (and, at this point, probably won't work anyway).
Go is just more popular because it's been around longer and it (as far
as a single developer is concerned) gets rid of the dependency mess.
That's why developers like it.

From ncoghlan at gmail.com  Fri May 29 00:38:44 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 08:38:44 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
Message-ID: <CADiSq7fCSDQ=QN6BwuH8f9m7zuDPpdmP_z+GZgUfDyT7eo1+ew@mail.gmail.com>

On 29 May 2015 00:52, "Paul Moore" <p.f.moore at gmail.com> wrote:
>
> +1. The new embeddable Python distribution for Windows is a great step
> forward for this. It's not single-file, but it's easy to produce a
> single-directory self-contained application with it. I don't know if
> there's anything equivalent for Linux/OSX - maybe it's something we
> should look at for them as well (although the whole "static binaries"
> concept seems to be fairly frowned on in the Unix world, from what
> I've seen).

Correct - in the absence of the capacity to rebuild and redeploy the world
at the click of a button, widespread deployment of static binaries poses an
appallingly high security risk. It isn't an accident that Linux container
orchestration is co-evolving with Linux container formats.

Those efforts are mostly focused on network services & GUI applications,
though. For portable console applications, Go is still one of the nicest
options currently available, as the relatively limited ability to
interoperate with the system provided C/C++ libraries makes it much harder
to create dependencies between the app and the platform. It's similar to
Java in that respect, but without the dependency on a large language
runtime like the JVM.

In that vein, it might be interesting to see what could be done with
MicroPython in terms of providing a lightweight portable Python runtime
without CPython's extensive integration with the underlying OS.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/39986e3b/attachment.html>

From barry at python.org  Fri May 29 01:01:26 2015
From: barry at python.org (Barry Warsaw)
Date: Thu, 28 May 2015 19:01:26 -0400
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <55678D4B.1020005@ubuntu.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <55678D4B.1020005@ubuntu.com>
Message-ID: <20150528190126.38ce61e6@anarchist.wooz.org>

On May 28, 2015, at 11:48 PM, Matthias Klose wrote:

>And the very same place where you are working is investing in getting shared
>libraries working for Go.  Single binaries may be popular for distributing end
>user applications, but definitely not for distributing a core OS or a SDK.

Yep, I mentioned this in an earlier message (probably buried in the
centithread avalanche).  Both static and dynamic linking have valid use
cases.

Cheers,
-Barry


From solipsis at pitrou.net  Fri May 29 01:15:36 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Fri, 29 May 2015 01:15:36 +0200
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
 <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
 <CADiSq7eq98tFt+fymQiE++7aTd5og0E2oDySthamHhUpv=oFsA@mail.gmail.com>
Message-ID: <20150529011536.60c4a3a5@fsol>

On Thu, 28 May 2015 08:48:11 +1000
Nick Coghlan <ncoghlan at gmail.com> wrote:
> 
> I just remembered one of the biggest causes of pain: Windows binaries for
> projects that aren't using the stable ABI. It used to regularly take 6+
> months for the Windows ecosystem to catch up after each 2.x release.

You're right, compatibility of C extension builds under Windows is
pretty much the killer.

> After all, the real difference between the alphas and the final releases
> isn't about anything *we* do, it's about the testing *other people* do that
> picks up gaps in our test coverage. A gated trunk makes it more feasible
> for other projects to do continuous integration against it.

Long ago (before I became a core developer) we had "community
buildbots" for that. They didn't receive any attention or maintenance
from third-party projects.

Regards

Antoine.



From pmiscml at gmail.com  Fri May 29 01:01:55 2015
From: pmiscml at gmail.com (Paul Sokolovsky)
Date: Fri, 29 May 2015 02:01:55 +0300
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CADiSq7fCSDQ=QN6BwuH8f9m7zuDPpdmP_z+GZgUfDyT7eo1+ew@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CADiSq7fCSDQ=QN6BwuH8f9m7zuDPpdmP_z+GZgUfDyT7eo1+ew@mail.gmail.com>
Message-ID: <20150529020155.33377daa@x230>

Hello,

On Fri, 29 May 2015 08:38:44 +1000
Nick Coghlan <ncoghlan at gmail.com> wrote:

[]

> In that vein, it might be interesting to see what could be done with
> MicroPython in terms of providing a lightweight portable Python
> runtime without CPython's extensive integration with the underlying
> OS.

Thanks for mentioning that. That's definitely what I have on my mind,
actually I wanted to do long ago a Lua-esque hack of being able to cat
together an interpreter and a script, so resulting executable would
just run the script. What stopped me is that it would be Lua-esque-ly
useless, as how much useful one can do with a bare script without
dependencies?

And MicroPython definitely has some puzzle pieces for a generic
solution, but so far not a complete picture:

1. There're frozen modules support, but they're that - modules,
packages not supported for freezing so far.

2. Then frozen modules require recompiling, and that's not real-world
scalable.

3. Standard library is already "distributed" (vs CPython's monolithic),
but half of modules are dummy so far.


That said, making a demo of self-contained webapp server in 350-400K is
definitely on my TODO list (package support for frozen modules is the
only blocker for that).


> Cheers,
> Nick.



-- 
Best regards,
 Paul                          mailto:pmiscml at gmail.com

From ncoghlan at gmail.com  Fri May 29 01:38:49 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 09:38:49 +1000
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528121341.74d087da@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
Message-ID: <CADiSq7fj_1NdUht+6TBO6YJTKnnB-n57gHPo+7E1bXXimJNyvw@mail.gmail.com>

On 29 May 2015 2:16 am, "Barry Warsaw" <barry at python.org> wrote:
>
> Go seems to be popular where I work.  It is replacing Python in a number
of
> places, although Python (and especially Python 3) is still a very
important
> part of our language toolbox.
>
> There are several reasons why Go is gaining popularity.  Single-file
> executables is definitely a reason; it makes deployment very easy, even
if it
> increases the maintenance burden (e.g. without shared libraries, you have
> multiple copies of things so when a security fix is required for one of
those
> things you have to recompile the world).
>
> Start up times and memory footprint are also factors.  Probably not much
to be
> done about the latter, but perhaps PEP 432 can lead to improvements in the
> former.  (Hey Nick, I'm guessing you'll want to bump that one back to
3.6.)

Yep. I got the feature branch mostly working again just after PyCon (since
several folks expressed interest in helping with it), and thanks to Eric
Snow, the biggest blocker to further progress (splitting the import system
initialisation into two distinct phases) has already been addressed for 3.5
(that's not merged into the feature branch in my sandbox repo yet, though).

PEP 432 itself isn't likely to change startup time for the full interpreter
runtime very much (as it's mostly about rearranging how we call the
existing setup steps, rather than changing the steps themselves), but
having more of the C API available earlier in the bootstrapping cycle will
hopefully lay a foundation for future improvements.

The intent is also to make embedding *much* easier, and have it be trivial
to skip initialising any subsystems that a given application doesn't need.

> Certainly better support for multi-cores comes up a lot.  It should be a
SMoE
> to just get rid of the GIL once and for all <wink>.

Eric's been looking into this as well, and we think there's a plausible
path forward based on changing the way subinterpreters work such that the
GIL can be changed to a read/write lock, and each subinterpreter gets its
own Local Interpreter Lock.

Expect to hear more on that front before too long :)

> One thing I've seen more than once is that new development happens in
Python
> until the problem is understood, then the code is ported to Go.  Python's
> short path from idea to working code, along with its ability to quickly
morph
> as requirements and understanding changes, its batteries included
philosophy,
> and its "fits-your-brain" consistency are its biggest strengths!

Right, Go is displacing C/C++ in that regard (moreso than Python itself),
and now that Rust has hit 1.0, I expect we'll see it becoming another
contender for this task. Rust's big advantage over Go in that regard is
being compatible with the C/C++ ecosystem, including Python's cffi.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/fa60ca57/attachment-0001.html>

From donald at stufft.io  Fri May 29 01:48:34 2015
From: donald at stufft.io (Donald Stufft)
Date: Thu, 28 May 2015 19:48:34 -0400
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CADiSq7fj_1NdUht+6TBO6YJTKnnB-n57gHPo+7E1bXXimJNyvw@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <CADiSq7fj_1NdUht+6TBO6YJTKnnB-n57gHPo+7E1bXXimJNyvw@mail.gmail.com>
Message-ID: <etPan.5567a952.f2533e8.18516@Draupnir.home>



On May 28, 2015 at 7:40:26 PM, Nick Coghlan (ncoghlan at gmail.com) wrote:
> >
> > One thing I've seen more than once is that new development happens  
> in Python
> > until the problem is understood, then the code is ported to Go.  
> Python's
> > short path from idea to working code, along with its ability  
> to quickly morph
> > as requirements and understanding changes, its batteries  
> included philosophy,
> > and its "fits-your-brain" consistency are its biggest strengths!  
>  
>  
> Right, Go is displacing C/C++ in that regard (moreso than Python  
> itself), and now that Rust has hit 1.0, I expect we'll see it becoming  
> another contender for this task. Rust's big advantage over Go  
> in that regard is being compatible with the C/C++ ecosystem,  
> including Python's cffi.
>  

I?m not sure if I?m reading this right or not, but just to be clear, I?ve
seen a number of people express the sentiment that they are switching from
Python to Go and that the deployment story is one of the reasons. It?s not
just people switching from C/C++.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From larry at hastings.org  Fri May 29 01:59:15 2015
From: larry at hastings.org (Larry Hastings)
Date: Thu, 28 May 2015 16:59:15 -0700
Subject: [Python-Dev] Can someone configure the buildbots to build the 3.5
	branch?
Message-ID: <5567ABD3.6090108@hastings.org>



The buildbots currently live in a state of denial about the 3.5 branch.  
Could someone whisper tenderly in their collective shell-like ears so 
that they start building 3.5, in addition to 3.4 and trunk?

Thank you,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/f3f90228/attachment.html>

From victor.stinner at gmail.com  Fri May 29 02:58:55 2015
From: victor.stinner at gmail.com (Victor Stinner)
Date: Fri, 29 May 2015 02:58:55 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
Message-ID: <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>

2015-05-28 18:07 GMT+02:00 Guido van Rossum <guido at python.org>:
> This patch could save companies like Dropbox a lot of money. We run a ton of
> Python code in large datacenters, and while we are slow in moving to Python
> 3, we're good at updating to the latest 2.7.

I'm not sure that backporting optimizations would motivate companies
like Dropbox to drop Python 2. It's already hard to convince someone
to migrate to Python 3.

Why not continue to enhance Python 3 instead of wasting our time with
Python 2? We have limited resources in term of developers to maintain
Python.

(I'm not talking about fixing *bugs* in Python 2 which is fine with me.)

--

By the way, I just wrote sixer, a new tool to generate patches to port
OpenStack to Python 3 :-)
https://pypi.python.org/pypi/sixer

It's based on regex, so it's less reliable than 2to3, 2to6 or
modernize, but it's just enough for my specific use case. On
OpenStack, it's not possible to send one giant patch "hello, this is
python 3". Code is modified by small and incremental changes.

Come on in the Python 3 world and... always look on the bright side of
life ( https://www.youtube.com/watch?v=VOAtCOsNuVM )!

Victor

From larry at hastings.org  Fri May 29 03:09:02 2015
From: larry at hastings.org (Larry Hastings)
Date: Thu, 28 May 2015 18:09:02 -0700
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
Message-ID: <5567BC2E.8090101@hastings.org>

On 05/28/2015 05:58 PM, Victor Stinner wrote:
> Why not continue to enhance Python 3 instead of wasting our time with
> Python 2? We have limited resources in term of developers to maintain
> Python.

Uh, guys, you might as well call off the debate.  Benjamin already 
checked it in.

    https://hg.python.org/cpython/rev/17d3bbde60d2


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/54af9fd6/attachment.html>

From guido at python.org  Fri May 29 03:10:33 2015
From: guido at python.org (Guido van Rossum)
Date: Thu, 28 May 2015 18:10:33 -0700
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
Message-ID: <CAP7+vJKg36tOKdvjhOvbfzAQgLt9VDrW6inrYwypj1psTPg5DQ@mail.gmail.com>

On Thu, May 28, 2015 at 5:58 PM, Victor Stinner <victor.stinner at gmail.com>
wrote:

> 2015-05-28 18:07 GMT+02:00 Guido van Rossum <guido at python.org>:
> > This patch could save companies like Dropbox a lot of money. We run a
> ton of
> > Python code in large datacenters, and while we are slow in moving to
> Python
> > 3, we're good at updating to the latest 2.7.
>
> I'm not sure that backporting optimizations would motivate companies
> like Dropbox to drop Python 2. It's already hard to convince someone
> to migrate to Python 3.
>
> Why not continue to enhance Python 3 instead of wasting our time with
> Python 2? We have limited resources in term of developers to maintain
> Python.
>

As a matter of fact (unknown to me earlier) Dropbox is already using a
private backport of this patch. So it won't affect Dropbox's decision
anyway. But it might make life easier for other, smaller companies that
don't have the same resources.

However this talk of "wasting our time with Python 2" needs to stop, and if
you think that making Python 2 less attractive will encourage people to
migrate to Python 3, think again. Companies like Intel are *contributing*
by offering this backport up publicly.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/4a9b23d7/attachment.html>

From rymg19 at gmail.com  Fri May 29 03:15:51 2015
From: rymg19 at gmail.com (Ryan Gonzalez)
Date: Thu, 28 May 2015 20:15:51 -0500
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <5567BC2E.8090101@hastings.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <5567BC2E.8090101@hastings.org>
Message-ID: <CAO41-mOwOE6qi3vx9tNniCbde=t=b8YzgxgBf_q=k9RJprHSGg@mail.gmail.com>

YESSS!!!

On Thu, May 28, 2015 at 8:09 PM, Larry Hastings <larry at hastings.org> wrote:

>  On 05/28/2015 05:58 PM, Victor Stinner wrote:
>
> Why not continue to enhance Python 3 instead of wasting our time with
> Python 2? We have limited resources in term of developers to maintain
> Python.
>
>
> Uh, guys, you might as well call off the debate.  Benjamin already checked
> it in.
>
> https://hg.python.org/cpython/rev/17d3bbde60d2
>
>
> */arry*
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com
>
>


-- 
Ryan
[ERROR]: Your autotools build scripts are 200 lines longer than your
program. Something?s wrong.
http://kirbyfan64.github.io/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/379daf31/attachment.html>

From ericsnowcurrently at gmail.com  Fri May 29 03:47:32 2015
From: ericsnowcurrently at gmail.com (Eric Snow)
Date: Thu, 28 May 2015 19:47:32 -0600
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <20150529020155.33377daa@x230>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CADiSq7fCSDQ=QN6BwuH8f9m7zuDPpdmP_z+GZgUfDyT7eo1+ew@mail.gmail.com>
 <20150529020155.33377daa@x230>
Message-ID: <CALFfu7A7M8fS2GiVto-LZ2NdWRYA0B02Qm=eR8L5acriacaa7w@mail.gmail.com>

On Thu, May 28, 2015 at 5:01 PM, Paul Sokolovsky <pmiscml at gmail.com> wrote:
> That said, making a demo of self-contained webapp server in 350-400K is
> definitely on my TODO list (package support for frozen modules is the
> only blocker for that).

It may be worth taking this over to import-sig at python.org for more discussion.

-eric

From greg at krypto.org  Fri May 29 04:01:25 2015
From: greg at krypto.org (Gregory P. Smith)
Date: Fri, 29 May 2015 02:01:25 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <5567BC2E.8090101@hastings.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <5567BC2E.8090101@hastings.org>
Message-ID: <CAGE7PNLFO6-2gDUiD8d_hJAEkHx_nR7D2k=aqYBtY95MVWFjQw@mail.gmail.com>

On Thu, May 28, 2015 at 6:10 PM Larry Hastings <larry at hastings.org> wrote:

> On 05/28/2015 05:58 PM, Victor Stinner wrote:
>
> Why not continue to enhance Python 3 instead of wasting our time with
> Python 2? We have limited resources in term of developers to maintain
> Python.
>
>
> Uh, guys, you might as well call off the debate.  Benjamin already checked
> it in.
>
> https://hg.python.org/cpython/rev/17d3bbde60d2
>
>
Thanks Benjamin! :)


>
>
> */arry*
>  _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/greg%40krypto.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/4f2fb56b/attachment.html>

From thomas at python.org  Fri May 29 04:04:52 2015
From: thomas at python.org (Thomas Wouters)
Date: Fri, 29 May 2015 04:04:52 +0200
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150528121341.74d087da@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
Message-ID: <CAPdQG2pAwZJVRNLW0=6NS1zGA0ZNPA1q6A=atGqSjHfejZ1-WA@mail.gmail.com>

On Thu, May 28, 2015 at 6:13 PM, Barry Warsaw <barry at python.org> wrote:

> Go seems to be popular where I work.  It is replacing Python in a number of
> places, although Python (and especially Python 3) is still a very important
> part of our language toolbox.
>
> There are several reasons why Go is gaining popularity.  Single-file
> executables is definitely a reason; it makes deployment very easy, even if
> it
> increases the maintenance burden (e.g. without shared libraries, you have
> multiple copies of things so when a security fix is required for one of
> those
> things you have to recompile the world).
>
> Start up times and memory footprint are also factors.  Probably not much
> to be
> done about the latter, but perhaps PEP 432 can lead to improvements in the
> former.  (Hey Nick, I'm guessing you'll want to bump that one back to 3.6.)
>
> Certainly better support for multi-cores comes up a lot.  It should be a
> SMoE
> to just get rid of the GIL once and for all <wink>.
>
> One thing I've seen more than once is that new development happens in
> Python
> until the problem is understood, then the code is ported to Go.  Python's
> short path from idea to working code, along with its ability to quickly
> morph
> as requirements and understanding changes, its batteries included
> philosophy,
> and its "fits-your-brain" consistency are its biggest strengths!
>
> On May 28, 2015, at 10:37 AM, Donald Stufft wrote:
>
> >I think docker is a pretty crummy answer to Go?s static binaries. What I
> would
> >love is for Python to get:
> >
> >* The ability to import .so modules via zipzimport (ideally without a
> >temporary   directory, but that might require newer APIs from libc and
> such).
>
> +1 - Thomas Wouters mentioned at the language summit some work being done
> on
> glibc to add dlopen_from_memory() (sp?) which would allow for loading .so
> files directly from a zip.  Not sure what the status is of that, but it
> would
> be a great addition.
>

It's dlopen_with_offset:
https://sourceware.org/bugzilla/show_bug.cgi?id=11767. There's also a patch
that's sort-of dlopen_from_memory (dlopen_ehdr), but it requires a lot of
manual setup to map the right bits to the right places; dlopen_with_offset
is a lot simpler.

Building a Python application into a single file doesn't require
dlopen_with_offset, *iff* you build everything from source. It's not easy
to do this -- Python's setup.py and third-party's uses of distutils don't
allow this -- but it's mostly possible using the old Modules/Setup file.
(Or you could do what we routinely do at Google with third-party packages
and re-implement the build in your own build system ;P)



>
> >* The ability to create a ?static? Python that links everything it needs
> into
> >the binary to do a zipimport of everything else (including the stdlib).
>
>
This is possible (with some jumping through hoops) using Modules/Setup and
some post-processing of the standard library. It would be a lot easier if
we got rid of distutils for building Python (or for everything) -- or made
it output a Modules/Setup-like file :) (For those who don't remember,
Modules/Setup was the file we used to build stdlib extension modules before
we had distutils, and it's parsed and incorporated into the regular
Makefile. It can build both static and dynamic extension modules.)


> +1
>
> >*The ability to execute a zipfile that has been concat onto the end of
> the
> >Python binary.
>

This is already possible, just not with the regular 'python' binary. All it
takes is fifty lines of C or so, a tiny application that embeds Python,
sets sys.path[0] to argv[0], and uses the runpy module to execute something
from the ZIP file. There are some issues with this approach (like what
sys.executable should be :) but they're mostly cosmetic


>
> +1
>
> >I think that if we get all of that, we could easily create a single file
> >executable with real, native support from Python by simply compiling
> Python
> >in that static mode and then appending a zip file containing the standard
> >library and any other distributions we need to the end of it.
> >
> >We?d probably want some more quality of life improvements around accessing
> >resources from within that zip file as well, but that can be done as a
> >library easier than the above three things can.
>
> E.g. you really should be using the pkg_resources APIs for loading
> resources
> from your packages, otherwise you're gonna have problems with zip
> executables.  We've talked before about adopting some of these APIs into
> Python's stdlib.  pkgutil is a start, and the higher level APIs from
> pkg_resources should probably go there.
>
> Cheers,
> -Barry
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/thomas%40python.org
>



-- 
Thomas Wouters <thomas at python.org>

Hi! I'm an email virus! Think twice before sending your email to help me
spread!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/1f9afe42/attachment-0001.html>

From chris.barker at noaa.gov  Fri May 29 05:05:11 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Thu, 28 May 2015 20:05:11 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <55678404.40602@g.nevcal.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
 <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
 <55678404.40602@g.nevcal.com>
Message-ID: <CALGmxELaQgNxqZiGJkVAjYLBWoddPL7r5JueLz_pVXRYcr5NfA@mail.gmail.com>

Getting lost as to what thread this belongs in...

But another tack to take toward a single executable is Cython's embedding
option:

https://github.com/cython/cython/wiki/EmbeddingCython

This is a quick and dirty way to create a C executable that will then run
the cythonized code, all linked to the python run time.

At this point, it still requires the python shared lib, and I think any
other compiled extension is shared, too. And if you run Cython on all the
python code and modules you use, you'll have a LOT of shared libs. But
perhaps one could re-do the linking step of all that and get a single
compiled exe.

and IIUC, the way Windows dll hell works, if you stuff the whole pile into
one dir -- you will get a single executable directory, if not a single file.

and about a 2X performance boost, as well, when you cythonize pure Python,
at least in my limited experience.

Just a thought.

-Chris








-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/cdac8923/attachment.html>

From benno at benno.id.au  Fri May 29 06:46:52 2015
From: benno at benno.id.au (Ben Leslie)
Date: Fri, 29 May 2015 14:46:52 +1000
Subject: [Python-Dev] Obtaining stack-frames from co-routine objects
Message-ID: <CABZ0LtAPMi9nQb9ZSp4jW4-kvTte_WVjz=wtKvUWmGeSwuyqVA@mail.gmail.com>

Hi all,

Apologies in advance; I'm not a regular, and this may have been
handled already (but I couldn't find it when searching).

I've been using the new async/await functionality (congrats again to
Yury on getting that through!), and I'd like to get a stack trace
between the place at which blocking occurs and the outer co-routine.

For example, consider this code:

"""
async def a():
    await b()

async def b():
    await switch()

@types.coroutine
def switch():
    yield

coro_a = a()
coro_a.send(None)
"""

At this point I'd really like to be able to somehow get a stack trace
similar to:

test.py:2
test.py:4
test.py:9

Using the gi_frame attribute of coro_a, I can get the line number of
the outer frame (e.g.: line 2), but from there there is no way to
descend the stack to reach the actual yield point.

I thought that perhaps the switch() co-routine could yield the frame
object returned from inspect.currentframe(), however once that
function yields that frame object has f_back changed to None.

A hypothetical approach would be to work the way down form the
outer-frame, but that requires getting access to the co-routine object
that the outer-frame is currently await-ing. Some hypothetical code
could be:

"""
def show(coro):
    print("{}:{}".format(coro.gi_frame.f_code.co_filename,
coro.gi_frame.f_lineno))
    if dis.opname[coro.gi_code.co_code[coro.gi_frame.f_lasti + 1]] ==
'YIELD_FROM':
        show(coro.gi_frame.f_stack[0])
"""

This relies on the fact that an await-ing co-routine will be executing
a YIELD_FROM instruction. The above code uses a completely
hypothetical 'f_stack' property of frame objects to pull the
co-routine object which a co-routine is currently await-ing from the
stack. I've implemented a proof-of-concept f_stack property in the
frameobject.c just to test out the above code, and it seems to work.

With all that, some questions:

1) Does anyone else see value in trying to get the stack-trace down to
the actual yield point?
2) Is there a different way of doing it that doesn't require changes
to Python internals?
3) Assuming no to #2 is there a better way of getting the information
compared to the pretty hacking byte-code/stack inspection?

Thanks,

Ben

From larry at hastings.org  Fri May 29 08:20:18 2015
From: larry at hastings.org (Larry Hastings)
Date: Thu, 28 May 2015 23:20:18 -0700
Subject: [Python-Dev] Python 3.5 schedule addendum adding a new Python 3.5.0
 beta, this weekend
Message-ID: <55680522.5010307@hastings.org>



On behalf of the Python 3.5 release team:

Due to a particularly bad bug ( http://bugs.python.org/issue24285 ), 
we're going to issue a new beta of Python 3.5 this weekend.  This will 
not change the rest of the schedule; it'll just bump the remaining beta 
numbers up by 1.  Thus the schedule is now as follows:

    - 3.5.0 beta 1: May 24, 2015
    - 3.5.0 beta 2: May 31, 2015
    - 3.5.0 beta 3: July 5, 2015
    - 3.5.0 beta 4: July 26, 2015
    - 3.5.0 candidate 1: August 9, 2015
    - 3.5.0 candidate 2: August 23, 2015
    - 3.5.0 candidate 3: September 6, 2015
    - 3.5.0 final: September 13, 2015


May you live in interesting times,


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150528/ed2ceecc/attachment.html>

From ncoghlan at gmail.com  Fri May 29 08:40:53 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 16:40:53 +1000
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <20150529011536.60c4a3a5@fsol>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
 <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
 <CADiSq7eq98tFt+fymQiE++7aTd5og0E2oDySthamHhUpv=oFsA@mail.gmail.com>
 <20150529011536.60c4a3a5@fsol>
Message-ID: <CADiSq7fTiRMjDHFiVCeOKr1WWYgvVc2daxvU4opGcct7qL7Rcw@mail.gmail.com>

On 29 May 2015 9:17 am, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
>
> On Thu, 28 May 2015 08:48:11 +1000
> Nick Coghlan <ncoghlan at gmail.com> wrote:

> > After all, the real difference between the alphas and the final releases
> > isn't about anything *we* do, it's about the testing *other people* do
that
> > picks up gaps in our test coverage. A gated trunk makes it more feasible
> > for other projects to do continuous integration against it.
>
> Long ago (before I became a core developer) we had "community
> buildbots" for that. They didn't receive any attention or maintenance
> from third-party projects.

Right, but it's hard to integrate against trunk when trunk itself may be
broken. If we had a way of publishing "known good" commit hashes that
passed the test suite on all the stable buildbots, that could potentially
provide a basis for integration testing without needing to switch to merge
gating first.

Do we know if BuildBot offers an interface for that?

Cheers,
Nick.

>
> Regards
>
> Antoine.
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/7588e16c/attachment.html>

From ncoghlan at gmail.com  Fri May 29 08:58:25 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 16:58:25 +1000
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.5567a952.f2533e8.18516@Draupnir.home>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <CADiSq7fj_1NdUht+6TBO6YJTKnnB-n57gHPo+7E1bXXimJNyvw@mail.gmail.com>
 <etPan.5567a952.f2533e8.18516@Draupnir.home>
Message-ID: <CADiSq7e3e0jUV=MP=1xffL5TtUn_2d-HemkXHGrTW3cuCtK2FA@mail.gmail.com>

On 29 May 2015 9:48 am, "Donald Stufft" <donald at stufft.io> wrote:
>
>
>
> On May 28, 2015 at 7:40:26 PM, Nick Coghlan (ncoghlan at gmail.com) wrote:
> > >
> > > One thing I've seen more than once is that new development happens
> > in Python
> > > until the problem is understood, then the code is ported to Go.
> > Python's
> > > short path from idea to working code, along with its ability
> > to quickly morph
> > > as requirements and understanding changes, its batteries
> > included philosophy,
> > > and its "fits-your-brain" consistency are its biggest strengths!
> >
> >
> > Right, Go is displacing C/C++ in that regard (moreso than Python
> > itself), and now that Rust has hit 1.0, I expect we'll see it becoming
> > another contender for this task. Rust's big advantage over Go
> > in that regard is being compatible with the C/C++ ecosystem,
> > including Python's cffi.
> >
>
> I?m not sure if I?m reading this right or not, but just to be clear, I?ve
> seen a number of people express the sentiment that they are switching from
> Python to Go and that the deployment story is one of the reasons. It?s not
> just people switching from C/C++.

C and C++ used to be the main "second version" languages used to create
statically linked standalone binaries after an initial prototype in Python.

Folks that learned Python first understandably weren't keen on that idea,
so they tended  to either use Cython (or its predecessor, Pyrex), or else
not bother doing it at all until first Go and now Rust came along (for
reasons unknown to me, D appears to have never gained any popularity
outside the ACM crowd).

If I seem blase about Go, that's the main reason why - the benefits it
offers aren't novel from the point of view of C/C++ programmers, they're
just now available without having to put up with arcane syntax, manual
memory management, an unmaintainable threading model, relatively poor
support for text manipulation, etc, etc.

There's no shortage of software needing to be written, so powerful new
additions to our collective toolkit like Go are advancements to be
celebrated and learned from, rather than feared.

Cheers,
Nick.

>
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/c3579acb/attachment.html>

From steve at pearwood.info  Fri May 29 09:15:50 2015
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 29 May 2015 17:15:50 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
	dispatch for Python 2)
In-Reply-To: <CACac1F_xbuFXh90Yc9Zvrv-FyfVEZc-YNi52b49diaje9_uP_w@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <20150528115834.69284cb1@anarchist.wooz.org>
 <CACac1F_xbuFXh90Yc9Zvrv-FyfVEZc-YNi52b49diaje9_uP_w@mail.gmail.com>
Message-ID: <20150529071550.GJ932@ando.pearwood.info>

On Thu, May 28, 2015 at 05:38:49PM +0100, Paul Moore wrote:

> I suspect "single file executables" just aren't viewed as a desirable
> solution on Unix. 

More of an anti-pattern than a pattern. A single file executable means 
that when you have a security update, instead of patching one library, 
you have to patch all fifty applications that include that library.


> Although Donald referred to a 4K binary, which
> probably means just a stub exe that depends on system-installed .so
> files, likely including Python (I'm just guessing here). 

The machine I'm currently on has a 5.6K Python executable:

[steve at ando ~]$ ls -lh /usr/bin/python*
-rwxr-xr-x 2 root root 5.6K Jan  9  2013 /usr/bin/python
lrwxrwxrwx 1 root root    6 Jan 22  2013 /usr/bin/python2 -> python
-rwxr-xr-x 2 root root 5.6K Jan  9  2013 /usr/bin/python2.4

but that doesn't include libpython:

[steve at ando ~]$ ls -lh /usr/lib/libpython2.4.so*
lrwxrwxrwx 1 root root   19 Jan 22  2013 /usr/lib/libpython2.4.so -> libpython2.4.so.1.0
-r-xr-xr-x 1 root root 1.1M Jan  9  2013 /usr/lib/libpython2.4.so.1.0

or the standard library.


-- 
Steve

From ncoghlan at gmail.com  Fri May 29 09:24:32 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 17:24:32 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
Message-ID: <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>

On 29 May 2015 11:01 am, "Victor Stinner" <victor.stinner at gmail.com> wrote:
>
> Why not continue to enhance Python 3 instead of wasting our time with
> Python 2? We have limited resources in term of developers to maintain
> Python.
>
> (I'm not talking about fixing *bugs* in Python 2 which is fine with me.)

I'm actually OK with volunteers deciding that even fixing bugs in 2.7 isn't
inherently rewarding enough for them to be willing to do it for free on
their own time.

Stepping up to extrinsically reward activities that are beneficial for
customers but aren't intrinsically interesting enough for people to be
willing to do for free is one of the key reasons commercial open source
redistributors get paid.

That more explicitly commercial presence is a dynamic we haven't
historically had to deal with in core development, so there are going to be
some growing pains as we find an arrangement that everyone is comfortable
with (or is at least willing to tolerate, but I'm optimistic we can do
better than that).

Cheers,
Nick.

>
> --
>
> By the way, I just wrote sixer, a new tool to generate patches to port
> OpenStack to Python 3 :-)
> https://pypi.python.org/pypi/sixer
>
> It's based on regex, so it's less reliable than 2to3, 2to6 or
> modernize, but it's just enough for my specific use case. On
> OpenStack, it's not possible to send one giant patch "hello, this is
> python 3". Code is modified by small and incremental changes.
>
> Come on in the Python 3 world and... always look on the bright side of
> life ( https://www.youtube.com/watch?v=VOAtCOsNuVM )!
>
> Victor
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/687d1e35/attachment.html>

From njs at pobox.com  Fri May 29 09:36:25 2015
From: njs at pobox.com (Nathaniel Smith)
Date: Fri, 29 May 2015 00:36:25 -0700
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <CADiSq7fTiRMjDHFiVCeOKr1WWYgvVc2daxvU4opGcct7qL7Rcw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
 <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
 <CADiSq7eq98tFt+fymQiE++7aTd5og0E2oDySthamHhUpv=oFsA@mail.gmail.com>
 <20150529011536.60c4a3a5@fsol>
 <CADiSq7fTiRMjDHFiVCeOKr1WWYgvVc2daxvU4opGcct7qL7Rcw@mail.gmail.com>
Message-ID: <CAPJVwBksfBhzM_6c=PubMXRHNBYU+X5u=kWx+N+OwKVqtsBDUw@mail.gmail.com>

On Thu, May 28, 2015 at 11:40 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>
> On 29 May 2015 9:17 am, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
>>
>> On Thu, 28 May 2015 08:48:11 +1000
>> Nick Coghlan <ncoghlan at gmail.com> wrote:
>
>> > After all, the real difference between the alphas and the final releases
>> > isn't about anything *we* do, it's about the testing *other people* do
>> > that
>> > picks up gaps in our test coverage. A gated trunk makes it more feasible
>> > for other projects to do continuous integration against it.
>>
>> Long ago (before I became a core developer) we had "community
>> buildbots" for that. They didn't receive any attention or maintenance
>> from third-party projects.
>
> Right, but it's hard to integrate against trunk when trunk itself may be
> broken. If we had a way of publishing "known good" commit hashes that passed
> the test suite on all the stable buildbots, that could potentially provide a
> basis for integration testing without needing to switch to merge gating
> first.

ISTM the most natural way to publish a "known good" commit hash is by
updating a branch head to point at the latest good version. In fact
this is pretty much the exact use case that motivated the invention of
DVCS back in the day :-). Unfortunately hg makes this a little
trickier than it could be, because in hg the same commit can't be in
two different branches; but this just means you have to insert some
no-op merges, oh well.

Interestingly, this is almost identical to merge gating (at least, if
I'm correctly guessing what you mean by that -- the "not rocket
science rule"?), just with different names for the branches :-).

-n

-- 
Nathaniel J. Smith -- http://vorpus.org

From steve at pearwood.info  Fri May 29 10:36:02 2015
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 29 May 2015 18:36:02 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
	dispatch for Python 2)
In-Reply-To: <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
Message-ID: <20150529083601.GK932@ando.pearwood.info>

On Thu, May 28, 2015 at 01:20:06PM -0400, Donald Stufft wrote:

> I think it?s an issue for all platforms, even when there is a system Python
> that can be used.
> 
> Here?s why:
> 
> * Even on Linux systems Python isn?t always a guaranteed thing to be installed,
> ? for instance Debian works just fine without any Python installed.

Donald, are you a Linux user? If so, which distro? Because in the Linux 
world that I'm familiar with, this (and the points you make below) are 
absolutely not an issue. You let the package manager worry about 
dependencies:

yum install myapp  # Red Hat based distros
apt-get install myapp  # Debian based distros

will ensure that the right version of Python is installed.

In the circles I move in, installing anything which does not go through 
the package manager is considered to be quite dubious. In order of 
preference (best to worst):

- install from official distro repositories;
- install from a third-party repo;
- install from source;
- find an alternative application;
- do without;
- install from some binary format (also known as "would you like
  a root kit with that?").


[...]
> * Even if you have Python installed already, is it the right one? What if it?s
> ? an ancient RHEL box that has 2.6 or (heaven forbid) 2.4? What if it?s a not
> ? ancient box that has Python 2.7 but you want to deploy your app in Python 3?

Most recent distros have both a python2 and python3 package, and when 
building your rpm or deb file, you specify which is your dependency in 
the normal fashion.


> * What if you have Python installed already, but it?s been patched by the place
> ? you got it from and now the behavior is different than what you expected?

Normally you would write for the version of Python provided by the 
distros you wish to support. In practice that might mean writing hybrid 
code targetting (say) 2.6 and 2.7, which covers most recent Red Hat and 
Debian based systems, and anything else, you provide the source code and 
let the user work it out.

I suppose that in principle you could include whatever version of Python 
you like *in your application*, but that would be considered an unusual 
thing to do. I believe that LibreOffice and OpenOffice do that, so they 
can support Python as a scripting language, but they're generally 
considered (1) a special case because they're cross-platform, and (2) 
not "proper" Unix or Linux apps anyway.

The point is, in the Linux circles I move in, this idea of single file 
installation would be about as popular as a police raid at a rave club. 
Maybe you move in different circles (perhaps more enterprisey?), but I 
can already imagine the sort of derogatory comments the sys admins I 
work with would make about this idea. 



-- 
Steve

From p.f.moore at gmail.com  Fri May 29 11:38:36 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 29 May 2015 10:38:36 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150529083601.GK932@ando.pearwood.info>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
 <20150529083601.GK932@ando.pearwood.info>
Message-ID: <CACac1F8=zj0MfyGm+r_Y+mdhFwxsK1ftNQSs+fsGE4yEAxooLw@mail.gmail.com>

On 29 May 2015 at 09:36, Steven D'Aprano <steve at pearwood.info> wrote:
> The point is, in the Linux circles I move in, this idea of single file
> installation would be about as popular as a police raid at a rave club.
> Maybe you move in different circles (perhaps more enterprisey?), but I
> can already imagine the sort of derogatory comments the sys admins I
> work with would make about this idea.

In my environments, we frequently have ancient versions of RHEL
installed, sometimes with no Python at all (IIRC) or nothing better
than 2.4. The sysadmins won't install newer versions, as Python isn't
formally needed, but we'd happily use it for adhoc admin-style scripts
(the alternative is typically shell scripts or nothing). It's not
precisely acceptable, but being able to create a single-file small
executable in the support user's home directory made from an admin
script would be immensely useful.

It's hardly a core use case, and we generally just live with shell
scripts, but such environments *do* exist :-(

Paul

From stephen at xemacs.org  Fri May 29 12:24:56 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Fri, 29 May 2015 19:24:56 +0900
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <CAPJVwBksfBhzM_6c=PubMXRHNBYU+X5u=kWx+N+OwKVqtsBDUw@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
 <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
 <CADiSq7eq98tFt+fymQiE++7aTd5og0E2oDySthamHhUpv=oFsA@mail.gmail.com>
 <20150529011536.60c4a3a5@fsol>
 <CADiSq7fTiRMjDHFiVCeOKr1WWYgvVc2daxvU4opGcct7qL7Rcw@mail.gmail.com>
 <CAPJVwBksfBhzM_6c=PubMXRHNBYU+X5u=kWx+N+OwKVqtsBDUw@mail.gmail.com>
Message-ID: <87d21jvfyf.fsf@uwakimon.sk.tsukuba.ac.jp>

Nathaniel Smith writes:

 > DVCS back in the day :-). Unfortunately hg makes this a little
 > trickier than it could be, because in hg the same commit can't be in
 > two different branches; but this just means you have to insert some
 > no-op merges, oh well.

Don't use named branches ("friends don't let friends ...").  Use
bookmarks.  Theoretically that should work fine.


From p.f.moore at gmail.com  Fri May 29 12:33:58 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 29 May 2015 11:33:58 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <55678404.40602@g.nevcal.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
 <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
 <55678404.40602@g.nevcal.com>
Message-ID: <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>

On 28 May 2015 at 22:09, Glenn Linderman <v+python at g.nevcal.com> wrote:
> This would be something I could use and benefit from immediately upon it
> being available, so I laud your idea, and hope you have a successful
> implementation, and look forward to using it.  It would largely replace the
> need for the py.exe launcher for some classes of applications.

The following proof-of-concept works as is (based on my pretty minimal
testing), and only uses the limited API, so it should work with any
version of Python 3 (I've not tested it with Python 2, but I think the
only "new" API is PySys_SetArgvEx, which could be replaced with
PySys_SetArgv at a pinch). Excuse the dreadful coding style and lack
of error handling, I hacked it up in about an hour :-)

(Actually, I just tried building on Python 2 - guess what - Unicode
:-) SetProgramName and SetArgvEx won't take Unicode values. The easy
fix is just not to use Unicode, the hard one is to do the encoding
dance, but I'm not going to bother...).

#define UNICODE
#define _UNICODE
#include <Python.h>
#include <windows.h>

int
main()
{
    TCHAR program[MAX_PATH];
    LPWSTR *argv;
    int argc;
    PyObject *runpy;
    PyObject *ret;

    argv = CommandLineToArgvW(GetCommandLineW(), &argc);
    GetModuleFileName(NULL, program, MAX_PATH);
    Py_SetProgramName(program);  /* optional but recommended */
    Py_Initialize();
    PySys_SetArgvEx(argc, argv, 0);
    runpy = PyImport_ImportModule("runpy");
    if (!runpy) PyErr_Print();
    ret = PyObject_CallMethod(runpy, "run_path", "u", program);
    if (!ret) PyErr_Print();
    Py_Finalize();
    return 0;
}

One mildly annoying thing is that python3.dll is only installed in
<python install dir>\DLLs, which typically isn't on PATH. So actually
using the limited API from your own application fails by default.
Fixing that's mostly a user admin issue, though (and you can just link
to the full API and avoid the whole problem).

> Of course, per other disccusions, this doesn't solve the problem for:
>
> A) machine without Python installed
> B) programs that need binary extensions
>
> Other discussions have suggested:
>
> 3) The stub could offer to download and install Python
>
> A corollary:
>
> 4) The stub could offer to download and install the needed binary extensions
> as well as Python. This would require the installation uniformity of
> something like pip, so perhaps would be restricted to extensions available
> via pip.  And it would be much enhanced by some technique where the zipapp
> would contain metadata readable by the stub, that would declare the list of
> binary extensions required.  Or, of course, it could even declare non-binary
> extension that are not packaged with the zipapp, if the process is smooth,
> the modules available via pip, etc., as a tradeoff.

I'm pretty strongly against downloading interpreters or extensions.
Apart from the pretty huge added complexity, as a user I'm not sure
I'd trust a supposedly simple application I'd received if it started
asking to download stuff unexpectedly...

Paul

From stephen at xemacs.org  Fri May 29 12:35:57 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Fri, 29 May 2015 19:35:57 +0900
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F8=zj0MfyGm+r_Y+mdhFwxsK1ftNQSs+fsGE4yEAxooLw@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
 <20150529083601.GK932@ando.pearwood.info>
 <CACac1F8=zj0MfyGm+r_Y+mdhFwxsK1ftNQSs+fsGE4yEAxooLw@mail.gmail.com>
Message-ID: <87bnh3vfg2.fsf@uwakimon.sk.tsukuba.ac.jp>

Paul Moore writes:

 > In my environments, we frequently have ancient versions of RHEL
 > installed, sometimes with no Python at all (IIRC) or nothing better
 > than 2.4.

That's pretty advanced as older Red Hat systems go.  You're lucky it
isn't 1.5.2!

Getting serious, Red Hat systems have included Python for about as
long as the youngest core committers have been alive, and I'm sure
that goes back before they called it "Enterprise" Linux.




From steve at pearwood.info  Fri May 29 12:53:53 2015
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 29 May 2015 20:53:53 +1000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
	dispatch for Python 2)
In-Reply-To: <CADiSq7dtNX3fDbZJgsLF1tkgRaK-_O_L-YZWmBS+az3RFwbmYw@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
 <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
 <CADiSq7dtNX3fDbZJgsLF1tkgRaK-_O_L-YZWmBS+az3RFwbmYw@mail.gmail.com>
Message-ID: <20150529105352.GL932@ando.pearwood.info>

On Fri, May 29, 2015 at 07:08:43AM +1000, Nick Coghlan wrote:
> On 29 May 2015 05:25, "Chris Barker" <chris.barker at noaa.gov> wrote:
> >
> > OK, I'm really confused here:
> >
> > 1) what the heck is so special about go all of a sudden? People have been
> > writing and deploying single file executables built with C and ++, 
> > and whatever else? forever. (and indeed, it was a big sticking point 
> > for me when I introduced python in my organization)
> 
> For scientific Python folks, the equivalent conversations I have are about
> Julia.
> 
> If you're not used to thinking of Python's competitive position as "best
> orchestration language, solid competitor in any given niche", then the rise
> of niche specific competitors like Go & Julia can feel terrifying, as the
> relatively narrow user base changes the trade-offs you can make in the
> language & ecosystem design to better optimise them for that purpose.

We've been there before, with the "Ruby is the Python-killer" FUD of a 
few years ago. If Go is different and does overtake Python, I think it 
will be due to its privileged position on Android.

Personally, I don't pay a lot of attention to language popularity 
statistics. Who cares whether Python is used by 8% of projects or 10% of 
projects? Either way, it's huge. But from time to time, it might be 
useful to look at a few different measurements of popularity.

According to CodeEval, Python is still *by far* the most popular 
language, at 31.2% (second place is below 20%), with Go at #9 with 2.3%.

According to Redmonk, Python is stable at #4, while Go has jumped from 
#21 to #17 in six months.

LangPop gives various different measures of popularity, and according to 
the overall summary, Python is at #6, but Go doesn't appear to be a 
language they look at.

TIOBE has Python moving up two places to #6, and Go (which was the 2009 
"Hall Of Fame" winner) doesn't even appear in the top 100.

http://blog.codeeval.com/codeevalblog/2015
http://redmonk.com/sogrady/2015/01/14/language-rankings-1-15/
http://langpop.com/
http://www.tiobe.com/index.php/content/paperinfo/tpci/index.html


I think there are some exciting and interesting languages coming up: 
Swift, Julia, Go, Rust and others. Why are we threatened by this? Python 
makes a wonderful glue language. It would be great for Python to glue to 
more than just C and Fortran code. For scientific users, imagine being 
able to call Julia code from Python, and vice versa. Instead of thinking 
of Go as an opponent to beat, wouldn't it be great to be able to write 
extensions in a modern language like Go, Rust or D instead of creaky old 
C with all its safety issues?



-- 
Steve

From pmiscml at gmail.com  Fri May 29 13:25:09 2015
From: pmiscml at gmail.com (Paul Sokolovsky)
Date: Fri, 29 May 2015 14:25:09 +0300
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150529105352.GL932@ando.pearwood.info>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
 <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
 <CADiSq7dtNX3fDbZJgsLF1tkgRaK-_O_L-YZWmBS+az3RFwbmYw@mail.gmail.com>
 <20150529105352.GL932@ando.pearwood.info>
Message-ID: <20150529142509.6fa315bb@x230>

Hello,

On Fri, 29 May 2015 20:53:53 +1000
Steven D'Aprano <steve at pearwood.info> wrote:

[ insightful statistics skipped ]

> I think there are some exciting and interesting languages coming up: 
> Swift, Julia, Go, Rust and others. 

Only those? Every one in a dozen university student comes up with an
exciting, interesting language - it has always been like that. Further
development and maintenance is what levels it below the common crowd.

> Why are we threatened by this?

Because at least some of them are backed by media companies, who use
them as leverage for their advertisement and PR campaigns. Obviously,
media companies already have great advertisement influence, and can
fool anybody's head with their tricks.

> Python makes a wonderful glue language. It would be great for Python
> to glue to more than just C and Fortran code. For scientific users,
> imagine being able to call Julia code from Python, and vice versa.

There "always" were things like integration of Python and Lua, etc.
Did somebody use them? No, they're of interest only to their authors. 

> Instead of thinking of Go as an opponent to beat, wouldn't it be
> great to be able to write extensions in a modern language like Go,
> Rust or D instead of creaky old C with all its safety issues?

Because very few people use Go, Rust, or D at all. And then they're
likely concentrated in a small niche. Going 2-level, using experimental
language L in an area A, where A is arbitrary/random, has almost zero
interest for majority of population. Let's wait till Rust becomes
real rust and talk again.

> -- 
> Steve


-- 
Best regards,
 Paul                          mailto:pmiscml at gmail.com

From ncoghlan at gmail.com  Fri May 29 13:39:55 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 29 May 2015 21:39:55 +1000
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <87d21jvfyf.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <CADiSq7co3YUt8RgAGBcSow+Zxo3-dsRC7EvUpqWiZtYXDKFEHQ@mail.gmail.com>
 <CADiSq7dw8LYkDNbmXDrno9DUm0AstNbZnkbV+ZW8aGXY-K-vfw@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
 <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
 <CADiSq7eq98tFt+fymQiE++7aTd5og0E2oDySthamHhUpv=oFsA@mail.gmail.com>
 <20150529011536.60c4a3a5@fsol>
 <CADiSq7fTiRMjDHFiVCeOKr1WWYgvVc2daxvU4opGcct7qL7Rcw@mail.gmail.com>
 <CAPJVwBksfBhzM_6c=PubMXRHNBYU+X5u=kWx+N+OwKVqtsBDUw@mail.gmail.com>
 <87d21jvfyf.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <CADiSq7e6HAiES-THOnXPcy-dOL5+hELakFZHjLopJrokK7dmag@mail.gmail.com>

On 29 May 2015 20:24, "Stephen J. Turnbull" <stephen at xemacs.org> wrote:
>
> Nathaniel Smith writes:
>
>  > DVCS back in the day :-). Unfortunately hg makes this a little
>  > trickier than it could be, because in hg the same commit can't be in
>  > two different branches; but this just means you have to insert some
>  > no-op merges, oh well.
>
> Don't use named branches ("friends don't let friends ...").  Use
> bookmarks.  Theoretically that should work fine.

The key is whether or not we can readily notify people when the "most
recent known good" hash *changes*, and less about the mechanics of how we
then record the history of which commits *were* stable, or the identity of
the most recent commit.

That said, prompted by Nathaniel's comment, I realised that having a
"post-BuildBot" stable repo is one possible way we could achieve that. That
way we could introduce merge gating without needing to change anything at
all about how we manage the current development repo, we'd just push stable
versions to a separate repo that only the BuildBot master (or a dedicated
service monitoring for successful cross-platform BuildBot runs) and the
release managers had permissions to push to.

The commit hooks on that *stable* repo could then be used to trigger third
party test suites only for builds that at least passed CPython's own test
suite.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/9e7c0d68/attachment.html>

From solipsis at pitrou.net  Fri May 29 13:53:23 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Fri, 29 May 2015 13:53:23 +0200
Subject: [Python-Dev] time-based releases (was Re: Preserving the
 definition order of class namespaces.)
In-Reply-To: <CADiSq7e6HAiES-THOnXPcy-dOL5+hELakFZHjLopJrokK7dmag@mail.gmail.com>
References: <CALFfu7CdzTFsZcOENZwCCGYxdXZtLpG5vx6iQvPig_89Y23xhg@mail.gmail.com>
 <55614230.5010904@hastings.org> <20150525093314.3ce18048@fsol>
 <CALFfu7CfLb4C5HvDnG7qqZufDh6W5u7ws3r-por+xwtQFgC63A@mail.gmail.com>
 <mk01cd$lp1$1@ger.gmane.org>
 <CALFfu7DZvqTgNYHGu+0zOXJ1xAOhN7ne7_JrmJnL330g4FcBGQ@mail.gmail.com>
 <5563BE8A.9070406@hastings.org> <20150527101622.10a36d75@fsol>
 <mk5c6e$ruv$1@ger.gmane.org>
 <20150527173426.3a829b78@anarchist.wooz.org>
 <CAP7+vJK=9j9N_d2RmzJ-c90FW4Uq0QqY4ep9JkvbvEiUn5pgFA@mail.gmail.com>
 <CADiSq7e_DY+3cP=oyMzOv8_GoY5wHvLYN_hXE-egPduZ7YsKYw@mail.gmail.com>
 <CADiSq7eq98tFt+fymQiE++7aTd5og0E2oDySthamHhUpv=oFsA@mail.gmail.com>
 <20150529011536.60c4a3a5@fsol>
 <CADiSq7fTiRMjDHFiVCeOKr1WWYgvVc2daxvU4opGcct7qL7Rcw@mail.gmail.com>
 <CAPJVwBksfBhzM_6c=PubMXRHNBYU+X5u=kWx+N+OwKVqtsBDUw@mail.gmail.com>
 <87d21jvfyf.fsf@uwakimon.sk.tsukuba.ac.jp>
 <CADiSq7e6HAiES-THOnXPcy-dOL5+hELakFZHjLopJrokK7dmag@mail.gmail.com>
Message-ID: <20150529135323.7def0017@fsol>

On Fri, 29 May 2015 21:39:55 +1000
Nick Coghlan <ncoghlan at gmail.com> wrote:
> The key is whether or not we can readily notify people when the "most
> recent known good" hash *changes*, and less about the mechanics of how we
> then record the history of which commits *were* stable, or the identity of
> the most recent commit.
> 
> That said, prompted by Nathaniel's comment, I realised that having a
> "post-BuildBot" stable repo is one possible way we could achieve that. That
> way we could introduce merge gating without needing to change anything at
> all about how we manage the current development repo, we'd just push stable
> versions to a separate repo that only the BuildBot master (or a dedicated
> service monitoring for successful cross-platform BuildBot runs) and the
> release managers had permissions to push to.

Any amount of merge gating or other automated workflow is dependent on
stabilizing our test suite and buildbot fleet, so that regressions can
be unambiguously spotted. 

Regards

Antoine.

From donald at stufft.io  Fri May 29 14:35:44 2015
From: donald at stufft.io (Donald Stufft)
Date: Fri, 29 May 2015 08:35:44 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <20150529083601.GK932@ando.pearwood.info>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
 <20150529083601.GK932@ando.pearwood.info>
Message-ID: <etPan.55685d20.63d83f03.18516@Draupnir.home>



On May 29, 2015 at 4:37:37 AM, Steven D'Aprano (steve at pearwood.info) wrote:
> On Thu, May 28, 2015 at 01:20:06PM -0400, Donald Stufft wrote:
>  
> > I think it?s an issue for all platforms, even when there is a system Python
> > that can be used.
> >
> > Here?s why:
> >
> > * Even on Linux systems Python isn?t always a guaranteed thing to be installed,
> > for instance Debian works just fine without any Python installed.
>  
> Donald, are you a Linux user? If so, which distro? Because in the Linux
> world that I'm familiar with, this (and the points you make below) are
> absolutely not an issue. You let the package manager worry about
> dependencies:
>  
> yum install myapp # Red Hat based distros
> apt-get install myapp # Debian based distros
>  
> will ensure that the right version of Python is installed.
>  
> In the circles I move in, installing anything which does not go through
> the package manager is considered to be quite dubious. In order of
> preference (best to worst):
>  
> - install from official distro repositories;
> - install from a third-party repo;
> - install from source;
> - find an alternative application;
> - do without;
> - install from some binary format (also known as "would you like
> a root kit with that?").
>  
>  
> [...]
> > * Even if you have Python installed already, is it the right one? What if it?s
> > an ancient RHEL box that has 2.6 or (heaven forbid) 2.4? What if it?s a not
> > ancient box that has Python 2.7 but you want to deploy your app in Python 3?
>  
> Most recent distros have both a python2 and python3 package, and when
> building your rpm or deb file, you specify which is your dependency in
> the normal fashion.
>  
>  
> > * What if you have Python installed already, but it?s been patched by the place
> > you got it from and now the behavior is different than what you expected?
>  
> Normally you would write for the version of Python provided by the
> distros you wish to support. In practice that might mean writing hybrid
> code targetting (say) 2.6 and 2.7, which covers most recent Red Hat and
> Debian based systems, and anything else, you provide the source code and
> let the user work it out.
>  
> I suppose that in principle you could include whatever version of Python
> you like *in your application*, but that would be considered an unusual
> thing to do. I believe that LibreOffice and OpenOffice do that, so they
> can support Python as a scripting language, but they're generally
> considered (1) a special case because they're cross-platform, and (2)
> not "proper" Unix or Linux apps anyway.
>  
> The point is, in the Linux circles I move in, this idea of single file
> installation would be about as popular as a police raid at a rave club.
> Maybe you move in different circles (perhaps more enterprisey?), but I
> can already imagine the sort of derogatory comments the sys admins I
> work with would make about this idea.
>  

I use Linux for servers yes, I don't stick with a single Distribution and right
now I manage services that are running on CentOS, Ubuntu, Debian, Alpine, and
FreeBSD (not a Linux, but w/e).?

Here's the thing though, when you make software that is designed for other
people to consume you have two choices. Either you have to try to anticipate
every single environment that they might possibly run it in and what is
available there so that your software either runs there or you can provide
instructions on how to run it there, or you need to depend on as little from
the OS as possible.

An example of a product that does this is Chef, they install their own Ruby
and everything but libc into /opt/chef to completely isolate themselves from
the host system. I?m told this made things *much* easier for them as they
don't really have to worry at all about what's available on the host system,
Chef pretty much just works.

Another example is one that I personally worked on recently, where the company
I worked for wanted to distribute a CLI to our customers which would
"just work" that they could use to interact with the service we provided. We
were writing this client in Python and it was very painful to satisfy the
requirement that it work without Python installed as a single file executable.
That was a hard requirement as anything else would be more difficult to
distribute to end users and be more likely to break. The entire time I was
working on that particular piece I was feeling like I should be suggesting to
my manager that we throw away the Python code and just write it in Go.

As folks may or may not know, I'm heavily involved in pip which is probably one
of the most widely used CLIs written in Python. A single file executable won't
help pip, however through my experience there I can tell you that a significant
portion of our issues come from random weird differences in how a particular
person's Python environment is setup. If someone is distributing a CLI app
written in Python, being able to remain independent from the OS eliminates a
whole class of problems.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From donald at stufft.io  Fri May 29 14:50:15 2015
From: donald at stufft.io (Donald Stufft)
Date: Fri, 29 May 2015 08:50:15 -0400
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CADiSq7e3e0jUV=MP=1xffL5TtUn_2d-HemkXHGrTW3cuCtK2FA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <CADiSq7fj_1NdUht+6TBO6YJTKnnB-n57gHPo+7E1bXXimJNyvw@mail.gmail.com>
 <etPan.5567a952.f2533e8.18516@Draupnir.home>
 <CADiSq7e3e0jUV=MP=1xffL5TtUn_2d-HemkXHGrTW3cuCtK2FA@mail.gmail.com>
Message-ID: <etPan.55686087.7d9049f5.18516@Draupnir.home>



On May 29, 2015 at 2:58:28 AM, Nick Coghlan (ncoghlan at gmail.com) wrote:
> On 29 May 2015 9:48 am, "Donald Stufft" wrote:
> >
> >
> >
> > On May 28, 2015 at 7:40:26 PM, Nick Coghlan (ncoghlan at gmail.com) wrote:
> > > >
> > > > One thing I've seen more than once is that new development happens
> > > in Python
> > > > until the problem is understood, then the code is ported to Go.
> > > Python's
> > > > short path from idea to working code, along with its ability
> > > to quickly morph
> > > > as requirements and understanding changes, its batteries
> > > included philosophy,
> > > > and its "fits-your-brain" consistency are its biggest strengths!
> > >
> > >
> > > Right, Go is displacing C/C++ in that regard (moreso than Python
> > > itself), and now that Rust has hit 1.0, I expect we'll see it becoming
> > > another contender for this task. Rust's big advantage over Go
> > > in that regard is being compatible with the C/C++ ecosystem,
> > > including Python's cffi.
> > >
> >
> > I?m not sure if I?m reading this right or not, but just to be clear, I?ve
> > seen a number of people express the sentiment that they are switching from
> > Python to Go and that the deployment story is one of the reasons. It?s not
> > just people switching from C/C++.
>  
> C and C++ used to be the main "second version" languages used to create
> statically linked standalone binaries after an initial prototype in Python.
>  
> Folks that learned Python first understandably weren't keen on that idea,
> so they tended to either use Cython (or its predecessor, Pyrex), or else
> not bother doing it at all until first Go and now Rust came along (for
> reasons unknown to me, D appears to have never gained any popularity
> outside the ACM crowd).
>  
> If I seem blase about Go, that's the main reason why - the benefits it
> offers aren't novel from the point of view of C/C++ programmers, they're
> just now available without having to put up with arcane syntax, manual
> memory management, an unmaintainable threading model, relatively poor
> support for text manipulation, etc, etc.
>  

I don't think Go is going to "kill" Python or anything, but I do think that
not taking a look at other languages and why people are picking them over
Python is important, otherwise we will end up dying (and would deserve to)
because we'd be like the big company that didn't bother to keep up with the
times and just assumed we'd be big forever. I talk to a lot of people about
the distribution story of Python applications, what works and what doesn't.
A very large majority of the people who have used both Go and Python in a
serious capacity have indicated that they've at least considered writing new
things in Go instead of Python due to the fact that distributing it is much
easier and a not insignificant number of them have in fact started to switch
to using Go in situations where they are trying to distribute things to
disparate boxes.

This might be something that people could have done before with C/C++ but with
a nicer language behind it... but that's kind of the point? You don't need to
be stuck with a terrible language to get a nice single file executable anymore,
you can get that and use a good language at the same time which makes it a lot
more compelling to a lot more people than having to be stuck with C.

---  
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA



From solipsis at pitrou.net  Fri May 29 14:57:40 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Fri, 29 May 2015 14:57:40 +0200
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
 <20150529083601.GK932@ando.pearwood.info>
Message-ID: <20150529145740.6b459e6f@fsol>

On Fri, 29 May 2015 18:36:02 +1000
Steven D'Aprano <steve at pearwood.info> wrote:
> 
> The point is, in the Linux circles I move in, this idea of single file 
> installation would be about as popular as a police raid at a rave club. 

This is frankly not true. There are many programs (e.g. games) which are
not available as distribution packages, and can't rely on the user's
distribution to provide the right versions of the required pieces of
infrastructure. Those programs have to bundle the whole stack
independently.

Perhaps in *your* Linux circle you never use such programs, but they do
exist and are part of the ecosystem.

Regards

Antoine.



From souravsaket31 at gmail.com  Fri May 29 14:35:56 2015
From: souravsaket31 at gmail.com (Saket Sourav)
Date: Fri, 29 May 2015 12:35:56 +0000
Subject: [Python-Dev] Not getting the exact file to start
Message-ID: <CADPg4o4td1Wdr+-6QV5Sa4AxPhC4VJJz3SYaJafvLV9JcVRFMw@mail.gmail.com>

Hello sir.
I have just installed python 3.4.2.
I'm not getting the file 'IDLE (python GUI)'
to start programming.
Or which file I should open to write code
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/8ef7d8c5/attachment.html>

From brett at python.org  Fri May 29 15:38:00 2015
From: brett at python.org (Brett Cannon)
Date: Fri, 29 May 2015 13:38:00 +0000
Subject: [Python-Dev] Not getting the exact file to start
In-Reply-To: <CADPg4o4td1Wdr+-6QV5Sa4AxPhC4VJJz3SYaJafvLV9JcVRFMw@mail.gmail.com>
References: <CADPg4o4td1Wdr+-6QV5Sa4AxPhC4VJJz3SYaJafvLV9JcVRFMw@mail.gmail.com>
Message-ID: <CAP1=2W5YtD2n_u5R=nicnrB8pMmOHjiAW8DEP0Vg5j19TNfFig@mail.gmail.com>

This mailing list is for the development *of* Python, not *with* it. Your
best option for getting help like this is python-list at python.org.

On Fri, May 29, 2015 at 9:36 AM Saket Sourav <souravsaket31 at gmail.com>
wrote:

> Hello sir.
> I have just installed python 3.4.2.
> I'm not getting the file 'IDLE (python GUI)'
> to start programming.
> Or which file I should open to write code
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/ca3d322c/attachment.html>

From yselivanov.ml at gmail.com  Fri May 29 15:57:24 2015
From: yselivanov.ml at gmail.com (Yury Selivanov)
Date: Fri, 29 May 2015 09:57:24 -0400
Subject: [Python-Dev] Obtaining stack-frames from co-routine objects
In-Reply-To: <CABZ0LtAPMi9nQb9ZSp4jW4-kvTte_WVjz=wtKvUWmGeSwuyqVA@mail.gmail.com>
References: <CABZ0LtAPMi9nQb9ZSp4jW4-kvTte_WVjz=wtKvUWmGeSwuyqVA@mail.gmail.com>
Message-ID: <55687044.1090700@gmail.com>

Hi Ben,

Is there any real-world scenario where you would need this?

It looks like this can help with debugging, somehow, but the easiest
solution is to put a "if debug: log(...)" before "yield" in your
"switch()" function.  You'll have a perfect traceback there.

Thanks,
Yury

On 2015-05-29 12:46 AM, Ben Leslie wrote:
> Hi all,
>
> Apologies in advance; I'm not a regular, and this may have been
> handled already (but I couldn't find it when searching).
>
> I've been using the new async/await functionality (congrats again to
> Yury on getting that through!), and I'd like to get a stack trace
> between the place at which blocking occurs and the outer co-routine.
>
> For example, consider this code:
>
> """
> async def a():
>      await b()
>
> async def b():
>      await switch()
>
> @types.coroutine
> def switch():
>      yield
>
> coro_a = a()
> coro_a.send(None)
> """
>
> At this point I'd really like to be able to somehow get a stack trace
> similar to:
>
> test.py:2
> test.py:4
> test.py:9
>
> Using the gi_frame attribute of coro_a, I can get the line number of
> the outer frame (e.g.: line 2), but from there there is no way to
> descend the stack to reach the actual yield point.
>
> I thought that perhaps the switch() co-routine could yield the frame
> object returned from inspect.currentframe(), however once that
> function yields that frame object has f_back changed to None.
>
> A hypothetical approach would be to work the way down form the
> outer-frame, but that requires getting access to the co-routine object
> that the outer-frame is currently await-ing. Some hypothetical code
> could be:
>
> """
> def show(coro):
>      print("{}:{}".format(coro.gi_frame.f_code.co_filename,
> coro.gi_frame.f_lineno))
>      if dis.opname[coro.gi_code.co_code[coro.gi_frame.f_lasti + 1]] ==
> 'YIELD_FROM':
>          show(coro.gi_frame.f_stack[0])
> """
>
> This relies on the fact that an await-ing co-routine will be executing
> a YIELD_FROM instruction. The above code uses a completely
> hypothetical 'f_stack' property of frame objects to pull the
> co-routine object which a co-routine is currently await-ing from the
> stack. I've implemented a proof-of-concept f_stack property in the
> frameobject.c just to test out the above code, and it seems to work.
>
> With all that, some questions:
>
> 1) Does anyone else see value in trying to get the stack-trace down to
> the actual yield point?
> 2) Is there a different way of doing it that doesn't require changes
> to Python internals?
> 3) Assuming no to #2 is there a better way of getting the information
> compared to the pretty hacking byte-code/stack inspection?
>
> Thanks,
>
> Ben
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com


From p.andrefreitas at gmail.com  Fri May 29 15:41:54 2015
From: p.andrefreitas at gmail.com (=?UTF-8?Q?Andr=C3=A9_Freitas?=)
Date: Fri, 29 May 2015 13:41:54 +0000
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.55686087.7d9049f5.18516@Draupnir.home>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <CADiSq7fj_1NdUht+6TBO6YJTKnnB-n57gHPo+7E1bXXimJNyvw@mail.gmail.com>
 <etPan.5567a952.f2533e8.18516@Draupnir.home>
 <CADiSq7e3e0jUV=MP=1xffL5TtUn_2d-HemkXHGrTW3cuCtK2FA@mail.gmail.com>
 <etPan.55686087.7d9049f5.18516@Draupnir.home>
Message-ID: <CAMkX=YWGV2RBVz2yCqBJqDQ-YkCXp1N=+s0aBfv46Ok4QEkrYw@mail.gmail.com>

Speaking about distribution I believe Pip is the simplest way of
distributing. I have used some freezing tools in the past such cxfreeze but
with more complex projects they start being hard to manage. Now instead of
saying people to goto an url, download and put in the path I just say: pip
install <project>

Unfortunately, this approach only works well with products built for
developers.

A Sex, 29/05/2015, 13:50, Donald Stufft <donald at stufft.io> escreveu:

>
>
> On May 29, 2015 at 2:58:28 AM, Nick Coghlan (ncoghlan at gmail.com) wrote:
> > On 29 May 2015 9:48 am, "Donald Stufft" wrote:
> > >
> > >
> > >
> > > On May 28, 2015 at 7:40:26 PM, Nick Coghlan (ncoghlan at gmail.com)
> wrote:
> > > > >
> > > > > One thing I've seen more than once is that new development happens
> > > > in Python
> > > > > until the problem is understood, then the code is ported to Go.
> > > > Python's
> > > > > short path from idea to working code, along with its ability
> > > > to quickly morph
> > > > > as requirements and understanding changes, its batteries
> > > > included philosophy,
> > > > > and its "fits-your-brain" consistency are its biggest strengths!
> > > >
> > > >
> > > > Right, Go is displacing C/C++ in that regard (moreso than Python
> > > > itself), and now that Rust has hit 1.0, I expect we'll see it
> becoming
> > > > another contender for this task. Rust's big advantage over Go
> > > > in that regard is being compatible with the C/C++ ecosystem,
> > > > including Python's cffi.
> > > >
> > >
> > > I?m not sure if I?m reading this right or not, but just to be clear,
> I?ve
> > > seen a number of people express the sentiment that they are switching
> from
> > > Python to Go and that the deployment story is one of the reasons. It?s
> not
> > > just people switching from C/C++.
> >
> > C and C++ used to be the main "second version" languages used to create
> > statically linked standalone binaries after an initial prototype in
> Python.
> >
> > Folks that learned Python first understandably weren't keen on that idea,
> > so they tended to either use Cython (or its predecessor, Pyrex), or else
> > not bother doing it at all until first Go and now Rust came along (for
> > reasons unknown to me, D appears to have never gained any popularity
> > outside the ACM crowd).
> >
> > If I seem blase about Go, that's the main reason why - the benefits it
> > offers aren't novel from the point of view of C/C++ programmers, they're
> > just now available without having to put up with arcane syntax, manual
> > memory management, an unmaintainable threading model, relatively poor
> > support for text manipulation, etc, etc.
> >
>
> I don't think Go is going to "kill" Python or anything, but I do think that
> not taking a look at other languages and why people are picking them over
> Python is important, otherwise we will end up dying (and would deserve to)
> because we'd be like the big company that didn't bother to keep up with the
> times and just assumed we'd be big forever. I talk to a lot of people about
> the distribution story of Python applications, what works and what doesn't.
> A very large majority of the people who have used both Go and Python in a
> serious capacity have indicated that they've at least considered writing
> new
> things in Go instead of Python due to the fact that distributing it is much
> easier and a not insignificant number of them have in fact started to
> switch
> to using Go in situations where they are trying to distribute things to
> disparate boxes.
>
> This might be something that people could have done before with C/C++ but
> with
> a nicer language behind it... but that's kind of the point? You don't need
> to
> be stuck with a terrible language to get a nice single file executable
> anymore,
> you can get that and use a good language at the same time which makes it a
> lot
> more compelling to a lot more people than having to be stuck with C.
>
> ---
> Donald Stufft
> PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/p.andrefreitas%40gmail.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/66e3de14/attachment.html>

From pmiscml at gmail.com  Fri May 29 16:23:16 2015
From: pmiscml at gmail.com (Paul Sokolovsky)
Date: Fri, 29 May 2015 17:23:16 +0300
Subject: [Python-Dev] Single-file Python executables (including case of
 self-sufficient package manager)
In-Reply-To: <etPan.55685d20.63d83f03.18516@Draupnir.home>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
 <20150529083601.GK932@ando.pearwood.info>
 <etPan.55685d20.63d83f03.18516@Draupnir.home>
Message-ID: <20150529172316.516fce6c@x230>

Hello,

On Fri, 29 May 2015 08:35:44 -0400
Donald Stufft <donald at stufft.io> wrote:

[]
 
> Another example is one that I personally worked on recently, where
> the company I worked for wanted to distribute a CLI to our customers
> which would "just work" that they could use to interact with the
[]
> particular piece I was feeling like I should be suggesting to my
> manager that we throw away the Python code and just write it in Go.

Please consider next time thinking about MicroPython for this usecase,
as that's exactly why there're people who think that MicroPython is
interesting for "desktop" systems, not just bare-metal
microcontrollers. There're too few such people so far, unfortunately,
so progress is slow.

> An example of a product that does this is Chef, they install their
> own Ruby and everything but libc into /opt/chef to completely isolate
> themselves from the host system. I?m told this made things *much*
> easier for them as they don't really have to worry at all about
> what's available on the host system, Chef pretty much just works.

[]
> As folks may or may not know, I'm heavily involved in pip which is
> probably one of the most widely used CLIs written in Python. A single
> file executable won't help pip, however through my experience there I

It's interesting you bring up this case of pip (and Chef), as I had
similar concerns/issues when developing a self-hosted package manager
for MicroPython. MicroPython doesn't come out of the box with standard
library - beyond few builtin modules ("core" library), every other
module/package needs to be installed individually (micropython-*
modules on PyPI). That makes a package manager a critical component, and
means that package manager itself cannot rely on standard library -
presence, absence, of specific version (or contents of standard
modules).

My initial idea was to write single-file script, but we're not ready
for self-sufficiency yet anyway (no SSL support, have to rely on wget),
and it's a bit of chore anyway. So instead, I made a semi-automated
"library subset package" (e.g. os renamed to upip_os), and all such
modules comes package together with the main script:
https://github.com/micropython/micropython-lib/tree/master/upip ,
https://pypi.python.org/pypi/micropython-upip . Then we have a script
to bootstrap upip:
https://github.com/micropython/micropython/blob/master/tools/bootstrap_upip.sh ,
after which any package can be installed using upip proper.



-- 
Best regards,
 Paul                          mailto:pmiscml at gmail.com

From tjreedy at udel.edu  Fri May 29 16:42:43 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Fri, 29 May 2015 10:42:43 -0400
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F_FpAf-9vZJGScRq_TX5o07GaCd2m4d1J=D0WB7xm9qzQ@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <etPan.55674859.543b4eb3.12a4d@Draupnir.home>
 <CALGmxEJFdyv-VF7Dfk_SAOFt+pgu5kC4YQM-Vu61jSW0gordfA@mail.gmail.com>
 <CAP1=2W7BDX6jqszQrKE4X4QPRrqF3dcMgNXqOH46zkivLnRf8A@mail.gmail.com>
 <CACac1F_FpAf-9vZJGScRq_TX5o07GaCd2m4d1J=D0WB7xm9qzQ@mail.gmail.com>
Message-ID: <mk9ttq$shr$1@ger.gmane.org>

On 5/28/2015 4:29 PM, Paul Moore wrote:
> On 28 May 2015 at 20:47, Brett Cannon <brett at python.org> wrote:
>> I think it's to have a single tool to do it for any platform, not to have
>> the technical nuts and bolts be the same necessarily. I think it's also to
>> figure out if there is anything the interpreter and/or stdlib can do to
>> facilitate this.
>
> Precisely. At the moment, the story seems to be "if you're on Windows,
> use py2exe, if you're on OSX use py2app, or on Unix, ..., or..."
>
> What would be a compelling story is "to build your app into a single
> file executable, do "python -m build <myapp>". The machinery behind
> the build can be as different as necessary - but being able to use the
> same command on every platform is the goal.

The python-based ren'py visual novel development system has something 
like this  When one is ready to publish, there is an easy option to 
build single-file downloadable redistributables for any or all of 
Windows, Linux, and Mac.  I know it works as far as it goes because I 
help my wife use the system not for a novel, but a photo-based tutorial. 
  After testing the resulting files with the help of others on linux and 
mac systems (we developed on Windows), she put the files up on one of 
her university pages.

As far as I know, the build code should be Python, if anyone want to 
look at it.

-- 
Terry Jan Reedy


From status at bugs.python.org  Fri May 29 18:08:23 2015
From: status at bugs.python.org (Python tracker)
Date: Fri, 29 May 2015 18:08:23 +0200 (CEST)
Subject: [Python-Dev] Summary of Python tracker Issues
Message-ID: <20150529160823.3FC1156895@psf.upfronthosting.co.za>


ACTIVITY SUMMARY (2015-05-22 - 2015-05-29)
Python tracker at http://bugs.python.org/

To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.

Issues counts and deltas:
  open    4844 (+11)
  closed 31241 (+47)
  total  36085 (+58)

Open issues with patches: 2217 


Issues opened (38)
==================

#23970: Update distutils.msvccompiler for VC14
http://bugs.python.org/issue23970  reopened by benjamin.peterson

#23996: _PyGen_FetchStopIterationValue() crashes on unnormalised excep
http://bugs.python.org/issue23996  reopened by scoder

#24267: test_venv.EnsurePipTest.test_with_pip triggers version check o
http://bugs.python.org/issue24267  opened by vadmium

#24270: PEP 485 (math.isclose) implementation
http://bugs.python.org/issue24270  opened by ncoghlan

#24272: PEP 484 docs
http://bugs.python.org/issue24272  opened by gvanrossum

#24274: erroneous comments in dictobject.c
http://bugs.python.org/issue24274  opened by Jim.Jewett

#24277: Take the new email package features out of provisional status
http://bugs.python.org/issue24277  opened by r.david.murray

#24278: Docs on Parsing arguments should say something about mem mgmt 
http://bugs.python.org/issue24278  opened by blais

#24279: Update test_base64 to use test.support.script_helper
http://bugs.python.org/issue24279  opened by bobcatfish

#24280: Unable to install Python
http://bugs.python.org/issue24280  opened by Jeff77789

#24284: Inconsistency in startswith/endswith
http://bugs.python.org/issue24284  opened by serhiy.storchaka

#24287: Let ElementTree prolog include comments and processing instruc
http://bugs.python.org/issue24287  opened by rhettinger

#24290: c_uint32 bitfields break structures
http://bugs.python.org/issue24290  opened by Rony Batista

#24291: wsgiref.handlers.SimpleHandler truncates large output blobs
http://bugs.python.org/issue24291  opened by Jonathan Kamens

#24292: wsgiref.simple_server.WSGIRequestHandler doesn't log request t
http://bugs.python.org/issue24292  opened by Jonathan Kamens

#24294: DeprecationWarnings should be visible by default in the intera
http://bugs.python.org/issue24294  opened by njs

#24295: Backport of #17086 causes regression in setup.py
http://bugs.python.org/issue24295  opened by moritzs

#24296: Queue documentation note needed
http://bugs.python.org/issue24296  opened by Sandy Chapman

#24299: 2.7.10 test__locale.py change breaks on Solaris
http://bugs.python.org/issue24299  opened by jbeck

#24300: Code Refactoring  in function nis_mapname()
http://bugs.python.org/issue24300  opened by pankaj.s01

#24301: gzip module failing to decompress valid compressed file
http://bugs.python.org/issue24301  opened by Eric Gorr

#24302: Dead Code of Handler check in function faulthandler_fatal_erro
http://bugs.python.org/issue24302  opened by pankaj.s01

#24303: OSError 17 due to _multiprocessing/semaphore.c assuming a one-
http://bugs.python.org/issue24303  opened by Paul Hobbs

#24305: The new import system makes it impossible to correctly issue a
http://bugs.python.org/issue24305  opened by njs

#24306: Backport py.exe to 3.4
http://bugs.python.org/issue24306  opened by steve.dower

#24307: pip error on windows whose current user name contains non-asci
http://bugs.python.org/issue24307  opened by tanbro-liu

#24308: Test failure: test_with_pip (test.test_venv.EnsurePipTest in 3
http://bugs.python.org/issue24308  opened by koobs

#24309: string.Template should be using str.format and/or deprecated
http://bugs.python.org/issue24309  opened by vlth

#24310: Idle documentation -- what to do if you do not see an undersco
http://bugs.python.org/issue24310  opened by lac

#24313: json fails to serialise numpy.int64
http://bugs.python.org/issue24313  opened by thomas-arildsen

#24314: irrelevant cross-link in documentation of user-defined functio
http://bugs.python.org/issue24314  opened by july

#24317: Change installer Customize default to match quick settings
http://bugs.python.org/issue24317  opened by steve.dower

#24318: Better documentaiton of profile-opt (and release builds in gen
http://bugs.python.org/issue24318  opened by skip.montanaro

#24319: Crash during "make coverage-report"
http://bugs.python.org/issue24319  opened by skip.montanaro

#24320: Remove a now-unnecessary workaround from importlib._bootstrap.
http://bugs.python.org/issue24320  opened by eric.snow

#24322: Hundreds of linker warnings on Windows
http://bugs.python.org/issue24322  opened by BreamoreBoy

#24323: Typo in Mutable Sequence Types documentation.
http://bugs.python.org/issue24323  opened by eimista

#24324: Remove -Wunreachable-code flag
http://bugs.python.org/issue24324  opened by skip.montanaro



Most recent 15 issues with no replies (15)
==========================================

#24324: Remove -Wunreachable-code flag
http://bugs.python.org/issue24324

#24322: Hundreds of linker warnings on Windows
http://bugs.python.org/issue24322

#24319: Crash during "make coverage-report"
http://bugs.python.org/issue24319

#24317: Change installer Customize default to match quick settings
http://bugs.python.org/issue24317

#24307: pip error on windows whose current user name contains non-asci
http://bugs.python.org/issue24307

#24303: OSError 17 due to _multiprocessing/semaphore.c assuming a one-
http://bugs.python.org/issue24303

#24302: Dead Code of Handler check in function faulthandler_fatal_erro
http://bugs.python.org/issue24302

#24300: Code Refactoring  in function nis_mapname()
http://bugs.python.org/issue24300

#24292: wsgiref.simple_server.WSGIRequestHandler doesn't log request t
http://bugs.python.org/issue24292

#24280: Unable to install Python
http://bugs.python.org/issue24280

#24277: Take the new email package features out of provisional status
http://bugs.python.org/issue24277

#24274: erroneous comments in dictobject.c
http://bugs.python.org/issue24274

#24265: IDLE produces error message when run with both -s and -c.
http://bugs.python.org/issue24265

#24264: imageop Unsafe Arithmetic
http://bugs.python.org/issue24264

#24263: Why VALID_MODULE_NAME in unittest/loader.py is r'[_a-z]\w*\.py
http://bugs.python.org/issue24263



Most recent 15 issues waiting for review (15)
=============================================

#24318: Better documentaiton of profile-opt (and release builds in gen
http://bugs.python.org/issue24318

#24314: irrelevant cross-link in documentation of user-defined functio
http://bugs.python.org/issue24314

#24303: OSError 17 due to _multiprocessing/semaphore.c assuming a one-
http://bugs.python.org/issue24303

#24302: Dead Code of Handler check in function faulthandler_fatal_erro
http://bugs.python.org/issue24302

#24300: Code Refactoring  in function nis_mapname()
http://bugs.python.org/issue24300

#24299: 2.7.10 test__locale.py change breaks on Solaris
http://bugs.python.org/issue24299

#24295: Backport of #17086 causes regression in setup.py
http://bugs.python.org/issue24295

#24287: Let ElementTree prolog include comments and processing instruc
http://bugs.python.org/issue24287

#24284: Inconsistency in startswith/endswith
http://bugs.python.org/issue24284

#24279: Update test_base64 to use test.support.script_helper
http://bugs.python.org/issue24279

#24278: Docs on Parsing arguments should say something about mem mgmt 
http://bugs.python.org/issue24278

#24272: PEP 484 docs
http://bugs.python.org/issue24272

#24270: PEP 485 (math.isclose) implementation
http://bugs.python.org/issue24270

#24266: raw_input + readline: Ctrl+C during search breaks readline
http://bugs.python.org/issue24266

#24259: tar.extractall() does not recognize unexpected EOF
http://bugs.python.org/issue24259



Top 10 most discussed issues (10)
=================================

#16991: Add OrderedDict written in C
http://bugs.python.org/issue16991  28 msgs

#24244: Python exception on strftime with %f on Python 3 and Python 2 
http://bugs.python.org/issue24244  27 msgs

#24270: PEP 485 (math.isclose) implementation
http://bugs.python.org/issue24270  21 msgs

#24259: tar.extractall() does not recognize unexpected EOF
http://bugs.python.org/issue24259  14 msgs

#21998: asyncio: support fork
http://bugs.python.org/issue21998  13 msgs

#24294: DeprecationWarnings should be visible by default in the intera
http://bugs.python.org/issue24294   9 msgs

#14373: C implementation of functools.lru_cache
http://bugs.python.org/issue14373   8 msgs

#24260: TabError behavior doesn't match documentation
http://bugs.python.org/issue24260   8 msgs

#23970: Update distutils.msvccompiler for VC14
http://bugs.python.org/issue23970   7 msgs

#24254: Make class definition namespace ordered by default
http://bugs.python.org/issue24254   7 msgs



Issues closed (46)
==================

#11205: Evaluation order of dictionary display is different from refer
http://bugs.python.org/issue11205  closed by python-dev

#18032: Optimization for set/frozenset.issubset()
http://bugs.python.org/issue18032  closed by rhettinger

#18459: readline: libedit support on non-apple platforms
http://bugs.python.org/issue18459  closed by vadmium

#20035: Clean up Tcl library discovery in Tkinter on Windows
http://bugs.python.org/issue20035  closed by zach.ware

#21448: Email Parser use 100% CPU
http://bugs.python.org/issue21448  closed by rhettinger

#21961: Add What's New for Idle.
http://bugs.python.org/issue21961  closed by terry.reedy

#22189: collections.UserString missing some str methods
http://bugs.python.org/issue22189  closed by rhettinger

#22931: cookies with square brackets in value
http://bugs.python.org/issue22931  closed by python-dev

#22955: Pickling of methodcaller, attrgetter, and itemgetter
http://bugs.python.org/issue22955  closed by serhiy.storchaka

#23086: Add start and stop parameters to the Sequence.index() ABC mixi
http://bugs.python.org/issue23086  closed by rhettinger

#23270: Use the new __builtin_mul_overflow() of Clang and GCC 5 to che
http://bugs.python.org/issue23270  closed by haypo

#23359: Speed-up set_lookkey()
http://bugs.python.org/issue23359  closed by rhettinger

#23509: Speed up Counter operators
http://bugs.python.org/issue23509  closed by rhettinger

#23574: datetime: support leap seconds
http://bugs.python.org/issue23574  closed by haypo

#23648: PEP 475 meta issue
http://bugs.python.org/issue23648  closed by haypo

#23712: Experiment:  Assume that exact unicode hashes are perfect disc
http://bugs.python.org/issue23712  closed by rhettinger

#23754: Add a new os.read_into() function to avoid memory copies
http://bugs.python.org/issue23754  closed by haypo

#23840: tokenize.open() leaks an open binary file on TextIOWrapper err
http://bugs.python.org/issue23840  closed by haypo

#23955: Add python.ini file for embedded/applocal installs
http://bugs.python.org/issue23955  closed by steve.dower

#23993: Use surrogateescape error handler by default in open() if the 
http://bugs.python.org/issue23993  closed by haypo

#24199: Idle: remove idlelib.idlever.py and its use in About dialog
http://bugs.python.org/issue24199  closed by python-dev

#24204: string.strip() documentation is misleading
http://bugs.python.org/issue24204  closed by rhettinger

#24219: Repeated integer in Lexical analysis/Integer literals section
http://bugs.python.org/issue24219  closed by rhettinger

#24230: tempfile.mkdtemp() doesn't work with bytes paths
http://bugs.python.org/issue24230  closed by gregory.p.smith

#24268: PEP 489 -- Multi-phase extension module initialization
http://bugs.python.org/issue24268  closed by ncoghlan

#24269: Few improvements to the collections documentation
http://bugs.python.org/issue24269  closed by rhettinger

#24271: Python site randomly scrolls up when on mobile.
http://bugs.python.org/issue24271  closed by ned.deily

#24273: _scproxy.so causes EXC_BAD_ACCESS (SIGSEGV)
http://bugs.python.org/issue24273  closed by ned.deily

#24275: lookdict_* give up too soon
http://bugs.python.org/issue24275  closed by benjamin.peterson

#24276: Correct reuse argument tuple in property descriptor
http://bugs.python.org/issue24276  closed by serhiy.storchaka

#24281: String formatting: incorrect number of decimal places
http://bugs.python.org/issue24281  closed by r.david.murray

#24282: 3.5 gdbm extension build fails with "'clinic/_gdbmmodule.c.h' 
http://bugs.python.org/issue24282  closed by ned.deily

#24283: Print not safe in signal handlers
http://bugs.python.org/issue24283  closed by pitrou

#24285: regression for importing extensions in packages
http://bugs.python.org/issue24285  closed by berker.peksag

#24286: Should OrderedDict.viewitems compare equal to dict.viewitems w
http://bugs.python.org/issue24286  closed by rhettinger

#24288: Include/opcode.h is modified during building
http://bugs.python.org/issue24288  closed by serhiy.storchaka

#24289: can't start Python3 due to ImportError of copy_reg
http://bugs.python.org/issue24289  closed by r.david.murray

#24293: Windows installer unreadable with std/custom themes
http://bugs.python.org/issue24293  closed by steve.dower

#24297: Lib/symbol.py is out of sync with Grammar/Grammar
http://bugs.python.org/issue24297  closed by yselivanov

#24298: inspect.signature includes bound argument for wrappers around 
http://bugs.python.org/issue24298  closed by yselivanov

#24304: Documentation broken link to license
http://bugs.python.org/issue24304  closed by ned.deily

#24311: urllib2.urlopen() through proxy fails when HTTPS URL contains 
http://bugs.python.org/issue24311  closed by serhiy.storchaka

#24312: miniDOM._write_data() give a vague error message when the argu
http://bugs.python.org/issue24312  closed by serhiy.storchaka

#24315: collections.abc: Coroutine should be derived from Awaitable
http://bugs.python.org/issue24315  closed by yselivanov

#24316: Fix types.coroutine to accept objects from Cython
http://bugs.python.org/issue24316  closed by yselivanov

#24321: interaction of nonlocal and except leading to incorrect behavi
http://bugs.python.org/issue24321  closed by ned.deily

From v+python at g.nevcal.com  Fri May 29 22:49:43 2015
From: v+python at g.nevcal.com (Glenn Linderman)
Date: Fri, 29 May 2015 13:49:43 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>	<etPan.556736c6.581308f4.12a4d@Draupnir.home>	<CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>	<CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>	<CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>	<CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>	<CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>	<55678404.40602@g.nevcal.com>
 <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>
Message-ID: <5568D0E7.9030708@g.nevcal.com>

On 5/29/2015 3:33 AM, Paul Moore wrote:
> On 28 May 2015 at 22:09, Glenn Linderman <v+python at g.nevcal.com> wrote:
>> This would be something I could use and benefit from immediately upon it
>> being available, so I laud your idea, and hope you have a successful
>> implementation, and look forward to using it.  It would largely replace the
>> need for the py.exe launcher for some classes of applications.
> The following proof-of-concept works as is (based on my pretty minimal
> testing), and only uses the limited API, so it should work with any
> version of Python 3 (I've not tested it with Python 2, but I think the
> only "new" API is PySys_SetArgvEx, which could be replaced with
> PySys_SetArgv at a pinch). Excuse the dreadful coding style and lack
> of error handling, I hacked it up in about an hour :-)

I have no particular interest in Python 2, having started with Python 3, 
and only used Python 2 for cases where dependent packages required it, 
and I've now reached the nirvana of all my dependency packages being 
ported to Python 3, although I have yet to port/test one remaining 
application to prove that. So I only mentioned Python 2 because it still 
could be useful for other people :)

> (Actually, I just tried building on Python 2 - guess what - Unicode
> :-) SetProgramName and SetArgvEx won't take Unicode values. The easy
> fix is just not to use Unicode, the hard one is to do the encoding
> dance, but I'm not going to bother...).

One approach would be to support Unicode arguments only for Python 3, 
but that would really be only paying lip service to Python 2 support.  
Another approach might be to not #define UNICODE for the Python 2 
version, and use the 8-bit Windows APIs, allowing Windows and the C 
runtime to do the encoding dance for you? Although I'm not sure what 
Python 2 requires in that respect.

> #define UNICODE
> #define _UNICODE
> #include <Python.h>
> #include <windows.h>
>
> int
> main()
> {
>      TCHAR program[MAX_PATH];
>      LPWSTR *argv;
>      int argc;
>      PyObject *runpy;
>      PyObject *ret;
>
>      argv = CommandLineToArgvW(GetCommandLineW(), &argc);
>      GetModuleFileName(NULL, program, MAX_PATH);
>      Py_SetProgramName(program);  /* optional but recommended */
>      Py_Initialize();
>      PySys_SetArgvEx(argc, argv, 0);
>      runpy = PyImport_ImportModule("runpy");
>      if (!runpy) PyErr_Print();
>      ret = PyObject_CallMethod(runpy, "run_path", "u", program);
>      if (!ret) PyErr_Print();
>      Py_Finalize();
>      return 0;
> }

That looks interesting, I wonder what compilation environment it would 
need?  I don't think I've even installed a C compiler on my last couple 
boxes, and the only version of a C compiler I have is, umm... M$VC++6.0, 
since I've moved to using Python for anything a 5 line batch file can't 
do...

> One mildly annoying thing is that python3.dll is only installed in
> <python install dir>\DLLs, which typically isn't on PATH.
Ah, linking.... so I guess if I figured out how to create this binary, 
it would contain a reference to python3.dll that would attempt to be 
resolved via the PATH, from what you say, and typically fail, due to 
PATH seldom containing python3.dll.  The python launcher gets around 
that by (1) being installed in %windir%, and going and finding the 
appropriate Python (per its own configuration file, and command line 
parameters), and setting up the path to that Python, which, when 
executed, knows its own directory structure and can thus find its own 
python3.dll.

The launcher, of course, adds an extra layer of process between the 
shell and the program, because it launches the "real" Python executable.

> So actually using the limited API from your own application fails by default.
> Fixing that's mostly a user admin issue, though (and you can just link
> to the full API and avoid the whole problem).

Do I understand correctly that the "user admin issue" means "add the 
appropriate <python install dir>\DLLs to the PATH"?

What I don't understand here is how linking to the full API avoids the 
problem... it must put more python library code into the stub 
executable? Enough to know how to search the registry to find the 
<python install dir> for the version of Python from which the full API 
was obtained? Or something else?

Are there other alternatives?  Assuming that the reference to the 
missing DLL is not required until the point at which a symbol from it is 
first referenced, so that the stub would have some ability to do 
something before that first call, maybe...

 1. The stub above could be enhanced to contained a "hard coded"
    directory that it adds to the PATH itself?
 2. The stub above could be enhanced to define that its first parameter
    is the <python install dir>, and tweak its PATH.
 3. These days, the Python installer does offer to optionally add itself
    to the PATH. Is that sufficient to make the stub work?
 4. The launcher could be used, assuming it is installed, but then you
    don't need a stub, and you get the extra process layer.
 5. stubpy.cmd could be created, a four line batch file below [1], which
    wouldn't require the launcher or its extra process layer, but would
    have to be placed on the PATH itself, or in the directory with the
    stub Python programs.

Only #3 could be construed as "easy" for the "dumb user"... if the 
Python installer offers to add itself to the PATH on "repair" installs, 
particularly (I'm not sure if it does).  Editing the System PATH through 
the  control panel is hard for the "dumb user", not made easier by the 
squinchy text box M$ provides for the editing. Nor is editing the System 
PATH made less error prone by the whole thing being available for 
editing, rather than the "GUI promotors" providing an editing editing 
interface such as displaying each item separately, with checkboxes to 
delete items, or insert items at particular locations, and directory 
selection dialogs rather than typing the desired new path as text.  Hmm. 
Sounds like a good program task for a stub Python program :) Except it 
doesn't bootstrap, unless it lives in <python install dir>.

[1] stubpy.cmd:
@setlocal
@PATH=<python install dir>;%PATH%
@shift
@%*

>
>> Of course, per other disccusions, this doesn't solve the problem for:
>>
>> A) machine without Python installed
>> B) programs that need binary extensions
>>
>> Other discussions have suggested:
>>
>> 3) The stub could offer to download and install Python
>>
>> A corollary:
>>
>> 4) The stub could offer to download and install the needed binary extensions
>> as well as Python. This would require the installation uniformity of
>> something like pip, so perhaps would be restricted to extensions available
>> via pip.  And it would be much enhanced by some technique where the zipapp
>> would contain metadata readable by the stub, that would declare the list of
>> binary extensions required.  Or, of course, it could even declare non-binary
>> extension that are not packaged with the zipapp, if the process is smooth,
>> the modules available via pip, etc., as a tradeoff.
> I'm pretty strongly against downloading interpreters or extensions.
> Apart from the pretty huge added complexity, as a user I'm not sure
> I'd trust a supposedly simple application I'd received if it started
> asking to download stuff unexpectedly...
>
> Paul

Yep. I mostly mentioned them for completeness.

I have no argument with this: installing Python can be a documentation 
thing.  "To use this program, you need to have Python installed, at 
least version N.M."  Maybe also : "and it should be installed on the 
System PATH, or other methods of setting the PATH before running this 
program must be used, or this program should be saved in the <python 
install dir>."

If stub+zip programs need other extensions, it can be documented as a 
batch file that calls pip a sufficient number of times with appropriate 
parameters, or the stub+zip program itself could be written to detect 
the needed but missing extensions, and invoke pip to get them before 
using them.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/e73ce9f1/attachment.html>

From greg at krypto.org  Fri May 29 23:14:03 2015
From: greg at krypto.org (Gregory P. Smith)
Date: Fri, 29 May 2015 21:14:03 +0000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
Message-ID: <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>

On Fri, May 29, 2015 at 12:24 AM Nick Coghlan <ncoghlan at gmail.com> wrote:

>
> On 29 May 2015 11:01 am, "Victor Stinner" <victor.stinner at gmail.com>
> wrote:
> >
> > Why not continue to enhance Python 3 instead of wasting our time with
> > Python 2? We have limited resources in term of developers to maintain
> > Python.
> >
> > (I'm not talking about fixing *bugs* in Python 2 which is fine with me.)
>
> I'm actually OK with volunteers deciding that even fixing bugs in 2.7
> isn't inherently rewarding enough for them to be willing to do it for free
> on their own time.
>

That is 100% okay.

What is not okay is for python-dev representatives to respond to users (in
any list/forum/channel) reporting bugs in 2.7 or asking if a fix in 3 can
be backported to 2.7 with things akin to "just use Python 3" or "sorry, 2.7
is critical fixes only. move to python 3 already." This is actively driving
our largest users away.  I bring this up because a user was bemoaning how
useless they feel python core devs are because of this attitude recently.
Leading to feelings of wishing to just abandon CPython if not Python all
together.

I'm sure I have even made some of those responses myself (sorry!). My point
here is: know it. recognize it. don't do it anymore. It harms the community.

A correct and accurate response to desires to make non-api-breaking changes
in 2.7 is "Patches that do not change any APIs for 2.7 are welcome in the
issue tracker." possibly including "I don't have the bandwidth to review
2.7 changes, find someone on python-dev to review and champion this for you
if you need it."  Finding someone may not always be easy. But at least is
still the "patches welcome" attitude and suggests that the work can be done
if someone is willing to do it. Lets make a concerted effort to not be
hostile and against it by default.

Ex: Is someone with a python application that is a million of lines
supposed to have everyone involved in that drop the productive work they
are doing and spend that porting their existing application to python 3
because we have so far failed to provide the tools to make that migration
easy?  No.  Empathize with our community.  Feel their pain.  (and everyone
who is working on tools to aid the transition: keep doing that! Our users
are gonna need it unless we don't want them as users anymore.)

We committed to supporting 2.7 until 2020 in 2014 per
https://hg.python.org/peps/rev/76d43e52d978.  That means backports of
important bug or performance fixes should at least be allowed on the table,
even if hairy, even if you won't work on them yourselves on a volunteer
basis. This is the first long term support release of Python ever. This is
what LTS means.  LTS could *also* stand for Learn To Support...

-gps

> Stepping up to extrinsically reward activities that are beneficial for
> customers but aren't intrinsically interesting enough for people to be
> willing to do for free is one of the key reasons commercial open source
> redistributors get paid.
>
> That more explicitly commercial presence is a dynamic we haven't
> historically had to deal with in core development, so there are going to be
> some growing pains as we find an arrangement that everyone is comfortable
> with (or is at least willing to tolerate, but I'm optimistic we can do
> better than that).
>
> Cheers,
> Nick.
>
> >
> > --
> >
> > By the way, I just wrote sixer, a new tool to generate patches to port
> > OpenStack to Python 3 :-)
> > https://pypi.python.org/pypi/sixer
> >
> > It's based on regex, so it's less reliable than 2to3, 2to6 or
> > modernize, but it's just enough for my specific use case. On
> > OpenStack, it's not possible to send one giant patch "hello, this is
> > python 3". Code is modified by small and incremental changes.
> >
> > Come on in the Python 3 world and... always look on the bright side of
> > life ( https://www.youtube.com/watch?v=VOAtCOsNuVM )!
> >
> > Victor
> > _______________________________________________
> > Python-Dev mailing list
> > Python-Dev at python.org
> > https://mail.python.org/mailman/listinfo/python-dev
>
> > Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
>  _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/greg%40krypto.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/15ff1e49/attachment.html>

From p.f.moore at gmail.com  Fri May 29 23:45:47 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 29 May 2015 22:45:47 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <5568D0E7.9030708@g.nevcal.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
 <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
 <55678404.40602@g.nevcal.com>
 <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>
 <5568D0E7.9030708@g.nevcal.com>
Message-ID: <CACac1F_1kq1qqGE3uQAH+H3KSgnn_pWqZNppdqOq0URzzhdwtA@mail.gmail.com>

On 29 May 2015 at 21:49, Glenn Linderman <v+python at g.nevcal.com> wrote:
>
> That looks interesting, I wonder what compilation environment it would need?
> I don't think I've even installed a C compiler on my last couple boxes, and
> the only version of a C compiler I have is, umm... M$VC++6.0, since I've
> moved to using Python for anything a 5 line batch file can't do...
>
>> One mildly annoying thing is that python3.dll is only installed in
>> <python install dir>\DLLs, which typically isn't on PATH.
>
> Ah, linking.... so I guess if I figured out how to create this binary, it
> would contain a reference to python3.dll that would attempt to be resolved
> via the PATH, from what you say, and typically fail, due to PATH seldom
> containing python3.dll.  The python launcher gets around that by (1) being
> installed in %windir%, and going and finding the appropriate Python (per its
> own configuration file, and command line parameters), and setting up the
> path to that Python, which, when executed, knows its own directory structure
> and can thus find its own python3.dll.
>
> The launcher, of course, adds an extra layer of process between the shell
> and the program, because it launches the "real" Python executable.
>
>> So actually using the limited API from your own application fails by
>> default.
>> Fixing that's mostly a user admin issue, though (and you can just link
>> to the full API and avoid the whole problem).
>
>
> Do I understand correctly that the "user admin issue" means "add the
> appropriate <python install dir>\DLLs to the PATH"?
>
> What I don't understand here is how linking to the full API avoids the
> problem... it must put more python library code into the stub executable?
> Enough to know how to search the registry to find the <python install dir>
> for the version of Python from which the full API was obtained? Or something
> else?

Sorry, I assumed more Windows/C knowledge than you have.

I'll work on this and produce proper binaries in due course, so you
can always wait for them. But you can build the stub with pretty much
anything, I suspect - I managed with MSVC 2010 and mingw. I'll add
some build docs and get it on github.

Using mingw

    gcc -Wall -O2 -o stub.exe stub.c -I <python home>\Include
C:\Windows\system32\python34.dll
    strip -s stub.exe

Using MSVC

    cl /Festub.exe /O2 stub.c /I<python home>\Include <python
home>\libs\python34.lib

Regarding the DLLs, yes the "user admin issue" is adding the right
directory to PATH. I used the phrase "admin issue" as it's the aspect
that's likely to be far harder than any of the technical issues :-)
The reason using the full API helps is that the full API references
python34.dll rather than python3.dll. And the Python installer puts
python34.dll on PATH automatically, as it's what the "python" command
uses. (For the people with more Windows knowledge, I know this is a
simplification, but it's close enough for now).

So there are two  options with the code I posted.

1. Build an exe that uses a specific version of Python, but which will
"just work" in basically the same way that the "python" command works.
2. Build an exe that works with any version of Python, but requires
some setup from the user.

Either approach requires that the Python DLL is on PATH, but that's
far more likely with the version-specific one, just because of how the
installer does things.

With extra code, the stub could locate an appropriate Python DLL
dynamically, which would simplify usage at the cost of a bit of fiddly
code in the stub.

This might be a useful addition to the zipapp module for Python 3.6.

Paul

PS Current launchers (py.exe, the entry point launchers from
pip/setuptools, etc) tend to spawn the actual python program in a
subprocess. I believe there are *technically* some differences in the
runtime environment when you use an embedding approach like this, but
I don't know what they are, and they probably won't affect 99.9% of
users. Lack of support for binary extensions is likely to be way more
significant.

From graffatcolmingov at gmail.com  Fri May 29 23:52:08 2015
From: graffatcolmingov at gmail.com (Ian Cordasco)
Date: Fri, 29 May 2015 16:52:08 -0500
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
Message-ID: <CAN-Kwu0ujJLfK0xBXABkeO4Fv47gtsFwmGB1MTVg8hwejBn_4g@mail.gmail.com>

On Fri, May 29, 2015 at 4:14 PM, Gregory P. Smith <greg at krypto.org> wrote:
>
> On Fri, May 29, 2015 at 12:24 AM Nick Coghlan <ncoghlan at gmail.com> wrote:
>>
>>
>> On 29 May 2015 11:01 am, "Victor Stinner" <victor.stinner at gmail.com>
>> wrote:
>> >
>> > Why not continue to enhance Python 3 instead of wasting our time with
>> > Python 2? We have limited resources in term of developers to maintain
>> > Python.
>> >
>> > (I'm not talking about fixing *bugs* in Python 2 which is fine with me.)
>>
>> I'm actually OK with volunteers deciding that even fixing bugs in 2.7
>> isn't inherently rewarding enough for them to be willing to do it for free
>> on their own time.
>
>
> That is 100% okay.
>
> What is not okay is for python-dev representatives to respond to users (in
> any list/forum/channel) reporting bugs in 2.7 or asking if a fix in 3 can be
> backported to 2.7 with things akin to "just use Python 3" or "sorry, 2.7 is
> critical fixes only. move to python 3 already." This is actively driving our
> largest users away.  I bring this up because a user was bemoaning how
> useless they feel python core devs are because of this attitude recently.
> Leading to feelings of wishing to just abandon CPython if not Python all
> together.
>
> I'm sure I have even made some of those responses myself (sorry!). My point
> here is: know it. recognize it. don't do it anymore. It harms the community.
>
> A correct and accurate response to desires to make non-api-breaking changes
> in 2.7 is "Patches that do not change any APIs for 2.7 are welcome in the
> issue tracker." possibly including "I don't have the bandwidth to review 2.7
> changes, find someone on python-dev to review and champion this for you if
> you need it."  Finding someone may not always be easy. But at least is still
> the "patches welcome" attitude and suggests that the work can be done if
> someone is willing to do it. Lets make a concerted effort to not be hostile
> and against it by default.
>
> Ex: Is someone with a python application that is a million of lines supposed
> to have everyone involved in that drop the productive work they are doing
> and spend that porting their existing application to python 3 because we
> have so far failed to provide the tools to make that migration easy?  No.
> Empathize with our community.  Feel their pain.  (and everyone who is
> working on tools to aid the transition: keep doing that! Our users are gonna
> need it unless we don't want them as users anymore.)
>
> We committed to supporting 2.7 until 2020 in 2014 per
> https://hg.python.org/peps/rev/76d43e52d978.  That means backports of
> important bug or performance fixes should at least be allowed on the table,
> even if hairy, even if you won't work on them yourselves on a volunteer
> basis. This is the first long term support release of Python ever. This is
> what LTS means.  LTS could also stand for Learn To Support...

At the same time, they can ask for it, but if people aren't motivated
to do the work for it, it won't happen. We should be encouraging (and
maybe even mentoring) these people who are desperately in need of the
fixes to be backported, to backport the patches themselves. With that
done, it can go through review and we can maybe get those fixes in
faster if we can also get a larger group of reviews.

The problem consists of a few parts:

- We're all volunteers
- Volunteers are going to work on what interests them
- Python 2.7 maintenance doesn't seem to interest many of our
volunteers currently

Perhaps we should explain this to each of the people requesting
backports to (ideally) encourage them.

From rymg19 at gmail.com  Fri May 29 23:57:28 2015
From: rymg19 at gmail.com (Ryan Gonzalez)
Date: Fri, 29 May 2015 16:57:28 -0500
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F_1kq1qqGE3uQAH+H3KSgnn_pWqZNppdqOq0URzzhdwtA@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
 <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
 <55678404.40602@g.nevcal.com>
 <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>
 <5568D0E7.9030708@g.nevcal.com>
 <CACac1F_1kq1qqGE3uQAH+H3KSgnn_pWqZNppdqOq0URzzhdwtA@mail.gmail.com>
Message-ID: <CAO41-mPf2AMewRTX_fDGF7ArTCaPpGtQgq-Gfbjy3-pL9FV5tQ@mail.gmail.com>

I did that once; it wasn't worth it. It was no smaller than what
PyInstaller would output and required manually adding in the required
modules that weren't in the stdlib, along with any extra DLLs (e.g. the Qt
DLLs).


On Fri, May 29, 2015 at 4:45 PM, Paul Moore <p.f.moore at gmail.com> wrote:

> On 29 May 2015 at 21:49, Glenn Linderman <v+python at g.nevcal.com> wrote:
> >
> > That looks interesting, I wonder what compilation environment it would
> need?
> > I don't think I've even installed a C compiler on my last couple boxes,
> and
> > the only version of a C compiler I have is, umm... M$VC++6.0, since I've
> > moved to using Python for anything a 5 line batch file can't do...
> >
> >> One mildly annoying thing is that python3.dll is only installed in
> >> <python install dir>\DLLs, which typically isn't on PATH.
> >
> > Ah, linking.... so I guess if I figured out how to create this binary, it
> > would contain a reference to python3.dll that would attempt to be
> resolved
> > via the PATH, from what you say, and typically fail, due to PATH seldom
> > containing python3.dll.  The python launcher gets around that by (1)
> being
> > installed in %windir%, and going and finding the appropriate Python (per
> its
> > own configuration file, and command line parameters), and setting up the
> > path to that Python, which, when executed, knows its own directory
> structure
> > and can thus find its own python3.dll.
> >
> > The launcher, of course, adds an extra layer of process between the shell
> > and the program, because it launches the "real" Python executable.
> >
> >> So actually using the limited API from your own application fails by
> >> default.
> >> Fixing that's mostly a user admin issue, though (and you can just link
> >> to the full API and avoid the whole problem).
> >
> >
> > Do I understand correctly that the "user admin issue" means "add the
> > appropriate <python install dir>\DLLs to the PATH"?
> >
> > What I don't understand here is how linking to the full API avoids the
> > problem... it must put more python library code into the stub executable?
> > Enough to know how to search the registry to find the <python install
> dir>
> > for the version of Python from which the full API was obtained? Or
> something
> > else?
>
> Sorry, I assumed more Windows/C knowledge than you have.
>
> I'll work on this and produce proper binaries in due course, so you
> can always wait for them. But you can build the stub with pretty much
> anything, I suspect - I managed with MSVC 2010 and mingw. I'll add
> some build docs and get it on github.
>
> Using mingw
>
>     gcc -Wall -O2 -o stub.exe stub.c -I <python home>\Include
> C:\Windows\system32\python34.dll
>     strip -s stub.exe
>
> Using MSVC
>
>     cl /Festub.exe /O2 stub.c /I<python home>\Include <python
> home>\libs\python34.lib
>
> Regarding the DLLs, yes the "user admin issue" is adding the right
> directory to PATH. I used the phrase "admin issue" as it's the aspect
> that's likely to be far harder than any of the technical issues :-)
> The reason using the full API helps is that the full API references
> python34.dll rather than python3.dll. And the Python installer puts
> python34.dll on PATH automatically, as it's what the "python" command
> uses. (For the people with more Windows knowledge, I know this is a
> simplification, but it's close enough for now).
>
> So there are two  options with the code I posted.
>
> 1. Build an exe that uses a specific version of Python, but which will
> "just work" in basically the same way that the "python" command works.
> 2. Build an exe that works with any version of Python, but requires
> some setup from the user.
>
> Either approach requires that the Python DLL is on PATH, but that's
> far more likely with the version-specific one, just because of how the
> installer does things.
>
> With extra code, the stub could locate an appropriate Python DLL
> dynamically, which would simplify usage at the cost of a bit of fiddly
> code in the stub.
>
> This might be a useful addition to the zipapp module for Python 3.6.
>
> Paul
>
> PS Current launchers (py.exe, the entry point launchers from
> pip/setuptools, etc) tend to spawn the actual python program in a
> subprocess. I believe there are *technically* some differences in the
> runtime environment when you use an embedding approach like this, but
> I don't know what they are, and they probably won't affect 99.9% of
> users. Lack of support for binary extensions is likely to be way more
> significant.
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com
>



-- 
Ryan
[ERROR]: Your autotools build scripts are 200 lines longer than your
program. Something?s wrong.
http://kirbyfan64.github.io/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/fd3668e5/attachment.html>

From v+python at g.nevcal.com  Sat May 30 00:15:51 2015
From: v+python at g.nevcal.com (Glenn Linderman)
Date: Fri, 29 May 2015 15:15:51 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F_1kq1qqGE3uQAH+H3KSgnn_pWqZNppdqOq0URzzhdwtA@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>	<etPan.556736c6.581308f4.12a4d@Draupnir.home>	<CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>	<CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>	<CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>	<CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>	<CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>	<55678404.40602@g.nevcal.com>	<CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>	<5568D0E7.9030708@g.nevcal.com>
 <CACac1F_1kq1qqGE3uQAH+H3KSgnn_pWqZNppdqOq0URzzhdwtA@mail.gmail.com>
Message-ID: <5568E517.9080006@g.nevcal.com>

On 5/29/2015 2:45 PM, Paul Moore wrote:
> On 29 May 2015 at 21:49, Glenn Linderman <v+python at g.nevcal.com> wrote:
>> That looks interesting, I wonder what compilation environment it would need?
>> I don't think I've even installed a C compiler on my last couple boxes, and
>> the only version of a C compiler I have is, umm... M$VC++6.0, since I've
>> moved to using Python for anything a 5 line batch file can't do...
>>
>>> One mildly annoying thing is that python3.dll is only installed in
>>> <python install dir>\DLLs, which typically isn't on PATH.
>> Ah, linking.... so I guess if I figured out how to create this binary, it
>> would contain a reference to python3.dll that would attempt to be resolved
>> via the PATH, from what you say, and typically fail, due to PATH seldom
>> containing python3.dll.  The python launcher gets around that by (1) being
>> installed in %windir%, and going and finding the appropriate Python (per its
>> own configuration file, and command line parameters), and setting up the
>> path to that Python, which, when executed, knows its own directory structure
>> and can thus find its own python3.dll.
>>
>> The launcher, of course, adds an extra layer of process between the shell
>> and the program, because it launches the "real" Python executable.
>>
>>> So actually using the limited API from your own application fails by
>>> default.
>>> Fixing that's mostly a user admin issue, though (and you can just link
>>> to the full API and avoid the whole problem).
>>
>> Do I understand correctly that the "user admin issue" means "add the
>> appropriate <python install dir>\DLLs to the PATH"?
>>
>> What I don't understand here is how linking to the full API avoids the
>> problem... it must put more python library code into the stub executable?
>> Enough to know how to search the registry to find the <python install dir>
>> for the version of Python from which the full API was obtained? Or something
>> else?
> Sorry, I assumed more Windows/C knowledge than you have.

It is mostly the C/Python interface knowledge that I lack... although my 
Windows/C knowledge is getting rusty.

> I'll work on this and produce proper binaries in due course, so you
> can always wait for them. But you can build the stub with pretty much
> anything, I suspect - I managed with MSVC 2010 and mingw. I'll add
> some build docs and get it on github.
>
> Using mingw
>
>      gcc -Wall -O2 -o stub.exe stub.c -I <python home>\Include
> C:\Windows\system32\python34.dll
>      strip -s stub.exe
>
> Using MSVC
>
>      cl /Festub.exe /O2 stub.c /I<python home>\Include <python
> home>\libs\python34.lib

Github sounds good.  Binaries sound good.  I would have to download the 
free MSVC10 or Ming and install and learn to use them, etc., to make 
progress... probably doable, but (1) I'm surviving at the moment with 
the launcher + zipapp, but it'd be nice for folks I code for to have 
.exe things, and (2) I'm backlogged in my other projects which don't 
need me to download a C compiler to make progress.

> Regarding the DLLs, yes the "user admin issue" is adding the right
> directory to PATH. I used the phrase "admin issue" as it's the aspect
> that's likely to be far harder than any of the technical issues :-)
> The reason using the full API helps is that the full API references
> python34.dll rather than python3.dll. And the Python installer puts
> python34.dll on PATH automatically, as it's what the "python" command
> uses. (For the people with more Windows knowledge, I know this is a
> simplification, but it's close enough for now).
>
> So there are two  options with the code I posted.
>
> 1. Build an exe that uses a specific version of Python, but which will
> "just work" in basically the same way that the "python" command works.
> 2. Build an exe that works with any version of Python, but requires
> some setup from the user.
>
> Either approach requires that the Python DLL is on PATH, but that's
> far more likely with the version-specific one, just because of how the
> installer does things.

I don't presently see any C:\Python34\DLLs or C:\Python34 on my path, 
but I didn't ask the installer to put it there either. So I'm guessing 
your option 1 assumes asking the Python installer to put it there? Not 
"automatically" but "on request", I think?

In my c:\Python34\DLLs, I don't see a python34.dll, only python3.dll... 
so I'm somewhat unclear on your simplified explanation.

>
> With extra code, the stub could locate an appropriate Python DLL
> dynamically, which would simplify usage at the cost of a bit of fiddly
> code in the stub.
>
> This might be a useful addition to the zipapp module for Python 3.6.

Indeed.  Especially with extra fiddly code if you or someone on Github 
has the time, it could be very useful for the zipapp module.

>
> Paul
>
> PS Current launchers (py.exe, the entry point launchers from
> pip/setuptools, etc) tend to spawn the actual python program in a
> subprocess. I believe there are *technically* some differences in the
> runtime environment when you use an embedding approach like this, but
> I don't know what they are, and they probably won't affect 99.9% of
> users. Lack of support for binary extensions is likely to be way more
> significant.

Lack of support for "zipped / bundled" binary extensions, I assume you 
mean. I have the perception this solution allows use of "normally 
installed" binary extensions.

I don't know the differences in the runtime either, would be good to 
know, if someone knows.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/4c6e5c36/attachment.html>

From p.f.moore at gmail.com  Sat May 30 00:28:15 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Fri, 29 May 2015 23:28:15 +0100
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <5568E517.9080006@g.nevcal.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
 <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
 <55678404.40602@g.nevcal.com>
 <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>
 <5568D0E7.9030708@g.nevcal.com>
 <CACac1F_1kq1qqGE3uQAH+H3KSgnn_pWqZNppdqOq0URzzhdwtA@mail.gmail.com>
 <5568E517.9080006@g.nevcal.com>
Message-ID: <CACac1F_mtBbqv61z54q1akS+8RK41JU0MVVUO9iZkc6RQwi8mw@mail.gmail.com>

On 29 May 2015 at 23:15, Glenn Linderman <v+python at g.nevcal.com> wrote:
> I don't presently see any C:\Python34\DLLs or C:\Python34 on my path, but I
> didn't ask the installer to put it there either. So I'm guessing your option
> 1 assumes asking the Python installer to put it there? Not "automatically"
> but "on request", I think?
>
> In my c:\Python34\DLLs, I don't see a python34.dll, only python3.dll... so
> I'm somewhat unclear on your simplified explanation.

I'm definitely batting zero today :-(

OK, let's try to be clear. I typically do "all users" installs. The
"for me only" install is slightly different, and the new install for
3.5 may be different again. But what I see is:

1. C:\Python34\DLLs containins python3.dll, which is *never* on PATH
(and doesn't need to be for normal use). Anything that wants to use
python3.dll needs that directory manually adding to PATH.
2. python34.dll is in C:\Windows\System32. This is always available to
all processes, as it's in the Windows system directory.

If you say "add Python to my PATH" you get C:\Python34 added to PATH.
For a user install, I believe python34.dll may be in there rather than
in C:\Windows\system32, so technically, for an app that uses
python34.dll to work, you need *either* an admin install, *or* to have
done "add Python to PATH".

I hope that made sense. Sorry for my garbled previous version.

Paul

From ronaldoussoren at mac.com  Fri May 29 23:37:43 2015
From: ronaldoussoren at mac.com (Ronald Oussoren)
Date: Fri, 29 May 2015 23:37:43 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CALGmxELkjrFzwAyKC6gn9cp1RnvjqGSLCxp3DWL2xfADy-bf_g@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <BY1PR03MB146688630F810679972ABBBDF5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <mk7lka$uig$1@ger.gmane.org> <etPan.55675f97.4794acf7.12a4d@Draupnir.home>
 <1434330486454533457.360774sturla.molden-gmail.com@news.gmane.org>
 <CALGmxELkjrFzwAyKC6gn9cp1RnvjqGSLCxp3DWL2xfADy-bf_g@mail.gmail.com>
Message-ID: <68787220-220B-429E-82FE-20667731B653@mac.com>



Op 28 mei 2015 om 21:37 heeft Chris Barker <chris.barker at noaa.gov> het volgende geschreven:

> On Thu, May 28, 2015 at 12:25 PM, Sturla Molden <sturla.molden at gmail.com> wrote:
> 
>> The system
>> Python should be left alone as it is.
> 
> absolutely!
> 
> By the way, py2app will build an application bundle that depends on the system python, indeed, that's all it will do if you run it with the system python, as Apple has added some non-redistributable bits in there.

That's not quite the reason. It's more that I don't want to guess whether or not it is valid to bundle binaries from a system location.  Furthermore bundling files from a base install of the OS is pretty useless, especially when those binaries won't run on earlier releases anyway due to the compilation options used. 

Ronald
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/f9065302/attachment.html>

From ncoghlan at gmail.com  Sat May 30 00:44:41 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 08:44:41 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
Message-ID: <CADiSq7fNwzOe9v2vqNmQgyDauR9_g_4d0=1Y3sa0QSmp21Miag@mail.gmail.com>

On 30 May 2015 07:14, "Gregory P. Smith" <greg at krypto.org> wrote:
>
>
> On Fri, May 29, 2015 at 12:24 AM Nick Coghlan <ncoghlan at gmail.com> wrote:
>>
>>
>> On 29 May 2015 11:01 am, "Victor Stinner" <victor.stinner at gmail.com>
wrote:
>> >
>> > Why not continue to enhance Python 3 instead of wasting our time with
>> > Python 2? We have limited resources in term of developers to maintain
>> > Python.
>> >
>> > (I'm not talking about fixing *bugs* in Python 2 which is fine with
me.)
>>
>> I'm actually OK with volunteers deciding that even fixing bugs in 2.7
isn't inherently rewarding enough for them to be willing to do it for free
on their own time.
>
>
> That is 100% okay.
>
> What is not okay is for python-dev representatives to respond to users
(in any list/forum/channel) reporting bugs in 2.7 or asking if a fix in 3
can be backported to 2.7 with things akin to "just use Python 3" or "sorry,
2.7 is critical fixes only. move to python 3 already." This is actively
driving our largest users away.  I bring this up because a user was
bemoaning how useless they feel python core devs are because of this
attitude recently. Leading to feelings of wishing to just abandon CPython
if not Python all together.
>
> I'm sure I have even made some of those responses myself (sorry!). My
point here is: know it. recognize it. don't do it anymore. It harms the
community.
>
> A correct and accurate response to desires to make non-api-breaking
changes in 2.7 is "Patches that do not change any APIs for 2.7 are welcome
in the issue tracker." possibly including "I don't have the bandwidth to
review 2.7 changes, find someone on python-dev to review and champion this
for you if you need it."  Finding someone may not always be easy. But at
least is still the "patches welcome" attitude and suggests that the work
can be done if someone is willing to do it. Lets make a concerted effort to
not be hostile and against it by default.

Better answer (and yes, I'm biased): "Have you asked your Python support
vendor to push for this change on your behalf?"

If well-funded entities expect open source software to just magically be
maintained without them paying someone to maintain it (whether that's their
own developers or a redistributor), then their long term risk management
processes are fundamentally broken and they need to reconsider their
approach.

> Ex: Is someone with a python application that is a million of lines
supposed to have everyone involved in that drop the productive work they
are doing and spend that porting their existing application to python 3
because we have so far failed to provide the tools to make that migration
easy?  No.  Empathize with our community.  Feel their pain.  (and everyone
who is working on tools to aid the transition: keep doing that! Our users
are gonna need it unless we don't want them as users anymore.)

Are they paying someone for Python support (or at least sponsoring the
Python Software Foundation)? If they're paying for support, are they
working with that vendor to figure out how they're going to manage the
transition to Python 3? If they're not paying for support, are they
actively participating in the community such that I know their developers
at a personal level and care about them as friends & colleagues?

If the answer to all three of those questions is "No", then no, I don't
have any sympathy for them. In the first case, my response is "Stop being a
freeloader on the generosity of the Python community and pay someone", in
the second case it's "Go make use of that commercial support you're paying
for (and stop breaking our core development funding signals by bypassing
it)", while in the third it's "What have you done for *me* lately that
should make me care about your inability to appropriately manage business
risk?".

> We committed to supporting 2.7 until 2020 in 2014 per
https://hg.python.org/peps/rev/76d43e52d978.  That means backports of
important bug or performance fixes should at least be allowed on the table,
even if hairy, even if you won't work on them yourselves on a volunteer
basis. This is the first long term support release of Python ever. This is
what LTS means.  LTS could also stand for Learn To Support...

It also stands for commercial redistributors and the infrastructure teams
at large institutions actually doing what we're paid for, rather than
expecting other volunteers to pick up our slack.

The only thing we can legitimately ask volunteers to do is to not *object*
while we do this, and for them to redirect well-funded end users to paid
support options and other means of contributing back to the Python
community, rather than haranguing them to "just upgrade already".

Regards,
Nick.

>
> -gps
>>
>> Stepping up to extrinsically reward activities that are beneficial for
customers but aren't intrinsically interesting enough for people to be
willing to do for free is one of the key reasons commercial open source
redistributors get paid.
>>
>> That more explicitly commercial presence is a dynamic we haven't
historically had to deal with in core development, so there are going to be
some growing pains as we find an arrangement that everyone is comfortable
with (or is at least willing to tolerate, but I'm optimistic we can do
better than that).
>>
>> Cheers,
>> Nick.
>>
>> >
>> > --
>> >
>> > By the way, I just wrote sixer, a new tool to generate patches to port
>> > OpenStack to Python 3 :-)
>> > https://pypi.python.org/pypi/sixer
>> >
>> > It's based on regex, so it's less reliable than 2to3, 2to6 or
>> > modernize, but it's just enough for my specific use case. On
>> > OpenStack, it's not possible to send one giant patch "hello, this is
>> > python 3". Code is modified by small and incremental changes.
>> >
>> > Come on in the Python 3 world and... always look on the bright side of
>> > life ( https://www.youtube.com/watch?v=VOAtCOsNuVM )!
>> >
>> > Victor
>> > _______________________________________________
>> > Python-Dev mailing list
>> > Python-Dev at python.org
>> > https://mail.python.org/mailman/listinfo/python-dev
>>
>> > Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
>>
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/greg%40krypto.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150530/6e46537a/attachment-0001.html>

From guido at python.org  Sat May 30 01:04:06 2015
From: guido at python.org (Guido van Rossum)
Date: Fri, 29 May 2015 16:04:06 -0700
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CAN-Kwu0ujJLfK0xBXABkeO4Fv47gtsFwmGB1MTVg8hwejBn_4g@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <CAN-Kwu0ujJLfK0xBXABkeO4Fv47gtsFwmGB1MTVg8hwejBn_4g@mail.gmail.com>
Message-ID: <CAP7+vJ+G7ftOoee2FaVhVvYz70G29UqQ3WO6y0FOCtbaE91SBg@mail.gmail.com>

On Fri, May 29, 2015 at 2:52 PM, Ian Cordasco <graffatcolmingov at gmail.com>
wrote:

> On Fri, May 29, 2015 at 4:14 PM, Gregory P. Smith <greg at krypto.org> wrote:
> >
> > On Fri, May 29, 2015 at 12:24 AM Nick Coghlan <ncoghlan at gmail.com>
> wrote:
> >>
> >>
> >> On 29 May 2015 11:01 am, "Victor Stinner" <victor.stinner at gmail.com>
> >> wrote:
> >> >
> >> > Why not continue to enhance Python 3 instead of wasting our time with
> >> > Python 2? We have limited resources in term of developers to maintain
> >> > Python.
> >> >
> >> > (I'm not talking about fixing *bugs* in Python 2 which is fine with
> me.)
> >>
> >> I'm actually OK with volunteers deciding that even fixing bugs in 2.7
> >> isn't inherently rewarding enough for them to be willing to do it for
> free
> >> on their own time.
> >
> >
> > That is 100% okay.
> >
> > What is not okay is for python-dev representatives to respond to users
> (in
> > any list/forum/channel) reporting bugs in 2.7 or asking if a fix in 3
> can be
> > backported to 2.7 with things akin to "just use Python 3" or "sorry, 2.7
> is
> > critical fixes only. move to python 3 already." This is actively driving
> our
> > largest users away.  I bring this up because a user was bemoaning how
> > useless they feel python core devs are because of this attitude recently.
> > Leading to feelings of wishing to just abandon CPython if not Python all
> > together.
> >
> > I'm sure I have even made some of those responses myself (sorry!). My
> point
> > here is: know it. recognize it. don't do it anymore. It harms the
> community.
> >
> > A correct and accurate response to desires to make non-api-breaking
> changes
> > in 2.7 is "Patches that do not change any APIs for 2.7 are welcome in the
> > issue tracker." possibly including "I don't have the bandwidth to review
> 2.7
> > changes, find someone on python-dev to review and champion this for you
> if
> > you need it."  Finding someone may not always be easy. But at least is
> still
> > the "patches welcome" attitude and suggests that the work can be done if
> > someone is willing to do it. Lets make a concerted effort to not be
> hostile
> > and against it by default.
> >
> > Ex: Is someone with a python application that is a million of lines
> supposed
> > to have everyone involved in that drop the productive work they are doing
> > and spend that porting their existing application to python 3 because we
> > have so far failed to provide the tools to make that migration easy?  No.
> > Empathize with our community.  Feel their pain.  (and everyone who is
> > working on tools to aid the transition: keep doing that! Our users are
> gonna
> > need it unless we don't want them as users anymore.)
> >
> > We committed to supporting 2.7 until 2020 in 2014 per
> > https://hg.python.org/peps/rev/76d43e52d978.  That means backports of
> > important bug or performance fixes should at least be allowed on the
> table,
> > even if hairy, even if you won't work on them yourselves on a volunteer
> > basis. This is the first long term support release of Python ever. This
> is
> > what LTS means.  LTS could also stand for Learn To Support...
>
> At the same time, they can ask for it, but if people aren't motivated
> to do the work for it, it won't happen. We should be encouraging (and
> maybe even mentoring) these people who are desperately in need of the
> fixes to be backported, to backport the patches themselves. With that
> done, it can go through review and we can maybe get those fixes in
> faster if we can also get a larger group of reviews.
>
> The problem consists of a few parts:
>
> - We're all volunteers
>

Speak for yourself. There are a fair number of people on this thread whose
employer pays them to work on Python. And this thread originated when a
patch was being contributed by people who were also paid by their employer
to do all the dirty work (including benchmarks). And yet they were
(initially) given the cold shoulder by some "high and mighty" Python 3
zealots. This attitude need to change.


> - Volunteers are going to work on what interests them
> - Python 2.7 maintenance doesn't seem to interest many of our
> volunteers currently
>
> Perhaps we should explain this to each of the people requesting
> backports to (ideally) encourage them.
>

Please let someone else do the explaining. I don't want to have to do the
damage control after you "explain" something.

-- 
--Guido van Rossum (python.org/~guido)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/4d7d8372/attachment.html>

From Steve.Dower at microsoft.com  Sat May 30 00:33:17 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Fri, 29 May 2015 22:33:17 +0000
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>
 <CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>
 <CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>
 <CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>
 <55678404.40602@g.nevcal.com>
 <CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>
Message-ID: <BY1PR03MB146608F501CCCB858E99AD0BF5C90@BY1PR03MB1466.namprd03.prod.outlook.com>

Paul Moore wrote:
> One mildly annoying thing is that python3.dll is only installed in <python install dir>\DLLs, which
> typically isn't on PATH. So actually using the limited API from your own application fails by default.
> Fixing that's mostly a user admin issue, though (and you can just link to the full API and avoid the whole problem).

I didn't even notice that 3.4 (and earlier?) were doing that, so I changed/fixed it by accident :)

Python 3.5 installs python3.dll alongside python35.dll, so it'll go into the user's Python directory by default or into the system directory for an all-users installation. The embeddable distro includes python3.dll alongside the other DLLs as well.

Cheers,
Steve

From ncoghlan at gmail.com  Sat May 30 01:08:21 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 09:08:21 +1000
Subject: [Python-Dev] Keeping competitive with Go (was Re: Computed Goto
 dispatch for Python 2)
In-Reply-To: <etPan.55686087.7d9049f5.18516@Draupnir.home>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <20150528121341.74d087da@anarchist.wooz.org>
 <CADiSq7fj_1NdUht+6TBO6YJTKnnB-n57gHPo+7E1bXXimJNyvw@mail.gmail.com>
 <etPan.5567a952.f2533e8.18516@Draupnir.home>
 <CADiSq7e3e0jUV=MP=1xffL5TtUn_2d-HemkXHGrTW3cuCtK2FA@mail.gmail.com>
 <etPan.55686087.7d9049f5.18516@Draupnir.home>
Message-ID: <CADiSq7ckEzhg_Sq5v6-UTcvjyvruEsq252Hi-MjcQ43h0=-z=w@mail.gmail.com>

On 29 May 2015 22:50, "Donald Stufft" <donald at stufft.io> wrote:
>
> This might be something that people could have done before with C/C++ but
with
> a nicer language behind it... but that's kind of the point? You don't
need to
> be stuck with a terrible language to get a nice single file executable
anymore,
> you can get that and use a good language at the same time which makes it
a lot
> more compelling to a lot more people than having to be stuck with C.

Right, but the only things you can really write in Go are network services
and console applications - once you start looking at curses & GUI
applications on the end user side, you're back to the same kind of
distribution complexity as C/C++ (where you have to choose between external
dependency management or very large downloads), and once you start looking
at the infrastructure side, Docker, Rocket & Kubernetes are bringing this
kind of easy deployability to network services written in arbitrary
languages.

Hence my comment about MicroPython: the easiest way to make an interpreter
runtime that's lighter than CPython is to have it *do less*.

Communicating with embedded MicroPython instances via cffi could even
potentially offer a way for both CPython and PyPy to split work across
multiple cores without having to fundamentally redesign their main
interpreters.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150530/93121c46/attachment.html>

From barry at python.org  Sat May 30 01:18:23 2015
From: barry at python.org (Barry Warsaw)
Date: Fri, 29 May 2015 19:18:23 -0400
Subject: [Python-Dev] 2.7 is here until 2020,
 please don't call it a waste.
In-Reply-To: <CAP7+vJ+G7ftOoee2FaVhVvYz70G29UqQ3WO6y0FOCtbaE91SBg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <CAN-Kwu0ujJLfK0xBXABkeO4Fv47gtsFwmGB1MTVg8hwejBn_4g@mail.gmail.com>
 <CAP7+vJ+G7ftOoee2FaVhVvYz70G29UqQ3WO6y0FOCtbaE91SBg@mail.gmail.com>
Message-ID: <20150529191823.11385f5f@anarchist.wooz.org>

On May 29, 2015, at 04:04 PM, Guido van Rossum wrote:

>There are a fair number of people on this thread whose employer pays them to
>work on Python.

My guess is that as Python 2.7 gets longer in the tooth, and it becomes harder
to motivate volunteers to shepherd contributed patches into Python 2, folks
getting paid by employers who need Python 2 to continue to be maintained, will
step up and go through the mentorship process so that they can more directly
apply such patches.  I.e. they can start to take over some of the active
maintenance of the Python 2.7 branch.

Cheers,
-Barry


From graffatcolmingov at gmail.com  Sat May 30 01:20:20 2015
From: graffatcolmingov at gmail.com (Ian Cordasco)
Date: Fri, 29 May 2015 18:20:20 -0500
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CAP7+vJ+G7ftOoee2FaVhVvYz70G29UqQ3WO6y0FOCtbaE91SBg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <CAN-Kwu0ujJLfK0xBXABkeO4Fv47gtsFwmGB1MTVg8hwejBn_4g@mail.gmail.com>
 <CAP7+vJ+G7ftOoee2FaVhVvYz70G29UqQ3WO6y0FOCtbaE91SBg@mail.gmail.com>
Message-ID: <CAN-Kwu3_N2NQCMksh6CYgFfTJEcv1jvepG1G1Gj262kdxHpE=w@mail.gmail.com>

On Fri, May 29, 2015 at 6:04 PM, Guido van Rossum <guido at python.org> wrote:
> On Fri, May 29, 2015 at 2:52 PM, Ian Cordasco <graffatcolmingov at gmail.com>
> wrote:
>>
>> On Fri, May 29, 2015 at 4:14 PM, Gregory P. Smith <greg at krypto.org> wrote:
>> >
>> > On Fri, May 29, 2015 at 12:24 AM Nick Coghlan <ncoghlan at gmail.com>
>> > wrote:
>> >>
>> >>
>> >> On 29 May 2015 11:01 am, "Victor Stinner" <victor.stinner at gmail.com>
>> >> wrote:
>> >> >
>> >> > Why not continue to enhance Python 3 instead of wasting our time with
>> >> > Python 2? We have limited resources in term of developers to maintain
>> >> > Python.
>> >> >
>> >> > (I'm not talking about fixing *bugs* in Python 2 which is fine with
>> >> > me.)
>> >>
>> >> I'm actually OK with volunteers deciding that even fixing bugs in 2.7
>> >> isn't inherently rewarding enough for them to be willing to do it for
>> >> free
>> >> on their own time.
>> >
>> >
>> > That is 100% okay.
>> >
>> > What is not okay is for python-dev representatives to respond to users
>> > (in
>> > any list/forum/channel) reporting bugs in 2.7 or asking if a fix in 3
>> > can be
>> > backported to 2.7 with things akin to "just use Python 3" or "sorry, 2.7
>> > is
>> > critical fixes only. move to python 3 already." This is actively driving
>> > our
>> > largest users away.  I bring this up because a user was bemoaning how
>> > useless they feel python core devs are because of this attitude
>> > recently.
>> > Leading to feelings of wishing to just abandon CPython if not Python all
>> > together.
>> >
>> > I'm sure I have even made some of those responses myself (sorry!). My
>> > point
>> > here is: know it. recognize it. don't do it anymore. It harms the
>> > community.
>> >
>> > A correct and accurate response to desires to make non-api-breaking
>> > changes
>> > in 2.7 is "Patches that do not change any APIs for 2.7 are welcome in
>> > the
>> > issue tracker." possibly including "I don't have the bandwidth to review
>> > 2.7
>> > changes, find someone on python-dev to review and champion this for you
>> > if
>> > you need it."  Finding someone may not always be easy. But at least is
>> > still
>> > the "patches welcome" attitude and suggests that the work can be done if
>> > someone is willing to do it. Lets make a concerted effort to not be
>> > hostile
>> > and against it by default.
>> >
>> > Ex: Is someone with a python application that is a million of lines
>> > supposed
>> > to have everyone involved in that drop the productive work they are
>> > doing
>> > and spend that porting their existing application to python 3 because we
>> > have so far failed to provide the tools to make that migration easy?
>> > No.
>> > Empathize with our community.  Feel their pain.  (and everyone who is
>> > working on tools to aid the transition: keep doing that! Our users are
>> > gonna
>> > need it unless we don't want them as users anymore.)
>> >
>> > We committed to supporting 2.7 until 2020 in 2014 per
>> > https://hg.python.org/peps/rev/76d43e52d978.  That means backports of
>> > important bug or performance fixes should at least be allowed on the
>> > table,
>> > even if hairy, even if you won't work on them yourselves on a volunteer
>> > basis. This is the first long term support release of Python ever. This
>> > is
>> > what LTS means.  LTS could also stand for Learn To Support...
>>
>> At the same time, they can ask for it, but if people aren't motivated
>> to do the work for it, it won't happen. We should be encouraging (and
>> maybe even mentoring) these people who are desperately in need of the
>> fixes to be backported, to backport the patches themselves. With that
>> done, it can go through review and we can maybe get those fixes in
>> faster if we can also get a larger group of reviews.
>>
>> The problem consists of a few parts:
>>
>> - We're all volunteers
>
>
> Speak for yourself. There are a fair number of people on this thread whose
> employer pays them to work on Python. And this thread originated when a
> patch was being contributed by people who were also paid by their employer
> to do all the dirty work (including benchmarks). And yet they were
> (initially) given the cold shoulder by some "high and mighty" Python 3
> zealots. This attitude need to change.
>
>>
>> - Volunteers are going to work on what interests them
>> - Python 2.7 maintenance doesn't seem to interest many of our
>> volunteers currently
>>
>> Perhaps we should explain this to each of the people requesting
>> backports to (ideally) encourage them.
>
>
> Please let someone else do the explaining. I don't want to have to do the
> damage control after you "explain" something.

Good to know. I'll stop trying to make spare time to review patches then.

From ncoghlan at gmail.com  Sat May 30 01:35:38 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 09:35:38 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150529191823.11385f5f@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <CAN-Kwu0ujJLfK0xBXABkeO4Fv47gtsFwmGB1MTVg8hwejBn_4g@mail.gmail.com>
 <CAP7+vJ+G7ftOoee2FaVhVvYz70G29UqQ3WO6y0FOCtbaE91SBg@mail.gmail.com>
 <20150529191823.11385f5f@anarchist.wooz.org>
Message-ID: <CADiSq7d+HG9hucv8a0xY9XG0OQsu1AHj9wPzCHAHmNK2FzuZGg@mail.gmail.com>

On 30 May 2015 09:21, "Barry Warsaw" <barry at python.org> wrote:
>
> On May 29, 2015, at 04:04 PM, Guido van Rossum wrote:
>
> >There are a fair number of people on this thread whose employer pays
them to
> >work on Python.
>
> My guess is that as Python 2.7 gets longer in the tooth, and it becomes
harder
> to motivate volunteers to shepherd contributed patches into Python 2,
folks
> getting paid by employers who need Python 2 to continue to be maintained,
will
> step up and go through the mentorship process so that they can more
directly
> apply such patches.  I.e. they can start to take over some of the active
> maintenance of the Python 2.7 branch.

Yep, I'm hoping to be able to do exactly that for Red Hat folks so we can
minimise our need to carry 2.7 patches downstream without imposing
additional work on volunteers upstream.

We have a few core committers working here now (me, Kushal, Christian,
Victor), but we're not the folks specifically working on Python 2.7
maintenance and support.

This means that while I don't believe "I'm getting paid to support Python
2.7" should be a free ride to commit access, I *do* think it's a factor we
ought to take into account.

Cheers,
Nick.

>
> Cheers,
> -Barry
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150530/b3eaa9a6/attachment.html>

From christian at python.org  Sat May 30 01:49:10 2015
From: christian at python.org (Christian Heimes)
Date: Sat, 30 May 2015 01:49:10 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
Message-ID: <5568FAF6.6040802@python.org>

On 2015-05-29 23:14, Gregory P. Smith wrote:
> 
> On Fri, May 29, 2015 at 12:24 AM Nick Coghlan <ncoghlan at gmail.com
> <mailto:ncoghlan at gmail.com>> wrote:
> 
> 
>     On 29 May 2015 11:01 am, "Victor Stinner" <victor.stinner at gmail.com
>     <mailto:victor.stinner at gmail.com>> wrote:
>     >
>     > Why not continue to enhance Python 3 instead of wasting our time with
>     > Python 2? We have limited resources in term of developers to maintain
>     > Python.
>     >
>     > (I'm not talking about fixing *bugs* in Python 2 which is fine
>     with me.)
> 
>     I'm actually OK with volunteers deciding that even fixing bugs in
>     2.7 isn't inherently rewarding enough for them to be willing to do
>     it for free on their own time.
> 
>  
> That is 100% okay.
> 
> What is not okay is for python-dev representatives to respond to users
> (in any list/forum/channel) reporting bugs in 2.7 or asking if a fix in
> 3 can be backported to 2.7 with things akin to "just use Python 3" or
> "sorry, 2.7 is critical fixes only. move to python 3 already." This is
> actively driving our largest users away.  I bring this up because a user
> was bemoaning how useless they feel python core devs are because of this
> attitude recently. Leading to feelings of wishing to just abandon
> CPython if not Python all together.
> 
> I'm sure I have even made some of those responses myself (sorry!). My
> point here is: know it. recognize it. don't do it anymore. It harms the
> community.
> 
> A correct and accurate response to desires to make non-api-breaking
> changes in 2.7 is "Patches that do not change any APIs for 2.7 are
> welcome in the issue tracker." possibly including "I don't have the
> bandwidth to review 2.7 changes, find someone on python-dev to review
> and champion this for you if you need it."  Finding someone may not
> always be easy. But at least is still the "patches welcome" attitude and
> suggests that the work can be done if someone is willing to do it. Lets
> make a concerted effort to not be hostile and against it by default.
> 
> Ex: Is someone with a python application that is a million of lines
> supposed to have everyone involved in that drop the productive work they
> are doing and spend that porting their existing application to python 3
> because we have so far failed to provide the tools to make that
> migration easy?  No.  Empathize with our community.  Feel their pain.
>  (and everyone who is working on tools to aid the transition: keep doing
> that! Our users are gonna need it unless we don't want them as users
> anymore.)
> 
> We committed to supporting 2.7 until 2020 in 2014 per
> https://hg.python.org/peps/rev/76d43e52d978.  That means backports of
> important bug or performance fixes should at least be allowed on the
> table, even if hairy, even if you won't work on them yourselves on a
> volunteer basis. This is the first long term support release of Python
> ever. This is what LTS means.  LTS could /also/ stand for Learn To
> Support...

Over the last years I have changed my mind a bit, too. For Python 2.7
LTS I welcome performance improving patches as well as security
improvements (SSL module) and build related fixes.

For performance patches we have to consider our responsibility for the
environment. Every improvement means more speed and less power
consumption. Python runs of hundreds of thousands of machines in the
cloud. Python 2.7 will be used for at least half a decade, probably
longer. Servers can be replaced with faster machines later and less
fossil fuel must be burned to produce power. Let's keep Python green! :)

Thanks to Benjamin, the patch has already landed.

Antoine's improved GIL may be another improvement for Python 2.7.
Servers are getting more cores every year. The new GIL helps to scale
multiple CPU bound threads on machines with more cores, e.g.
http://www.dabeaz.com/python/NewGIL.pdf

Christian


From solipsis at pitrou.net  Sat May 30 01:56:21 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 30 May 2015 01:56:21 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org>
Message-ID: <20150530015621.518b537f@fsol>

On Sat, 30 May 2015 01:49:10 +0200
Christian Heimes <christian at python.org> wrote:
> For performance patches we have to consider our responsibility for the
> environment. Every improvement means more speed and less power
> consumption. Python runs of hundreds of thousands of machines in the
> cloud. Python 2.7 will be used for at least half a decade, probably
> longer. Servers can be replaced with faster machines later and less
> fossil fuel must be burned to produce power.

Please keep your ideology out of this.

Regards

Antoine.



From ncoghlan at gmail.com  Sat May 30 02:34:15 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 10:34:15 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530015621.518b537f@fsol>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
Message-ID: <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>

On 30 May 2015 09:57, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
>
> On Sat, 30 May 2015 01:49:10 +0200
> Christian Heimes <christian at python.org> wrote:
> > For performance patches we have to consider our responsibility for the
> > environment. Every improvement means more speed and less power
> > consumption. Python runs of hundreds of thousands of machines in the
> > cloud. Python 2.7 will be used for at least half a decade, probably
> > longer. Servers can be replaced with faster machines later and less
> > fossil fuel must be burned to produce power.
>
> Please keep your ideology out of this.

I'm a qualified engineer (in computer systems engineering), so caring about
environmental sustainability is part of my professional ethical standards,
not just a matter of personal preference: http://www.wfeo.org/ethics/

Given the power draw of large data centres, the environmental impact of
performance improvements to the Python 2 series is a point well worth
considering.

Cheers,
Nick.

>
> Regards
>
> Antoine.
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150530/298e7ff1/attachment.html>

From tritium-list at sdamon.com  Sat May 30 02:40:16 2015
From: tritium-list at sdamon.com (Alexander Walters)
Date: Fri, 29 May 2015 20:40:16 -0400
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
Message-ID: <556906F0.4090504@sdamon.com>

Python is a giant cache-miss generator.  A little performance boost on 
the opt-code dispatch isn't going to change that much.  If we really do 
care about improving python to do less environmental damage, then that 
is a discussion we should be having on it's own merits.  It was really 
out of place, even in this tangenty thread.

On 5/29/2015 20:34, Nick Coghlan wrote:
>
>
> On 30 May 2015 09:57, "Antoine Pitrou" <solipsis at pitrou.net 
> <mailto:solipsis at pitrou.net>> wrote:
> >
> > On Sat, 30 May 2015 01:49:10 +0200
> > Christian Heimes <christian at python.org 
> <mailto:christian at python.org>> wrote:
> > > For performance patches we have to consider our responsibility for the
> > > environment. Every improvement means more speed and less power
> > > consumption. Python runs of hundreds of thousands of machines in the
> > > cloud. Python 2.7 will be used for at least half a decade, probably
> > > longer. Servers can be replaced with faster machines later and less
> > > fossil fuel must be burned to produce power.
> >
> > Please keep your ideology out of this.
>
> I'm a qualified engineer (in computer systems engineering), so caring 
> about environmental sustainability is part of my professional ethical 
> standards, not just a matter of personal preference: 
> http://www.wfeo.org/ethics/
>
> Given the power draw of large data centres, the environmental impact 
> of performance improvements to the Python 2 series is a point well 
> worth considering.
>
> Cheers,
> Nick.
>
> >
> > Regards
> >
> > Antoine.
> >
> >
> > _______________________________________________
> > Python-Dev mailing list
> > Python-Dev at python.org <mailto:Python-Dev at python.org>
> > https://mail.python.org/mailman/listinfo/python-dev
> > Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com
>
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/6d5f1e53/attachment.html>

From v+python at g.nevcal.com  Sat May 30 03:03:08 2015
From: v+python at g.nevcal.com (Glenn Linderman)
Date: Fri, 29 May 2015 18:03:08 -0700
Subject: [Python-Dev] Single-file Python executables (was: Computed Goto
 dispatch for Python 2)
In-Reply-To: <CACac1F_mtBbqv61z54q1akS+8RK41JU0MVVUO9iZkc6RQwi8mw@mail.gmail.com>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>	<etPan.556736c6.581308f4.12a4d@Draupnir.home>	<CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>	<CAD+XWwpJ_ODdy+-WDM2GwV0UgOCdSRGzEccb=ra9Y4Q6WivLpQ@mail.gmail.com>	<CACac1F-O29Yn6cP=YbFfWXeow0BVRy_W9gzqtfp8vhVBqH4RBw@mail.gmail.com>	<CAPTjJmoEJyV=XPdDGRtrnkEupCShj5QtwdkXQVkZAgxfMoN-0Q@mail.gmail.com>	<CACac1F9ZyY_1xSk9daK4-Hzqa4pBWa8_N_zTW3aMckVJLT9Rew@mail.gmail.com>	<55678404.40602@g.nevcal.com>	<CACac1F_B08XdfMBrV1OU_oHzYaXASGxATMhS5U7xe6PJf_KeEA@mail.gmail.com>	<5568D0E7.9030708@g.nevcal.com>	<CACac1F_1kq1qqGE3uQAH+H3KSgnn_pWqZNppdqOq0URzzhdwtA@mail.gmail.com>	<5568E517.9080006@g.nevcal.com>
 <CACac1F_mtBbqv61z54q1akS+8RK41JU0MVVUO9iZkc6RQwi8mw@mail.gmail.com>
Message-ID: <55690C4C.8060604@g.nevcal.com>

On 5/29/2015 3:28 PM, Paul Moore wrote:
> On 29 May 2015 at 23:15, Glenn Linderman <v+python at g.nevcal.com> wrote:
>> I don't presently see any C:\Python34\DLLs or C:\Python34 on my path, but I
>> didn't ask the installer to put it there either. So I'm guessing your option
>> 1 assumes asking the Python installer to put it there? Not "automatically"
>> but "on request", I think?
>>
>> In my c:\Python34\DLLs, I don't see a python34.dll, only python3.dll... so
>> I'm somewhat unclear on your simplified explanation.
>
> I'm definitely batting zero today :-(
>
> OK, let's try to be clear. I typically do "all users" installs. The
> "for me only" install is slightly different, and the new install for
> 3.5 may be different again. But what I see is:
>
> 1. C:\Python34\DLLs containins python3.dll, which is *never* on PATH
> (and doesn't need to be for normal use). Anything that wants to use
> python3.dll needs that directory manually adding to PATH.
> 2. python34.dll is in C:\Windows\System32. This is always available to
> all processes, as it's in the Windows system directory.
>
> If you say "add Python to my PATH" you get C:\Python34 added to PATH.
> For a user install, I believe python34.dll may be in there rather than
> in C:\Windows\system32, so technically, for an app that uses
> python34.dll to work, you need *either* an admin install, *or* to have
> done "add Python to PATH".


Interesting.

In C:\, I have directories  Python27, Python32, Python33, Python34.  I 
can't be 100% sure how I answered the install questions, even Python34 
was a couple months ago.

In C:\Windows\System32, I have python27.dll, python32.dll, python33.dll, 
pythoncom32.dll, pythoncom33.dll, pythoncomloader32.dll, and 
pythoncomloader33.dll.  But not python34.dll!  I finally found that in 
c:\Windows\SysWOW64, which I guess means that I "accidentally" installed 
a 32-bit Python 3.4.  Or maybe I had a reason at the time.  But does 
that add another dimension to the picture for the stub?


> I hope that made sense. Sorry for my garbled previous version.

This matches reality better... but whether it makes sense or not is 
another question.


On 5/29/2015 3:33 PM, Steve Dower wrote:
 > Paul Moore wrote:
 >> One mildly annoying thing is that python3.dll is only installed in 
<python install dir>\DLLs, which
 >> typically isn't on PATH. So actually using the limited API from your 
own application fails by default.
 >> Fixing that's mostly a user admin issue, though (and you can just 
link to the full API and avoid the whole problem).
 >
 > I didn't even notice that 3.4 (and earlier?) were doing that, so I 
changed/fixed it by accident :)


Indeed, and earlier. It apparently started in 3.2 with the definition of 
the Stable ABI.


 > Python 3.5 installs python3.dll alongside python35.dll, so it'll go 
into the user's Python directory by default or into the system directory 
for an all-users installation. The embeddable distro includes 
python3.dll alongside the other DLLs as well.
 >
 > Cheers,
 > Steve


This makes more sense... but will it cause problems with something? It 
seems to me like it was a bug to put it in the <python install dir> 
rather than %windir%\System32 back in Python 3.2 days when it was 
invented. What good is it to have a stable ABI that is hidden away where 
it cannot (easily) be used, and the more dynamic API is easier to get to?

Sadly, PEP 384 is silent on the location of python3.dll.

From chris.barker at noaa.gov  Sat May 30 05:43:58 2015
From: chris.barker at noaa.gov (Chris Barker)
Date: Fri, 29 May 2015 20:43:58 -0700
Subject: [Python-Dev] [Distutils] Single-file Python executables
 (including case of self-sufficient package manager)
In-Reply-To: <20150529172316.516fce6c@x230>
References: <BY1PR03MB14667D450CA9336F0F59CC64F5CA0@BY1PR03MB1466.namprd03.prod.outlook.com>
 <etPan.556736c6.581308f4.12a4d@Draupnir.home>
 <CALGmxE+E=XB3rFZbrYpW4NOBXyOf-BSLQ79o9iz=KWjVs7cy4g@mail.gmail.com>
 <CALGmxEJSz=w4mrU7jn23DLw3TfkWGj-ueVrYCAmfv3cDR2hWoQ@mail.gmail.com>
 <CAPTjJmoVKs59+ToinAkVWNhW4FROh4HTObvASa_Gini4EgWivg@mail.gmail.com>
 <etPan.55674e46.11fdcc9d.12a4d@Draupnir.home>
 <20150529083601.GK932@ando.pearwood.info>
 <etPan.55685d20.63d83f03.18516@Draupnir.home> <20150529172316.516fce6c@x230>
Message-ID: <CALGmxEJvfvkCQ3WOJDEiLQXnMyvyZ4-qDRKzUH0su1f3XePLnw@mail.gmail.com>

On Fri, May 29, 2015 at 7:23 AM, Paul Sokolovsky <pmiscml at gmail.com> wrote:


> > An example of a product that does this is Chef, they install their
> > own Ruby and everything but libc into /opt/chef to completely isolate
> > themselves from the host system.


this sounds a bit like what conda does -- install miniconda, and a conda
environment set up with a yaml file,, and away you go. not small, but quite
self contained, and give you exactly what you want.

-CHB


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

Chris.Barker at noaa.gov
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150529/b7f95933/attachment.html>

From storchaka at gmail.com  Sat May 30 09:09:08 2015
From: storchaka at gmail.com (Serhiy Storchaka)
Date: Sat, 30 May 2015 10:09:08 +0300
Subject: [Python-Dev] RM for 3.6?
Message-ID: <mkbnml$fj$1@ger.gmane.org>

Isn't it a time to assign release manager for 3.6-3.7?


From ncoghlan at gmail.com  Sat May 30 10:55:20 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 18:55:20 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <556906F0.4090504@sdamon.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
Message-ID: <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>

On 30 May 2015 10:46, "Alexander Walters" <tritium-list at sdamon.com> wrote:
>
> Python is a giant cache-miss generator.  A little performance boost on the opt-code dispatch isn't going to change that much.  If we really do care about improving python to do less environmental damage, then that is a discussion we should be having on it's own merits.  It was really out of place, even in this tangenty thread.

I think the way core development gets funded is entirely on topic for
the main core development mailing list, we just historically haven't
discussed it openly, even though some of us have been advocating for
various improvements to arrangements behind the scenes. I personally
consider becoming more transparent about how we go about that process
to be a good thing.

Intel are looking to get involved in CPython core development
*specifically* to work on performance improvements, so it's important
to offer folks in the community good reasons for why we're OK with
seeing at least some of that work applied to Python 2, rather than
restricting their contributions to Python 3.

The key is that the reason for not backporting performance
enhancements *differs* from the reasons for not backporting new
features. Rolling out new features has a broad ripple effect on the
Python ecosystem as books, training material, etc, all need to be
updated, and projects need to decide how to communicate their version
dependencies appropriately if they decide to depend on one of the
backported features. We pushed that kind of Python 2 change out to
PyPI years ago, and aside from truly exceptional cases like the
network security enhancements in PEPs 466 & 476 and the decision to
bundle pip to make it easier to access PyPI, it isn't open for
reconsideration as a general principle.

Performance improvements, by contrast, historically haven't been
backported solely due to the stability and maintainability
implications for CPython itself - they don't have a change management
ripple effect the way new language and standard library level features
do. That lack of negative ripple effects that cause work for other
people is why the proposal to contribute paid development time makes
such a big difference to the acceptability of Python 2.7 performance
patches, as it should be a pure gain for current Python 2.7 users, and
the paid development contributions should address the maintainability
concerns on the core development side (particularly since Intel are
*paying* for their coaching in core contribution practices and
politics, rather than expecting to receive that coaching from
community volunteers for free).

Backporting the computed goto patch is an easy place for them to
start, since the change is already well tested in the Python 3 branch,
but we don't expect this to be the end of the line for CPython 2 (or
3) performance enhancements.

However, we also shouldn't downplay the significance of this as a
notable policy change for the Python 2.7 maintenance branch, which
means it is useful to offer folks as many reasons as we can to help
them come to terms with the idea that Python 2 performance still
matters, and that it is only the limitations on our development and
support capacity that prevented us from further improving it
previously.

The commercially pragmatic reason is because Python 2 is where the
largest current installed base is today, so applying some of the
increased development capacity arising from sponsored contributions to
Python 2.7 performance improvements is a good way to demonstrate to
Python 2 developers that we still care about them *as Python 2 users*,
rather than only being interested in them as potential future Python 3
users. This is the rationale that's likely to get our paid
contributors (both current and future) on board with the idea, but it
isn't necessarily going to be compelling to folks that are here as
volunteers.

The first "What's in it for the volunteers?" reason is the one I
raised: giving the nod to an increased corporate developer presence in
Python 2 maintenance should eventually let volunteers stop worrying
about even Python 2.7 bug fix changes with a clear conscience,
confident that as volunteer efforts drop away redistributors and other
folks with an institutional interest will pick up the slack with paid
development time. "Do the fun stuff for free, figure out a way to get
paid for the boring-but-necessary stuff (or leave those tasks to
someone else that's getting paid to handle them)" is a good
sustainable approach to open source development, while trying to do it
*all* for free is a fast path to burnout.

Being ready, willing and able to handle the kind of situation created
by the Python 2->3 community transition is a large part of what it
means to offer commercial support for community driven open source
projects, as it buys customers' time for either migration technologies
to mature to a point where the cost of migration drops dramatically,
for the newer version of a platform to move far enough ahead of the
legacy version for there to be a clear and compelling business case
for forward porting existing software, or (as is the case we're aiming
to engineer for Python), both.

The environmental argument is another one that may be appealing to
folks that have no commercial interest in improving Python 2
performance. Regardless of which programming language we use to write
our own software, we all still share the same planet, so reducing the
amount of power we collectively use is something we can all benefit
from. Even though none of us have the necessary data to even guess at
the absolute magnitude of that reduction, we can at least be confident
it's a non-trivial portion of the amount of power Python 2
applications are currently consuming.

Regards,
Nick.

From solipsis at pitrou.net  Sat May 30 12:35:10 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 30 May 2015 12:35:10 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
Message-ID: <20150530123510.177f311c@fsol>

On Sat, 30 May 2015 10:34:15 +1000
Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 30 May 2015 09:57, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
> >
> > On Sat, 30 May 2015 01:49:10 +0200
> > Christian Heimes <christian at python.org> wrote:
> > > For performance patches we have to consider our responsibility for the
> > > environment. Every improvement means more speed and less power
> > > consumption. Python runs of hundreds of thousands of machines in the
> > > cloud. Python 2.7 will be used for at least half a decade, probably
> > > longer. Servers can be replaced with faster machines later and less
> > > fossil fuel must be burned to produce power.
> >
> > Please keep your ideology out of this.
> 
> I'm a qualified engineer (in computer systems engineering), so caring about
> environmental sustainability is part of my professional ethical standards,
> not just a matter of personal preference: http://www.wfeo.org/ethics/

There is no reason to assume that a smallish performance improvement in
a single Python 2.7 release will make any difference in "environmental
sustainability" of the world's computing infrastructure, while the
problem is measured in orders of magnitude.  The onus is on to you to
prove the contrary.  Otherwise, bringing it up is mere ideology.

Regards

Antoine.

From rosuav at gmail.com  Sat May 30 12:52:21 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Sat, 30 May 2015 20:52:21 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530123510.177f311c@fsol>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <20150530123510.177f311c@fsol>
Message-ID: <CAPTjJmpSgM_o_fRkAWxaJoibn_hrkOhYBTKa3D9t0QKNF1xs2Q@mail.gmail.com>

On Sat, May 30, 2015 at 8:35 PM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> On Sat, 30 May 2015 10:34:15 +1000
> Nick Coghlan <ncoghlan at gmail.com> wrote:
>> On 30 May 2015 09:57, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
>> >
>> > On Sat, 30 May 2015 01:49:10 +0200
>> > Christian Heimes <christian at python.org> wrote:
>> > > For performance patches we have to consider our responsibility for the
>> > > environment. Every improvement means more speed and less power
>> > > consumption. Python runs of hundreds of thousands of machines in the
>> > > cloud. Python 2.7 will be used for at least half a decade, probably
>> > > longer. Servers can be replaced with faster machines later and less
>> > > fossil fuel must be burned to produce power.
>> >
>> > Please keep your ideology out of this.
>>
>> I'm a qualified engineer (in computer systems engineering), so caring about
>> environmental sustainability is part of my professional ethical standards,
>> not just a matter of personal preference: http://www.wfeo.org/ethics/
>
> There is no reason to assume that a smallish performance improvement in
> a single Python 2.7 release will make any difference in "environmental
> sustainability" of the world's computing infrastructure, while the
> problem is measured in orders of magnitude.  The onus is on to you to
> prove the contrary.  Otherwise, bringing it up is mere ideology.

The magnitude of the environmental benefit of Python performance
improvement is uncertain, but we know what direction it's going to be.
If there's going to be a massive maintenance nightmare, or if the
change comes at a cost of functionality or debuggability, then sure,
the onus is on the person begging for performance improvements; but if
there's no such cost (or if the cost is being carried by the same
person/people who proposed the change), then surely it's worth
something?

Suppose someone came up with a magic patch that makes the CPython core
run 25% faster. No downsides, just 25% faster across the board. I
wouldn't pay money for it on the sole basis of expecting to make that
back in reduced electricity bills, but I certainly wouldn't be sorry
to watch the load averages drop. Why is this controversial?

ChrisA

From solipsis at pitrou.net  Sat May 30 12:58:11 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 30 May 2015 12:58:11 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
Message-ID: <20150530125811.56b5ec5f@fsol>

On Sat, 30 May 2015 18:55:20 +1000
Nick Coghlan <ncoghlan at gmail.com> wrote:
> On 30 May 2015 10:46, "Alexander Walters" <tritium-list at sdamon.com> wrote:
> >
> > Python is a giant cache-miss generator.  A little performance boost on the opt-code dispatch isn't going to change that much.  If we really do care about improving python to do less environmental damage, then that is a discussion we should be having on it's own merits.  It was really out of place, even in this tangenty thread.
> 
> I think the way core development gets funded is entirely on topic for
> the main core development mailing list, we just historically haven't
> discussed it openly, even though some of us have been advocating for
> various improvements to arrangements behind the scenes. I personally
> consider becoming more transparent about how we go about that process
> to be a good thing.

The way this so-called discussion is taking place feels much less like
an actual discussion than an aggressive push for a change in maintenance
policy. Guido has already taunted Ian Cordasco out of contributing.

Regards

Antoine.



From ncoghlan at gmail.com  Sat May 30 12:58:56 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 20:58:56 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530123510.177f311c@fsol>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <20150530123510.177f311c@fsol>
Message-ID: <CADiSq7eaVD3er0arApJ=ZGHHNVUZtkb21Rkc1R4KONxWdjSReA@mail.gmail.com>

On 30 May 2015 at 20:35, Antoine Pitrou <solipsis at pitrou.net> wrote:
> On Sat, 30 May 2015 10:34:15 +1000
> Nick Coghlan <ncoghlan at gmail.com> wrote:
>> On 30 May 2015 09:57, "Antoine Pitrou" <solipsis at pitrou.net> wrote:
>> >
>> > On Sat, 30 May 2015 01:49:10 +0200
>> > Christian Heimes <christian at python.org> wrote:
>> > > For performance patches we have to consider our responsibility for the
>> > > environment. Every improvement means more speed and less power
>> > > consumption. Python runs of hundreds of thousands of machines in the
>> > > cloud. Python 2.7 will be used for at least half a decade, probably
>> > > longer. Servers can be replaced with faster machines later and less
>> > > fossil fuel must be burned to produce power.
>> >
>> > Please keep your ideology out of this.
>>
>> I'm a qualified engineer (in computer systems engineering), so caring about
>> environmental sustainability is part of my professional ethical standards,
>> not just a matter of personal preference: http://www.wfeo.org/ethics/
>
> There is no reason to assume that a smallish performance improvement in
> a single Python 2.7 release will make any difference in "environmental
> sustainability" of the world's computing infrastructure, while the
> problem is measured in orders of magnitude.  The onus is on to you to
> prove the contrary.  Otherwise, bringing it up is mere ideology.

This isn't about this one change - it's about changing the Python 2.7
maintenance policy to allow ongoing performance improvements to Python
2.7, backed by additional commercial investment in Python 2.7
maintenance to mitigate the increased risks to stability and
maintainability.

As I say in my other email, though, not all of our volunteers are
going to care about the fact that there are a lot of institutional
downstream users of Python 2.7 that will appreciate this change in
policy (e.g. all of the open government data sites running on CKAN:
http://ckan.org/instances ), as well as the sponsored contributions
that make it feasible.

If the environmental benefits (however unquantifiable) help some folks
to see the value in the change in policy, then that's a good thing,
even if it's not the actual primary motivation for the change (the
latter honor belongs to the fact that folks at Intel are interested in
working on it, and they've backed that interest up both by joining the
PSF as a sponsor member, and by hiring David Murray's firm to help
coach them through the process).

As strings go, "we want to work on improving Python 2.7 performance,
not just Python 3 performance" isn't a bad one to have attached to a
credible offer of ongoing contributions to CPython development :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From solipsis at pitrou.net  Sat May 30 13:00:37 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 30 May 2015 13:00:37 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <20150530123510.177f311c@fsol>
 <CAPTjJmpSgM_o_fRkAWxaJoibn_hrkOhYBTKa3D9t0QKNF1xs2Q@mail.gmail.com>
Message-ID: <20150530130037.72815f1e@fsol>

On Sat, 30 May 2015 20:52:21 +1000
Chris Angelico <rosuav at gmail.com> wrote:
> 
> Suppose someone came up with a magic patch that makes the CPython core
> run 25% faster. No downsides, just 25% faster across the board. I
> wouldn't pay money for it on the sole basis of expecting to make that
> back in reduced electricity bills, but I certainly wouldn't be sorry
> to watch the load averages drop. Why is this controversial?

That was not my point. What I'm opposing is the idea that
"environmental sustainability" (or what people's ideological conception
of it is) should become part of our criteria when making maintenance
decisions.

Obviously if a patch makes CPython faster without any downsides, there
is no need to argue about environmental sustainability to make the
patch desirable. The performance improvement itself is a sufficient
reason.

Regards

Antoine.



From rosuav at gmail.com  Sat May 30 13:16:28 2015
From: rosuav at gmail.com (Chris Angelico)
Date: Sat, 30 May 2015 21:16:28 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530130037.72815f1e@fsol>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <20150530123510.177f311c@fsol>
 <CAPTjJmpSgM_o_fRkAWxaJoibn_hrkOhYBTKa3D9t0QKNF1xs2Q@mail.gmail.com>
 <20150530130037.72815f1e@fsol>
Message-ID: <CAPTjJmpPpcYSXYiSOZc5Xw4592N2jEtNtMv2pzPWpJCwe=yizQ@mail.gmail.com>

On Sat, May 30, 2015 at 9:00 PM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> On Sat, 30 May 2015 20:52:21 +1000
> Chris Angelico <rosuav at gmail.com> wrote:
>>
>> Suppose someone came up with a magic patch that makes the CPython core
>> run 25% faster. No downsides, just 25% faster across the board. I
>> wouldn't pay money for it on the sole basis of expecting to make that
>> back in reduced electricity bills, but I certainly wouldn't be sorry
>> to watch the load averages drop. Why is this controversial?
>
> That was not my point. What I'm opposing is the idea that
> "environmental sustainability" (or what people's ideological conception
> of it is) should become part of our criteria when making maintenance
> decisions.
>
> Obviously if a patch makes CPython faster without any downsides, there
> is no need to argue about environmental sustainability to make the
> patch desirable. The performance improvement itself is a sufficient
> reason.

Okay. But what objection do you have to reduced electricity usage? I'm
still not understanding how this is a problem. It might not be a
priority for everyone, but surely it's a nice bonus?

ChrisA

From ncoghlan at gmail.com  Sat May 30 13:20:56 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 21:20:56 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530125811.56b5ec5f@fsol>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530125811.56b5ec5f@fsol>
Message-ID: <CADiSq7du2T8_JK2L94PBYDJDx3U8bWZatE1ymgrCPM1ntqq3Ew@mail.gmail.com>

On 30 May 2015 at 20:58, Antoine Pitrou <solipsis at pitrou.net> wrote:
> On Sat, 30 May 2015 18:55:20 +1000
> Nick Coghlan <ncoghlan at gmail.com> wrote:
>> On 30 May 2015 10:46, "Alexander Walters" <tritium-list at sdamon.com> wrote:
>> >
>> > Python is a giant cache-miss generator.  A little performance boost on the opt-code dispatch isn't going to change that much.  If we really do care about improving python to do less environmental damage, then that is a discussion we should be having on it's own merits.  It was really out of place, even in this tangenty thread.
>>
>> I think the way core development gets funded is entirely on topic for
>> the main core development mailing list, we just historically haven't
>> discussed it openly, even though some of us have been advocating for
>> various improvements to arrangements behind the scenes. I personally
>> consider becoming more transparent about how we go about that process
>> to be a good thing.
>
> The way this so-called discussion is taking place feels much less like
> an actual discussion than an aggressive push for a change in maintenance
> policy. Guido has already taunted Ian Cordasco out of contributing.

Ian was unfortunately responding from incomplete information. While
"we're all volunteers here" was true for a very long time, with Guido
being the main exception since the PythonLabs days, a number of folks
(both existing core contributors and members of other organisations)
have been working hard to change that, since it's an unsustainable
state of affairs given the criticality of CPython as a piece of
Internet infrastructure.

Given the extensive complaints about the lack of corporate
contribution to upstream CPython maintenance, the hostile reaction to
a concrete proposal for such ongoing contributions has been both
incredibly surprising *and* disappointing, especially when it was
deliberately aimed at tasks that most volunteers find to be a
unrewarding chore rather than an entertaining use of their free time.

The offer came with one string attached: that the Python 2.7 branch be
opened up for performance improvements in addition to bug fixes. Since
maintainability was the main concern with not backporting performance
improvements in the first place, this seemed like a straight up win to
me (and presumably to other folks aware of the offer), so it never
even occurred to us that folks might not accept "because this proposal
is backed by a credible offer of ongoing contributions to CPython
maintenance and support" as a complete answer to the question of "Why
accept this proposal to backport performance enhancements, and not
previous proposals?".

Regards,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From solipsis at pitrou.net  Sat May 30 13:37:22 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 30 May 2015 13:37:22 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7du2T8_JK2L94PBYDJDx3U8bWZatE1ymgrCPM1ntqq3Ew@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530125811.56b5ec5f@fsol>
 <CADiSq7du2T8_JK2L94PBYDJDx3U8bWZatE1ymgrCPM1ntqq3Ew@mail.gmail.com>
Message-ID: <20150530133722.52a3ed11@fsol>

On Sat, 30 May 2015 21:20:56 +1000
Nick Coghlan <ncoghlan at gmail.com> wrote:
> Given the extensive complaints about the lack of corporate
> contribution to upstream CPython maintenance, the hostile reaction to
> a concrete proposal for such ongoing contributions has been both
> incredibly surprising *and* disappointing

IMHO, they were not more hostile than against individuals'
contributions of the same kind. Any patch proposal is bound to
controversy, that's a normal aspect of the process, and one that
contributors should usually be willing to go through.

Also, when there are in rules in place, most people want to see them
upholded, because that tends to promote fairness much more than when
exceptions are granted all over the place. So people's reactions have
really been understandable, if debatable.

(FTR, Intel contacted me in private about such contributions and I said
the backport of the computed gotos sounded ok to me -- since it has
turned out entirely harmless on the 3.x branches --; that doesn't mean I
like how this public discussion has turned out)

> The offer came with one string attached: that the Python 2.7 branch be
> opened up for performance improvements in addition to bug fixes. Since
> maintainability was the main concern with not backporting performance
> improvements in the first place, this seemed like a straight up win to
> me (and presumably to other folks aware of the offer), so it never
> even occurred to us
> that folks might not accept "because this proposal
> is backed by a credible offer of ongoing contributions to CPython
> maintenance and support" as a complete answer to the question of "Why
> accept this proposal to backport performance enhancements, and not
> previous proposals?".

You're making contribution some kind of contractual engagement. That's
not an obvious improvement, because it has some large impacts on the
power structure (for one, volunteers can't reasonably compete with
contractual engagements).

Regards

Antoine.

From stephen at xemacs.org  Sat May 30 13:51:54 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Sat, 30 May 2015 20:51:54 +0900
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530015621.518b537f@fsol>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
Message-ID: <87wpzqth9h.fsf@uwakimon.sk.tsukuba.ac.jp>

Antoine Pitrou writes:
 > On Sat, 30 May 2015 01:49:10 +0200
 > Christian Heimes <christian at python.org> wrote:
 > > For performance patches we have to consider our responsibility for the
 > > environment. Every improvement means more speed and less power
 > > consumption. Python runs of hundreds of thousands of machines in the
 > > cloud. Python 2.7 will be used for at least half a decade, probably
 > > longer. Servers can be replaced with faster machines later and less
 > > fossil fuel must be burned to produce power.
 > 
 > Please keep your ideology out of this.

Bad idea, unless you have benchmarks and engineering studies proving
that that effect doesn't exist and never will.

In a community of volunteers, ideology is typically a great motivator.
If it weren't for ideology (specifically, RMS's), many of us wouldn't
be here, and quite likely nothing like the current Linux and BSD
ecosystems would be available yet, and maybe not at all.  Which points
to a better idea: Harness ideology to encourage contributions that
help us all.

Hey, Christian, maybe you know some sustainability advocates who'd
like to help fund that work?  Or do the programming?

From antoine at python.org  Sat May 30 14:03:40 2015
From: antoine at python.org (Antoine Pitrou)
Date: Sat, 30 May 2015 14:03:40 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <87wpzqth9h.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>	<CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>	<7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>	<CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>	<CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>	<mk7asm$37i$1@ger.gmane.org>	<CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>	<CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>	<CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>	<CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>	<5568FAF6.6040802@python.org>	<20150530015621.518b537f@fsol>
 <87wpzqth9h.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <5569A71C.80406@python.org>


Le 30/05/2015 13:51, Stephen J. Turnbull a ?crit :
> Antoine Pitrou writes:
>  > On Sat, 30 May 2015 01:49:10 +0200
>  > Christian Heimes <christian at python.org> wrote:
>  > > For performance patches we have to consider our responsibility for the
>  > > environment. Every improvement means more speed and less power
>  > > consumption. Python runs of hundreds of thousands of machines in the
>  > > cloud. Python 2.7 will be used for at least half a decade, probably
>  > > longer. Servers can be replaced with faster machines later and less
>  > > fossil fuel must be burned to produce power.
>  > 
>  > Please keep your ideology out of this.
> 
> Bad idea, unless you have benchmarks and engineering studies proving
> that that effect doesn't exist and never will.

No, it's up to the proponent to prove that the effect exists, with a
magnitude large enough to make any interesting difference. That's part
of the process when suggesting a change.

If it doesn't, or if it's entirely cosmetical, it may be an important
part of Christian's lifestyle (as are many individual practices,
including religious, militant, dietetic...), but it certainly shouldn't
brought up here. We don't want everyone trying to inject their beliefs
in the maintenance process.

> In a community of volunteers, ideology is typically a great motivator.

If and only everyone agrees on it. Otherwise, it is typically a great
divisor. Even abidance to RMS' writings and actions would probably not
be unanimous here...

Regards

Antoine.

From christian at python.org  Sat May 30 14:57:58 2015
From: christian at python.org (Christian Heimes)
Date: Sat, 30 May 2015 14:57:58 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <5569A71C.80406@python.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>	<CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>	<7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>	<CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>	<CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>	<mk7asm$37i$1@ger.gmane.org>	<CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>	<CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>	<CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>	<CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>	<5568FAF6.6040802@python.org>	<20150530015621.518b537f@fsol>
 <87wpzqth9h.fsf@uwakimon.sk.tsukuba.ac.jp> <5569A71C.80406@python.org>
Message-ID: <5569B3D6.4070501@python.org>

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

On 2015-05-30 14:03, Antoine Pitrou wrote:
> No, it's up to the proponent to prove that the effect exists, with
> a magnitude large enough to make any interesting difference. That's
> part of the process when suggesting a change.
> 
> If it doesn't, or if it's entirely cosmetical, it may be an
> important part of Christian's lifestyle (as are many individual
> practices, including religious, militant, dietetic...), but it
> certainly shouldn't brought up here. We don't want everyone trying
> to inject their beliefs in the maintenance process.

Antoine,

now your are putting it over the top. You make it sound like I'm some
crazy environmentalist or eco-warrior. Well, I'm not. I merely keep
the environment in mind. Yes, I have a modern, power saving washing
machine and LED lights at home. Mostly because they save money in the
long run (Germany's electricity prices are high).

That was also the point behind my comment. Increased performance
result in better efficiency which lead to better utilization of
hardware and less power consumption. Companies are interested in
better efficiency, because they have to pay less for hardware, power
and cooling. The obvious benefits for our environment are a side effect.

A smaller CO2 foot print is not my main concern. But I wanted to bring
it up anyway. For some it is an small additional motivator for
performance improvements. For others it could be a marketing
instrument. In Germany ads are full of crazy 'green' slogans.

Christian
-----BEGIN PGP SIGNATURE-----

iQEcBAEBCgAGBQJVabPTAAoJEIZoUkkhLbaJTF0H+wb8ciikP762qc8u586H2AjV
2xV9AAamI1Z6RwlvKRM7YHVk48coYIKk9WQ6DZODNlVSIhnijexII1dai91gbQvy
jEVkLK2P6/C7I4gz7Fp0/SoCwkpGCev2CiSJUhIoE4oIw+Mm4BRASpf5hn4n+pRI
yqXixYf7h+QWHgN0FRU3GU8RxNYRe65zB/3YeDUhKLQdkf8Gq4NVX7rlTx1gvZrq
DbaGjKtkT8uec6hnvZcXwWVODYW10VHTonhlV3ff0sReXw94sXOeQwQ3n+7uwKAb
sqvy11k0r6JejNGFxJqfMyXH557LP5ucc2g9+J8M2Sw4SOs7L6E+caaX89FY754=
=soyL
-----END PGP SIGNATURE-----

From ncoghlan at gmail.com  Sat May 30 15:16:25 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 30 May 2015 23:16:25 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530133722.52a3ed11@fsol>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530125811.56b5ec5f@fsol>
 <CADiSq7du2T8_JK2L94PBYDJDx3U8bWZatE1ymgrCPM1ntqq3Ew@mail.gmail.com>
 <20150530133722.52a3ed11@fsol>
Message-ID: <CADiSq7cfJjd7rvmZiAT0Upn1Nm0Z9pjJJUUOuAOnrG8VnkZqSA@mail.gmail.com>

On 30 May 2015 at 21:37, Antoine Pitrou <solipsis at pitrou.net> wrote:
> On Sat, 30 May 2015 21:20:56 +1000
> Nick Coghlan <ncoghlan at gmail.com> wrote:
>> Given the extensive complaints about the lack of corporate
>> contribution to upstream CPython maintenance, the hostile reaction to
>> a concrete proposal for such ongoing contributions has been both
>> incredibly surprising *and* disappointing
>
> IMHO, they were not more hostile than against individuals'
> contributions of the same kind. Any patch proposal is bound to
> controversy, that's a normal aspect of the process, and one that
> contributors should usually be willing to go through.
>
> Also, when there are in rules in place, most people want to see them
> upholded, because that tends to promote fairness much more than when
> exceptions are granted all over the place. So people's reactions have
> really been understandable, if debatable.

Agreed, but it's also understandable when folks forget that things
that they're taking for granted aren't necessarily common knowledge.

In this case:
* the fact that this proposal was a suggested starting point for
ongoing contributions, not a one-and-done effort
* the fact that the rationale for the prohibition on performance
enhancements was *different* from the reason for disallowing new
features (and hence requiring a PEP for *any* new Python 2.7 feature)

For folks that already knew both those facts, this *wasn't* a
controversial suggestion. We unfortunately failed to account for the
fact that not everyone was aware of that context, and that *is* a
highly regrettable mistake.

Hence my information dumps thoughout the thread, attempting to provide
that context without committing folks to things they haven't committed
to, and without disclosing potentially confidential third party
information.

>> The offer came with one string attached: that the Python 2.7 branch be
>> opened up for performance improvements in addition to bug fixes. Since
>> maintainability was the main concern with not backporting performance
>> improvements in the first place, this seemed like a straight up win to
>> me (and presumably to other folks aware of the offer), so it never
>> even occurred to us
>> that folks might not accept "because this proposal
>> is backed by a credible offer of ongoing contributions to CPython
>> maintenance and support" as a complete answer to the question of "Why
>> accept this proposal to backport performance enhancements, and not
>> previous proposals?".
>
> You're making contribution some kind of contractual engagement. That's
> not an obvious improvement, because it has some large impacts on the
> power structure (for one, volunteers can't reasonably compete with
> contractual engagements).

We've long had a requirement that certain kinds of proposal come with
at least nominal support commitments from the folks proposing them
(e.g. adding modules to the standard library, supporting new
platforms). Institutions with a clear financial interest in a
particular problem area can certainly make such commitments more
credibly, so I agree with your concerns about the potential impact on
the power dynamics of core development.

That's one of the main benefits I see in attempting to guide sponsored
contributions towards Python 2.7, at least initially - that's in LTS
mode, so working on it is fairly uninteresting anyway, and it keeps
discussion of *new* features (and hence the overall direction of
language evolution) a community focused activity.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From tritium-list at sdamon.com  Sat May 30 13:49:43 2015
From: tritium-list at sdamon.com (Alexander Walters)
Date: Sat, 30 May 2015 07:49:43 -0400
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>	<CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>	<7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>	<CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>	<CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>	<mk7asm$37i$1@ger.gmane.org>	<CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>	<CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>	<CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>	<CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>	<5568FAF6.6040802@python.org>	<20150530015621.518b537f@fsol>	<CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>	<556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
Message-ID: <5569A3D7.9010108@sdamon.com>

Who said anything about funding?  this is a thread about the patch Intel 
offered (and had committed).

And that's the point.  This is the thread about THAT patch.  Why are we 
hijacking this topic for an environmental debate?  If it is a legitimate 
topic (which it might be), discuss it in its own right. Otherwise it 
sounds like guilt-tripping and greenwashing.

This patch will do little to nothing statistically significant for the 
environment.  Bringing that up is ideology and politics.

On 5/30/2015 04:55, Nick Coghlan wrote:
> On 30 May 2015 10:46, "Alexander Walters" <tritium-list at sdamon.com> wrote:
>> Python is a giant cache-miss generator.  A little performance boost on the opt-code dispatch isn't going to change that much.  If we really do care about improving python to do less environmental damage, then that is a discussion we should be having on it's own merits.  It was really out of place, even in this tangenty thread.
> I think the way core development gets funded is entirely on topic for
> the main core development mailing list, we just historically haven't
> discussed it openly, even though some of us have been advocating for
> various improvements to arrangements behind the scenes. I personally
> consider becoming more transparent about how we go about that process
> to be a good thing.
>
> Intel are looking to get involved in CPython core development
> *specifically* to work on performance improvements, so it's important
> to offer folks in the community good reasons for why we're OK with
> seeing at least some of that work applied to Python 2, rather than
> restricting their contributions to Python 3.
>
> The key is that the reason for not backporting performance
> enhancements *differs* from the reasons for not backporting new
> features. Rolling out new features has a broad ripple effect on the
> Python ecosystem as books, training material, etc, all need to be
> updated, and projects need to decide how to communicate their version
> dependencies appropriately if they decide to depend on one of the
> backported features. We pushed that kind of Python 2 change out to
> PyPI years ago, and aside from truly exceptional cases like the
> network security enhancements in PEPs 466 & 476 and the decision to
> bundle pip to make it easier to access PyPI, it isn't open for
> reconsideration as a general principle.
>
> Performance improvements, by contrast, historically haven't been
> backported solely due to the stability and maintainability
> implications for CPython itself - they don't have a change management
> ripple effect the way new language and standard library level features
> do. That lack of negative ripple effects that cause work for other
> people is why the proposal to contribute paid development time makes
> such a big difference to the acceptability of Python 2.7 performance
> patches, as it should be a pure gain for current Python 2.7 users, and
> the paid development contributions should address the maintainability
> concerns on the core development side (particularly since Intel are
> *paying* for their coaching in core contribution practices and
> politics, rather than expecting to receive that coaching from
> community volunteers for free).
>
> Backporting the computed goto patch is an easy place for them to
> start, since the change is already well tested in the Python 3 branch,
> but we don't expect this to be the end of the line for CPython 2 (or
> 3) performance enhancements.
>
> However, we also shouldn't downplay the significance of this as a
> notable policy change for the Python 2.7 maintenance branch, which
> means it is useful to offer folks as many reasons as we can to help
> them come to terms with the idea that Python 2 performance still
> matters, and that it is only the limitations on our development and
> support capacity that prevented us from further improving it
> previously.
>
> The commercially pragmatic reason is because Python 2 is where the
> largest current installed base is today, so applying some of the
> increased development capacity arising from sponsored contributions to
> Python 2.7 performance improvements is a good way to demonstrate to
> Python 2 developers that we still care about them *as Python 2 users*,
> rather than only being interested in them as potential future Python 3
> users. This is the rationale that's likely to get our paid
> contributors (both current and future) on board with the idea, but it
> isn't necessarily going to be compelling to folks that are here as
> volunteers.
>
> The first "What's in it for the volunteers?" reason is the one I
> raised: giving the nod to an increased corporate developer presence in
> Python 2 maintenance should eventually let volunteers stop worrying
> about even Python 2.7 bug fix changes with a clear conscience,
> confident that as volunteer efforts drop away redistributors and other
> folks with an institutional interest will pick up the slack with paid
> development time. "Do the fun stuff for free, figure out a way to get
> paid for the boring-but-necessary stuff (or leave those tasks to
> someone else that's getting paid to handle them)" is a good
> sustainable approach to open source development, while trying to do it
> *all* for free is a fast path to burnout.
>
> Being ready, willing and able to handle the kind of situation created
> by the Python 2->3 community transition is a large part of what it
> means to offer commercial support for community driven open source
> projects, as it buys customers' time for either migration technologies
> to mature to a point where the cost of migration drops dramatically,
> for the newer version of a platform to move far enough ahead of the
> legacy version for there to be a clear and compelling business case
> for forward porting existing software, or (as is the case we're aiming
> to engineer for Python), both.
>
> The environmental argument is another one that may be appealing to
> folks that have no commercial interest in improving Python 2
> performance. Regardless of which programming language we use to write
> our own software, we all still share the same planet, so reducing the
> amount of power we collectively use is something we can all benefit
> from. Even though none of us have the necessary data to even guess at
> the absolute magnitude of that reduction, we can at least be confident
> it's a non-trivial portion of the amount of power Python 2
> applications are currently consuming.
>
> Regards,
> Nick.


From stephen at xemacs.org  Sat May 30 15:53:13 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Sat, 30 May 2015 22:53:13 +0900
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <5569A71C.80406@python.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <87wpzqth9h.fsf@uwakimon.sk.tsukuba.ac.jp>
 <5569A71C.80406@python.org>
Message-ID: <87vbfatbna.fsf@uwakimon.sk.tsukuba.ac.jp>

Antoine Pitrou writes:

 > > In a community of volunteers, ideology is typically a great
 > > motivator.
 > 
 > If and only everyone agrees on it.

That, my friend, is *your* ideology speaking.  Some people work on
open source to scratch technical itches -- the program doesn't do what
they want, they're able to improve it, the license allows them to
improve it, so they do, done.  Others use that same freedom to change
the software to improve the world in other ways.  We don't need to
agree on *why* we do the work we do.  We only need to agree on an
evaluation and arbitration process for determining *which* work gets
released as part of "original Python".  More on that below.
 
 > Otherwise, it is typically a great divisor.

Only because some people make a point of insisting on implementing
theirs[1] -- and others insist on objecting to any mention of it.  I
think both extremes are divisive -- but nothing new there, extremes
usually are divisive.

Now, Christian did say "must" when he suggested considering the
environment, and that's obviously not right.  To the extent that folks
are volunteers and not bound by the professional ethics that Nick
professes, there's no *must* about it.  I don't think Christian really
meant to try to impose that on everybody in the project, though.  It
was more a wish on his part as I understand it, one he knows will at
best be fulfilled gradually and voluntarily as people come to be aware
of the issue and agree with him that some things need to be done to
address it.

But if people like Christian choose to work on patches because they
are "environmentally friendly", or vote +1 on them, even if that means
a clarification or even reinterpretation of maintenance policy, why
should we care whether they say what their motivation is?

On the other hand, if it *is* a change in maintenance policy to commit
the Intel patch, IMO you have right on your side to speak up about
that (as you do elsewhere).  (OTOH, it seems to me that most posters
in this thread so far agree that it's a mere clarification of
*policy*, although it's a clear reallocation of *effort* that probably
wouldn't come voluntarily from the core committers.)

You're also right to point out that the nature of the community will
change as people paid to work on commercially desirable tasks become
committers.  Definitely the natures of Linux and GUI framework
development changed (as indeed X11 did when it passed from a
commercial consortium to a more open organization) as commercial
interests started supplying more and more effort, as well as hiring
core developers.  Whether that prospective change is a good thing for
Python is a matter for debate, and (speaking only for myself, and this
may not be the appropriate channel anyway) I'm interested in hearing
your discussion on that matter.

 > Even abidance to RMS' writings and actions would probably not
 > be unanimous here...

I assure there's absolutely no "probably" about it.  You evidently
missed the (intended though obscure) irony of *me* praising RMS's
ideology (see return address).


Footnotes: 
[1]  You could argue that "insisting on implementing" is implied by
"ideology", but then I expect that Christian would deny a desire to
*impose* his values on the project.


From solipsis at pitrou.net  Sat May 30 15:54:40 2015
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 30 May 2015 15:54:40 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <87wpzqth9h.fsf@uwakimon.sk.tsukuba.ac.jp>
 <5569A71C.80406@python.org> <5569B3D6.4070501@python.org>
Message-ID: <20150530155440.31791de1@fsol>


Hi Christian,

> Antoine,
> 
> now your are putting it over the top. You make it sound like I'm some
> crazy environmentalist or eco-warrior. Well, I'm not.

I apologize for misrepresenting your position.
I still don't think discussing environmental matters is really
productive here, though :-)

Regards

Antoine.



From a.badger at gmail.com  Sat May 30 16:26:23 2015
From: a.badger at gmail.com (Toshio Kuratomi)
Date: Sat, 30 May 2015 07:26:23 -0700
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
Message-ID: <CABVPEKqW1qoHLSPppqtEELbLR=15cqUKMAk55CTVv+F1HqqUtw@mail.gmail.com>

On May 30, 2015 1:56 AM, "Nick Coghlan" <ncoghlan at gmail.com> wrote:
>
> Being ready, willing and able to handle the kind of situation created
> by the Python 2->3 community transition is a large part of what it
> means to offer commercial support for community driven open source
> projects, as it buys customers' time for either migration technologies
> to mature to a point where the cost of migration drops dramatically,
> for the newer version of a platform to move far enough ahead of the
> legacy version for there to be a clear and compelling business case
> for forward porting existing software, or (as is the case we're aiming
> to engineer for Python), both.
>
Earlier, you said that it had been a surprise that people were against this
change.  I'd just point out that the reason is bound up in what you say
here.  Porting performance features from python 3 to python 2 has the
disadvantage of cutting into a compelling business case for users to move
forward to python 3.[1]  so doing this has a cost to python 3 adoption.
But, the question is whether there is a benefit that outweighs that cost.
I think seeing more steady, reliable contributors to python core is a very
large payment.  Sure, for now that payment is aimed at extending the legs
on the legacy version of python but at some point in the future python 2's
legs will be well and truly exhausted.  When that happens both the
developers who have gained the skill of contributing to cpython and the
companies who have invested money in training people to be cpython
contributors will have to decide whether to give up on all of that or
continue to utilize those skills and investments by bettering python 3.
I'd hope that we can prove ourselves a welcoming enough community that
they'd choose to stay.

-Toshio

[1] In fact, performance differences are a rather safe way to build
compelling business cases for forwards porting.  Safe because it is a
difference (unlike api and feature differences) that will not negatively
affect your ability to incrementally move your code to python 3.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150530/e6cd3c43/attachment.html>

From barry at python.org  Sat May 30 17:42:19 2015
From: barry at python.org (Barry Warsaw)
Date: Sat, 30 May 2015 11:42:19 -0400
Subject: [Python-Dev] 2.7 is here until 2020,
 please don't call it a waste.
In-Reply-To: <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
Message-ID: <20150530114219.0b9beaa7@anarchist.wooz.org>

On May 30, 2015, at 06:55 PM, Nick Coghlan wrote:

>Intel are looking to get involved in CPython core development
>*specifically* to work on performance improvements, so it's important
>to offer folks in the community good reasons for why we're OK with
>seeing at least some of that work applied to Python 2, rather than
>restricting their contributions to Python 3.

I think that's fine, for all the reasons you, Toshio, and others mention.  For
better or worse, Python 2.7 *is* our LTS release so I think we can make life
easier for the folks stuck on it <wink>.

However, I want us to be very careful not to accept performance improvements
in Python 2.7 that haven't also been applied to Python 3, unless of course
they aren't relevant.  Python 3 also has a need for performance improvements,
perhaps more so for various reasons, so let's make sure we're pushing that
forward too.

In many cases where you have a long lived stable release and active
development releases, it's generally the policy that fixes show up in the dev
release first.  At least, this is the case with Ubuntu and SRUs, and it makes
a lot of sense.

Cheers,
-Barry

From gmludo at gmail.com  Sat May 30 19:32:41 2015
From: gmludo at gmail.com (Ludovic Gasc)
Date: Sat, 30 May 2015 19:32:41 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <20150530114219.0b9beaa7@anarchist.wooz.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530114219.0b9beaa7@anarchist.wooz.org>
Message-ID: <CAON-fpFr9daNj45nHY1kp5NaCHAvCEgdKCSz7SLMh4H-oeC46A@mail.gmail.com>

For now, I'm following the mailing-lists from a spy-glass: I don't read
most of the e-mails.
However, this thread seems to be "infected": I can smell from here your
emotions behind your words.

Why to push a lot of emotions inside a technical discussion ?
What's the nerves have been hit with this discussion ?

If you know me a little bit, you know I'm always interested in by
efficiency improvements, especially around Python.

However, I see two parts of this discussion:

1. Python 3 must continue to be the first class citizen for the features,
bugs-killing and performance improvements, as Barry explained.
Programming in Python isn't only a language, it's also a spirit and a
community with forces and weaknesses.

The main "issue" for the Python 3 adoption by the community is that Python
community is mainly composed by Late Majority and Laggards [1], contrary to
some fancy programming language like Ruby, Go, Rust, <insert your fancy
language here> where you have a majority of Early Adopters. For example,
the migration from Ruby 1.8 to 1.9 has taken time because they changed some
critical parts, but finally, now, almost nobody uses Ruby 1.8 on production.
FYI, Ruby 1.9 has been released only one year after Python 3.0, and Ruby
community has finished their migration a long time ago, where you continue
to support Python 2.7. Maybe the change was less important between Ruby 1.8
and 1.9 that between Python 2 and Python 3, however I personally think the
majority of Early Adopters in Ruby community has helped a lot for that.

Nevertheless, at least to my eyes, it's a proof that, despite the fact time
to time somebody announce that Python is dying and that nobody will use
that on production for the new projects, in fact, Python is a clearly a
mainstream programming language, Python 3 migration time is the best proof,
you don't have that with the fancy languages.
But, it also means that to accelerate Python 3 adoption, we need more
incentives: Have a clean way to migrate, almost important libraries ported
and the fact that Python 3 is more newcomers friendly [2] aren't enough,
new features and performances are a better incentive, at least to me.
Without AsyncIO, I'll continue to code for Python 2.

2. From a strategical point of view, even if it should be reduce the
adoption speed of Python 3, it should be a good "move" to support that for
Python 2, to reduce the risk of fork of Python: It's better for the Python
community to use Python 2 than not Python at all.
See the NodeJS community: even if the reasons seem to be more political
than technical, fork a language isn't a potential myth.
If we force too much Python 2 users to migrate to Python 3, they should
reject completely the language, everybody will lose in this story.
Moreover, if we start to have a critical mass of Laggards with Python 2 who
have enough money/time to maintain a patch like that, and we reject that,
we should lose the discussion link and mutual enrichment: everybody is
concerned by performance improvements. Personally, only final results
matter, I don't care about the personal motivations: economical,
ecological, or basely to publish a blog post about the fact that the Python
community has a bigger one that some others ;-)

And don't forget: Almost nobody cares about our internal discussions and
our drama, they only interested by the source code we produce, even the
Python developers who use CPython.
Even if we have different motivations, I'm sure that everybody on this
mailing-list, or at least in this thread, "believe" in Python: You don't
take personal time during a week-end if Python isn't something important to
you, because during the time you take to write e-mails/source code, you
don't watch series or take care of your family.

[1] http://en.wikipedia.org/wiki/Early_adopter#History
[2] It's in French (Google translate is your friend), however an
interesting point of view of a Python trainer who has switched to Python 3:
http://sametmax.com/python-3-est-fait-pour-les-nouveaux-venus/ (The website
is down for now)
--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2015-05-30 17:42 GMT+02:00 Barry Warsaw <barry at python.org>:

> On May 30, 2015, at 06:55 PM, Nick Coghlan wrote:
>
> >Intel are looking to get involved in CPython core development
> >*specifically* to work on performance improvements, so it's important
> >to offer folks in the community good reasons for why we're OK with
> >seeing at least some of that work applied to Python 2, rather than
> >restricting their contributions to Python 3.
>
> I think that's fine, for all the reasons you, Toshio, and others mention.
> For
> better or worse, Python 2.7 *is* our LTS release so I think we can make
> life
> easier for the folks stuck on it <wink>.
>
> However, I want us to be very careful not to accept performance
> improvements
> in Python 2.7 that haven't also been applied to Python 3, unless of course
> they aren't relevant.  Python 3 also has a need for performance
> improvements,
> perhaps more so for various reasons, so let's make sure we're pushing that
> forward too.
>
> In many cases where you have a long lived stable release and active
> development releases, it's generally the policy that fixes show up in the
> dev
> release first.  At least, this is the case with Ubuntu and SRUs, and it
> makes
> a lot of sense.
>
> Cheers,
> -Barry
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150530/f2b26dbe/attachment-0001.html>

From ncoghlan at gmail.com  Sun May 31 00:26:42 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 31 May 2015 08:26:42 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CAON-fpFr9daNj45nHY1kp5NaCHAvCEgdKCSz7SLMh4H-oeC46A@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530114219.0b9beaa7@anarchist.wooz.org>
 <CAON-fpFr9daNj45nHY1kp5NaCHAvCEgdKCSz7SLMh4H-oeC46A@mail.gmail.com>
Message-ID: <CADiSq7dZ73tg9tU4Zg+97edx1Q0Lx7s0W3jAf8Tbh3BzMyg1hg@mail.gmail.com>

On 31 May 2015 04:20, "Ludovic Gasc" <gmludo at gmail.com> wrote:
>
> For now, I'm following the mailing-lists from a spy-glass: I don't read
most of the e-mails.
> However, this thread seems to be "infected": I can smell from here your
emotions behind your words.
>
> Why to push a lot of emotions inside a technical discussion ?
> What's the nerves have been hit with this discussion ?

I think you answered your own question fairly well - there's a
longstanding, but rarely articulated, culture clash between the folks that
are primarily interested in the innovators and early adopters side of
things, and those of us that are most interested in bridging the gap to the
early majority, late majority and laggards.

Add in the perfectly reasonable wariness a lot of folks have regarding the
potential for commercial interests to unfairly exploit open source
contributors without an adequate return contribution of development effort,
gratis software, gratis services, or interesting employment opportunities,
and you're going to see the occasional flare-ups as we find those rough
edges where differences in motivation & background lead to differences of
opinion & behaviour.

Cheers,
Nick.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150531/effeeb78/attachment.html>

From larry at hastings.org  Sun May 31 01:20:48 2015
From: larry at hastings.org (Larry Hastings)
Date: Sat, 30 May 2015 16:20:48 -0700
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CABVPEKqW1qoHLSPppqtEELbLR=15cqUKMAk55CTVv+F1HqqUtw@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <CABVPEKqW1qoHLSPppqtEELbLR=15cqUKMAk55CTVv+F1HqqUtw@mail.gmail.com>
Message-ID: <556A45D0.7070606@hastings.org>

On 05/30/2015 07:26 AM, Toshio Kuratomi wrote:
>
> Porting performance features from python 3 to python 2 has the 
> disadvantage of cutting into a compelling business case for users to 
> move forward to python 3.[1]  so doing this has a cost to python 3 
> adoption.  But, the question is whether there is a benefit that 
> outweighs that cost. [...]
>

Backporting performance enhancements from 3 to 2 does seem to be 
counterproductive from the perspective of the Core Dev community. But 
certainly in this case, when Intel drops a major bundle of working code 
in our collective lap, it absolutely feels like the right thing to me to 
check it in and support it.  And happily the Python Core Dev community 
generally does the right thing.

Consider the flip side--what if we'd refused to accept it?  What sort of 
signal would that be to the Python community?  I don't know, but I'd 
guess that people would harbor ill will and distrust.  I'd rather the 
community liked and trusted us; that makes it more likely they'll listen 
when we say "honest, Python 3 is better than 2--c'mon over!"


//arry/

p.s. Supporting this patch also helps cut into PyPy's reported 
performance lead--that is, if they ever upgrade speed.pypy.org from 
comparing against Python *2.7.2*.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150530/d39a4e90/attachment.html>

From zachary.ware+pydev at gmail.com  Sun May 31 05:25:43 2015
From: zachary.ware+pydev at gmail.com (Zachary Ware)
Date: Sat, 30 May 2015 22:25:43 -0500
Subject: [Python-Dev] Can someone configure the buildbots to build the
	3.5 branch?
In-Reply-To: <5567ABD3.6090108@hastings.org>
References: <5567ABD3.6090108@hastings.org>
Message-ID: <CAKJDb-OicQJQ27kN6iGjGOSb4fN6NWN_iaO493Beiz3q9mowtg@mail.gmail.com>

On Thu, May 28, 2015 at 6:59 PM, Larry Hastings <larry at hastings.org> wrote:
> The buildbots currently live in a state of denial about the 3.5 branch.
> Could someone whisper tenderly in their collective shell-like ears so that
> they start building 3.5, in addition to 3.4 and trunk?

The 3.5 branch seems to be set up on the buildbots, we'll see how it
goes when somebody commits something to 3.5.

-- 
Zach

From greg.ewing at canterbury.ac.nz  Sun May 31 00:37:55 2015
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 31 May 2015 10:37:55 +1200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7cfJjd7rvmZiAT0Upn1Nm0Z9pjJJUUOuAOnrG8VnkZqSA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530125811.56b5ec5f@fsol>
 <CADiSq7du2T8_JK2L94PBYDJDx3U8bWZatE1ymgrCPM1ntqq3Ew@mail.gmail.com>
 <20150530133722.52a3ed11@fsol>
 <CADiSq7cfJjd7rvmZiAT0Upn1Nm0Z9pjJJUUOuAOnrG8VnkZqSA@mail.gmail.com>
Message-ID: <556A3BC3.6090207@canterbury.ac.nz>

Nick Coghlan wrote:

> We've long had a requirement that certain kinds of proposal come with
> at least nominal support commitments from the folks proposing them
> (e.g. adding modules to the standard library, supporting new
> platforms). Institutions with a clear financial interest in a
> particular problem area can certainly make such commitments more
> credibly,

Are such commitments from commercial entities really
any more reliable in the long term than anyone else's?
Such entities can be expected to drop them as soon as
they perceive them as no longer being in their financial
interests.

-- 
Greg



From ncoghlan at gmail.com  Sun May 31 07:23:49 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 31 May 2015 15:23:49 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <556A45D0.7070606@hastings.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <CABVPEKqW1qoHLSPppqtEELbLR=15cqUKMAk55CTVv+F1HqqUtw@mail.gmail.com>
 <556A45D0.7070606@hastings.org>
Message-ID: <CADiSq7dZT8+_-yrA3VKng3uArQ_MfRTa0Om3FYLEVysC5zKBZQ@mail.gmail.com>

On 31 May 2015 at 09:20, Larry Hastings <larry at hastings.org> wrote:
> On 05/30/2015 07:26 AM, Toshio Kuratomi wrote:
>
> Porting performance features from python 3 to python 2 has the disadvantage
> of cutting into a compelling business case for users to move forward to
> python 3.[1]  so doing this has a cost to python 3 adoption.  But, the
> question is whether there is a benefit that outweighs that cost. [...]
>
> Backporting performance enhancements from 3 to 2 does seem to be
> counterproductive from the perspective of the Core Dev community.  But
> certainly in this case, when Intel drops a major bundle of working code in
> our collective lap, it absolutely feels like the right thing to me to check
> it in and support it.  And happily the Python Core Dev community generally
> does the right thing.

There's another benefit that I didn't think to mention earlier, which
is that getting folks from Python 2 -> Python 3 isn't actually my
major version adoption concern at the moment: I'm more interested in
how I can persuade them to stop using Python *2.6*, which is still a
higher proportion of PyPI downloads with an identifiable client
version than Python 3 [1], and the relative proportions between them
are likely to be even worse once we start venturing inside corporate
firewalls where direct downloads from PyPI aren't permitted.

While I suspect barriers to migration at the distro level carry a fair
bit of the blame there (and we're working on those), performance
improvements in the 2.7 branch help provide an additional carrot to
assist in that process, complementing the stick of trying to educate
the community at large that it's unrealistic and exploitative [2] for
folks to expect free community support for versions of Python that are
so old that not even the core development team support them any more
(i.e. Python 2.6 and earlier).

My one consolation is that the Python community are far from alone in
struggling to win that fight against institutional inertia once folks
have widely adopted a version of a product that "works for them". My
theory is that folks will pay to be able to keep using these older
systems because our industry doesn't have very good tools for
quantifying the cost of the technical debt incurred by attempting to
maintain the status quo in the face of an evolving ecosystem. As
infrastructure change management practices improve (e.g. through ideas
like Holistic Software's hybrid dynamic management [3]), and not only
the platform level tools but also the related business models evolve
to better support those approaches, I'm hoping we'll see things change
for the better not just in terms of Python in particular, but in terms
of institutional infrastructure as a whole.

Cheers,
Nick.

[1] https://caremad.io/2015/04/a-year-of-pypi-downloads/
[2] http://www.curiousefficiency.org/posts/2015/04/stop-supporting-python26.html
[3] http://www.holistic-software.com/hybrid-dynamic-model

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From ncoghlan at gmail.com  Sun May 31 07:31:44 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 31 May 2015 15:31:44 +1000
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <556A3BC3.6090207@canterbury.ac.nz>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530125811.56b5ec5f@fsol>
 <CADiSq7du2T8_JK2L94PBYDJDx3U8bWZatE1ymgrCPM1ntqq3Ew@mail.gmail.com>
 <20150530133722.52a3ed11@fsol>
 <CADiSq7cfJjd7rvmZiAT0Upn1Nm0Z9pjJJUUOuAOnrG8VnkZqSA@mail.gmail.com>
 <556A3BC3.6090207@canterbury.ac.nz>
Message-ID: <CADiSq7eMFNJjWJvyR6orZbuw-QhiSYJH1B9pRXfwCqbeEbfW8w@mail.gmail.com>

On 31 May 2015 at 08:37, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Nick Coghlan wrote:
>
>> We've long had a requirement that certain kinds of proposal come with
>> at least nominal support commitments from the folks proposing them
>> (e.g. adding modules to the standard library, supporting new
>> platforms). Institutions with a clear financial interest in a
>> particular problem area can certainly make such commitments more
>> credibly,
>
> Are such commitments from commercial entities really
> any more reliable in the long term than anyone else's?
> Such entities can be expected to drop them as soon as
> they perceive them as no longer being in their financial
> interests.

Yes, if the credibility stems from the market situation and the
financial incentives leading an organisation to make the offer, rather
than from the personal interest of one or two key folks at that
organisation. Structural incentives are harder to shift than personal
interests, so this is a case where institutional inertia actually
works for the community rather than against us.

It's not an ironclad guarantee (since businesses fail, divisions get
shut down, companies decide to exit markets, etc), but if we
understand the business case backing an investment decision (whether
that investment is in the form of funding, developer time, or both),
that's genuinely more reliable than commitments from individuals
(since we don't have the kind of ability to manage and distribute risk
that larger organisations do).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From xavier.combelle at gmail.com  Sun May 31 11:14:56 2015
From: xavier.combelle at gmail.com (Xavier Combelle)
Date: Sun, 31 May 2015 11:14:56 +0200
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
Message-ID: <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>

> +1. The new embeddable Python distribution for Windows is a great step
> forward for this. It's not single-file, but it's easy to produce a
> single-directory self-contained application with it. I don't know if
> there's anything equivalent for Linux/OSX - maybe it's something we
> should look at for them as well (although the whole "static binaries"
> concept seems to be fairly frowned on in the Unix world, from what
> I've seen).
>
> Just curious What is "the new embeddable Python distribution for Windows" ?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150531/4cc39e4d/attachment.html>

From p.f.moore at gmail.com  Sun May 31 12:41:38 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Sun, 31 May 2015 11:41:38 +0100
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>
Message-ID: <CACac1F99FgEiKUZ23__mN745QdPbB2k5ZVYUm+SHSAOJ6A-WQA@mail.gmail.com>

On 31 May 2015 at 10:14, Xavier Combelle <xavier.combelle at gmail.com> wrote:
>> +1. The new embeddable Python distribution for Windows is a great step
>> forward for this. It's not single-file, but it's easy to produce a
>> single-directory self-contained application with it. I don't know if
>> there's anything equivalent for Linux/OSX - maybe it's something we
>> should look at for them as well (although the whole "static binaries"
>> concept seems to be fairly frowned on in the Unix world, from what
>> I've seen).
>>
> Just curious What is "the new embeddable Python distribution for Windows" ?

Python 3.5 ships a zipfile which contains a self-contained Python
installation, intended for embedding. The idea is that you unzip it
into your application directory, and use it from within your
application (either via the embedding API, or using the included
python.exe/pythonw.exe). It doesn't use the registry, or any global
resources, so it's independent of any installed python that might be
present.

Paul

From p.f.moore at gmail.com  Sun May 31 12:47:50 2015
From: p.f.moore at gmail.com (Paul Moore)
Date: Sun, 31 May 2015 11:47:50 +0100
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CACac1F99FgEiKUZ23__mN745QdPbB2k5ZVYUm+SHSAOJ6A-WQA@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>
 <CACac1F99FgEiKUZ23__mN745QdPbB2k5ZVYUm+SHSAOJ6A-WQA@mail.gmail.com>
Message-ID: <CACac1F8D492073iycPeX1sSvjGGwsYqaNiuEvQ-=wmLpzA2G7g@mail.gmail.com>

On 31 May 2015 at 11:41, Paul Moore <p.f.moore at gmail.com> wrote:
> On 31 May 2015 at 10:14, Xavier Combelle <xavier.combelle at gmail.com> wrote:
>>> +1. The new embeddable Python distribution for Windows is a great step
>>> forward for this. It's not single-file, but it's easy to produce a
>>> single-directory self-contained application with it. I don't know if
>>> there's anything equivalent for Linux/OSX - maybe it's something we
>>> should look at for them as well (although the whole "static binaries"
>>> concept seems to be fairly frowned on in the Unix world, from what
>>> I've seen).
>>>
>> Just curious What is "the new embeddable Python distribution for Windows" ?
>
> Python 3.5 ships a zipfile which contains a self-contained Python
> installation, intended for embedding. The idea is that you unzip it
> into your application directory, and use it from within your
> application (either via the embedding API, or using the included
> python.exe/pythonw.exe). It doesn't use the registry, or any global
> resources, so it's independent of any installed python that might be
> present.

By the way, IMO the new embeddable distribution is a pretty big deal
on Windows. To make sure that it doesn't end up unnoticed, can I
suggest we include a prominent "What's New" entry for it, and a
section in "Python Setup and Usage" under "Using Python on Windows"
for it?

I'd hate to find that 3 or 4 versions from now, we're still trying to
remind people that they can use the embeddable distribution, in the
same way that executable zipfiles ended up an almost unknown feature
for ages.

Paul

From gmludo at gmail.com  Sun May 31 11:07:58 2015
From: gmludo at gmail.com (Ludovic Gasc)
Date: Sun, 31 May 2015 11:07:58 +0200
Subject: [Python-Dev] 2.7 is here until 2020,
	please don't call it a waste.
In-Reply-To: <CADiSq7dZ73tg9tU4Zg+97edx1Q0Lx7s0W3jAf8Tbh3BzMyg1hg@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <7A4DF5A2-5D04-4F5A-9883-B5E815A14909@gmail.com>
 <CAP1=2W5cZZzwQj0kqHgKozoKhCL_2fvx2cydNqOZBVW1BNGEdw@mail.gmail.com>
 <CANc-5Uwx5rVJfYfQK6_-SxnJSSLv62t7p4NddC=mNXda=WWJDQ@mail.gmail.com>
 <mk7asm$37i$1@ger.gmane.org>
 <CAP7+vJKDqU8TakYoQ6+z+zt_vs823MGZ9YzcFOBOwusYRtyqKg@mail.gmail.com>
 <CAMpsgwY+6CPWWEr2yyNgiTYCWJ9p67UNy8_p5hR2WQ9MtXXnbQ@mail.gmail.com>
 <CADiSq7eGd-R8PjNYKnn8XmMF7BD-s9cd65E+HNnXzRi-gSa4GA@mail.gmail.com>
 <CAGE7PNJqO36PXi3P8a6Ov1vkN2KigDnaMYQGyFcophYiSe+GNw@mail.gmail.com>
 <5568FAF6.6040802@python.org> <20150530015621.518b537f@fsol>
 <CADiSq7c35TP5pfH6AuEg7-hESf7-gkbLeh40i6rHREhvfqziRA@mail.gmail.com>
 <556906F0.4090504@sdamon.com>
 <CADiSq7daKMxSoFswRaYF2_PKtRGK2nMoJ69N4weKc7E=KKreYA@mail.gmail.com>
 <20150530114219.0b9beaa7@anarchist.wooz.org>
 <CAON-fpFr9daNj45nHY1kp5NaCHAvCEgdKCSz7SLMh4H-oeC46A@mail.gmail.com>
 <CADiSq7dZ73tg9tU4Zg+97edx1Q0Lx7s0W3jAf8Tbh3BzMyg1hg@mail.gmail.com>
Message-ID: <CAON-fpGr-HViSRa173BNbsyZm4KPMXSPfL_fZf-AhUZu59Q35w@mail.gmail.com>

2015-05-31 0:26 GMT+02:00 Nick Coghlan <ncoghlan at gmail.com>:

>
> On 31 May 2015 04:20, "Ludovic Gasc" <gmludo at gmail.com> wrote:
> >
> > For now, I'm following the mailing-lists from a spy-glass: I don't read
> most of the e-mails.
> > However, this thread seems to be "infected": I can smell from here your
> emotions behind your words.
> >
> > Why to push a lot of emotions inside a technical discussion ?
> > What's the nerves have been hit with this discussion ?
>
> I think you answered your own question fairly well
>
Thanks.

> - there's a longstanding, but rarely articulated, culture clash between
> the folks that are primarily interested in the innovators and early
> adopters side of things, and those of us that are most interested in
> bridging the gap to the early majority, late majority and laggards.
>
> Add in the perfectly reasonable wariness a lot of folks have regarding the
> potential for commercial interests to unfairly exploit open source
> contributors without an adequate return contribution of development effort,
> gratis software, gratis services,
>
Based on my professional experience, more a client pays for your skills,
more you have chance that he will respect you, because he knows your value.
The contrary is, that, less a client pays, more he will try to manipulate
you to do more things that it was planned in the contract.

Now, for an open source software, you don't have money cost, but, you still
have the knowledge cost.
If you replace money by knowledge in my two previous sentences, theses
sentences are also true.

However, things aren't binary: Apart the contribution level [1] of each
member, the "good" and "bad" ideas for the future of Python can arrive from
everybody.
The only thing I'm sure: I'm incompetent to predict the future, I've no
idea how each member of our community will react, I can list only some
possible scenarios.
But with Internet, you know as me that with only few persons you can change
a lot of things, look Edward Snowden for example.

About Python 3 migration, I think that one of our best control stick is
newcomers, and by extension, Python trainers/teachers.
If newcomers learn first Python 3, when they will start to work
professionally, they should help to rationalize the Python 3 migration
inside existing dev teams, especially because they don't have an interest
conflict based on the fact that they haven't written plenty of code with
Python 2.
2020 is around the corner, 5 years shouldn't be enough to change the
community mind, I don't know.

[1] Don't forget that contributions aren't only the source code ;-)

> or interesting employment opportunities, and you're going to see the
> occasional flare-ups as we find those rough edges where differences in
> motivation & background lead to differences of opinion & behaviour.
>
> Cheers,
> Nick.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150531/2c948192/attachment.html>

From tritium-list at sdamon.com  Sun May 31 12:59:43 2015
From: tritium-list at sdamon.com (Alexander Walters)
Date: Sun, 31 May 2015 06:59:43 -0400
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <CACac1F8D492073iycPeX1sSvjGGwsYqaNiuEvQ-=wmLpzA2G7g@mail.gmail.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>
 <CACac1F99FgEiKUZ23__mN745QdPbB2k5ZVYUm+SHSAOJ6A-WQA@mail.gmail.com>
 <CACac1F8D492073iycPeX1sSvjGGwsYqaNiuEvQ-=wmLpzA2G7g@mail.gmail.com>
Message-ID: <556AE99F.8060801@sdamon.com>

A better course of action would be to deprecate the non-portable 
version.  Other than setting the PATH envvar, why do we need to continue 
even touching the system on install?  It is highly annoying for those of 
us that maintain several installs of python on a single windows system, 
and it really should stop.

The only use I can think of for ever touching the registry in the first 
place is to tell distutils installers where python is.  I can tell you 
right now, that design choice is a bug.  There are some mighty hacks you 
have to go through to correct that behavior when you happen to be using 
a virtualenv.

(We are calling it 'embedable', but the rest of the world would call it 
'portable', as in, runable from a usb stick)

On 5/31/2015 06:47, Paul Moore wrote:
> On 31 May 2015 at 11:41, Paul Moore <p.f.moore at gmail.com> wrote:
>> On 31 May 2015 at 10:14, Xavier Combelle <xavier.combelle at gmail.com> wrote:
>>>> +1. The new embeddable Python distribution for Windows is a great step
>>>> forward for this. It's not single-file, but it's easy to produce a
>>>> single-directory self-contained application with it. I don't know if
>>>> there's anything equivalent for Linux/OSX - maybe it's something we
>>>> should look at for them as well (although the whole "static binaries"
>>>> concept seems to be fairly frowned on in the Unix world, from what
>>>> I've seen).
>>>>
>>> Just curious What is "the new embeddable Python distribution for Windows" ?
>> Python 3.5 ships a zipfile which contains a self-contained Python
>> installation, intended for embedding. The idea is that you unzip it
>> into your application directory, and use it from within your
>> application (either via the embedding API, or using the included
>> python.exe/pythonw.exe). It doesn't use the registry, or any global
>> resources, so it's independent of any installed python that might be
>> present.
> By the way, IMO the new embeddable distribution is a pretty big deal
> on Windows. To make sure that it doesn't end up unnoticed, can I
> suggest we include a prominent "What's New" entry for it, and a
> section in "Python Setup and Usage" under "Using Python on Windows"
> for it?
>
> I'd hate to find that 3 or 4 versions from now, we're still trying to
> remind people that they can use the embeddable distribution, in the
> same way that executable zipfiles ended up an almost unknown feature
> for ages.
>
> Paul
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com


From benno at benno.id.au  Sun May 31 14:35:57 2015
From: benno at benno.id.au (Ben Leslie)
Date: Sun, 31 May 2015 22:35:57 +1000
Subject: [Python-Dev] Obtaining stack-frames from co-routine objects
In-Reply-To: <55687044.1090700@gmail.com>
References: <CABZ0LtAPMi9nQb9ZSp4jW4-kvTte_WVjz=wtKvUWmGeSwuyqVA@mail.gmail.com>
 <55687044.1090700@gmail.com>
Message-ID: <CABZ0LtDX8J-Uc6kBBQ93cCF1NzAcebA6oWsf03WX5VuViL0niA@mail.gmail.com>

Hi Yury,

I'm just starting my exploration into using async/await; all my
'real-world' scenarios are currently hypothetical.

One such hypothetical scenario however is that if I have a server
process running, with some set of concurrent connections, each managed
by a co-routine. Each co-routine is of some arbitrary complexity e.g:
some combination of reading files, reading from database, reading from
peripherals. If I notice one of those co-routines appears stuck and
not making progress, I'd very much like to debug that, and preferably
in a way that doesn't necessarily stop the rest of the server (or even
the co-routine that appears stuck).

The problem with the "if debug: log(...)" approach is that you need
foreknowledge of the fault state occurring; on a busy server you don't
want to just be logging every 'switch()'. I guess you could do
something like "switch_state[outer_coro] = get_current_stack_frames()"
on each switch. To me double book-keeping something that the
interpreter already knows seems somewhat wasteful but maybe it isn't
really too bad.

Cheers,

Ben

On 29 May 2015 at 23:57, Yury Selivanov <yselivanov.ml at gmail.com> wrote:
> Hi Ben,
>
> Is there any real-world scenario where you would need this?
>
> It looks like this can help with debugging, somehow, but the easiest
> solution is to put a "if debug: log(...)" before "yield" in your
> "switch()" function.  You'll have a perfect traceback there.
>
> Thanks,
> Yury
>
>
> On 2015-05-29 12:46 AM, Ben Leslie wrote:
>>
>> Hi all,
>>
>> Apologies in advance; I'm not a regular, and this may have been
>> handled already (but I couldn't find it when searching).
>>
>> I've been using the new async/await functionality (congrats again to
>> Yury on getting that through!), and I'd like to get a stack trace
>> between the place at which blocking occurs and the outer co-routine.
>>
>> For example, consider this code:
>>
>> """
>> async def a():
>>      await b()
>>
>> async def b():
>>      await switch()
>>
>> @types.coroutine
>> def switch():
>>      yield
>>
>> coro_a = a()
>> coro_a.send(None)
>> """
>>
>> At this point I'd really like to be able to somehow get a stack trace
>> similar to:
>>
>> test.py:2
>> test.py:4
>> test.py:9
>>
>> Using the gi_frame attribute of coro_a, I can get the line number of
>> the outer frame (e.g.: line 2), but from there there is no way to
>> descend the stack to reach the actual yield point.
>>
>> I thought that perhaps the switch() co-routine could yield the frame
>> object returned from inspect.currentframe(), however once that
>> function yields that frame object has f_back changed to None.
>>
>> A hypothetical approach would be to work the way down form the
>> outer-frame, but that requires getting access to the co-routine object
>> that the outer-frame is currently await-ing. Some hypothetical code
>> could be:
>>
>> """
>> def show(coro):
>>      print("{}:{}".format(coro.gi_frame.f_code.co_filename,
>> coro.gi_frame.f_lineno))
>>      if dis.opname[coro.gi_code.co_code[coro.gi_frame.f_lasti + 1]] ==
>> 'YIELD_FROM':
>>          show(coro.gi_frame.f_stack[0])
>> """
>>
>> This relies on the fact that an await-ing co-routine will be executing
>> a YIELD_FROM instruction. The above code uses a completely
>> hypothetical 'f_stack' property of frame objects to pull the
>> co-routine object which a co-routine is currently await-ing from the
>> stack. I've implemented a proof-of-concept f_stack property in the
>> frameobject.c just to test out the above code, and it seems to work.
>>
>> With all that, some questions:
>>
>> 1) Does anyone else see value in trying to get the stack-trace down to
>> the actual yield point?
>> 2) Is there a different way of doing it that doesn't require changes
>> to Python internals?
>> 3) Assuming no to #2 is there a better way of getting the information
>> compared to the pretty hacking byte-code/stack inspection?
>>
>> Thanks,
>>
>> Ben
>> _______________________________________________
>> Python-Dev mailing list
>> Python-Dev at python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/benno%40benno.id.au

From ncoghlan at gmail.com  Sun May 31 16:15:01 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 1 Jun 2015 00:15:01 +1000
Subject: [Python-Dev] Python 3 migration status update across some key
 subcommunities (was Re: 2.7 is here until 2020,
 please don't call it a waste.)
Message-ID: <CADiSq7fjMQhMzwpp=VS4+0mhM=s25BaKBJ-bRBSBuGOkjAug8g@mail.gmail.com>

On 31 May 2015 at 19:07, Ludovic Gasc <gmludo at gmail.com> wrote:
> About Python 3 migration, I think that one of our best control stick is
> newcomers, and by extension, Python trainers/teachers.
> If newcomers learn first Python 3, when they will start to work
> professionally, they should help to rationalize the Python 3 migration
> inside existing dev teams, especially because they don't have an interest
> conflict based on the fact that they haven't written plenty of code with
> Python 2.
> 2020 is around the corner, 5 years shouldn't be enough to change the
> community mind, I don't know.

The education community started switching a while back - if you watch
Carrie-Anne Philbin's PyCon UK 2014 keynote, one of her requests for
the broader Python community was for everyone else to just catch up
already in order to reduce student's confusion (she phrased it more
politely than that, though). Educators need to tweak examples and
exercises to account for a version switch, but that's substantially
easier than migrating hundreds of thousands or even millions of lines
of production code.

And yes, if you learn Python 3 first, subsequently encountering Python
2's quirks and cruft is likely to encourage folks that know both
versions of the language to start advocating for a version upgrade :)

After accounting for the "Wow, the existing Python 2 install base is
even larger than we realised" factour, the migration is actually in a
pretty good place overall these days. The "enterprise" crowd really
are likely to be the only ones that might need the full remaining 5
years of migration time (and they may potentially have even more time,
if they're relying on a commercial redistributor).

Web frameworks have allowed Python 3 development for a while now, and
with Django switching their tutorial to Python 3 by default, Django
downloads via pip show one of the highest proportions of Python 3
adoption on PyPI. www.python.org itself is now a production Python 3
Django web service, and the next generation of pypi.python.org will be
a Pyramid application that's also running on Python 3.

The dedicated async/await syntax in 3.5 represents a decent carrot to
encourage migration for anyone currently using yield (or yield from)
based coroutines, since the distinct syntax not only allows for easier
local reasoning about whether something is an iterator or a coroutine,
it also provides a much improved user experience for asynchronous
iterators and context managers (including finally handling the
"asynchronous database transaction as a context manager" case, which
previous versions of Python couldn't really do at all).

The matrix multiplication operator is similarly a major improvement
for the science and data analysis part of the Python community.

In terms of reducing *barriers* to adoption, after inviting them to
speak at the 2014 language summit, we spent a fair bit of time with
the Twisted and Mercurial folks over the past year or so working
through "What's still missing from Python 3 for your use cases?", as
Python 3.4 was still missing some features for binary data
manipulation where we'd been a bit too ruthless in pruning back the
binary side of things when deciding what counted as text-only
features, and what was applicable to binary data as well. So 3.5
brings back binary interpolation, adds a hex() method to bytes, and
adds binary data support directly to a couple of standard library
modules (tempfile, difflib).

If I understand the situation correctly, the work Guido et al have
been doing on PEP 484 and type hinting standardisation is also aimed
at reducing barriers to Python 3 adoption, by making it possible to
develop better migration tools that are more semantically aware than
the existing syntax focused tools. The type hinting actually acts as a
carrot as well, since it's a feature that mainly shows its value when
attempting to scale a *team* to larger sizes (as it lets you delegate
more of the code review process to an automated tool, letting the
human reviewers spend more time focusing on higher level semantic
concerns).

Finally, both Debian/Ubuntu and Fedora are well advanced in their
efforts to replace Python 2 with Python 3 in their respective default
images (but keeping Py2 available in their package repos). That work
is close to finished now (myself, Slavek Kabrda, Barry Warsaw, and
Matthias Klose had some good opportunities to discuss that at PyCon),
although there are still some significant rough edges to figure out
(such as coming up with a coherent cross-platform story for what we're
going to do with the Python symlink), as well as a few more key
projects to either migrate entirely, or at least finish porting to the
source compatible subset of Python 2 & 3 (e.g. Samba).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From tjreedy at udel.edu  Sun May 31 16:44:33 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Sun, 31 May 2015 10:44:33 -0400
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <556AE99F.8060801@sdamon.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>
 <CACac1F99FgEiKUZ23__mN745QdPbB2k5ZVYUm+SHSAOJ6A-WQA@mail.gmail.com>
 <CACac1F8D492073iycPeX1sSvjGGwsYqaNiuEvQ-=wmLpzA2G7g@mail.gmail.com>
 <556AE99F.8060801@sdamon.com>
Message-ID: <mkf6pb$6v0$1@ger.gmane.org>

On 5/31/2015 6:59 AM, Alexander Walters wrote:
> A better course of action would be to deprecate the non-portable
> version.  Other than setting the PATH envvar, why do we need to continue
> even touching the system on install?  It is highly annoying for those of
> us that maintain several installs of python on a single windows system,
> and it really should stop.

Some people want the right-click context menu entries -- Run (also 
double click) and Edit with Idle, which should be Edit with Idle x.y.

> The only use I can think of for ever touching the registry in the first
> place is to tell distutils installers where python is.  I can tell you
> right now, that design choice is a bug.  There are some mighty hacks you
> have to go through to correct that behavior when you happen to be using
> a virtualenv.
>
> (We are calling it 'embedable', but the rest of the world would call it
> 'portable', as in, runable from a usb stick)

-- 
Terry Jan Reedy


From me at the-compiler.org  Sun May 31 16:44:30 2015
From: me at the-compiler.org (Florian Bruhin)
Date: Sun, 31 May 2015 16:44:30 +0200
Subject: [Python-Dev] Python 3 migration status update across some key
 subcommunities (was Re: 2.7 is here until 2020,
 please don't call it a waste.)
In-Reply-To: <CADiSq7fjMQhMzwpp=VS4+0mhM=s25BaKBJ-bRBSBuGOkjAug8g@mail.gmail.com>
References: <CADiSq7fjMQhMzwpp=VS4+0mhM=s25BaKBJ-bRBSBuGOkjAug8g@mail.gmail.com>
Message-ID: <20150531144430.GD469@tonks>

* Nick Coghlan <ncoghlan at gmail.com> [2015-06-01 00:15:01 +1000]:
> On 31 May 2015 at 19:07, Ludovic Gasc <gmludo at gmail.com> wrote:
> > About Python 3 migration, I think that one of our best control stick is
> > newcomers, and by extension, Python trainers/teachers.
> > If newcomers learn first Python 3, when they will start to work
> > professionally, they should help to rationalize the Python 3 migration
> > inside existing dev teams, especially because they don't have an interest
> > conflict based on the fact that they haven't written plenty of code with
> > Python 2.
> > 2020 is around the corner, 5 years shouldn't be enough to change the
> > community mind, I don't know.
> 
> The education community started switching a while back - if you watch
> Carrie-Anne Philbin's PyCon UK 2014 keynote, one of her requests for
> the broader Python community was for everyone else to just catch up
> already in order to reduce student's confusion (she phrased it more
> politely than that, though). Educators need to tweak examples and
> exercises to account for a version switch, but that's substantially
> easier than migrating hundreds of thousands or even millions of lines
> of production code.
> 
> And yes, if you learn Python 3 first, subsequently encountering Python
> 2's quirks and cruft is likely to encourage folks that know both
> versions of the language to start advocating for a version upgrade :)

I think a big issue here is the lack of good newcomer tutorials for
Python 3.

In the #python IRC channel, "learn Python the hard way"[1] is often
recommended, and the common consensus seems to be that all other
tutorials (other than the official one[2] which is clearly not aimed
at newcomers to programming) seem to lack in some way.

LPTHW is Python 2 only, so at least from what I see in #python, many
newcomers are recommended to learn Python 2 rather than 3 because of
that.

I agree migrating large existing codebases (and developers) from 2 to
3 can be quite an issue, and a lot of energy went into making this
easier (which is good!). But I also think nobody fresh to Python
should start learning Python 2 now, except when there's a compelling
reason (such as unported libraries without alternatives).

Florian

[1] http://learnpythonthehardway.org/book/
[2] https://docs.python.org/3/tutorial/index.html

-- 
http://www.the-compiler.org | me at the-compiler.org (Mail/XMPP)
   GPG: 916E B0C8 FD55 A072 | http://the-compiler.org/pubkey.asc
         I love long mails! | http://email.is-not-s.ms/
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 819 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150531/e57fab94/attachment-0001.sig>

From tjreedy at udel.edu  Sun May 31 17:10:31 2015
From: tjreedy at udel.edu (Terry Reedy)
Date: Sun, 31 May 2015 11:10:31 -0400
Subject: [Python-Dev] Python 3 migration status update across some key
 subcommunities (was Re: 2.7 is here until 2020,
 please don't call it a waste.)
In-Reply-To: <CADiSq7fjMQhMzwpp=VS4+0mhM=s25BaKBJ-bRBSBuGOkjAug8g@mail.gmail.com>
References: <CADiSq7fjMQhMzwpp=VS4+0mhM=s25BaKBJ-bRBSBuGOkjAug8g@mail.gmail.com>
Message-ID: <mkf8a1$tkk$1@ger.gmane.org>

On 5/31/2015 10:15 AM, Nick Coghlan wrote:

> The education community started switching a while back - if you watch
> Carrie-Anne Philbin's PyCon UK 2014 keynote, one of her requests for
> the broader Python community was for everyone else to just catch up
> already in order to reduce student's confusion (she phrased it more
> politely than that, though). Educators need to tweak examples and
> exercises to account for a version switch, but that's substantially
> easier than migrating hundreds of thousands or even millions of lines
> of production code.

There is another somewhat invisible but real aspect of migration that 
tends to get ignored: the Python embedded in applications.  LibreOffice 
4.0, for instance, upgraded from 2.6 to 3.3 (around Jan 14 I think). It 
is currently in lo4dir/program/python-core-3.3.1.  I presume unicode 
everywhere pluse the new-in-3.3 efficient, cross-platform unicode 
implementation had something to do with this.  lo4dir/program/wizards is 
a package with subpackages and over 100 .py files.  There are now 
perhaps 20 million LO4 users (and indirect 3.3 users) around the world 
(my guess from Wikipedia article). A few will use the PyUNO bridge for 
scripting.  Installations are from CDs, direct downloads, torrents, and 
linux distributions, but not from pypi.  In a few years, the number 
might grow to 100 million as more LO3 users upgrade and new users start 
with LO4.

[...]

> In terms of reducing *barriers* to adoption, after inviting them to
> speak at the 2014 language summit, we spent a fair bit of time with
> the Twisted and Mercurial folks over the past year or so working
> through "What's still missing from Python 3 for your use cases?", as
> Python 3.4 was still missing some features for binary data
> manipulation where we'd been a bit too ruthless in pruning back the
> binary side of things when deciding what counted as text-only
> features, and what was applicable to binary data as well. So 3.5
> brings back binary interpolation, adds a hex() method to bytes, and
> adds binary data support directly to a couple of standard library
> modules (tempfile, difflib).

Perhaps we should investigate whether other apps with embedded but user 
accessible python has migrated and if not, ask why not (dependencies?) 
and whether planned.

-- 
Terry Jan Reedy


From ncoghlan at gmail.com  Sun May 31 17:16:15 2015
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 1 Jun 2015 01:16:15 +1000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <mkf6pb$6v0$1@ger.gmane.org>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>
 <CACac1F99FgEiKUZ23__mN745QdPbB2k5ZVYUm+SHSAOJ6A-WQA@mail.gmail.com>
 <CACac1F8D492073iycPeX1sSvjGGwsYqaNiuEvQ-=wmLpzA2G7g@mail.gmail.com>
 <556AE99F.8060801@sdamon.com> <mkf6pb$6v0$1@ger.gmane.org>
Message-ID: <CADiSq7fFbpQPzqYPYj0aXHVG+ArBD0jmgSVmRBHLEPy2QLkR2w@mail.gmail.com>

On 1 June 2015 at 00:44, Terry Reedy <tjreedy at udel.edu> wrote:
> On 5/31/2015 6:59 AM, Alexander Walters wrote:
>>
>> A better course of action would be to deprecate the non-portable
>> version.  Other than setting the PATH envvar, why do we need to continue
>> even touching the system on install?  It is highly annoying for those of
>> us that maintain several installs of python on a single windows system,
>> and it really should stop.
>
>
> Some people want the right-click context menu entries -- Run (also double
> click) and Edit with Idle, which should be Edit with Idle x.y.

And system administrators responsible for deploying and maintaining
Standard Operating Environments want the MSI integration. In that
regard, the default behaviour of the python.org installers is the
rough equivalent of the system Python on Linux distributions (with the
added complexity of needing to deal with the Windows registry).

Portable installations are often good for developers, but they come at
the cost of failing to integrate properly with the underlying
operating system.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

From stephen at xemacs.org  Sun May 31 17:39:14 2015
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Mon, 01 Jun 2015 00:39:14 +0900
Subject: [Python-Dev] Python 3 migration status update across some key
 subcommunities (was Re: 2.7 is here until 2020,
 please don't call it a waste.)
In-Reply-To: <20150531144430.GD469@tonks>
References: <CADiSq7fjMQhMzwpp=VS4+0mhM=s25BaKBJ-bRBSBuGOkjAug8g@mail.gmail.com>
 <20150531144430.GD469@tonks>
Message-ID: <87k2vou57h.fsf@uwakimon.sk.tsukuba.ac.jp>

Florian Bruhin writes:

 > I think a big issue here is the lack of good newcomer tutorials for
 > Python 3.

My business students (who are hardly advanced programmers) don't take
tutorials seriously.  They're way too focused on getting results.  And
there it's the "Doing <SOME PRACTICAL THING> with Python" books that
are the killer.  They just cargo cult those books, which are almost
all still Python-2-focused in my experience.

I don't think there's much we can do about those books except hope
they're popular enough to justify new editions in short order, but I
did want to point out that tutorials are not the only way beginners
are introduced to Python, and a lot of those entry ports remain
Python-2-oriented.

What I would really like to see is a Python 3 (and if you really need
Python 2, here's how it differs) version of Python: Essential
Reference.

BTW, for my students the main thing that trips them is not Unicode,
but rather things like the print function (vs. statement in Python 2).

 > But I also think nobody fresh to Python should start learning
 > Python 2 now, except when there's a compelling reason (such as
 > unported libraries without alternatives).

I agree, but the cargo cult thing is big for people coming to Python
because somebody told them it's a good way to do something practical.
(Fortunately my students have to deal with the insane proliferation of
encodings in Japan, so "less mojibake" is a compelling reason for
Python 3.  I get no backtalk.<wink/>)

From Steve.Dower at microsoft.com  Sun May 31 20:32:35 2015
From: Steve.Dower at microsoft.com (Steve Dower)
Date: Sun, 31 May 2015 18:32:35 +0000
Subject: [Python-Dev] Computed Goto dispatch for Python 2
In-Reply-To: <556AE99F.8060801@sdamon.com>
References: <34384DEB0F607E42BD61D446586AD4E86B51172C@ORSMSX103.amr.corp.intel.com>
 <CAF4280+-RDftrwUtM4Ro6E7NcMAaQsOvg0MUcNMTXaUo8v9LZg@mail.gmail.com>
 <CAK5idxQfCVvJ9tBcN=nZ5EfW+7LS68sg0uqLd84=2W2TL4P_sA@mail.gmail.com>
 <CADiSq7fgC+Z=00qO6YLPqtwQ4MvHV8WNo1WdOsndmJ_gNtVQnQ@mail.gmail.com>
 <etPan.55672824.4d4993ef.12a4d@Draupnir.home>
 <CACac1F_XxiDo7=uGvZwEUKHX3PfUWjr0Dzri5uUTKj74mAKbdA@mail.gmail.com>
 <CAEQcUJR-sYzAKUDGj1UjD+ZxnrOQBT1WXYqaSQWT+=o=S2cYjA@mail.gmail.com>
 <CACac1F99FgEiKUZ23__mN745QdPbB2k5ZVYUm+SHSAOJ6A-WQA@mail.gmail.com>
 <CACac1F8D492073iycPeX1sSvjGGwsYqaNiuEvQ-=wmLpzA2G7g@mail.gmail.com>,
 <556AE99F.8060801@sdamon.com>
Message-ID: <BY1PR03MB146656E77D8C96CDECC55FECF5B70@BY1PR03MB1466.namprd03.prod.outlook.com>

"We are calling it 'embedable', but the rest of the world would call it
'portable', as in, runable from a usb stick"

I called it embeddable because it's not intended for direct use and is not complete. There's no test suite, no documentation, no tkinter (pending high demand), no pip, no site-packages, and no folder structure. It really is meant to be a component in another application that provides the rest of the layout for its own needs. (I probably ought to blog about it so there's at least one detailed example of what it's for...)

A nice side-effect is that you can make a regular per-user install portable by adding a pyvenv.cfg with "applocal = True", which disables regular path resolution (and also ignores PYTHONPATH, which is a feature or a bug, depending on your point of view). This only works on Windows right now, but could probably be ported from getpathp.c into getpath.c easily.

Cheers,
Steve

Top-posted from my Windows Phone
________________________________
From: Alexander Walters<mailto:tritium-list at sdamon.com>
Sent: ?5/?31/?2015 6:39
To: python-dev at python.org<mailto:python-dev at python.org>
Subject: Re: [Python-Dev] Computed Goto dispatch for Python 2

A better course of action would be to deprecate the non-portable
version.  Other than setting the PATH envvar, why do we need to continue
even touching the system on install?  It is highly annoying for those of
us that maintain several installs of python on a single windows system,
and it really should stop.

The only use I can think of for ever touching the registry in the first
place is to tell distutils installers where python is.  I can tell you
right now, that design choice is a bug.  There are some mighty hacks you
have to go through to correct that behavior when you happen to be using
a virtualenv.

(We are calling it 'embedable', but the rest of the world would call it
'portable', as in, runable from a usb stick)

On 5/31/2015 06:47, Paul Moore wrote:
> On 31 May 2015 at 11:41, Paul Moore <p.f.moore at gmail.com> wrote:
>> On 31 May 2015 at 10:14, Xavier Combelle <xavier.combelle at gmail.com> wrote:
>>>> +1. The new embeddable Python distribution for Windows is a great step
>>>> forward for this. It's not single-file, but it's easy to produce a
>>>> single-directory self-contained application with it. I don't know if
>>>> there's anything equivalent for Linux/OSX - maybe it's something we
>>>> should look at for them as well (although the whole "static binaries"
>>>> concept seems to be fairly frowned on in the Unix world, from what
>>>> I've seen).
>>>>
>>> Just curious What is "the new embeddable Python distribution for Windows" ?
>> Python 3.5 ships a zipfile which contains a self-contained Python
>> installation, intended for embedding. The idea is that you unzip it
>> into your application directory, and use it from within your
>> application (either via the embedding API, or using the included
>> python.exe/pythonw.exe). It doesn't use the registry, or any global
>> resources, so it's independent of any installed python that might be
>> present.
> By the way, IMO the new embeddable distribution is a pretty big deal
> on Windows. To make sure that it doesn't end up unnoticed, can I
> suggest we include a prominent "What's New" entry for it, and a
> section in "Python Setup and Usage" under "Using Python on Windows"
> for it?
>
> I'd hate to find that 3 or 4 versions from now, we're still trying to
> remind people that they can use the embeddable distribution, in the
> same way that executable zipfiles ended up an almost unknown feature
> for ages.
>
> Paul
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com

_______________________________________________
Python-Dev mailing list
Python-Dev at python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150531/40036676/attachment.html>

From willingc at willingconsulting.com  Sun May 31 20:26:28 2015
From: willingc at willingconsulting.com (Carol Willing)
Date: Sun, 31 May 2015 11:26:28 -0700
Subject: [Python-Dev] Python 3 migration status update across some key
 subcommunities (was Re: 2.7 is here until 2020,
 please don't call it a waste.)
In-Reply-To: <87k2vou57h.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <CADiSq7fjMQhMzwpp=VS4+0mhM=s25BaKBJ-bRBSBuGOkjAug8g@mail.gmail.com>
 <20150531144430.GD469@tonks> <87k2vou57h.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <556B5254.30609@willingconsulting.com>

On 5/31/15 8:39 AM, Stephen J. Turnbull wrote:
> What I would really like to see is a Python 3 (and if you really need
> Python 2, here's how it differs) version of Python: Essential
> Reference.
Agreed.  If anyone has Python 3 books, talks, or resources that they 
find helpful and of high quality, please send me an email and I will 
happily curate a cheatsheet, document, or website with the results. For 
example, Harry Percival's TDD book and tutorials on PyVideo.org are well 
done with a Python 3 focus.

If you have other favorite Python 2 books that you wish were 
revised/rewritten to have a Python 3 focus, please email me that as well.
> I agree, but the cargo cult thing is big for people coming to Python
> because somebody told them it's a good way to do something practical.
For our user group attendees (whether novice or experienced, teens or 
post-docs), "practical and simple" trumps "shiny and complex". Search 
gives them a mountain of resources. Yet, these users are looking for 
guidance on a reasonable approach to do the practical things that 
interest them. These creators, innovators, and experimenters care less 
about programming language or version than they do about building their 
ideas. Fortunately, the Python language, especially when combined with 
the Python community and its outreach, enables building these 
ideas...when we are not tripping all over our own perspectives of which 
version "should" suit the use case. Practically, use whichever version 
is best suited to the use case.

Warmly,
Carol

P.S. Whether you develop for version 2, version 3, or both, thank you 
for doing so :-)
-- 
*Carol Willing*
Developer | Willing Consulting
https://willingconsulting.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20150531/42884be4/attachment.html>