
On Fri, Mar 13, 2020 at 4:04 AM Paul Moore <p.f.moore@gmail.com> wrote:
On Thu, 12 Mar 2020 at 16:42, Eric Wieser <wieser.eric+numpy@gmail.com> wrote:
Do you have an example of an application where this sort of micro-optimisation is significant enough to justify a fairly major language change?
As an example of these optimizations being valuable, see https://github.com/numpy/numpy/pull/7997, which claims the optimization I described at the beginning resulted in a 1.5x-2x speedup.
I was amused by the first comment in that thread: "First, let me say that this is terrifying and this comment should not be taken as condoning it."
but that didn't need a language change to implement...
The language change I'm proposing isn't about _implementing_ these optimizations, it's about allowing users to exploit them without having to sacrifice naming their variables.
Fair point. But I still question the cost vs benefit of this. After all, this is a *language* change you're proposing - not just limited to CPython with its particular implementation of GC. Consider
x = some_complex_object() y = (del x)
What is to stop a Python implementation (not CPython, maybe, but some other one) from discarding the object in x as soon as it's unreferenced by the (del x) expression? Then y wouldn't be able to hold that value. After all, in theory, after the first line our object has a refcount of 1, then the (del x) changes that refcount to 0, then y = (the value) changes it back to 1. But it could be discarded in that period when it has refcount 0.
No, it wouldn't - the use of the value as a return value counts as a reference. It's exactly the same as any other function that returns a brand-new value. ChrisA