And as I understand it (from a quick scan) the reason it can’t tell isn’t that the refcount isn’t 1 (which is something CPython could maybe fix if it were a problem, but it isn’t). Rather, it is already 1, but a refcount of 1 doesn’t actually prove a temporary value
This is at odds with my undestanding, which is that the reason it can’t tell _is_ that the refcount isn’t 1. When you do `(b * c) + d`, `(b * c).__add__(d)` is passed `self` with a refcount of 1 (on the stack). I believe this hits the optimization today in numpy. When you do `bc + d`, `bc.__add__(d)` is passed `self` with a refcount of 2 (one on the stack, and one in `locals()`).
My proposal of del is that `(del bc) + d` would enter `bc.__add__(d)` with `self` passed with a refcount of 1.
Even if it could, surely the solution to “`a+b+c+d` is slow” can’t be “just write `(del (del a+b)+c)+d` and it’ll be fast”
That would be a syntax error under my proposal. `del` remains an operator only on names, not expressions.