Having thought about it some, I propose it'd be acceptable to do
dead store optimization if-and-only-if optimizations are
explicitly enabled, e.g. with "-O". Allowing explicitly-enabled
optimizations to observably affect runtime behavior does have some
precedent, e.g. "-OO" which breaks doctest, docopt, etc. It'd be
a shame if the existence of locals() et al meant Python could
never ever perform dead store optimization.
Assuming you're still talking about how to implement wildcards, it really sounds like you're willing to add a lot of complexity just to have a "consistent" treatment of `_`. But why would you care so much about that consistency? When I write `for x, _, _ in pts` the main point is not that I can write `print(_)` and get the z coordinate. The main point is that I am not interested in the y or the z coordinates (and showing this to the reader up front). The value assigned to `_` is uninteresting (even in a debug session, unless you're debugging Python itself).
Using the same character in patterns makes intuitive sense to anyone who is familiar with this convention in Python. Furthermore it also makes sense to anyone who is familiar with patterns in other languages: *all* languages with structural pattern matching that we found at uses `_` -- C#, Elixir, Erlang, Scala, Rust, F#, Haskell, Mathematica, OCaml, Ruby, and Swift. (That's a much stronger precedent than the use of `?` in shell and regular expressions IMO. :-)
The need for a wildcard pattern has already been explained -- we really want to disallow `Point(x, y, y)` but we really need to allow `Point(z, _, _)`. Generating code to assign the value to `_` seems odd given the clear intent to *ignore* the value.
Using `?` as the wildcard has mostly disadvantages: it requires changes to the tokenizer, it could conflict with other future uses of `?` (it's been proposed for type annotations as a shorter version of Optional, and there's PEP 505, which I think isn't quite dead yet), and Python users have no pre-existing intuition for its meaning.
A note about i18n: it would be unfortunate if we had to teach users they couldn't use `_` as a wildcard in patterns in code that also uses `_` as part of the i18n stack (`from gettext import gettext as _` -- see gettext stdlib docs). This is a known limitation on the `for x, _, _ in ...` idiom, which I've seen people work around by writing things like `for x, __, __ in ...`. But for patterns (because the pattern code generation needs to know about wildcards) we can't easily use that workaround. However, the solution of never assigning to `_` (by definition, rather than through dead store optimization) solves this case as well.
So can we please lay this one to rest?