
On 2021-09-30 10:08 a.m., Chris Angelico wrote:
On Thu, Sep 30, 2021 at 8:43 PM Soni L. <fakedme+py@gmail.com> wrote:
You misnderstand exception hygiene. It isn't about "do the least stuff in try blocks", but about "don't shadow unrelated exceptions into your public API".
For example, generators don't allow you to manually raise StopIteration anymore:
next((next(iter([])) for x in [1, 2, 3])) Traceback (most recent call last): File "<stdin>", line 1, in <genexpr> StopIteration
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "<stdin>", line 1, in <module> RuntimeError: generator raised StopIteration
This is a (limited) form of exception hygiene. Can we generalize it? Can we do better about it? This effectively means all generators *are* wrapped in a try/except, so your point about "too much stuff inside a try block" goes directly against accepted practice and even existing python features as they're implemented.
The reason for this is that StopException is nothing more than an *implementation detail* of generators. Look at this code: where is StopException?
def gen(): yield 5 x = (yield 7) yield 9 if x: return 11 yield 1 return 3
The code doesn't raise StopException other than because that's the way that iterables are implemented. As a function, it simply does its work, with yield points and the ability to return a value. That's why a leaking StopException can and should be turned into RuntimeError.
But what you're talking about doesn't have this clear distinction, other than in *your own definitions*. You have deemed that, in some areas, a certain exception should be turned into a RuntimeError; but in other areas, it shouldn't. To me, that sounds like a job for a context manager, not a function-level declaration.
But generators *are* iterators. By definition. In fact this had to be a breaking change *because there was code in the wild that relied on it*! Imagine if that code could be changed to be: def gen() with StopIteration: try: yield next(foo) except StopIteration: raise and have the StopIteration propagate as a StopIteration instead of RuntimeError! (altho supporting this *specific* use-case would probably be painful given that this is mostly a purely syntactic transformation.)
My comments asking how the compiler is supposed to know which part of the code needs to be guarded with a "re-raise the exception" flag still apply, regardless of whether I have misunderstood your API or not.
Your syntax has:
def a_potentially_recursive_function(some, args) with ExceptionWeCareAbout: some.user_code() code_we_assume_is_safe() if args.something and some_condition: raise ExceptionWeCareAbout # Line (A)
How does the compiler know that *only* ExceptionWeCareAbout originating in Line (A) should be re-raised, and any other location turned into RuntimeError?
Same way Rust decides whether to propagate or unwrap a Result: you *must* tell the compiler.
Please elaborate. We can already write this:
def foo(): with fail_on_exception(ExceptionWeCareAbout): some.user_code() if some_condition: raise ExceptionWeCareAbout
Does that count as telling the compiler? If not, what is it you're trying to do, and how is the compiler supposed to know which ones to permit and which to wrap in RuntimeError?
With a source transformation, really. that is: def foo() with exceptions: something raise ... always transforms into: def foo(): set_to_True_to_pass_through_instead_of_wrapping_in_RuntimeError = False try: something set_to_True_to_pass_through_instead_of_wrapping_in_RuntimeError = True raise ... except exceptions as exc: if set_to_True_to_pass_through_instead_of_wrapping_in_RuntimeError: raise else: raise RuntimeError from exc that is: the "with exceptions" becomes "except exceptions", and every "raise" gains an "set_to_True_to_pass_through_instead_of_wrapping_in_RuntimeError = True" immediately before it (mostly - kinda glossing over the whole "the expression of the raise doesn't get to, itself, raise its own exceptions", but anyway). It gets clearer/etc if you have a more complex function that isn't a tiny wrapper. A tiny wrapper with 3 different exceptional exit conditions is inherently gonna look a little busy, but a larger wrapper with only one or two would actually look clearer! For example this: (real code) def get_property_values(self, prop): try: factory = self.get_supported_properties()[prop] except KeyError as exc: raise PropertyError from exc iterator = factory(self._obj) try: first = next(iterator) except StopIteration: return (x for x in ()) except abdl.exceptions.ValidationError as exc: raise LookupError from exc except LookupError as exc: raise RuntimeError from exc # don't accidentally swallow bugs in the iterator return itertools.chain([first], iterator) vs: def get_property_values(self, prop) with PropertyError, LookupError: try: factory = self.get_supported_properties()[prop] except KeyError as exc: raise PropertyError from exc iterator = factory(self._obj) try: first = next(iterator) except StopIteration: return (x for x in ()) except abdl.exceptions.ValidationError as exc: raise LookupError from exc return itertools.chain([first], iterator) (arguably the call to get_supported_properties should also be moved outside the try, but that actually doesn't change that a whole line got removed!) In this case, not only does it clean stuff up, it also solves potential maintainability issues. Without this feature, this would need a bunch more blocks to get the correct exception hygiene.
What if I factor out those last two lines and make it:
def a_potentially_recursive_function(some, args) with ExceptionWeCareAbout: some.user_code() code_we_assume_is_safe() check_condition_or_raise(args.something, some_condition)
How does the compiler decide to re-raise exceptions originating in the last line but not the first two?
In this case, it explicitly doesn't. You explicitly told it the last line doesn't raise any exceptions that contribute to your API's exception surface.
You *must* use try: check_condition_or_raise(args.something, some_condition) except ExceptionWeCareAbout: raise
(Verbosity can be improved if this feature gets widely used, but it's beside the point.)
Ewww eww ewww. I have seen horrific Java code that exists solely to satisfy arbitrary function exception declarations. It does not improve the code.
This is explicitly NOT checked exceptions. Do not mistake these. If anything this is the *direct opposite* (direct antithesis?) of checked exceptions.
This works fine because any explicit raise will always poke through the generated try/except.
So what you're saying is that the raise statement will always raise an exception, but that any exception raised from any other function won't. Okay. So you basically want exceptions to... not be exceptions. You want to use exceptions as if they're return values.
Why not just use return values?
Because they're "unpythonth". Yes they're a thing in Rust and Rust is the inspiration for this idea (and a good part of the reason we're rewriting our code in Rust) but we do think it's possible to have a pythonic solution to a pythonic problem. We've seen how many exceptions are accidentally swallowed by python web frameworks and how much of a debugging nightmare it can make. That's why we write code that guards against unexpected exceptions. Like the (real code) we pasted here above. (It's saved us plenty of trouble *in practice*, so this is a real issue.)
ChrisA _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/5CPTIF... Code of Conduct: http://python.org/psf/codeofconduct/