
Hello there! This is a rehash of something that I wrote on stackless-dev recently. Perhaps this has been suggested in the past. If so, please excuse my ignorance. It irks me sometimes how inflexible context managers can be. For example, wouldn't it be nice to be able to write with as_subprocess(): do_stuff() or in StacklessPython: with different_tasklet(): do_stuff() This is currently impossible because context managers are implemented as __enter__()/__exit__() methods running in the current scope. There is no callable function site to pass ot a subprocess module, or a tasklet scheduler. I have another favorite pet-peeve, which is that for some things, a pair of context managers are needed, since a single context manager cannot silence it's own exception: with IgnoreError, LockResourceOrRaiseIfBusy(resource): do_stuff cannot be collapsed into: with LockResourceOrPassIfBusy(resource): do_stuff. But another thing is also interesting: Even though context managers are an __enter__() / __exit__() pair, the most common idiom these days is to write: @contextlib.contextmanager def mycontextmanager(): setup() try: yield finally(): teardown() or similar. There are a million reasons for this. Mostly it is because this layout is easier to figure out and plays nicer in the head. It also simplifies error handling, because regular try/except clauses can be used. If you are writing a "raw" context manager, you have to explicitly maintain some state between the __enter__() and __exit__() methods to know what to clean up and how, depending on the error conditions. This quickly becomes tedious. And what does the above code look like? Well, the place of the "yield", could just as well be a call site. I mean, the decorated "contextmanager" function simply looks like a wrapper function around a function call. You write it exactly as you would write a wrapper function ,except where you would call the function, you use the "yield" statement (and you _have_ to call yield. Can't skip it for whatever reason). So, If this is the way people like to think about context managers, like writing wrapper functoins, why don't we turn them into proper wrapper functions? What if a context manager were given a _callable_, representing the code? like this: class NewContextManager(object): # A context manager that locks a resource, then executes the code only if it is not recursing def __init__(self, lock): self.lock = lock def __contextcall__(self, code): with lock: if lock.active: return # This is where @contextmanager will stop you, you can't skip the 'yield' lock.active = True try: return code(None) # optionally pass value to the code as in "with foo() as X" finally: lock.active = False The cool thing here though, is that "code" could, for example, be run on a different tasklet. Or a different thread. Or a different universe: def TaskletContextManager(object): def __contextcall__(self, code): return stacklesslib.run_on_tasklet(code) def ThreadContextManager(object): def __contextcall__(self, code): result = [] def helper(): result.append(code()) t = threading.Thread(target=helper) t.start() t.join() return result[0] This sort of thing would need compiler and syntax support, of course. The compiler would need to create an anonymous function object. The return value out of "code" would be some token that could be special if the code returned.... To illustrate, let's see how this can be done manually: This code here: with foo() as bar : if condition: return stuff do_stuff(bar) can be re-written like this: def _code(_arg): bar = _arg if condition: return True, stuff # early return do_stuff(bar) return False, None # no return token, value = foo(bar).__contextcall__(_code): if token is True return value where: class foo(object): def __init__(self, arg): self.arg = arg def __contextcall__(self, _code): set_up() try: return _code(None) #pass it some value finally: tear_down() Compiler support for this sort of thing would entail the automatic creation of the "_code" function as an anonymous function with special semantics for a "return" value. This function is then passed to the __contextcall__() method of the "new" context manager, where the context manager treats it as any other callable, which' return value it must return. The "early return" can also be done as a special kind of exception, ContextManagerReturn(value). So, anyway. Context manager syntax is really nice for so many reasons, which is why we have it in the language, instead of wrapper functions. But if it _were_ just syntactic sugar for actual wrapper functions, they would be even awesomer. K

On Tue, Oct 22, 2013 at 12:55 AM, Kristján Valur Jónsson <kristjan@ccpgames.com> wrote:
Possible problem: A function creates a new scope, a with block doesn't. Imagine this: with different_tasklet(): foo = 1 print(foo) In current Python, whatever different_tasklet does, those two foos are the same foo. If the body becomes a callable, that could get messy. Do you have to declare 'nonlocal foo' when you enter the with block? That'd be a nasty backward-compatibility break. Or is this callable somehow part of the previous scope? Could work in theory, but would need a fair amount of magic. ChrisA

Well, yes, like I said, it could be a new kind of callable if necessary. But the scope problem is easily solved using "cell" variables, the same way as closures are implemented today. The compiler, which is building the anonymous function, makes sure to bind local variables to the parent's scope, using cells. K

On Oct 21, 2013, at 7:50, Kristján Valur Jónsson <kristjan@ccpgames.com> wrote:
A paradigm case for with is creating a new variable in a context to use outside of it: with open(path) as f: rows = list(reader(f)) for row in rows[1:]: with lock: connections = self.connections for connection in connections: The compiler would not make rows or connections into a closure variable, but a local. So you'd have to write nonlocal in most with statements. And if you changed the rules so everything was nonlocal by default in a context function, we'd need a new keyword to declare local variables, which would be (a) very different from the rest of python, and (b) hard to come up with a name for that didn't conflict with thousands of different programs.

And if you changed the rules so everything was nonlocal by default in a context function, we'd need a new keyword to declare local variables, which would be (a) very different from the > rest of python, and (b) hard to come up with a name for that didn't conflict with thousands of different programs.
Well no. You are not defining a function, so all variables are in the same scope as the existing code. Binding does not change. “nonlocal” has exactly the same meaning, meaning outside the function you are writing. The “anonymous callable” exists only technically, not syntactically. It has no local variables, no (visible) arguments. Semantics stay exactly the same, only the method of invoking the executable code changes. K -----Original Message----- From: Andrew Barnert [mailto:abarnert@yahoo.com] Sent: 21. október 2013 16:26 To: Kristján Valur Jónsson Cc: Chris Angelico; python-ideas Subject: Re: [Python-ideas] A different kind of context manager A paradigm case for with is creating a new variable in a context to use outside of it: with open(path) as f: rows = list(reader(f)) for row in rows[1:]: with lock: connections = self.connections for connection in connections: The compiler would not make rows or connections into a closure variable, but a local. So you'd have to write nonlocal in most with statements. And if you changed the rules so everything was nonlocal by default in a context function, we'd need a new keyword to declare local variables, which would be (a) very different from the rest of python, and (b) hard to come up with a name for that didn't conflict with thousands of different programs.
K
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org<mailto:Python-ideas@python.org>

On Oct 21, 2013, at 14:05, Kristján Valur Jónsson <kristjan@ccpgames.com> wrote:
Obviously semantics don't stay exactly the same or there would be no benefits to the change. The whole point is that you're creating a function with closure and visibly passing it to a method of the context manager. It's either one or the other: either every variable is implicitly nonlocal whether you want it to be or not, or every variable is implicitly local and you have to nonlocal them to perform common context manager idioms. Compare with comprehensions. Changing them to use functions under the covers had no effect (other than breaking a rare use case with StopIteration passing, which I believe has been fixed), but that's only because the comprehension variable(s) were already explicitly prevented from replacing existing bindings. (The fact that there's no way to explicitly bind a variable in a comprehension helps too--no potential surprised about what "x=2" might do when statements aren't allowed in the first place.) That's obviously not true for with statements. If you think that every variable being implicitly nonlocal is a good thing, that's certainly arguable (maybe no existing code would ever notice the difference, and new code that did wouldn't be surprised by it?), but only of you make that case instead of trying to argue that there isn't an issue in the first place.

Syntax semantics stay the same. Ø It's either one or the other: either every variable is implicitly nonlocal whether you want it to be or not, or every variable is implicitly local and you have to nonlocal them to perform common context manager idioms. “Local” or “nonlocal” only has meaning within a function definition. one that starts with “def function():” I’m not suggesting new syntax. The code remains just a regular block, inside the “with” keyword and the variables there have the same binding as if the “with” statement were removed. You are not defining a function, but the compiler _is_ producing a new kind of callable. A “block” object, perhaps. It might not _need_ to be technically a new kind of callable, perhaps such a block is implementable within the existing “function” type. But that is merely an implementation detail. My proposal thus has no changes on syntax, merely on how the block is invoked. I suggest the code block be invoked explicitly by a “new-style context manager” rather than implicitly by the interpreter, inside the frame of __enter__/__exit__ K From: Andrew Barnert [mailto:abarnert@yahoo.com] Sent: 21. október 2013 23:37 To: Kristján Valur Jónsson Cc: Chris Angelico; python-ideas Subject: Re: [Python-ideas] A different kind of context manager Obviously semantics don't stay exactly the same or there would be no benefits to the change. The whole point is that you're creating a function with closure and visibly passing it to a method of the context manager. It's either one or the other: either every variable is implicitly nonlocal whether you want it to be or not, or every variable is implicitly local and you have to nonlocal them to perform common context manager idioms. Compare with comprehensions. Changing them to use functions under the covers had no effect (other than breaking a rare use case with StopIteration passing, which I believe has been fixed), but that's only because the comprehension variable(s) were already explicitly prevented from replacing existing bindings. (The fact that there's no way to explicitly bind a variable in a comprehension helps too--no potential surprised about what "x=2" might do when statements aren't allowed in the first place.) That's obviously not true for with statements. If you think that every variable being implicitly nonlocal is a good thing, that's certainly arguable (maybe no existing code would ever notice the difference, and new code that did wouldn't be surprised by it?), but only of you make that case instead of trying to argue that there isn't an issue in the first place.

No, it wouldn't. Read PEP 340 (the one Guido wrote *before* descoping it in PEP 343). The problem with offering true blocks is that they immediately create multiple ways to do a lot of different things, and this callback based variant also plays merry hell with the scoping rules. That said, something I *have* been thinking might work better than the status quo is permitting a variant of tuple unpacking that enters each context manager as it is produced and provides a tuple of the results. So this would work properly: with *(open(name) for name in names) as files: ... And you could factor out with statement skipping as a function returning a 2-tuple (although unpacking the value would be a little annoying in that case). Cheers, Nick.

On Mon, Oct 21, 2013 at 6:55 AM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:
Cool, sure. But what are the use cases that need this and can't be done easily with the existing design? class NewContextManager(object):
# A context manager that locks a resource, then executes the code only if
You can do that with current context managers: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() @contextmanager def lock_once(lock): if lock.active: yield False else: lock.active = True try: yield True finally: lock.active = False Note that I'm mimicking your lock/unlock code which of course is not the proper way to acquire/release a lock, but it gets the idea across. How would making the code inside the with block a callable improve this? I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. Note that my version also allows me to do this, which I can't easily do with your context manager: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() else: log('Lock %r was already acquired', x) do_stuff_every_time() --- Bruce I'm hiring: http://www.cadencemd.com/info/jobs Latest blog post: Alice's Puzzle Page http://www.vroospeak.com Learn how hackers think: http://j.mp/gruyere-security

Ø Cool, sure. But what are the use cases that need this and can't be done easily with the existing design? Exactly those I listed. Any form of execution that requires a "function" to be run. This include all existing threading/multiprocessing designs. Ø How would making the code inside the with block a callable improve this? It allows the "if" statement to be part of the context manager Ø I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. That is a matter of taste. Maybe it would make sense to pull other things out of the context manager too? But taste should not, IMHO, limit our options. If this is a common idiom, locking and executing once, why shouldn't we be able to write a clever macro/context manger/syntactic sugar to help us with that? Why insist on this verbosity? Also, consider my argument that most context managers are written using the @contextmanager paradigm. We like this idiom so much because this is what we really want to do, call the code from a wrapper function. If you look at the design of that context manager, it is not exactly straightforward. This suggests to me that maybe we took a wrong turn deciding on a context manager design. Maybe we should have selected one in which this sort of coding is its native, natural, form, rather than having this intermarriage kludge which turns an imperative-looking generator into the traditional context manager. From: Bruce Leban [mailto:bruce@leapyear.org] Sent: 21. október 2013 17:22 To: Kristján Valur Jónsson Cc: python-ideas@python.org Subject: Re: [Python-ideas] A different kind of context manager On Mon, Oct 21, 2013 at 6:55 AM, Kristján Valur Jónsson <kristjan@ccpgames.com<mailto:kristjan@ccpgames.com>> wrote: So, If this is the way people like to think about context managers, like writing wrapper functoins, why don't we turn them into proper wrapper functions? <...> The cool thing here though, is that "code" could, for example, be run on a different tasklet. Or a different thread. Or a different universe. Cool, sure. But what are the use cases that need this and can't be done easily with the existing design? class NewContextManager(object): # A context manager that locks a resource, then executes the code only if it is not recursing def __init__(self, lock): self.lock = lock def __contextcall__(self, code): with lock: if lock.active: return # This is where @contextmanager will stop you, you can't skip the 'yield' lock.active = True try: return code(None) # optionally pass value to the code as in "with foo() as X" finally: lock.active = False You can do that with current context managers: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() @contextmanager def lock_once(lock): if lock.active: yield False else: lock.active = True try: yield True finally: lock.active = False Note that I'm mimicking your lock/unlock code which of course is not the proper way to acquire/release a lock, but it gets the idea across. How would making the code inside the with block a callable improve this? I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. Note that my version also allows me to do this, which I can't easily do with your context manager: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() else: log('Lock %r was already acquired', x) do_stuff_every_time() --- Bruce I'm hiring: http://www.cadencemd.com/info/jobs Latest blog post: Alice's Puzzle Page http://www.vroospeak.com<http://www.vroospeak.com/> Learn how hackers think: http://j.mp/gruyere-security

I agree with this 100%. Unfortunately, Python picked the current style of `with` statements a while ago, and this would be a pretty huge change/additional feature. FWIW, other languages with easy anonymous functions (e.g. ruby, scala) have this, and it does provide all the benefits that you describe. For example, spinning off parallel tasklets inline is just a matter of `async{ ... }` where `async` is just a context manager. As much as I'd like to see python follow suite, getting it into python would be a pretty large upheaval and not something that I expect to happen in the near future. And the last option, if you're crazy, is to use MacroPy and write your own tasklet/forking context managers! It's surprisingly easy (< 50 lines). On Mon, Oct 21, 2013 at 2:16 PM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:

On 22 Oct 2013 07:27, "Haoyi Li" <haoyi.sg@gmail.com> wrote:
Maybe we should have selected one in which this sort of coding is its
native, natural, form, rather than having this intermarriage kludge which turns an imperative-looking generator into the traditional context manager.
I agree with this 100%. Unfortunately, Python picked the current style of
`with` statements a while ago, and this would be a pretty huge change/additional feature.
FWIW, other languages with easy anonymous functions (e.g. ruby, scala)
have this, and it does provide all the benefits that you describe. For example, spinning off parallel tasklets inline is just a matter of `async{ ... }` where `async` is just a context manager. It's not a coincidence that Ruby (at least - I don't know scala) just treats for loops and context management as special cases of anonymous callbacks - the latter is a powerful, more general construct. By contrast, Python chose the path of providing dedicated syntax for both iteration *and* context management, and hence requires that callbacks that don't fit in a single expression be defined prior to use. I think this makes those constructs easier to understand in many ways, but it *also* means that we *don't* currently have a clean syntax for single use callbacks. Hence the time I've put into PEP 403 and 3150 over the years - a key objective for both of them is providing a cleaner solution for the problem of single use callbacks (including those that modify local variables of the containing function). In addition to scoping, the other problem single use callbacks need to handle sensibly is the behaviour of the simple flow control statements: return, yield, break, continue, and raise. Building single use callbacks on top of the "def" statement has the advantage of *not* needing to define any new scoping or control flow semantics (as they're just ordinary nested scopes). Defining them any other way makes things far more complicated. It would certainly be close to impossible to repurpose any of the other existing compound statements without breaking backwards compatibility. A completely new keyword is also a possibility, but then it's necessary to find a good one, and explain it's use cases in a fashion similar to PEP 403. Cheers, Nick.
And the last option, if you're crazy, is to use MacroPy and write your
own tasklet/forking context managers! It's surprisingly easy (< 50 lines).
On Mon, Oct 21, 2013 at 2:16 PM, Kristján Valur Jónsson <
Also, consider my argument that most context managers are written using
kristjan@ccpgames.com> wrote: this? things out of the context manager too? But taste should not, IMHO, limit our options. If this is a common idiom, locking and executing once, why shouldn’t we be able to write a clever macro/context manger/syntactic sugar to help us with that? Why insist on this verbosity? the @contextmanager paradigm. We like this idiom so much because this is what we really want to do, call the code from a wrapper function. this intermarriage kludge which turns an imperative-looking generator into the traditional context manager. proper way to acquire/release a lock, but it gets the idea across. How would making the code inside the with block a callable improve this? I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. Note that my version also allows me to do this, which I can't easily do with your context manager:

Thanks for your detailed reply, Nick. Good to see that I am not completely bonkers and that swimming entirely alone against the flow. I realise of course that we are Python and not Ruby (I got to learn Ruby before Python, btw) and that this is not particularly likely to come to anything. But remember how Python retro-fitted Ruby's object model into its "new-style" classes? Ruby was written that way all along. (CS buffs out there will likely point out to me that this was not an original Matz invention). Perhaps, with persistent dripping, we can slowly hollow the proverbial stone. Cheers, K From: Nick Coghlan [mailto:ncoghlan@gmail.com] Sent: 22. október 2013 03:32 To: Haoyi Li Cc: python-ideas@python.org; Kristján Valur Jónsson Subject: Re: [Python-ideas] A different kind of context manager On 22 Oct 2013 07:27, "Haoyi Li" <haoyi.sg@gmail.com<mailto:haoyi.sg@gmail.com>> wrote:
It's not a coincidence that Ruby (at least - I don't know scala) just treats for loops and context management as special cases of anonymous callbacks - the latter is a powerful, more general construct. By contrast, Python chose the path of providing dedicated syntax for both iteration *and* context management, and hence requires that callbacks that don't fit in a single expression be defined prior to use. I think this makes those constructs easier to understand in many ways, but it *also* means that we *don't* currently have a clean syntax for single use callbacks. Hence the time I've put into PEP 403 and 3150 over the years - a key objective for both of them is providing a cleaner solution for the problem of single use callbacks (including those that modify local variables of the containing function). In addition to scoping, the other problem single use callbacks need to handle sensibly is the behaviour of the simple flow control statements: return, yield, break, continue, and raise. Building single use callbacks on top of the "def" statement has the advantage of *not* needing to define any new scoping or control flow semantics (as they're just ordinary nested scopes). Defining them any other way makes things far more complicated. It would certainly be close to impossible to repurpose any of the other existing compound statements without breaking backwards compatibility. A completely new keyword is also a possibility, but then it's necessary to find a good one, and explain it's use cases in a fashion similar to PEP 403. Cheers,

On 2013-10-23, at 16:55 , Kristján Valur Jónsson wrote:
(CS buffs out there will likely point out to me that this was not an original Matz invention).
You don't need CS buffs to point it out, it was an implementation detail leaking into semantic incompatibility between types implemented in C and classes implemented in Python: http://python-history.blogspot.be/2010/06/new-style-classes.html Python was fairly unique in having this dichotomy between built-in and user-defined types[0]. [0] but not anymore, Go has repeated this mistake, amongst others.

That's not what I was referring to, rather the class model that blew my mind when learing Ruby back in 2000. I'm not a CS, so this was new to me (knowing OOP only from C++): - Classes, metaclasses, methor resolution order. - All classes are subclasses of "object" - object itself is an instance of "type", type being a "metaclass". - type, too, is a subclass of object. Type is its own metaclass, so "type" is an instance of "type". Ruby was designed with this model in mind, it only arrived later into Python. K

On Thu, Oct 24, 2013 at 3:36 AM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:
Are you sure? I wrote about metaclasses in Python in 1998: http://www.python.org/doc/essays/metaclasses/ New-style classes were just the second or third iteration of the idea. -- --Guido van Rossum (python.org/~guido)

I'm not sure about anything :). In particular, I don't know where Ruby's object model originates. And Ruby 1.0 came out in 1996. I'm sure that the model of "object" and "type" (or other equivalent names) is older, though. Could be a simplification of Smalltalk's object model, for example. Well, looking this up, this is what Wikipedia says, in fact. But I recall someone, somewhere, mentioning that this system is based on a proper Paper by someone :) But Python and Ruby's models are quite similar in structure. I don't know if Python's new-style classes were inspired by Ruby or not, perhaps it is a case of convergent evolution. Cheers, K From: gvanrossum@gmail.com [mailto:gvanrossum@gmail.com] On Behalf Of Guido van Rossum Sent: 24. október 2013 15:26 To: Kristján Valur Jónsson Cc: python-ideas ideas Subject: Re: [Python-ideas] A different kind of context manager On Thu, Oct 24, 2013 at 3:36 AM, Kristján Valur Jónsson <kristjan@ccpgames.com<mailto:kristjan@ccpgames.com>> wrote: Ruby was designed with this model in mind, it only arrived later into Python. Are you sure? I wrote about metaclasses in Python in 1998: http://www.python.org/doc/essays/metaclasses/ New-style classes were just the second or third iteration of the idea.

As log as we are speculating about the origins of language features, I feel the need to set the record straight. I was not inspired by Ruby at that point (or ever :-). Ruby was in fact inspired by Python. Mats once told me that his inspiration was 20% Python, 80% Perl, and that Larry Wall is his hero. I was inspired to implement new-style classes by a very specific book, "Putting Metaclasses to Work" by Ira Forman and Scott Danforth ( http://www.amazon.com/Putting-Metaclasses-Work-Ira-Forman/dp/0201433052). But even Python's original design (in 1990, published in 1991) had the notion that 'type' was itself an object. The type pointer in any object has always been a pointer to a special object, whose "data" was a bunch of C function pointers implementing the behavior of other objects, similar to a C++ vtable. The type of a type was always a special type object, which you could call a meta-type, to be recognized because it was its own type. I was only vaguely aware of Smalltalk at the time; I remember being surprised by its use of metaclasses when I read about them much later. Smalltalk's bytecode was a bigger influence of Python's bytecode though. I'd read about it in a book by Adele Goldberg and others, I believe "Smalltalk-80: The Language and its Implementation" ( http://www.amazon.com/Smalltalk-80-The-Language-its-Implementation/dp/020111... ). On Thu, Oct 24, 2013 at 8:59 AM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:
-- --Guido van Rossum (python.org/~guido)

Thanks, Guido, This is in fact very interesting. I'll be sure to not wildly speculate out of my posterior on these matters again, but refer to the facts :) Another data point indicating that convergent evolution does, in fact, exist. K From: gvanrossum@gmail.com [mailto:gvanrossum@gmail.com] On Behalf Of Guido van Rossum Sent: 24. október 2013 16:55 To: Kristján Valur Jónsson Cc: python-ideas ideas Subject: Re: [Python-ideas] A different kind of context manager As log as we are speculating about the origins of language features, I feel the need to set the record straight. I was not inspired by Ruby at that point (or ever :-). Ruby was in fact inspired by Python. Mats once told me that his inspiration was 20% Python, 80% Perl, and that Larry Wall is his hero. I was inspired to implement new-style classes by a very specific book, "Putting Metaclasses to Work" by Ira Forman and Scott Danforth (http://www.amazon.com/Putting-Metaclasses-Work-Ira-Forman/dp/0201433052). But even Python's original design (in 1990, published in 1991) had the notion that 'type' was itself an object. The type pointer in any object has always been a pointer to a special object, whose "data" was a bunch of C function pointers implementing the behavior of other objects, similar to a C++ vtable. The type of a type was always a special type object, which you could call a meta-type, to be recognized because it was its own type. I was only vaguely aware of Smalltalk at the time; I remember being surprised by its use of metaclasses when I read about them much later. Smalltalk's bytecode was a bigger influence of Python's bytecode though. I'd read about it in a book by Adele Goldberg and others, I believe "Smalltalk-80: The Language and its Implementation" (http://www.amazon.com/Smalltalk-80-The-Language-its-Implementation/dp/020111...). --Guido van Rossum (python.org/~guido<http://python.org/~guido>)

Kristján, Your replies would be much easier to read if trimmed the previous email. Thanks. -- ~Ethan~

You can get the desired behavior by (ab)using function decorators, by rewriting with as_subprocess():
do_stuff()
as @as_subprocess def _(): <do stuff> Yes, it's not very elegant syntactically but gets the work done (and this technique is generalizable to most uses of Ruby-style blocks, I believe). Antony

Nice suggestion. And I can also do: def _(): do_stuff() result = execute_subprocess(_) i.e. just do it manually, rather than with the decorator. Your suggestion, though, is actually not such a bad pattern, but it has a few drawbacks: 1) You need the “nonlocal” qualifier to pass values out of it 2) It has the side effect of setting _ 3) It is a bit non-intuitive, particularly when decorators start taking arguments. When is the decorator run? This is not always immediately clear. Well, it is simpler than a regular decorator, since it will invoke the target function itself… 4) The syntax is not nice. Decorators themselves were invented as syntactic sugar to get rid of the def foo(): … foo = bar() pattern. Maybe I should revise my suggestion then? A new syntax that does the above, i.e.: new_with cm as bar: # or whatever keyword is deemed appropriate. do_stuff() compiles to: @cm def _( _bar): pragma(“nonlocal”, 1) # moves binding one step upwards bar = _bar do_stuff() But with _ and _bar magically hidden. The only thing really needed, then is the support for “pragma(“nonlocal”, 1)” or an equivalent way of changing the default binding of variables, and compiler magic for syntax. K From: Antony Lee [mailto:anntzer.lee@gmail.com] Sent: 22. október 2013 00:18 To: python-ideas@googlegroups.com Cc: python-ideas@python.org; Kristján Valur Jónsson Subject: Re: [Python-ideas] A different kind of context manager You can get the desired behavior by (ab)using function decorators, by rewriting with as_subprocess(): do_stuff() as @as_subprocess def _(): <do stuff> Yes, it's not very elegant syntactically but gets the work done (and this technique is generalizable to most uses of Ruby-style blocks, I believe). Antony

Ah, I forgot about the changed flow control with “return”, “break”, “continue”. The decorator way does not allow that. A special “block” callable would be needed, with special return opcodes. K From: Kristján Valur Jónsson Sent: 23. október 2013 14:48 To: 'Antony Lee'; python-ideas@googlegroups.com Cc: python-ideas@python.org Subject: RE: [Python-ideas] A different kind of context manager Nice suggestion. And I can also do: def _(): do_stuff() result = execute_subprocess(_) i.e. just do it manually, rather than with the decorator. Your suggestion, though, is actually not such a bad pattern, but it has a few drawbacks: 1) You need the “nonlocal” qualifier to pass values out of it 2) It has the side effect of setting _ 3) It is a bit non-intuitive, particularly when decorators start taking arguments. When is the decorator run? This is not always immediately clear. Well, it is simpler than a regular decorator, since it will invoke the target function itself… 4) The syntax is not nice. Decorators themselves were invented as syntactic sugar to get rid of the def foo(): … foo = bar() pattern. Maybe I should revise my suggestion then? A new syntax that does the above, i.e.: new_with cm as bar: # or whatever keyword is deemed appropriate. do_stuff() compiles to: @cm def _( _bar): pragma(“nonlocal”, 1) # moves binding one step upwards bar = _bar do_stuff() But with _ and _bar magically hidden. The only thing really needed, then is the support for “pragma(“nonlocal”, 1)” or an equivalent way of changing the default binding of variables, and compiler magic for syntax. K From: Antony Lee [mailto:anntzer.lee@gmail.com] Sent: 22. október 2013 00:18 To: python-ideas@googlegroups.com<mailto:python-ideas@googlegroups.com> Cc: python-ideas@python.org<mailto:python-ideas@python.org>; Kristján Valur Jónsson Subject: Re: [Python-ideas] A different kind of context manager You can get the desired behavior by (ab)using function decorators, by rewriting with as_subprocess(): do_stuff() as @as_subprocess def _(): <do stuff> Yes, it's not very elegant syntactically but gets the work done (and this technique is generalizable to most uses of Ruby-style blocks, I believe). Antony

On Tue, Oct 22, 2013 at 12:55 AM, Kristján Valur Jónsson <kristjan@ccpgames.com> wrote:
Possible problem: A function creates a new scope, a with block doesn't. Imagine this: with different_tasklet(): foo = 1 print(foo) In current Python, whatever different_tasklet does, those two foos are the same foo. If the body becomes a callable, that could get messy. Do you have to declare 'nonlocal foo' when you enter the with block? That'd be a nasty backward-compatibility break. Or is this callable somehow part of the previous scope? Could work in theory, but would need a fair amount of magic. ChrisA

Well, yes, like I said, it could be a new kind of callable if necessary. But the scope problem is easily solved using "cell" variables, the same way as closures are implemented today. The compiler, which is building the anonymous function, makes sure to bind local variables to the parent's scope, using cells. K

On Oct 21, 2013, at 7:50, Kristján Valur Jónsson <kristjan@ccpgames.com> wrote:
A paradigm case for with is creating a new variable in a context to use outside of it: with open(path) as f: rows = list(reader(f)) for row in rows[1:]: with lock: connections = self.connections for connection in connections: The compiler would not make rows or connections into a closure variable, but a local. So you'd have to write nonlocal in most with statements. And if you changed the rules so everything was nonlocal by default in a context function, we'd need a new keyword to declare local variables, which would be (a) very different from the rest of python, and (b) hard to come up with a name for that didn't conflict with thousands of different programs.

And if you changed the rules so everything was nonlocal by default in a context function, we'd need a new keyword to declare local variables, which would be (a) very different from the > rest of python, and (b) hard to come up with a name for that didn't conflict with thousands of different programs.
Well no. You are not defining a function, so all variables are in the same scope as the existing code. Binding does not change. “nonlocal” has exactly the same meaning, meaning outside the function you are writing. The “anonymous callable” exists only technically, not syntactically. It has no local variables, no (visible) arguments. Semantics stay exactly the same, only the method of invoking the executable code changes. K -----Original Message----- From: Andrew Barnert [mailto:abarnert@yahoo.com] Sent: 21. október 2013 16:26 To: Kristján Valur Jónsson Cc: Chris Angelico; python-ideas Subject: Re: [Python-ideas] A different kind of context manager A paradigm case for with is creating a new variable in a context to use outside of it: with open(path) as f: rows = list(reader(f)) for row in rows[1:]: with lock: connections = self.connections for connection in connections: The compiler would not make rows or connections into a closure variable, but a local. So you'd have to write nonlocal in most with statements. And if you changed the rules so everything was nonlocal by default in a context function, we'd need a new keyword to declare local variables, which would be (a) very different from the rest of python, and (b) hard to come up with a name for that didn't conflict with thousands of different programs.
K
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org<mailto:Python-ideas@python.org>

On Oct 21, 2013, at 14:05, Kristján Valur Jónsson <kristjan@ccpgames.com> wrote:
Obviously semantics don't stay exactly the same or there would be no benefits to the change. The whole point is that you're creating a function with closure and visibly passing it to a method of the context manager. It's either one or the other: either every variable is implicitly nonlocal whether you want it to be or not, or every variable is implicitly local and you have to nonlocal them to perform common context manager idioms. Compare with comprehensions. Changing them to use functions under the covers had no effect (other than breaking a rare use case with StopIteration passing, which I believe has been fixed), but that's only because the comprehension variable(s) were already explicitly prevented from replacing existing bindings. (The fact that there's no way to explicitly bind a variable in a comprehension helps too--no potential surprised about what "x=2" might do when statements aren't allowed in the first place.) That's obviously not true for with statements. If you think that every variable being implicitly nonlocal is a good thing, that's certainly arguable (maybe no existing code would ever notice the difference, and new code that did wouldn't be surprised by it?), but only of you make that case instead of trying to argue that there isn't an issue in the first place.

Syntax semantics stay the same. Ø It's either one or the other: either every variable is implicitly nonlocal whether you want it to be or not, or every variable is implicitly local and you have to nonlocal them to perform common context manager idioms. “Local” or “nonlocal” only has meaning within a function definition. one that starts with “def function():” I’m not suggesting new syntax. The code remains just a regular block, inside the “with” keyword and the variables there have the same binding as if the “with” statement were removed. You are not defining a function, but the compiler _is_ producing a new kind of callable. A “block” object, perhaps. It might not _need_ to be technically a new kind of callable, perhaps such a block is implementable within the existing “function” type. But that is merely an implementation detail. My proposal thus has no changes on syntax, merely on how the block is invoked. I suggest the code block be invoked explicitly by a “new-style context manager” rather than implicitly by the interpreter, inside the frame of __enter__/__exit__ K From: Andrew Barnert [mailto:abarnert@yahoo.com] Sent: 21. október 2013 23:37 To: Kristján Valur Jónsson Cc: Chris Angelico; python-ideas Subject: Re: [Python-ideas] A different kind of context manager Obviously semantics don't stay exactly the same or there would be no benefits to the change. The whole point is that you're creating a function with closure and visibly passing it to a method of the context manager. It's either one or the other: either every variable is implicitly nonlocal whether you want it to be or not, or every variable is implicitly local and you have to nonlocal them to perform common context manager idioms. Compare with comprehensions. Changing them to use functions under the covers had no effect (other than breaking a rare use case with StopIteration passing, which I believe has been fixed), but that's only because the comprehension variable(s) were already explicitly prevented from replacing existing bindings. (The fact that there's no way to explicitly bind a variable in a comprehension helps too--no potential surprised about what "x=2" might do when statements aren't allowed in the first place.) That's obviously not true for with statements. If you think that every variable being implicitly nonlocal is a good thing, that's certainly arguable (maybe no existing code would ever notice the difference, and new code that did wouldn't be surprised by it?), but only of you make that case instead of trying to argue that there isn't an issue in the first place.

No, it wouldn't. Read PEP 340 (the one Guido wrote *before* descoping it in PEP 343). The problem with offering true blocks is that they immediately create multiple ways to do a lot of different things, and this callback based variant also plays merry hell with the scoping rules. That said, something I *have* been thinking might work better than the status quo is permitting a variant of tuple unpacking that enters each context manager as it is produced and provides a tuple of the results. So this would work properly: with *(open(name) for name in names) as files: ... And you could factor out with statement skipping as a function returning a 2-tuple (although unpacking the value would be a little annoying in that case). Cheers, Nick.

On Mon, Oct 21, 2013 at 6:55 AM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:
Cool, sure. But what are the use cases that need this and can't be done easily with the existing design? class NewContextManager(object):
# A context manager that locks a resource, then executes the code only if
You can do that with current context managers: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() @contextmanager def lock_once(lock): if lock.active: yield False else: lock.active = True try: yield True finally: lock.active = False Note that I'm mimicking your lock/unlock code which of course is not the proper way to acquire/release a lock, but it gets the idea across. How would making the code inside the with block a callable improve this? I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. Note that my version also allows me to do this, which I can't easily do with your context manager: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() else: log('Lock %r was already acquired', x) do_stuff_every_time() --- Bruce I'm hiring: http://www.cadencemd.com/info/jobs Latest blog post: Alice's Puzzle Page http://www.vroospeak.com Learn how hackers think: http://j.mp/gruyere-security

Ø Cool, sure. But what are the use cases that need this and can't be done easily with the existing design? Exactly those I listed. Any form of execution that requires a "function" to be run. This include all existing threading/multiprocessing designs. Ø How would making the code inside the with block a callable improve this? It allows the "if" statement to be part of the context manager Ø I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. That is a matter of taste. Maybe it would make sense to pull other things out of the context manager too? But taste should not, IMHO, limit our options. If this is a common idiom, locking and executing once, why shouldn't we be able to write a clever macro/context manger/syntactic sugar to help us with that? Why insist on this verbosity? Also, consider my argument that most context managers are written using the @contextmanager paradigm. We like this idiom so much because this is what we really want to do, call the code from a wrapper function. If you look at the design of that context manager, it is not exactly straightforward. This suggests to me that maybe we took a wrong turn deciding on a context manager design. Maybe we should have selected one in which this sort of coding is its native, natural, form, rather than having this intermarriage kludge which turns an imperative-looking generator into the traditional context manager. From: Bruce Leban [mailto:bruce@leapyear.org] Sent: 21. október 2013 17:22 To: Kristján Valur Jónsson Cc: python-ideas@python.org Subject: Re: [Python-ideas] A different kind of context manager On Mon, Oct 21, 2013 at 6:55 AM, Kristján Valur Jónsson <kristjan@ccpgames.com<mailto:kristjan@ccpgames.com>> wrote: So, If this is the way people like to think about context managers, like writing wrapper functoins, why don't we turn them into proper wrapper functions? <...> The cool thing here though, is that "code" could, for example, be run on a different tasklet. Or a different thread. Or a different universe. Cool, sure. But what are the use cases that need this and can't be done easily with the existing design? class NewContextManager(object): # A context manager that locks a resource, then executes the code only if it is not recursing def __init__(self, lock): self.lock = lock def __contextcall__(self, code): with lock: if lock.active: return # This is where @contextmanager will stop you, you can't skip the 'yield' lock.active = True try: return code(None) # optionally pass value to the code as in "with foo() as X" finally: lock.active = False You can do that with current context managers: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() @contextmanager def lock_once(lock): if lock.active: yield False else: lock.active = True try: yield True finally: lock.active = False Note that I'm mimicking your lock/unlock code which of course is not the proper way to acquire/release a lock, but it gets the idea across. How would making the code inside the with block a callable improve this? I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. Note that my version also allows me to do this, which I can't easily do with your context manager: with lock_once(x) as lock_acquired: if lock_acquired: # was not already locked do_stuff_once() else: log('Lock %r was already acquired', x) do_stuff_every_time() --- Bruce I'm hiring: http://www.cadencemd.com/info/jobs Latest blog post: Alice's Puzzle Page http://www.vroospeak.com<http://www.vroospeak.com/> Learn how hackers think: http://j.mp/gruyere-security

I agree with this 100%. Unfortunately, Python picked the current style of `with` statements a while ago, and this would be a pretty huge change/additional feature. FWIW, other languages with easy anonymous functions (e.g. ruby, scala) have this, and it does provide all the benefits that you describe. For example, spinning off parallel tasklets inline is just a matter of `async{ ... }` where `async` is just a context manager. As much as I'd like to see python follow suite, getting it into python would be a pretty large upheaval and not something that I expect to happen in the near future. And the last option, if you're crazy, is to use MacroPy and write your own tasklet/forking context managers! It's surprisingly easy (< 50 lines). On Mon, Oct 21, 2013 at 2:16 PM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:

On 22 Oct 2013 07:27, "Haoyi Li" <haoyi.sg@gmail.com> wrote:
Maybe we should have selected one in which this sort of coding is its
native, natural, form, rather than having this intermarriage kludge which turns an imperative-looking generator into the traditional context manager.
I agree with this 100%. Unfortunately, Python picked the current style of
`with` statements a while ago, and this would be a pretty huge change/additional feature.
FWIW, other languages with easy anonymous functions (e.g. ruby, scala)
have this, and it does provide all the benefits that you describe. For example, spinning off parallel tasklets inline is just a matter of `async{ ... }` where `async` is just a context manager. It's not a coincidence that Ruby (at least - I don't know scala) just treats for loops and context management as special cases of anonymous callbacks - the latter is a powerful, more general construct. By contrast, Python chose the path of providing dedicated syntax for both iteration *and* context management, and hence requires that callbacks that don't fit in a single expression be defined prior to use. I think this makes those constructs easier to understand in many ways, but it *also* means that we *don't* currently have a clean syntax for single use callbacks. Hence the time I've put into PEP 403 and 3150 over the years - a key objective for both of them is providing a cleaner solution for the problem of single use callbacks (including those that modify local variables of the containing function). In addition to scoping, the other problem single use callbacks need to handle sensibly is the behaviour of the simple flow control statements: return, yield, break, continue, and raise. Building single use callbacks on top of the "def" statement has the advantage of *not* needing to define any new scoping or control flow semantics (as they're just ordinary nested scopes). Defining them any other way makes things far more complicated. It would certainly be close to impossible to repurpose any of the other existing compound statements without breaking backwards compatibility. A completely new keyword is also a possibility, but then it's necessary to find a good one, and explain it's use cases in a fashion similar to PEP 403. Cheers, Nick.
And the last option, if you're crazy, is to use MacroPy and write your
own tasklet/forking context managers! It's surprisingly easy (< 50 lines).
On Mon, Oct 21, 2013 at 2:16 PM, Kristján Valur Jónsson <
Also, consider my argument that most context managers are written using
kristjan@ccpgames.com> wrote: this? things out of the context manager too? But taste should not, IMHO, limit our options. If this is a common idiom, locking and executing once, why shouldn’t we be able to write a clever macro/context manger/syntactic sugar to help us with that? Why insist on this verbosity? the @contextmanager paradigm. We like this idiom so much because this is what we really want to do, call the code from a wrapper function. this intermarriage kludge which turns an imperative-looking generator into the traditional context manager. proper way to acquire/release a lock, but it gets the idea across. How would making the code inside the with block a callable improve this? I think this code is easier to read than yours as the logic of whether or not the do_stuff_once block is executed is where it belongs -- not hidden in the context manager. Note that my version also allows me to do this, which I can't easily do with your context manager:

Thanks for your detailed reply, Nick. Good to see that I am not completely bonkers and that swimming entirely alone against the flow. I realise of course that we are Python and not Ruby (I got to learn Ruby before Python, btw) and that this is not particularly likely to come to anything. But remember how Python retro-fitted Ruby's object model into its "new-style" classes? Ruby was written that way all along. (CS buffs out there will likely point out to me that this was not an original Matz invention). Perhaps, with persistent dripping, we can slowly hollow the proverbial stone. Cheers, K From: Nick Coghlan [mailto:ncoghlan@gmail.com] Sent: 22. október 2013 03:32 To: Haoyi Li Cc: python-ideas@python.org; Kristján Valur Jónsson Subject: Re: [Python-ideas] A different kind of context manager On 22 Oct 2013 07:27, "Haoyi Li" <haoyi.sg@gmail.com<mailto:haoyi.sg@gmail.com>> wrote:
It's not a coincidence that Ruby (at least - I don't know scala) just treats for loops and context management as special cases of anonymous callbacks - the latter is a powerful, more general construct. By contrast, Python chose the path of providing dedicated syntax for both iteration *and* context management, and hence requires that callbacks that don't fit in a single expression be defined prior to use. I think this makes those constructs easier to understand in many ways, but it *also* means that we *don't* currently have a clean syntax for single use callbacks. Hence the time I've put into PEP 403 and 3150 over the years - a key objective for both of them is providing a cleaner solution for the problem of single use callbacks (including those that modify local variables of the containing function). In addition to scoping, the other problem single use callbacks need to handle sensibly is the behaviour of the simple flow control statements: return, yield, break, continue, and raise. Building single use callbacks on top of the "def" statement has the advantage of *not* needing to define any new scoping or control flow semantics (as they're just ordinary nested scopes). Defining them any other way makes things far more complicated. It would certainly be close to impossible to repurpose any of the other existing compound statements without breaking backwards compatibility. A completely new keyword is also a possibility, but then it's necessary to find a good one, and explain it's use cases in a fashion similar to PEP 403. Cheers,

On 2013-10-23, at 16:55 , Kristján Valur Jónsson wrote:
(CS buffs out there will likely point out to me that this was not an original Matz invention).
You don't need CS buffs to point it out, it was an implementation detail leaking into semantic incompatibility between types implemented in C and classes implemented in Python: http://python-history.blogspot.be/2010/06/new-style-classes.html Python was fairly unique in having this dichotomy between built-in and user-defined types[0]. [0] but not anymore, Go has repeated this mistake, amongst others.

That's not what I was referring to, rather the class model that blew my mind when learing Ruby back in 2000. I'm not a CS, so this was new to me (knowing OOP only from C++): - Classes, metaclasses, methor resolution order. - All classes are subclasses of "object" - object itself is an instance of "type", type being a "metaclass". - type, too, is a subclass of object. Type is its own metaclass, so "type" is an instance of "type". Ruby was designed with this model in mind, it only arrived later into Python. K

On Thu, Oct 24, 2013 at 3:36 AM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:
Are you sure? I wrote about metaclasses in Python in 1998: http://www.python.org/doc/essays/metaclasses/ New-style classes were just the second or third iteration of the idea. -- --Guido van Rossum (python.org/~guido)

I'm not sure about anything :). In particular, I don't know where Ruby's object model originates. And Ruby 1.0 came out in 1996. I'm sure that the model of "object" and "type" (or other equivalent names) is older, though. Could be a simplification of Smalltalk's object model, for example. Well, looking this up, this is what Wikipedia says, in fact. But I recall someone, somewhere, mentioning that this system is based on a proper Paper by someone :) But Python and Ruby's models are quite similar in structure. I don't know if Python's new-style classes were inspired by Ruby or not, perhaps it is a case of convergent evolution. Cheers, K From: gvanrossum@gmail.com [mailto:gvanrossum@gmail.com] On Behalf Of Guido van Rossum Sent: 24. október 2013 15:26 To: Kristján Valur Jónsson Cc: python-ideas ideas Subject: Re: [Python-ideas] A different kind of context manager On Thu, Oct 24, 2013 at 3:36 AM, Kristján Valur Jónsson <kristjan@ccpgames.com<mailto:kristjan@ccpgames.com>> wrote: Ruby was designed with this model in mind, it only arrived later into Python. Are you sure? I wrote about metaclasses in Python in 1998: http://www.python.org/doc/essays/metaclasses/ New-style classes were just the second or third iteration of the idea.

As log as we are speculating about the origins of language features, I feel the need to set the record straight. I was not inspired by Ruby at that point (or ever :-). Ruby was in fact inspired by Python. Mats once told me that his inspiration was 20% Python, 80% Perl, and that Larry Wall is his hero. I was inspired to implement new-style classes by a very specific book, "Putting Metaclasses to Work" by Ira Forman and Scott Danforth ( http://www.amazon.com/Putting-Metaclasses-Work-Ira-Forman/dp/0201433052). But even Python's original design (in 1990, published in 1991) had the notion that 'type' was itself an object. The type pointer in any object has always been a pointer to a special object, whose "data" was a bunch of C function pointers implementing the behavior of other objects, similar to a C++ vtable. The type of a type was always a special type object, which you could call a meta-type, to be recognized because it was its own type. I was only vaguely aware of Smalltalk at the time; I remember being surprised by its use of metaclasses when I read about them much later. Smalltalk's bytecode was a bigger influence of Python's bytecode though. I'd read about it in a book by Adele Goldberg and others, I believe "Smalltalk-80: The Language and its Implementation" ( http://www.amazon.com/Smalltalk-80-The-Language-its-Implementation/dp/020111... ). On Thu, Oct 24, 2013 at 8:59 AM, Kristján Valur Jónsson < kristjan@ccpgames.com> wrote:
-- --Guido van Rossum (python.org/~guido)

Thanks, Guido, This is in fact very interesting. I'll be sure to not wildly speculate out of my posterior on these matters again, but refer to the facts :) Another data point indicating that convergent evolution does, in fact, exist. K From: gvanrossum@gmail.com [mailto:gvanrossum@gmail.com] On Behalf Of Guido van Rossum Sent: 24. október 2013 16:55 To: Kristján Valur Jónsson Cc: python-ideas ideas Subject: Re: [Python-ideas] A different kind of context manager As log as we are speculating about the origins of language features, I feel the need to set the record straight. I was not inspired by Ruby at that point (or ever :-). Ruby was in fact inspired by Python. Mats once told me that his inspiration was 20% Python, 80% Perl, and that Larry Wall is his hero. I was inspired to implement new-style classes by a very specific book, "Putting Metaclasses to Work" by Ira Forman and Scott Danforth (http://www.amazon.com/Putting-Metaclasses-Work-Ira-Forman/dp/0201433052). But even Python's original design (in 1990, published in 1991) had the notion that 'type' was itself an object. The type pointer in any object has always been a pointer to a special object, whose "data" was a bunch of C function pointers implementing the behavior of other objects, similar to a C++ vtable. The type of a type was always a special type object, which you could call a meta-type, to be recognized because it was its own type. I was only vaguely aware of Smalltalk at the time; I remember being surprised by its use of metaclasses when I read about them much later. Smalltalk's bytecode was a bigger influence of Python's bytecode though. I'd read about it in a book by Adele Goldberg and others, I believe "Smalltalk-80: The Language and its Implementation" (http://www.amazon.com/Smalltalk-80-The-Language-its-Implementation/dp/020111...). --Guido van Rossum (python.org/~guido<http://python.org/~guido>)

Kristján, Your replies would be much easier to read if trimmed the previous email. Thanks. -- ~Ethan~

You can get the desired behavior by (ab)using function decorators, by rewriting with as_subprocess():
do_stuff()
as @as_subprocess def _(): <do stuff> Yes, it's not very elegant syntactically but gets the work done (and this technique is generalizable to most uses of Ruby-style blocks, I believe). Antony

Nice suggestion. And I can also do: def _(): do_stuff() result = execute_subprocess(_) i.e. just do it manually, rather than with the decorator. Your suggestion, though, is actually not such a bad pattern, but it has a few drawbacks: 1) You need the “nonlocal” qualifier to pass values out of it 2) It has the side effect of setting _ 3) It is a bit non-intuitive, particularly when decorators start taking arguments. When is the decorator run? This is not always immediately clear. Well, it is simpler than a regular decorator, since it will invoke the target function itself… 4) The syntax is not nice. Decorators themselves were invented as syntactic sugar to get rid of the def foo(): … foo = bar() pattern. Maybe I should revise my suggestion then? A new syntax that does the above, i.e.: new_with cm as bar: # or whatever keyword is deemed appropriate. do_stuff() compiles to: @cm def _( _bar): pragma(“nonlocal”, 1) # moves binding one step upwards bar = _bar do_stuff() But with _ and _bar magically hidden. The only thing really needed, then is the support for “pragma(“nonlocal”, 1)” or an equivalent way of changing the default binding of variables, and compiler magic for syntax. K From: Antony Lee [mailto:anntzer.lee@gmail.com] Sent: 22. október 2013 00:18 To: python-ideas@googlegroups.com Cc: python-ideas@python.org; Kristján Valur Jónsson Subject: Re: [Python-ideas] A different kind of context manager You can get the desired behavior by (ab)using function decorators, by rewriting with as_subprocess(): do_stuff() as @as_subprocess def _(): <do stuff> Yes, it's not very elegant syntactically but gets the work done (and this technique is generalizable to most uses of Ruby-style blocks, I believe). Antony

Ah, I forgot about the changed flow control with “return”, “break”, “continue”. The decorator way does not allow that. A special “block” callable would be needed, with special return opcodes. K From: Kristján Valur Jónsson Sent: 23. október 2013 14:48 To: 'Antony Lee'; python-ideas@googlegroups.com Cc: python-ideas@python.org Subject: RE: [Python-ideas] A different kind of context manager Nice suggestion. And I can also do: def _(): do_stuff() result = execute_subprocess(_) i.e. just do it manually, rather than with the decorator. Your suggestion, though, is actually not such a bad pattern, but it has a few drawbacks: 1) You need the “nonlocal” qualifier to pass values out of it 2) It has the side effect of setting _ 3) It is a bit non-intuitive, particularly when decorators start taking arguments. When is the decorator run? This is not always immediately clear. Well, it is simpler than a regular decorator, since it will invoke the target function itself… 4) The syntax is not nice. Decorators themselves were invented as syntactic sugar to get rid of the def foo(): … foo = bar() pattern. Maybe I should revise my suggestion then? A new syntax that does the above, i.e.: new_with cm as bar: # or whatever keyword is deemed appropriate. do_stuff() compiles to: @cm def _( _bar): pragma(“nonlocal”, 1) # moves binding one step upwards bar = _bar do_stuff() But with _ and _bar magically hidden. The only thing really needed, then is the support for “pragma(“nonlocal”, 1)” or an equivalent way of changing the default binding of variables, and compiler magic for syntax. K From: Antony Lee [mailto:anntzer.lee@gmail.com] Sent: 22. október 2013 00:18 To: python-ideas@googlegroups.com<mailto:python-ideas@googlegroups.com> Cc: python-ideas@python.org<mailto:python-ideas@python.org>; Kristján Valur Jónsson Subject: Re: [Python-ideas] A different kind of context manager You can get the desired behavior by (ab)using function decorators, by rewriting with as_subprocess(): do_stuff() as @as_subprocess def _(): <do stuff> Yes, it's not very elegant syntactically but gets the work done (and this technique is generalizable to most uses of Ruby-style blocks, I believe). Antony
participants (11)
-
Andrew Barnert
-
Antony Lee
-
Bruce Leban
-
Chris Angelico
-
Ethan Furman
-
Guido van Rossum
-
Haoyi Li
-
Kristján Valur Jónsson
-
Masklinn
-
Nick Coghlan
-
Philipp A.