First of all I'm new to this. I tried figuring out if some inquiry like mine below already was posted but I couldn't find it, frankly partly because I don't know what to look for, I'm not sure if there's a name for this idea. I'm not convinced my idea below is solid so apologies if it's naïve but I figured to post it anyway. It has to do with the possibility to fold typical opcodes pairs by introducing a language construct. The idea is to be able to write this code: myobject.a myobject.b myobject.c() myobject.d = 1 like this: using myobject: .a .b .c() .d = 1 The first version would lead to these instructions: LOAD_FAST 0 (self) LOAD_ATTR 0 (a) POP_TOP LOAD_FAST 0 (self) LOAD_ATTR 1 (b) POP_TOP LOAD_FAST 0 (self) LOAD_ATTR 2 (c) CALL_FUNCTION 0 POP_TOP LOAD_CONST 1 (1) LOAD_FAST 0 (self) STORE_ATTR 3 (d) LOAD_CONST 0 (None) the using keyword would grab an internal handle to the object, leading to the ability to reduce the number of opcodes by introducing folded ones: PUSH_HANDLE 0 (myobject) LOAD_HATTR 0 (a) POP_TOP LOAD_HATTR 1 (b) POP_TOP LOAD_HATTR 1 (b) CALL_FUNCTION 0 (c) POP_TOP LOAD_CONST 1 (1) STORE_HATTR 3 (d) POP_HANDLE LOAD_CONST 0 (None) The rationale behind is that both typographically for the programmer this is more elegant than typing the variable name over and again. For the internals of Python it would reduce the number of handled opcodes at the cost of a number of new opcodes: PUSH_HANDLE => pushes object to 'direct handle stack' LOAD_HATTR = LOAD_FAST + LOAD_ATTR STORE_HATTR = LOAD_FAST + STORE_ATTR POP_HANDLE => pops object to 'direct handle stack' Since these pairs are quite numerous this could lead to a speed gain for code that uses a lot of "self." or "object." invocations. Shoot me. Robert