[Python-ideas] Adding __getter__ to compliment __iter__.
Steven D'Aprano
steve at pearwood.info
Thu Jul 18 04:27:51 CEST 2013
On 18/07/13 06:51, Ron Adam wrote:
> I played around with trying to find something that would work like the example Nick put up and found out that the different python types are not similar enough in how they do things to make a function that takes a method or other operator work well.
I don't understand this paragraph. Functions that take methods/operators/other functions work perfectly well, they're called second-order functions. Decorators, factory functions, map, filter, and functools.reduce are all good examples of this.
> What happens is you either end up with widely varying results depending on how the methods are implemented on each type, or an error because only a few methods are very common on all types. Mostly introspection methods.
Yes. If you call a function f on arbitrary objects, some of those objects may be appropriate arguments to f, some may not. What's your point?
> I believe this to be stronger underlying reason why functions like reduce and map were removed. And it's also a good reason not to recommend functions like sum() for things other than numbers.
reduce and map have not been removed. map hasn't even been moved out of the builtins.
> To use functions similar to that, you really have to think about what will happen in each case because the gears of the functions and methods are not visible in the same way a comprehension or generator expression is.
I don't understand this sentence.
> It's too late to change how a lot of those methods work and I'm not sure it will still work very well.
>
> One of the most consistent protocols python has is the iterator and generator protocols. The reason they work so well is that they need to interface with for-loops and nearly all containers support that.
>
> examples...
>
>>>> a = [1,2,3]
>>>> iter(a)
> <list_iterator object at 0x7f3bc9306e90>
What point are you trying to make? Builtins have custom iterator types. And? That's an implementation choice. One might make different choices:
py> type(iter(set([]))) is type(iter(frozenset([])))
True
Sets and frozen sets, despite being different types, share the same iterator type.
> And is why chain is the recommended method of joining multiple containers. This really only addresses getting stuff OUT of containers.
>
> PEP 448's * unpacking in comprehensions helps with the problem of putting things into containers. But that isn't the PEP's main point.
>
Now we come to your actual proposal:
> What I'm thinking of is the inverse operation of an iter. Lets call it a "getter".
> You would get a getter the same way you get an iter.
>
> g = getter(obj)
>
> But instead of __next__ or send() methods, it would have an iter_send(), or isend() method. The isend method would takes an iter object, or an object that iter() can be called on.
>
> The getter would return either the object it came from, or a new object depending on weather or not it was created from a mutable or immutable obj.
>
>
> Mutable objects...
>
> g = getter(A) # A needs a __getter__ method.
> A = g.isend(B)
What's B? Why is it needed as an argument, since g was fully specified by A only. That is:
g.isend(B)
g.isend(None)
g.isend(42)
etc. should all return the same A, so what is the purpose of passing the argument?
> A += B # extend
Since we don't know what A is, we cannot know in advance that it has an __iadd__ method that is the same as extend.
I don't really understand why I would want to do this:
start with an object A
call getter(A) to create a "getter" object g
call g.isend() to get A back again
call some method on A
when I could just do this:
start with an object A
call some method on A
Nor do I understand what this has to do with iterators and generators, or why the method is called "isend" (iter_send). As far as I can tell, the only similarity between your getter and the built-in iter is that they are both functions that take a single argument.
> Mutable objects...
>
> g = getter(A)
> C = g.isend(B)
>
> C = A + B # join
>
> The point, is to have something that works on many types and is as consistent in how it's defined as the iter protocol. Having a strict and clear definition is a very important!
This last sentence is very true. Would you like to give us a strict and clear definition of your getter proposal?
> The internal implementation of a getter could do a direct copy to make it faster, like slicing does, but that would be a private implementation detail.
A direct copy of what? A? Then why not spell it like this?
A = copy.copy(A)
instead of
A = getter(A).isend(23)
> They don't replace generator expressions or comprehensions. Those generally will do something with each item.
>
> Functions like extend() and concat() could be implemented with *getter-iters*, and work with a larger variety of objects with much less work and special handling.
>
> def extend(A, B):
> return getter(A).isend(B)
>
> def concat(A, B):
> """ Extend A with multiple containers from B. """
> g = getter(A)
> if g.isend() is not A:
> raise TypeError("can't concat immutable arg, use merge()")
> for x in B:
> g.isend(x)
> return A
How is that better than this?
def concat(A, B):
""" Extend A with multiple containers from B. """
for x in B:
A.extend(x)
return A
(But note, that's not the definition of concat() I would expect. I would expect concat to return a new object, not modify A in place.)
> Expecting many holes to be punched in this idea ...
> But hope not too many. ;-)
I'm afraid that to me the idea seems too incoherent to punch holes in it.
--
Steven
More information about the Python-ideas
mailing list