On 29.05.20 20:38, David Mertz wrote:

On Fri, May 29, 2020 at 1:56 PM Rhodri James <rhodri@kynesim.co.uk> wrote:
Presumably "delayed" is something that would be automatically applied to
the actual parameter given, otherwise your call graphs might or might
not actually be call graphs depending on how the function was called.
What happens if I call "foo(y=0)" for instance?

I am slightly hijacking the thread.  I think the "solution" to the narrow "problem" of mutable default arguments is not at all worth having.  So yes, if that was the only, or even main, purpose of a hypothetical 'delayed' keyword and 'DelayedType', it would absolutely not be worthwhile.  It would just happen to solve that problem as a side effect.

Where I think it is valuable is the idea of letting all the normal operations work on EITHER a DelayedType or whatever type the operation would otherwise operate on.  So no, `foo(y=0)` would pass in a concrete type and do greedy calculations, nothing delayed, no in-memory call graph (other than whatever is implicit in the bytecode).

I'm still struggling to imagine a real use case which can't already be solved by generators. Usually the purpose of such computation graphs is to execute on some specialized hardware or because you want to backtrack through the graph (e.g. Tensorflow, PyTorch, etc). Dask seems to be similar in a sense that the user can choose different execution models for the graph.

With generators you also don't have the problem of "concretizing" the result since any function that consumes an iterable naturally does this. If you really want to delay such a computation it's easy to write a custom type or generator function to do so and then use `next` (or even `concretize = next` beforehand).