Adding a Par construct to Python?
Steven D'Aprano
steve at REMOVE-THIS-cybersource.com.au
Sun May 17 20:04:45 EDT 2009
On Sun, 17 May 2009 20:36:36 +0200, Diez B. Roggisch wrote:
>> But reduce() can't tell whether the function being applied is
>> commutative or not. I suppose it could special-case a handful of
>> special cases (e.g. operator.add for int arguments -- but not floats!)
>> or take a caller- supplied argument that tells it whether the function
>> is commutative or not. But in general, you can't assume the function
>> being applied is commutative or associative, so unless you're willing
>> to accept undefined behaviour, I don't see any practical way of
>> parallelizing reduce().
>
> def reduce(operation, sequence, startitem=None, parallelize=False)
>
> should be enough. Approaches such as OpenMP also don't guess, they use
> explicit annotations.
It would be nice if the OP would speak up and tell us what he intended,
so we didn't have to guess what he meant. We're getting further and
further away from his original suggestion of a "par" loop.
If you pass parallize=True, then what? Does it assume that operation is
associative, or take some steps to ensure that it is? Does it guarantee
to perform the operations in a specific order, or will it potentially
give non-deterministic results depending on the order that individual
calculations come back?
As I said earlier, parallelizing map() sounds very plausible to me, but
the approaches that people have talked about for parallelizing reduce()
so far sound awfully fragile and magically to me. But at least I've
learned one thing: given an associative function, you *can* parallelize
reduce using a tree. (Thanks Roy!)
--
Steven
More information about the Python-list
mailing list