Parallelization of Python on GPU?
Steven D'Aprano
steve+comp.lang.python at pearwood.info
Wed Feb 25 22:02:56 EST 2015
John Ladasky wrote:
> What I would REALLY like to do is to take advantage of my GPU.
I can't help you with that, but I would like to point out that GPUs
typically don't support IEE-754 maths, which means that while they are
likely significantly faster, they're also likely significantly less
accurate. Any any two different brands/models of GPU are likely to give
different results. (Possibly not *very* different, but considering the mess
that floating point maths was prior to IEEE-754, possibly *very* different.)
Personally, I wouldn't trust GPU floating point for serious work. Maybe for
quick and dirty exploration of the data, but I'd then want to repeat any
calculations using the main CPU before using the numbers anywhere :-)
--
Steve
More information about the Python-list
mailing list