[scikit-learn] Using GPU in scikit learn
Sebastian Raschka
mail at sebastianraschka.com
Wed Aug 8 21:00:44 EDT 2018
Hi,
scikit-learn doesn't support computations on the GPU, unfortunately. Specifically for random forests, there's CudaTree, which implements a GPU version of scikit-learn's random forests. It doesn't look like the library is actively developed (hard to tell whether that's a good thing or a bad thing -- whether it's stable enough that it didn't need any updates). Anyway, maybe worth a try: https://github.com/EasonLiao/CudaTree
Otherwise, I can imagine there are probably alternative implementations out there?
Best,
Sebastian
> On Aug 8, 2018, at 7:50 PM, hoang trung Ta <tahoangtrung at gmail.com> wrote:
>
> Dear all members,
>
> I am using Random forest for classification satellite images. I have a bunch of images, thus the processing is quite slow. I searched on the Internet and they said that GPU can accelerate the process.
>
> I have GPU NDVIA Geforce GTX 1080 Ti installed in the computer
>
> Do you know how to use GPU in Scikit learn, I mean the packages to use and sample code that used GPU in random forest classification?
>
> Thank you very much
>
> --
> Ta Hoang Trung (Mr)
>
> Master student
> Graduate School of Life and Environmental Sciences
> University of Tsukuba, Japan
>
> Mobile: +81 70 3846 2993
> Email : ta.hoang-trung.xm at alumni.tsukuba.ac.jp
> tahoangtrung at gmail.com
> s1626066 at u.tsukuba.ac.jp
> ----
> Mapping Technician
> Department of Surveying and Mapping Vietnam
> No 2, Dang Thuy Tram street, Hanoi, Viet Nam
>
> Mobile: +84 1255151344
> Email : tahoangtrung at gmail.com
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
More information about the scikit-learn
mailing list