[IPython-dev] IPython + R + SparkR

Brian Granger ellisonbg at gmail.com
Sun Aug 9 22:36:31 EDT 2015


This should not require a custom kernel. Unlike PySpark, I think that
SparkR is installed as a regular R package. The only thing you should
have to do is set the SPARK_HOME. If there is a problem though, it
would probably be related to the event loop of the R kernel
conflicting with how R is talking to the JVM. I don't know anything
about that though. You might want to post this on the jupyter google
group as there are more people interested in jupyter+spark there.

On Sat, Aug 8, 2015 at 5:42 AM, Daniel Dean <kesmier84 at gmail.com> wrote:
> Hi,
>
> We are interested in running an iPython notebook supporting SparkR. This is possible with PySpark, as illustrated by the well written article:
>
> http://ramhiser.com/2015/02/01/configuring-ipython-notebook-support-for-pyspark/
>
> However, when trying a similar approach with R&SparkR, we found the shell does not load. Furthermore, we can manually load the SparkR library, but when performing any spark operations, the R kernel crashes.
>
> Is this type of functionality possible or would it require a custom kernel?
>
> Regards,
> Daniel
>
> _______________________________________________
> IPython-dev mailing list
> IPython-dev at scipy.org
> http://mail.scipy.org/mailman/listinfo/ipython-dev



-- 
Brian E. Granger
Cal Poly State University, San Luis Obispo
@ellisonbg on Twitter and GitHub
bgranger at calpoly.edu and ellisonbg at gmail.com



More information about the IPython-dev mailing list