From nathan12343 at gmail.com  Tue Jan  8 18:12:37 2019
From: nathan12343 at gmail.com (Nathan Goldbaum)
Date: Tue, 8 Jan 2019 17:12:37 -0600
Subject: [IPython-dev] [ANN] 2019 Scipy Conference: Call for Proposals
Message-ID: <CAJXewOm2UUH-AG-XO9QxwUg_xaw7eDNT+FH37=384Hak-XPi9g@mail.gmail.com>

SciPy 2019, the 18th annual Scientific Computing with Python conference,
will be held July 8-14, 2019 in Austin, Texas. The annual SciPy Conference
brings together over 800 participants from industry, academia, and
government to showcase their latest projects, learn from skilled users and
developers, and collaborate on code development. The call for abstracts for
SciPy 2019 for talks, posters and tutorials is now open. The deadline for
submissions is February 10, 2019.

Conference Website: https://www.scipy2019.scipy.org/

Submission Website: https://easychair.org/conferences/?conf=scipy2019

*Talks and Posters (July 10-12, 2019)*
In addition to the general track, this year will have specialized tracks
focused on:


   - Data Driven Discoveries (including Machine Learning and Data Science)
   - Open Source Communities (Sustainability)


*Mini Symposia*

   - Science Communication through Visualization
   - Neuroscience and Cognitive Science
   - Image Processing
   - Earth, Ocean, Geo and Atmospheric Science


There will also be a SciPy Tools Plenary Session each day with 2 to 5
minute updates on tools and libraries.

*Tutorials (July 8-9, 2019)*

Tutorials should be focused on covering a well-defined topic in a hands-on
manner. We are looking for useful techniques or packages, helping new or
advanced Python programmers develop better or faster scientific
applications. We encourage submissions to be designed to allow at least 50%
of the time for hands-on exercises even if this means the subject matter
needs to be limited. Tutorials will be 4 hours in duration. In your
tutorial application, you can indicate what prerequisite skills and
knowledge will be needed for your tutorial, and the approximate expected
level of knowledge of your students (i.e., beginner, intermediate,
advanced). Instructors of accepted tutorials will receive a stipend.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190108/02903971/attachment.html>

From andreas at yank.to  Sat Jan 12 18:07:20 2019
From: andreas at yank.to (Andreas Yankopolus)
Date: Sat, 12 Jan 2019 18:07:20 -0500
Subject: [IPython-dev] Changing functions in a running program?
Message-ID: <ED156154-49C9-4FCA-BCB5-E6F8F8BD4A22@yank.to>

Is it possible to run a program to a breakpoint in IPython and then change functions within that program? This would be for interactively experimenting with program logic without having to restart the program from scratch. I?m running IPython3 from Emacs using Elpy for interacting with the IPython3 process.

Consider foo.py as follows:

#!/usr/bin/env ipython3

import sys
from IPython.core.debugger import Pdb


def print_name():
    print ("Alice")

def name():
    print_name()

def main(argv):
    print ("In main.")
    Pdb().set_trace()
    name()

if __name__ == "__main__":
    main(sys.argv[1:])

Running it gives me:

ayank at snorri:~/Documents$ ./foo.py 
In main.
> /home/ayank/Documents/foo.py(16)main()
     14     print ("In main.")
     15     Pdb().set_trace()
---> 16     name()
     17 
     18 if __name__ == "__main__":


ipdb> 

I now want to change the definition of print_name() to print ?Bob? instead of ?Alice?. I can make this change in Emacs, send the new function to IPython with C-c C-y f, but when I then type name(), I get ?Alice?. The reference to print_name() in name() is not getting updated to point to the new definition of print_name(). Likely I?m going about this process incorrectly, as I can make these kind of changes at an IPython3 prompt but not at an ipdb one.

Thanks,

Andreas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190112/ebf994bb/attachment.html>

From wes.turner at gmail.com  Sat Jan 12 20:37:52 2019
From: wes.turner at gmail.com (Wes Turner)
Date: Sat, 12 Jan 2019 20:37:52 -0500
Subject: [IPython-dev] Changing functions in a running program?
In-Reply-To: <ED156154-49C9-4FCA-BCB5-E6F8F8BD4A22@yank.to>
References: <ED156154-49C9-4FCA-BCB5-E6F8F8BD4A22@yank.to>
Message-ID: <CACfEFw_QhB24j+NDkWjqgymcFxhV1i+YmVrQ2QNCpeQCxaVoFg@mail.gmail.com>

https://en.wikipedia.org/wiki/Monkey_patch#Pitfalls

https://pypi.org/search/?q=monkeypatch

https://docs.pytest.org/en/latest/monkeypatch.html#monkeypatching-mocking-modules-and-environments

https://stackoverflow.com/questions/48112226/update-a-function-during-debugging-pdb-or-ipdb
- Does this work from just a plain ipdb shell (when not sending the new
function definition from emacs)?


On Sat, Jan 12, 2019 at 6:11 PM Andreas Yankopolus <andreas at yank.to> wrote:

> Is it possible to run a program to a breakpoint in IPython and then change
> functions within that program? This would be for interactively
> experimenting with program logic without having to restart the program from
> scratch. I?m running IPython3 from Emacs using Elpy for interacting with
> the IPython3 process.
>
> Consider foo.py as follows:
>
> #!/usr/bin/env ipython3
>
> import sys
> from IPython.core.debugger import Pdb
>
>
> def print_name():
>     print ("Alice")
>
> def name():
>     print_name()
>
> def main(argv):
>     print ("In main.")
>     Pdb().set_trace()
>     name()
>
> if __name__ == "__main__":
>     main(sys.argv[1:])
>
> Running it gives me:
>
> *ayank at snorri*:*~/Documents*$ ./foo.py
> In main.
> > /home/ayank/Documents/foo.py(16)main()
>      14     print ("In main.")
>      15     Pdb().set_trace()
> ---> 16     name()
>      17
>      18 if __name__ == "__main__":
>
>
> ipdb>
>
> I now want to change the definition of print_name() to print ?Bob?
> instead of ?Alice?. I can make this change in Emacs, send the new function
> to IPython with C-c C-y f, but when I then type name(), I get ?Alice?.
> The reference to print_name() in name() is not getting updated to point
> to the new definition of print_name(). Likely I?m going about this
> process incorrectly, as I can make these kind of changes at an IPython3
> prompt but not at an ipdb one.
>
> Thanks,
>
> Andreas
> _______________________________________________
> IPython-dev mailing list
> IPython-dev at python.org
> https://mail.python.org/mailman/listinfo/ipython-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190112/9f18bd69/attachment-0001.html>

From andreas at yank.to  Sat Jan 12 20:56:54 2019
From: andreas at yank.to (Andreas Yankopolus)
Date: Sat, 12 Jan 2019 20:56:54 -0500
Subject: [IPython-dev] IPython-dev Digest, Vol 174, Issue 2
In-Reply-To: <mailman.3460.1547343485.4818.ipython-dev@python.org>
References: <mailman.3460.1547343485.4818.ipython-dev@python.org>
Message-ID: <B8590D2C-0843-4717-B581-827F52D36191@yank.to>

Wes,

> https://stackoverflow.com/questions/48112226/update-a-function-during-debugging-pdb-or-ipdb
> - Does this work from just a plain ipdb shell (when not sending the new function definition from emacs)?

This is getting close, but updating the function as proposed doesn?t work. Using the example code from my previous post:

ayank at snorri:~/Documents$ ./foo.py 
In main.
> /home/ayank/Documents/foo.py(16)main()
     14     print ("In main.")
     15     Pdb().set_trace()
---> 16     name()
     17 
     18 if __name__ == "__main__":

ipdb> print_name()
Alice
ipdb> name()
Alice
ipdb> !def print_name(): print ("Bob")
ipdb> print_name()
Bob
ipdb> name()
Alice

We can see that the function is updated, but functions that call it do not get updated to use the new definition. Any idea what?s going on here?

This kind of thing is no problem in Lisp, but its macro facility means that support for changing code at runtime is baked deep into the language. Perhaps there?s something missed in the IPython REPL?

Thanks,

Andreas


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190112/c58cb6ad/attachment.html>

From wes.turner at gmail.com  Sat Jan 12 21:42:18 2019
From: wes.turner at gmail.com (Wes Turner)
Date: Sat, 12 Jan 2019 21:42:18 -0500
Subject: [IPython-dev] Changing functions in a running program?
In-Reply-To: <CACfEFw_QhB24j+NDkWjqgymcFxhV1i+YmVrQ2QNCpeQCxaVoFg@mail.gmail.com>
References: <ED156154-49C9-4FCA-BCB5-E6F8F8BD4A22@yank.to>
 <CACfEFw_QhB24j+NDkWjqgymcFxhV1i+YmVrQ2QNCpeQCxaVoFg@mail.gmail.com>
Message-ID: <CACfEFw9X4dkWYh+TmH9FHfPOBQdnGAGCogOtOBR9=39EuAsOmg@mail.gmail.com>

In searching for a solution, I found your question here:
https://stackoverflow.com/questions/54153888/how-to-update-nested-functions-in-ipdb

It's funny, I've been coding in Python for very many years and I've never
encountered this need.
With TDD, you basically never do this: you write a different function and
test that;
if it needs to be parametrized ("dependency injection"),
you just pass a function reference as an argument.
(If the code is designed to be changed at runtime (polymorphic),
that should be part of the callable parameter spec.)

But that doesn't help with this simple example
(or probably the code you're actually working with).

IIUC, this is ocurring because
print_name in the main() function is already bound to the previous instance.

in order to change that reference,
we need to update __self__/__module__.print_name.
Maybe coincidentally I also have no idea how to do this without importing
inspect?
If it was a class method, a simple assignment would accomplish your
objective.

This works:

ipdb> !import inspect; inspect.currentframe().f_globals['print_name'] =
lambda: print("Bob")

But there may be a simpler solution?


On Sat, Jan 12, 2019 at 8:57 PM Andreas Yankopolus <andreas at yank.to> wrote:

> Wes,
>
>
> https://stackoverflow.com/questions/48112226/update-a-function-during-debugging-pdb-or-ipdb
> - Does this work from just a plain ipdb shell (when not sending the
> new function definition from emacs)?
>
>
> This is getting close, but updating the function as proposed doesn?t work.
> Using the example code from my previous post:
>
> *ayank at snorri*:*~/Documents*$ ./foo.py
> In main.
> > /home/ayank/Documents/foo.py(16)main()
>      14     print ("In main.")
>      15     Pdb().set_trace()
> ---> 16     name()
>      17
>      18 if __name__ == "__main__":
>
> ipdb> print_name()
> Alice
> ipdb> name()
> Alice
> ipdb> !def print_name(): print ("Bob")
> ipdb> print_name()
> Bob
> ipdb> name()
> Alice
>
> We can see that the function is updated, but functions that call it do not
> get updated to use the new definition. Any idea what?s going on here?
>
> This kind of thing is no problem in Lisp, but its macro facility means
> that support for changing code at runtime is baked deep into the language.
> Perhaps there?s something missed in the IPython REPL?
>
> Thanks,
>
> Andreas
>
>
>
On Sat, Jan 12, 2019 at 8:37 PM Wes Turner <wes.turner at gmail.com> wrote:

> https://en.wikipedia.org/wiki/Monkey_patch#Pitfalls
>
> https://pypi.org/search/?q=monkeypatch
>
>
> https://docs.pytest.org/en/latest/monkeypatch.html#monkeypatching-mocking-modules-and-environments
>
>
> https://stackoverflow.com/questions/48112226/update-a-function-during-debugging-pdb-or-ipdb
> - Does this work from just a plain ipdb shell (when not sending the new
> function definition from emacs)?
>
>
> On Sat, Jan 12, 2019 at 6:11 PM Andreas Yankopolus <andreas at yank.to>
> wrote:
>
>> Is it possible to run a program to a breakpoint in IPython and then
>> change functions within that program? This would be for interactively
>> experimenting with program logic without having to restart the program from
>> scratch. I?m running IPython3 from Emacs using Elpy for interacting with
>> the IPython3 process.
>>
>> Consider foo.py as follows:
>>
>> #!/usr/bin/env ipython3
>>
>> import sys
>> from IPython.core.debugger import Pdb
>>
>>
>> def print_name():
>>     print ("Alice")
>>
>> def name():
>>     print_name()
>>
>> def main(argv):
>>     print ("In main.")
>>     Pdb().set_trace()
>>     name()
>>
>> if __name__ == "__main__":
>>     main(sys.argv[1:])
>>
>> Running it gives me:
>>
>> *ayank at snorri*:*~/Documents*$ ./foo.py
>> In main.
>> > /home/ayank/Documents/foo.py(16)main()
>>      14     print ("In main.")
>>      15     Pdb().set_trace()
>> ---> 16     name()
>>      17
>>      18 if __name__ == "__main__":
>>
>>
>> ipdb>
>>
>> I now want to change the definition of print_name() to print ?Bob?
>> instead of ?Alice?. I can make this change in Emacs, send the new function
>> to IPython with C-c C-y f, but when I then type name(), I get ?Alice?.
>> The reference to print_name() in name() is not getting updated to point
>> to the new definition of print_name(). Likely I?m going about this
>> process incorrectly, as I can make these kind of changes at an IPython3
>> prompt but not at an ipdb one.
>>
>> Thanks,
>>
>> Andreas
>> _______________________________________________
>> IPython-dev mailing list
>> IPython-dev at python.org
>> https://mail.python.org/mailman/listinfo/ipython-dev
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190112/842018b0/attachment-0001.html>

From takowl at gmail.com  Sun Jan 13 03:13:04 2019
From: takowl at gmail.com (Thomas Kluyver)
Date: Sun, 13 Jan 2019 08:13:04 +0000
Subject: [IPython-dev] Changing functions in a running program?
In-Reply-To: <ED156154-49C9-4FCA-BCB5-E6F8F8BD4A22@yank.to>
References: <ED156154-49C9-4FCA-BCB5-E6F8F8BD4A22@yank.to>
Message-ID: <CAOvn4qi8ESV9r1Qnw9UsCNL97q5tG8zxh9eQ7KUgYGO=K3gcCQ@mail.gmail.com>

Hi Andreas,

If you define a function or variable at the breakpoint, it's probably
making it a local variable inside main(), so it's not in scope for name().

You may be able to get round this by defining the new function and then
explicitly making it a global variable, something like this:

globals()['print_name'] = print_name

Not exactly elegant, but hopefully it works.
Thomas

On Sun, 13 Jan 2019 at 00:10, Andreas Yankopolus <andreas at yank.to> wrote:

> Is it possible to run a program to a breakpoint in IPython and then change
> functions within that program? This would be for interactively
> experimenting with program logic without having to restart the program from
> scratch. I?m running IPython3 from Emacs using Elpy for interacting with
> the IPython3 process.
>
> Consider foo.py as follows:
>
> #!/usr/bin/env ipython3
>
> import sys
> from IPython.core.debugger import Pdb
>
>
> def print_name():
>     print ("Alice")
>
> def name():
>     print_name()
>
> def main(argv):
>     print ("In main.")
>     Pdb().set_trace()
>     name()
>
> if __name__ == "__main__":
>     main(sys.argv[1:])
>
> Running it gives me:
>
> *ayank at snorri*:*~/Documents*$ ./foo.py
> In main.
> > /home/ayank/Documents/foo.py(16)main()
>      14     print ("In main.")
>      15     Pdb().set_trace()
> ---> 16     name()
>      17
>      18 if __name__ == "__main__":
>
>
> ipdb>
>
> I now want to change the definition of print_name() to print ?Bob?
> instead of ?Alice?. I can make this change in Emacs, send the new function
> to IPython with C-c C-y f, but when I then type name(), I get ?Alice?.
> The reference to print_name() in name() is not getting updated to point
> to the new definition of print_name(). Likely I?m going about this
> process incorrectly, as I can make these kind of changes at an IPython3
> prompt but not at an ipdb one.
>
> Thanks,
>
> Andreas
> _______________________________________________
> IPython-dev mailing list
> IPython-dev at python.org
> https://mail.python.org/mailman/listinfo/ipython-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190113/06d1bcdb/attachment.html>

From andreas at yank.to  Mon Jan 14 10:16:53 2019
From: andreas at yank.to (Andreas Yankopolus)
Date: Mon, 14 Jan 2019 10:16:53 -0500
Subject: [IPython-dev] Changing functions in a running program?
In-Reply-To: <mailman.1.1547398802.9508.ipython-dev@python.org>
References: <mailman.1.1547398802.9508.ipython-dev@python.org>
Message-ID: <50B9EB6C-CBCB-46E3-AE8E-31FDD54CD568@yank.to>

Thomas,

> If you define a function or variable at the breakpoint, it's probably making it a local variable inside main(), so it's not in scope for name().
> 
> You may be able to get round this by defining the new function and then explicitly making it a global variable, something like this:
> 
> globals()['print_name'] = print_name
> 
> Not exactly elegant, but hopefully it works.

Yes?makes sense and does the trick! I?ll have to figure out an Emacs hook to automatically update globals in that manner. My output:

In main.
> /Users/ayank/Documents/programming/python/bar.py(13)main()
     12     import ipdb; ipdb.set_trace()
---> 13     name()
     14 

ipdb> name()
Alice
ipdb> !def print_name(): print ("Bob")
ipdb> name()
Alice
ipdb> globals()['print_name'] = print_name
ipdb> name()
Bob
ipdb> 

To follow up on Wes?s comment regarding TDD?I?m writing signal processing code and using Python + SciPy like Matlab. There are some calculations when the code starts that take a few minutes. I have a breakpoint there and am experimenting with the next steps of the processing chain.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190114/a0149695/attachment.html>

From wes.turner at gmail.com  Mon Jan 14 11:02:38 2019
From: wes.turner at gmail.com (Wes Turner)
Date: Mon, 14 Jan 2019 11:02:38 -0500
Subject: [IPython-dev] Changing functions in a running program?
In-Reply-To: <50B9EB6C-CBCB-46E3-AE8E-31FDD54CD568@yank.to>
References: <mailman.1.1547398802.9508.ipython-dev@python.org>
 <50B9EB6C-CBCB-46E3-AE8E-31FDD54CD568@yank.to>
Message-ID: <CACfEFw8CbR7eSHjrCO2S22ZWDSCS01UBXDwhWLVnhKdPU7TEDw@mail.gmail.com>

There are likely more convenient patterns for a functionally composed
signal processing pipeline; though ipdb may be good enough.
https://pypi.org/project/pdbpp/ supports {tab-completion,}

Decomposing to functions that accept state with a standard interface like
{data: [], kwargs: {}} may have advantages.
You can assign the output of one step of the pipeline to a global; or, more
ideally, a key of a dict (or an attribute of a class instance); so you
don't need to modify globals() from a different scope.

The Jupyter notebook way to do this would be to put the first processing
stage in one cell, and then work with it in the next. e g. Spyder and
VSCode support markers in Python sources so that you can execute a
top-level block of code at a time.

I'm not too familiar with signal processing. Someone has likely already
distilled the workflow into some standard interfaces like
sklearn.pipeline.Pipeline.steps?
https://scikit-learn.org/stable/modules/compose.html#pipeline

/q="scikit signal"
- scipy.signal
- scikit-signal

/q="python signal processing"
- http://greenteapress.com/thinkdsp/html/
- https://github.com/unpingco/Python-for-Signal-Processing (ipynb)



# This logs IPython input and output to a file;
# but not ipdb i/o AFAIU
%logstart -o example.py
%logstart -h

To avoid logging code that modifies globals(),
it's probably better and more convenient to pass print_name as an argument
to name():

def name(print_name=print_name):
    print_name()

With a config dict/object:

def name(conf):
    print_name = conf['print_name']
    print_name(conf['data'])


On Monday, January 14, 2019, Andreas Yankopolus <andreas at yank.to> wrote:

> Thomas,
>
> If you define a function or variable at the breakpoint, it's
> probably making it a local variable inside main(), so it's not in scope for
> name().
>
> You may be able to get round this by defining the new function and
> then explicitly making it a global variable, something like this:
>
> globals()['print_name'] = print_name
>
> Not exactly elegant, but hopefully it works.
>
>
> Yes?makes sense and does the trick! I?ll have to figure out an Emacs hook
> to automatically update globals in that manner. My output:
>
> In main.
> > /Users/ayank/Documents/programming/python/bar.py(13)main()
>      12     import ipdb; ipdb.set_trace()
> ---> 13     name()
>      14
>
> ipdb> name()
> Alice
> ipdb> !def print_name(): print ("Bob")
> ipdb> name()
> Alice
> ipdb> globals()['print_name'] = print_name
> ipdb> name()
> Bob
> ipdb>
>
> To follow up on Wes?s comment regarding TDD?I?m writing signal processing
> code and using Python + SciPy like Matlab. There are some calculations when
> the code starts that take a few minutes. I have a breakpoint there and am
> experimenting with the next steps of the processing chain.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190114/e61d86ef/attachment.html>

From ToTheDude at zoho.com  Sat Jan 26 23:22:12 2019
From: ToTheDude at zoho.com (TheDude)
Date: Sat, 26 Jan 2019 20:22:12 -0800
Subject: [IPython-dev] pixiedebugger: can't install
Message-ID: <5729286F-E250-412C-8776-BB8B4B6A3ABB@zoho.com>

Hello,
	I would really like to have the pixiedebugger working in my notebook. Unfortunately I ran into problems with the installation.

I am following the instructions here: <https://pixiedust.github.io/pixiedust/install.html>
For what is worth, I am using Python 3.7 in Anaconda on macOS 10.12.

The first part of the installation:
	pip install pixiedust

is successful, and it ends with:
	Successfully built pixiedust mpld3
	Installing collected packages: mpld3, geojson, astunparse, markdown, colour, pixiedust
	Successfully installed astunparse-1.6.2 colour-0.1.5 geojson-2.4.1 markdown-3.0.1 mpld3-0.3 pixiedust-1.1.15

The next step, installing a new Jupyter kernel, fails:
	jupyter pixiedust install

Here?s what I get:

(py37ana) adula-5:~ dude$ jupyter pixiedust install
Step 1: PIXIEDUST_HOME: /Users/dude/pixiedust
	Keep y/n [y]? y
Step 2: SPARK_HOME: /Users/dude/pixiedust/bin/spark
	Keep y/n [y]? y
Select an existing spark install or create a new one
1. spark-2.2.0-bin-hadoop2.7
2. Create a new spark Install
	Enter your selection: 1
Traceback (most recent call last):
 File "/Users/dude/anaconda/envs/py37ana/bin/jupyter-pixiedust", line 11, in <module>
   sys.exit(main())
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/pixiedustapp.py", line 41, in main
   PixiedustJupyterApp.launch_instance()
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 657, in launch_instance
   app.initialize(argv)
 File "<decorator-gen-2>", line 2, in initialize
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
   return method(app, *args, **kwargs)
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 296, in initialize
   self.parse_command_line(argv)
 File "<decorator-gen-4>", line 2, in parse_command_line
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
   return method(app, *args, **kwargs)
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 514, in parse_command_line
   return self.initialize_subcommand(subc, subargv)
 File "<decorator-gen-3>", line 2, in initialize_subcommand
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
   return method(app, *args, **kwargs)
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 452, in initialize_subcommand
   self.subapp.initialize(argv)
 File "<decorator-gen-6>", line 2, in initialize
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
   return method(app, *args, **kwargs)
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/jupyter_core/application.py", line 238, in initialize
   self.parse_command_line(argv)
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py", line 154, in parse_command_line
   spark_version = self.get_spark_version()
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py", line 379, in get_spark_version
   pyspark_out = subprocess.check_output([pyspark, "--version"], stderr=subprocess.STDOUT).decode("utf-8")
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py", line 389, in check_output
   **kwargs).stdout
 File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py", line 481, in run
   output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['/Users/dude/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark', '--version']' returned non-zero exit status 1.


Anybody has an idea on how to proceed?

TIA.

TheDude



From ToTheDude at zoho.com  Sun Jan 27 23:55:44 2019
From: ToTheDude at zoho.com (The Dude)
Date: Sun, 27 Jan 2019 20:55:44 -0800
Subject: [IPython-dev] pixiedebugger: can't install
In-Reply-To: <CANADyYmj8Eng18g_qP1oi4M70nhdWzAADSn8E49CEpMLsGoKWQ@mail.gmail.com>
References: <5729286F-E250-412C-8776-BB8B4B6A3ABB@zoho.com>
 <CANADyYmj8Eng18g_qP1oi4M70nhdWzAADSn8E49CEpMLsGoKWQ@mail.gmail.com>
Message-ID: <FD042477-AC19-4337-9C6D-4E406B523E33@zoho.com>

Hi Lisa,
	thanks for your help. 

Unfortunately, I get the same error after installing spark with conda. 
Here?s the output from my terminal:


(py37ana) adula-5:~ dude$ jupyter pixiedust install
Step 1: PIXIEDUST_HOME: /Users/dude/pixiedust
	Keep y/n [y]? y
Step 2: SPARK_HOME: /Users/dude/pixiedust/bin/spark
	Keep y/n [y]? y
Select an existing spark install or create a new one
1. spark-2.2.0-bin-hadoop2.7
2. Create a new spark Install
	Enter your selection: 1
Traceback (most recent call last):
  File "/Users/dude/anaconda/envs/py37ana/bin/jupyter-pixiedust", line 11, in <module>
    sys.exit(main())
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/pixiedustapp.py", line 41, in main
    PixiedustJupyterApp.launch_instance()
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 657, in launch_instance
    app.initialize(argv)
  File "<decorator-gen-2>", line 2, in initialize
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
    return method(app, *args, **kwargs)
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 296, in initialize
    self.parse_command_line(argv)
  File "<decorator-gen-4>", line 2, in parse_command_line
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
    return method(app, *args, **kwargs)
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 514, in parse_command_line
    return self.initialize_subcommand(subc, subargv)
  File "<decorator-gen-3>", line 2, in initialize_subcommand
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
    return method(app, *args, **kwargs)
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 452, in initialize_subcommand
    self.subapp.initialize(argv)
  File "<decorator-gen-6>", line 2, in initialize
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
    return method(app, *args, **kwargs)
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/jupyter_core/application.py", line 238, in initialize
    self.parse_command_line(argv)
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py", line 154, in parse_command_line
    spark_version = self.get_spark_version()
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py", line 379, in get_spark_version
    pyspark_out = subprocess.check_output([pyspark, "--version"], stderr=subprocess.STDOUT).decode("utf-8")
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py", line 389, in check_output
    **kwargs).stdout
  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py", line 481, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['/Users/dude/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark', '--version']' returned non-zero exit status 1.dude



It seems that the pyspark executable doesn?t run properly. 

TheDude





> On Jan 27, 2019, at 11:26:25, Lisa Bang <lisagbang at gmail.com> wrote:
> 
> 
> Hi,
> 
> Looks like you don't have pyspark installed.  I reproduced your error and after running "conda install pyspark" ,  "jupyter pixiedust install" finished just fine.
> 
> Best,
> Lisa
> 
> 
> On Sat, Jan 26, 2019 at 11:22 PM TheDude <ToTheDude at zoho.com <mailto:ToTheDude at zoho.com>> wrote:
> Hello,
>         I would really like to have the pixiedebugger working in my notebook. Unfortunately I ran into problems with the installation.
> 
> I am following the instructions here: <https://pixiedust.github.io/pixiedust/install.html <https://pixiedust.github.io/pixiedust/install.html>>
> For what is worth, I am using Python 3.7 in Anaconda on macOS 10.12.
> 
> The first part of the installation:
>         pip install pixiedust
> 
> is successful, and it ends with:
>         Successfully built pixiedust mpld3
>         Installing collected packages: mpld3, geojson, astunparse, markdown, colour, pixiedust
>         Successfully installed astunparse-1.6.2 colour-0.1.5 geojson-2.4.1 markdown-3.0.1 mpld3-0.3 pixiedust-1.1.15
> 
> The next step, installing a new Jupyter kernel, fails:
>         jupyter pixiedust install
> 
> Here?s what I get:
> 
> (py37ana) adula-5:~ dude$ jupyter pixiedust install
> Step 1: PIXIEDUST_HOME: /Users/dude/pixiedust
>         Keep y/n [y]? y
> Step 2: SPARK_HOME: /Users/dude/pixiedust/bin/spark
>         Keep y/n [y]? y
> Select an existing spark install or create a new one
> 1. spark-2.2.0-bin-hadoop2.7
> 2. Create a new spark Install
>         Enter your selection: 1
> Traceback (most recent call last):
>  File "/Users/dude/anaconda/envs/py37ana/bin/jupyter-pixiedust", line 11, in <module>
>    sys.exit(main())
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/pixiedustapp.py", line 41, in main
>    PixiedustJupyterApp.launch_instance()
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 657, in launch_instance
>    app.initialize(argv)
>  File "<decorator-gen-2>", line 2, in initialize
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
>    return method(app, *args, **kwargs)
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 296, in initialize
>    self.parse_command_line(argv)
>  File "<decorator-gen-4>", line 2, in parse_command_line
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
>    return method(app, *args, **kwargs)
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 514, in parse_command_line
>    return self.initialize_subcommand(subc, subargv)
>  File "<decorator-gen-3>", line 2, in initialize_subcommand
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
>    return method(app, *args, **kwargs)
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 452, in initialize_subcommand
>    self.subapp.initialize(argv)
>  File "<decorator-gen-6>", line 2, in initialize
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py", line 87, in catch_config_error
>    return method(app, *args, **kwargs)
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/jupyter_core/application.py", line 238, in initialize
>    self.parse_command_line(argv)
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py", line 154, in parse_command_line
>    spark_version = self.get_spark_version()
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py", line 379, in get_spark_version
>    pyspark_out = subprocess.check_output([pyspark, "--version"], stderr=subprocess.STDOUT).decode("utf-8")
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py", line 389, in check_output
>    **kwargs).stdout
>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py", line 481, in run
>    output=stdout, stderr=stderr)
> subprocess.CalledProcessError: Command '['/Users/dude/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark', '--version']' returned non-zero exit status 1.
> 
> 
> Anybody has an idea on how to proceed?
> 
> TIA.
> 
> TheDude
> 
> 
> _______________________________________________
> IPython-dev mailing list
> IPython-dev at python.org <mailto:IPython-dev at python.org>
> https://mail.python.org/mailman/listinfo/ipython-dev <https://mail.python.org/mailman/listinfo/ipython-dev>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190127/d0a7fa64/attachment.html>

From lisagbang at gmail.com  Mon Jan 28 10:08:59 2019
From: lisagbang at gmail.com (Lisa Bang)
Date: Mon, 28 Jan 2019 10:08:59 -0500
Subject: [IPython-dev] pixiedebugger: can't install
In-Reply-To: <FD042477-AC19-4337-9C6D-4E406B523E33@zoho.com>
References: <5729286F-E250-412C-8776-BB8B4B6A3ABB@zoho.com>
 <CANADyYmj8Eng18g_qP1oi4M70nhdWzAADSn8E49CEpMLsGoKWQ@mail.gmail.com>
 <FD042477-AC19-4337-9C6D-4E406B523E33@zoho.com>
Message-ID: <CANADyYkLGdhoj7ubb=RfqF4NK0JUcNb88H2Mgd=hSJHgmQSHsw@mail.gmail.com>

Hi Dude,

Yes, it looks like when subprocess calls pyspark to get its version,
pyspark returns no version.  I do see that this problem has happened
recently for others <https://github.com/pixiedust/pixiedust/issues/741> as
well also.  Maybe could be solved with editing the kernel.json in your
Jupyter directory for the pixiedust kernel to include pyspark, typically at
/Users/xxxx/Library/Jupyter/kernels/pythonwithpixiedustspark22/kernel.json but
this kind of seems like a pain.

For reference, I have put the output from my code with the same error and
it working after. Perhaps also ensuring the package versions of py4j and
pyspark are the same would help.

Best,
Lisa


XXXX:Documents xxxx$ jupyter pixiedust install

*Step 1: PIXIEDUST_HOME: /Users/xxxx/pixiedust*

     Keep y/n [y]? y

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? y

Select an existing spark install or create a new one

*1. spark-2.3.2-bin-hadoop2.7*

*2. Create a new spark Install*

     Enter your selection: 1

Traceback (most recent call last):

  File "/Users/xxxx/anaconda3/bin/jupyter-pixiedust", line 11, in <module>

    sys.exit(main())

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/pixiedustapp.py",
line 41, in main

    PixiedustJupyterApp.launch_instance()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 657, in launch_instance

    app.initialize(argv)

  File "<decorator-gen-2>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 296, in initialize

    self.parse_command_line(argv)

  File "<decorator-gen-4>", line 2, in parse_command_line

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 514, in parse_command_line

    return self.initialize_subcommand(subc, subargv)

  File "<decorator-gen-3>", line 2, in initialize_subcommand

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 452, in initialize_subcommand

    self.subapp.initialize(argv)

  File "<decorator-gen-6>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/jupyter_core/application.py",
line 238, in initialize

    self.parse_command_line(argv)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 154, in parse_command_line

    spark_version = self.get_spark_version()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 379, in get_spark_version

    pyspark_out = subprocess.check_output([pyspark, "--version"],
stderr=subprocess.STDOUT).decode("utf-8")

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 389, in
check_output

    **kwargs).stdout

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 481, in run

    output=stdout, stderr=stderr)

subprocess.CalledProcessError: Command
'['/Users/xxxx/pixiedust/bin/spark/spark-2.3.2-bin-hadoop2.7/bin/pyspark',
'--version']' returned non-zero exit status 1.

XXXX:Documents xxxx$ pip install pyspark

Collecting pyspark

  Downloading
https://files.pythonhosted.org/packages/88/01/a37e827c2d80c6a754e40e99b9826d978b55254cc6c6672b5b08f2e18a7f/pyspark-2.4.0.tar.gz
(213.4MB)

    100% |????????????????????????????????| 213.4MB 135kB/s

Collecting py4j==0.10.7 (from pyspark)

  Downloading https://files.pythonhosted.org/packages/e3/53/c737818eb9a7dc32a7cd4f1396e787bd94200c3997c72c1dbe028587bd76/py4j-0.10.7-py2.py3-none-any.whl
(197kB)

    100% |????????????????????????????????| 204kB 10.8MB/s

Building wheels for collected packages: pyspark

  Running setup.py bdist_wheel for pyspark ... done

  Stored in directory:
/Users/xxxx/Library/Caches/pip/wheels/cd/54/c2/abfcc942eddeaa7101228ebd6127a30dbdf903c72db4235b23

Successfully built pyspark

Installing collected packages: py4j, pyspark

Successfully installed py4j-0.10.7 pyspark-2.4.0

XXXX:Documents xxxx$ jupyter pixiedust install

*Step 1: PIXIEDUST_HOME: /Users/xxxx/pixiedust*

     Keep y/n [y]? y

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? y

Select an existing spark install or create a new one

*1. spark-2.3.2-bin-hadoop2.7*

*2. Create a new spark Install*

     Enter your selection: 2

*What version would you like to download? 1.6.3, 2.0.2, 2.1.0, 2.2.0, 2.3.2
[2.3.2]: *2.2.0

SPARK_HOME will be set to
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7

Downloading Spark 2.2.0

Extracting Spark 2.2.0 to /Users/xxxx/pixiedust/bin/spark

Traceback (most recent call last):

  File "/Users/xxxx/anaconda3/bin/jupyter-pixiedust", line 11, in <module>

    sys.exit(main())

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/pixiedustapp.py",
line 41, in main

    PixiedustJupyterApp.launch_instance()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 657, in launch_instance

    app.initialize(argv)

  File "<decorator-gen-2>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 296, in initialize

    self.parse_command_line(argv)

  File "<decorator-gen-4>", line 2, in parse_command_line

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 514, in parse_command_line

    return self.initialize_subcommand(subc, subargv)

  File "<decorator-gen-3>", line 2, in initialize_subcommand

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 452, in initialize_subcommand

    self.subapp.initialize(argv)

  File "<decorator-gen-6>", line 2, in initialize

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/traitlets/config/application.py",
line 87, in catch_config_error

    return method(app, *args, **kwargs)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/jupyter_core/application.py",
line 238, in initialize

    self.parse_command_line(argv)

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 154, in parse_command_line

    spark_version = self.get_spark_version()

  File
"/Users/xxxx/anaconda3/lib/python3.7/site-packages/install/createKernel.py",
line 379, in get_spark_version

    pyspark_out = subprocess.check_output([pyspark, "--version"],
stderr=subprocess.STDOUT).decode("utf-8")

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 389, in
check_output

    **kwargs).stdout

  File "/Users/xxxx/anaconda3/lib/python3.7/subprocess.py", line 481, in run

    output=stdout, stderr=stderr)

subprocess.CalledProcessError: Command
'['/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
'--version']' returned non-zero exit status 1.

XXXX:Documents xxxx$
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
'--version'

> ,

> '

-bash:
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark,
--version

,

: No such file or directory

XXXX:Documents xxxx$
/Users/xxxx/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark

No Java runtime present, requesting install.

XXXX:Documents xxxx$ conda install pyspark

Solving environment: done



## Package Plan ##



  environment location: /Users/xxxx/anaconda3



  added / updated specs:

    - pyspark





The following packages will be downloaded:



    package                    |            build

    ---------------------------|-----------------

    py4j-0.10.7                |           py37_0         251 KB

    pyspark-2.4.0              |           py37_0       203.4 MB

    conda-4.6.1                |           py37_0         1.7 MB

    ------------------------------------------------------------

                                           Total:       205.4 MB



The following NEW packages will be INSTALLED:



    py4j:    0.10.7-py37_0

    pyspark: 2.4.0-py37_0



The following packages will be UPDATED:



    conda:   4.5.12-py37_0 --> 4.6.1-py37_0



Proceed ([y]/n)? y





Downloading and Extracting Packages

py4j-0.10.7          | 251 KB    |
#################################################################### | 100%

pyspark-2.4.0        | 203.4 MB  |
#################################################################### | 100%

conda-4.6.1          | 1.7 MB    |
#################################################################### | 100%

Preparing transaction: done

Verifying transaction: done

Executing transaction: done

(base) XXXX:Documents xxxx$ jupyter pixiedust install

*Step 1: PIXIEDUST_HOME: /Users/xxxx/pixiedust*

     Keep y/n [y]? y

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? n

*Step 2: Please enter a SPARK_HOME location: *
/Users/xxxx/spark-2.3.2-bin-hadoop2.7

*Directory /Users/xxxx/sprk-2.3.2-bin-hadoop2.7 does not exist*

     Create y/n [y]? n

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? n

*Step 2: Please enter a SPARK_HOME location: *
/Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7

*Directory /Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7 does not exist*

     Create y/n [y]? n

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? n

*Step 2: Please enter a SPARK_HOME location: *
/Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7/bin/pyspark

*Directory /Users/xxxx/Downloads/spark-2.3.4-bin-hadoop2.7/bin/pyspark does
not exist*

     Create y/n [y]? n

*Step 2: SPARK_HOME: /Users/xxxx/pixiedust/bin/spark*

     Keep y/n [y]? y

Select an existing spark install or create a new one

*1. spark-2.3.2-bin-hadoop2.7*

*2. spark-2.2.0-bin-hadoop2.7*

*3. Create a new spark Install*

     Enter your selection: 2

downloaded spark cloudant jar:
/Users/xxxx/pixiedust/bin/cloudant-spark-v2.0.0-185.jar

*Step 3: SCALA_HOME: /Users/xxxx/pixiedust/bin/scala*

     Keep y/n [y]? y

*Directory /Users/xxxx/pixiedust/bin/scala does not contain a valid scala
install*

     Download Scala y/n [y]? y

SCALA_HOME will be set to /Users/xxxx/pixiedust/bin/scala/scala-2.11.8

Downloading Scala 2.11

Extracting Scala 2.11 to /Users/xxxx/pixiedust/bin/scala

*Step 4: Kernel Name: Python-with-Pixiedust_Spark-2.2*

     Keep y/n [y]? y

self.kernelInternalName pythonwithpixiedustspark22

[PixiedustInstall] Installed kernelspec pythonwithpixiedustspark22 in
/Users/xxxx/Library/Jupyter/kernels/pythonwithpixiedustspark22

Downloading intro notebooks into /Users/xxxx/pixiedust/notebooks

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
1 - Easy Visualizations.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
2 - Working with External Data.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
3 - Scala and Python.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
4 - Add External Spark Packages.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
5 - Stash to Cloudant.ipynb : *done*

...
https://github.com/ibm-watson-data-lab/pixiedust/raw/master/notebook/PixieDust
Contribute.ipynb : *done*





####################################################################################################

#    Congratulations: Kernel Python-with-Pixiedust_Spark-2.2 was
successfully created in
/Users/xxxx/Library/Jupyter/kernels/pythonwithpixiedustspark22

#    You can start the Notebook server with the following command:

#        *jupyter notebook /Users/xxxx/pixiedust/notebooks*

####################################################################################################


On Sun, Jan 27, 2019, 11:55 PM The Dude <ToTheDude at zoho.com> wrote:

> Hi Lisa,
> thanks for your help.
>
> Unfortunately, I get the same error after installing spark with conda.
> Here?s the output from my terminal:
>
>
> (py37ana) adula-5:~ dude$ jupyter pixiedust install
> Step 1: PIXIEDUST_HOME: /Users/dude/pixiedust
> Keep y/n [y]? y
> Step 2: SPARK_HOME: /Users/dude/pixiedust/bin/spark
> Keep y/n [y]? y
> Select an existing spark install or create a new one
> 1. spark-2.2.0-bin-hadoop2.7
> 2. Create a new spark Install
> Enter your selection: 1
> Traceback (most recent call last):
>   File "/Users/dude/anaconda/envs/py37ana/bin/jupyter-pixiedust", line 11,
> in <module>
>     sys.exit(main())
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/pixiedustapp.py",
> line 41, in main
>     PixiedustJupyterApp.launch_instance()
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 657, in launch_instance
>     app.initialize(argv)
>   File "<decorator-gen-2>", line 2, in initialize
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 296, in initialize
>     self.parse_command_line(argv)
>   File "<decorator-gen-4>", line 2, in parse_command_line
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 514, in parse_command_line
>     return self.initialize_subcommand(subc, subargv)
>   File "<decorator-gen-3>", line 2, in initialize_subcommand
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 452, in initialize_subcommand
>     self.subapp.initialize(argv)
>   File "<decorator-gen-6>", line 2, in initialize
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
> line 87, in catch_config_error
>     return method(app, *args, **kwargs)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/jupyter_core/application.py",
> line 238, in initialize
>     self.parse_command_line(argv)
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
> line 154, in parse_command_line
>     spark_version = self.get_spark_version()
>   File
> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
> line 379, in get_spark_version
>     pyspark_out = subprocess.check_output([pyspark, "--version"],
> stderr=subprocess.STDOUT).decode("utf-8")
>   File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
> line 389, in check_output
>     **kwargs).stdout
>   File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
> line 481, in run
>     output=stdout, stderr=stderr)
> subprocess.CalledProcessError: Command
> '['/Users/dude/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
> '--version']' returned non-zero exit status 1.dude
>
>
>
> It seems that the pyspark executable doesn?t run properly.
>
> TheDude
>
>
>
>
>
> On Jan 27, 2019, at 11:26:25, Lisa Bang <lisagbang at gmail.com> wrote:
>
>
> Hi,
>
> Looks like you don't have pyspark installed.  I reproduced your error and
> after running "conda install pyspark" ,  "jupyter pixiedust install"
> finished just fine.
>
> Best,
> Lisa
>
>
> On Sat, Jan 26, 2019 at 11:22 PM TheDude <ToTheDude at zoho.com> wrote:
>
>> Hello,
>>         I would really like to have the pixiedebugger working in my
>> notebook. Unfortunately I ran into problems with the installation.
>>
>> I am following the instructions here: <
>> https://pixiedust.github.io/pixiedust/install.html>
>> For what is worth, I am using Python 3.7 in Anaconda on macOS 10.12.
>>
>> The first part of the installation:
>>         pip install pixiedust
>>
>> is successful, and it ends with:
>>         Successfully built pixiedust mpld3
>>         Installing collected packages: mpld3, geojson, astunparse,
>> markdown, colour, pixiedust
>>         Successfully installed astunparse-1.6.2 colour-0.1.5
>> geojson-2.4.1 markdown-3.0.1 mpld3-0.3 pixiedust-1.1.15
>>
>> The next step, installing a new Jupyter kernel, fails:
>>         jupyter pixiedust install
>>
>> Here?s what I get:
>>
>> (py37ana) adula-5:~ dude$ jupyter pixiedust install
>> Step 1: PIXIEDUST_HOME: /Users/dude/pixiedust
>>         Keep y/n [y]? y
>> Step 2: SPARK_HOME: /Users/dude/pixiedust/bin/spark
>>         Keep y/n [y]? y
>> Select an existing spark install or create a new one
>> 1. spark-2.2.0-bin-hadoop2.7
>> 2. Create a new spark Install
>>         Enter your selection: 1
>> Traceback (most recent call last):
>>  File "/Users/dude/anaconda/envs/py37ana/bin/jupyter-pixiedust", line 11,
>> in <module>
>>    sys.exit(main())
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/pixiedustapp.py",
>> line 41, in main
>>    PixiedustJupyterApp.launch_instance()
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 657, in launch_instance
>>    app.initialize(argv)
>>  File "<decorator-gen-2>", line 2, in initialize
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 296, in initialize
>>    self.parse_command_line(argv)
>>  File "<decorator-gen-4>", line 2, in parse_command_line
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 514, in parse_command_line
>>    return self.initialize_subcommand(subc, subargv)
>>  File "<decorator-gen-3>", line 2, in initialize_subcommand
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 452, in initialize_subcommand
>>    self.subapp.initialize(argv)
>>  File "<decorator-gen-6>", line 2, in initialize
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/traitlets/config/application.py",
>> line 87, in catch_config_error
>>    return method(app, *args, **kwargs)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/jupyter_core/application.py",
>> line 238, in initialize
>>    self.parse_command_line(argv)
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
>> line 154, in parse_command_line
>>    spark_version = self.get_spark_version()
>>  File
>> "/Users/dude/anaconda/envs/py37ana/lib/python3.7/site-packages/install/createKernel.py",
>> line 379, in get_spark_version
>>    pyspark_out = subprocess.check_output([pyspark, "--version"],
>> stderr=subprocess.STDOUT).decode("utf-8")
>>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
>> line 389, in check_output
>>    **kwargs).stdout
>>  File "/Users/dude/anaconda/envs/py37ana/lib/python3.7/subprocess.py",
>> line 481, in run
>>    output=stdout, stderr=stderr)
>> subprocess.CalledProcessError: Command
>> '['/Users/dude/pixiedust/bin/spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark',
>> '--version']' returned non-zero exit status 1.
>>
>>
>> Anybody has an idea on how to proceed?
>>
>> TIA.
>>
>> TheDude
>>
>>
>> _______________________________________________
>> IPython-dev mailing list
>> IPython-dev at python.org
>> https://mail.python.org/mailman/listinfo/ipython-dev
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20190128/e558a1aa/attachment-0001.html>