Hi folks,
The current default colormap that yt uses is called "Algae," and it's been
with us for a while. Recently some of us on the yt-dev mailing list (which
is open to join!) have begun to look at alternate colormaps that have
better characteristics, particularly from an accessibility perspective.
If you have the time, it would be really, *really* helpful if you could
take this anonymous, two question (only one of which is mandatory!) poll
about which of the four candidates we've come up with you like the best.
http://goo.gl/forms/D4qQqkPIMq
Thanks very much! The poll will be open at least a week, and I'll report
back the results here sometime after that.
-Matt
Hi everyone,
The last couple weeks I’ve been thinking a lot about the future of yt.
What I’d like to propose is that we shift investment in yt as a
single, monolithic codebase into yt as a project, or an ecosystem of
projects.
This came out of the discussion of the extension/affiliated packages,
analysis modules, and so on. Britton has for a while been pitching
the idea [which I will poorly paraphrase here] that yt can be the
framework on top of which killer apps can be built. I think this is
great.
What’s holding us back in some ways from this is that yt is currently
structured as a monolithic code base, with little to no discovery of
other packages and apps and whatnot. We tried for a while to change
this with The Barn, but it ended up not quite taking off. I think the
time is right to try to change the way we think about yt to be more
about yt the Project, rather than yt the Codebase; the core codebase
is an important component of this, but not the whole of it.
Encouraging an ecosystem of packages can have a few very important benefits:
* External packages will confer greater individual credit to the
folks who develop them.
* External packages can be versioned and developed independently; the
review process can be different.
* yt’s core can be emphasized as a generic package, on top of which
astronomy analysis can be built.
* Packages can be maintained wherever, including alternate locations
such as github.
On the other hand, having packages inside the main distribution makes
discoverability much, much easier. It also enables everything to be
In The Box. And, the continuous integration and testing system is
already set up for yt. But, these are all possible to overcome -- we
can devise a strategy for adding packages to the CI system (and if
they are externally managed, they can also rely on yt as a dependency
and use whatever CI system they like!) and we can improve
discoverability by refocusing the website to enable this. I've asked
Kacper about adding new packages, and it's not as easy as it might
seem, so we may need to be careful about how that process occurs; one
possibility would be to provide servers and ready-made setups, but
have individuals do the heavy lifting. We could even have something
in the codebase that describes some packages that are available.
External packages could have much looser dependency rules, which means
they can be free to take advantage of things like OpenCL, numba, etc,
without having to add them to the primary codebase.
Synchronizing APIs and versions across extension packages may be
difficult in some particular cases, but I suspect in practice will not
be an issue, as long as we continue to have a reasonably stable
*public* API, and graduate a few things (such as .blocks) into a
public API from semi-private.
To this end, of really encouraging an ecosystem of packages, I’d like
to propose two things, in increasing order of disruptiveness.
First: Encourage extension packages. This would mean:
* Reorganize website to allow for extension packages to be displayed
prominently
* Add support for name-space packages in yt
* (possible) split out some packages from analysis_modules, including
halo finding
* Codify process of extension package creation, including how to have
CI set up for them and build system.
The second, more disruptive proposal:
* Split yt into subprojects. This would include spinning out the
volume rendering and some or all of the frontends, and probably the
testing infrastructure as well.
* Split further astro-specific routines into an astro extension, and
begin the process of doing this with other domains as well. (As in
the long-simmering domain context YTEP.)
I’ll invite comments from everyone, but particularly from folks who
have either not contributed to an analysis module or extension package
because of concerns that would be addressed by this, as well as from
core developers this would impact. If the thread gets too unweildy we
may also want to table this for the next yt team meeting.
Thanks,
Matt
_______________________________________________
yt-dev mailing list
yt-dev(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Hi folks,
In the past we've explored doing Google Summer of Code, but without much
success. I think this year the time is right, and we should get set up to
do so. The timeline is here:
https://developers.google.com/open-source/gsoc/timeline?hl=en
We have until the 19th of February to prep our application; I don't think
we should go in as a mentoring organization, but instead attempt to apply
through a larger mentoring organization such as NumFOCUS.
Is anyone willing to be a mentor? This would involve (virtual) checkins
with students and spending a fair amount of time each week working with the
student. It can be a gamble, but the program can be quite successful for
both students and projects, and I personally think it's worth it. If
you're interested, ping me off list in the next few days and let's start
organizing.
-Matt
Hi everyone,
The NumFOCUS organization is a 501(c)3 dedicated to both open source
scientific software and computational science education ( numfocus.org ).
I have recently been named to their board of directors.
NumFOCUS supports many projects familiar to us here, including NumPy,
Matplotlib, Jupyter, Astropy, and Sympy, as well as rOpenSci, Data
Carpentry, Software Carpentry and Julia.
After talking this over with Britton, I would like to propose that we join
NumFOCUS through a comprehensive Fiscal Sponsorship Agreement. This
process is outlined here:
http://www.numfocus.org/apply-for-fiscal-sponsorship.html
Organizations such as NumPy and rOpenSci have detailed their reasons for
participating in this program here:
https://mail.scipy.org/pipermail/numpy-discussion/2015-October/073926.htmlhttps://ropensci.org/blog/2014/10/01/numfocus-partnership/
Primarily, I think that this would help with our ability to exist
independently of a single investigator; grants for programs such as
workshops, project infrastructure, and so on can be managed by NumFOCUS
(which has low overhead) and can be affiliated with the project.
I think this is something that warrants discussion, and perhaps should be
talked over in person during a team meeting, but I believe that this would
be a strong step forward for us as a project and a community.
-Matt
New issue 1166: YTFieldNotFound for angular momentum
https://bitbucket.org/yt_analysis/yt/issues/1166/ytfieldnotfound-for-angula…
Joyce Lee:
I'm working with 3D simulation data (ORION) to find angular momentum vector.
http://use.yt/upload/f1926f08http://use.yt/upload/272030abhttp://use.yt/upload/5e496e7c
I run the following:
```
#!python
import yt
file = "data.0200.3d.hdf5"
ds = yt.load(file)
sp = ds.sphere("center", (0.1, 'pc'))
L = sp.quantities.angular_momentum_vector()
```
and get the following traceback
```
#!python
/Users/joycelee/yt-x86_64/src/yt-hg/yt/data_objects/derived_quantities.pyc in __call__(self, *args, **kwargs)
66 storage = {}
67 for sto, ds in parallel_objects(chunks, -1, storage = storage):
---> 68 sto.result = self.process_chunk(ds, *args, **kwargs)
69 # Now storage will have everything, and will be done via pickling, so
70 # the units will be preserved. (Credit to Nathan for this
/Users/joycelee/yt-x86_64/src/yt-hg/yt/data_objects/derived_quantities.pyc in process_chunk(self, data, use_gas, use_particles)
458 rvals.extend([(data["all", "particle_specific_angular_momentum_%s" % axis] *
459 data["all", "particle_mass"]).sum(dtype=np.float64) \
--> 460 for axis in "xyz"])
461 rvals.append(data["all", "particle_mass"].sum(dtype=np.float64))
462 return rvals
/Users/joycelee/yt-x86_64/src/yt-hg/yt/data_objects/data_containers.pyc in __getitem__(self, key)
244 Returns a single field. Will add if necessary.
245 """
--> 246 f = self._determine_fields([key])[0]
247 if f not in self.field_data and key not in self.field_data:
248 if f in self._container_fields:
/Users/joycelee/yt-x86_64/src/yt-hg/yt/data_objects/data_containers.pyc in _determine_fields(self, fields)
513 raise YTFieldNotParseable(field)
514 ftype, fname = field
--> 515 finfo = self.ds._get_field_info(ftype, fname)
516 else:
517 fname = field
/Users/joycelee/yt-x86_64/src/yt-hg/yt/data_objects/static_output.pyc in _get_field_info(self, ftype, fname)
541 self._last_finfo = self.field_info[(ftype, fname)]
542 return self._last_finfo
--> 543 raise YTFieldNotFound((ftype, fname), self)
544
545 def _setup_classes(self):
YTFieldNotFound: Could not find field '('all', 'particle_specific_angular_momentum_x')' in data.0200.3d.hdf5.
```
New issue 1165: angular_momentum_vector derived quantity fails for an ORION dataset
https://bitbucket.org/yt_analysis/yt/issues/1165/angular_momentum_vector-de…
Nathan Goldbaum:
This is a weird one.
With the following dataset (spread over three files):
http://use.yt/upload/5e496e7chttp://use.yt/upload/99731ae6http://use.yt/upload/ac7409e1
and the following test script:
```
#!python
import yt
file = "data.0100.3d.hdf5"
ds = yt.load(file)
sp = ds.sphere("center", (0.1, 'pc'))
L = sp.quantities.angular_momentum_vector()
```
you will get the following error:
```
#!python
Traceback (most recent call last):
File "test.py", line 7, in <module>
L = sp.quantities.angular_momentum_vector()
File "/Users/goldbaum/Documents/yt-hg/yt/data_objects/derived_quantities.py", line 72, in __call__
values[i].append(storage[key][i])
IndexError: list index out of range
```
When I step through in a debugger, it seems that the `parallel_objects` inside of the `DerivedQuantity` base class is not properly combining data from the two chunks that get processed. More specifically, the `parallel_passthrough` decorator on the `par_combine_objects` method is only returning data from the first chunk, not combining data from both chunks.
This is totally weird because it's very well tested, often-exercised code. I'm not terribly familiar with the internals of the parallel analysis interface and would appreciate another set of eyes here.
@brittonsmith @MatthewTurk maybe?
Hi folks,
For a long time we've used "algae," which was designed by Britton
about eight years ago, as the default colormap. This has been really
nice for "branding" yt -- if you see an algae plot, it's probably (not
definitely) made with yt. But it's also not accessible from a
colorblindness perspective. Stefan van der Walt has been giving some
really great talks lately about building a better colormap for
matplotlib (e.g., https://www.youtube.com/watch?v=xAoljeRJ3lU ) which
culminated in viridis, which is shipping in recent versions of
matplotlib and will become the default.
In support of this, he built a tool called viscm which can generate
reduced versions of colormaps to show what they would be like with
varying degrees of insensitivity to color. I've generated outputs
from viscm of three of the custom colormaps we ship with yt:
Algae: https://images.hub.yt/u/fido/m/d275d5e1-png/
Cubehelix: https://images.hub.yt/u/fido/m/8e698928-png/ (I believe
this is now also shipped with MPL)
Kamae: https://images.hub.yt/u/fido/m/e0e40efa-png/
I love algae, but it's not the best from an accessibility perspective.
I'd like to propose that we use a new default colormap. If we do
this, I see two options:
* Retain a "branding" by developing a new one either by using the
techniques used by matplotlib (or one of the maps they opted not to
use) or by modifying algae to be more accessible; looking at the
response functions, I suspect it would be reasonably possible to
modify it. (Modifying algae is my preference.)
* Use viridis (which we may then have to ship if we have older
versions of matplotlib to support)
-Matt
_______________________________________________
yt-dev mailing list
yt-dev(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org