Over the last little while, Kacper, Andrew and I have been picking up
on some work started by Chuck Rozhon to implement OpenGL-based volume
rendering of AMR data. Kacper recorded a demo about a week ago,
although it has improved considerably even since then:
As of right now, it can do these things:
* Load up a "data source" (which can be all_data, but doesn't need to be)
* Utilize orthographic and perspective cameras
* Maximum intensity projection
* Integrated projection
* Apply colormaps to these two things, using two-pass rendering
* Trackball camera with keyboard shortcuts for moving around the domain
* Output camera information that is compatible with the software
renderer (i.e., it can be used to get a sequence of camera positions)
* Save images out
* Preliminary support for color transfer function-based VR. At
present this only supports isosurfaces that are manually encoded. It
will soon take 1D textures from the CTF object.
The system has been designed to be very modular, with extensible
keyboard and mouse shortcuts. Kacper has even been able to build a
very lightweight Qt-based GUI around it (on BB as xarthisius/reason )
without changing much/any of the internal-to-yt code. Also, it works
reasonably well even on fairly old graphics cards for reasonably sized
data. (And since it'll accept data objects that are cutouts, this
means you could pull a sphere or block out of a gigantic dataset and
Anyway, the reason I'm writing is that I'd like to bring it to
people's attention sooner rather than later. It'll need some testing,
and we're also working to make it in a readily usable state as well.
As of right now, before WIP gets removed from the pull request, we're
going to add documentation (with notes that it is likely an unstable
API) and hopefully a short screencast. But before then, I would like
to invite folks to either review the PR or to test it out.
Note that this requires cyglfw3, which is accessible via pip.
I'm pretty excited about this, and the design we have been aiming for
with the way it accepts objects and shaders should enable a lot of
cool things to be done -- especially with respect to selecting data,
presenting it, etc etc.
I'd really like to see this be a part of 3.3.
Great tnews!! and I cannot type for teh excitment I have in me!!!
This year open astronomy is being the umbrella for all these projects (and
maybe more that want to form part) for the Google Summer of Code.
Thanks to all the admins for the different organisations that helped with
the proposal, the website, mailing lists and of course, the mentors for
The current default colormap that yt uses is called "Algae," and it's been
with us for a while. Recently some of us on the yt-dev mailing list (which
is open to join!) have begun to look at alternate colormaps that have
better characteristics, particularly from an accessibility perspective.
If you have the time, it would be really, *really* helpful if you could
take this anonymous, two question (only one of which is mandatory!) poll
about which of the four candidates we've come up with you like the best.
Thanks very much! The poll will be open at least a week, and I'll report
back the results here sometime after that.
The last couple weeks I’ve been thinking a lot about the future of yt.
What I’d like to propose is that we shift investment in yt as a
single, monolithic codebase into yt as a project, or an ecosystem of
This came out of the discussion of the extension/affiliated packages,
analysis modules, and so on. Britton has for a while been pitching
the idea [which I will poorly paraphrase here] that yt can be the
framework on top of which killer apps can be built. I think this is
What’s holding us back in some ways from this is that yt is currently
structured as a monolithic code base, with little to no discovery of
other packages and apps and whatnot. We tried for a while to change
this with The Barn, but it ended up not quite taking off. I think the
time is right to try to change the way we think about yt to be more
about yt the Project, rather than yt the Codebase; the core codebase
is an important component of this, but not the whole of it.
Encouraging an ecosystem of packages can have a few very important benefits:
* External packages will confer greater individual credit to the
folks who develop them.
* External packages can be versioned and developed independently; the
review process can be different.
* yt’s core can be emphasized as a generic package, on top of which
astronomy analysis can be built.
* Packages can be maintained wherever, including alternate locations
such as github.
On the other hand, having packages inside the main distribution makes
discoverability much, much easier. It also enables everything to be
In The Box. And, the continuous integration and testing system is
already set up for yt. But, these are all possible to overcome -- we
can devise a strategy for adding packages to the CI system (and if
they are externally managed, they can also rely on yt as a dependency
and use whatever CI system they like!) and we can improve
discoverability by refocusing the website to enable this. I've asked
Kacper about adding new packages, and it's not as easy as it might
seem, so we may need to be careful about how that process occurs; one
possibility would be to provide servers and ready-made setups, but
have individuals do the heavy lifting. We could even have something
in the codebase that describes some packages that are available.
External packages could have much looser dependency rules, which means
they can be free to take advantage of things like OpenCL, numba, etc,
without having to add them to the primary codebase.
Synchronizing APIs and versions across extension packages may be
difficult in some particular cases, but I suspect in practice will not
be an issue, as long as we continue to have a reasonably stable
*public* API, and graduate a few things (such as .blocks) into a
public API from semi-private.
To this end, of really encouraging an ecosystem of packages, I’d like
to propose two things, in increasing order of disruptiveness.
First: Encourage extension packages. This would mean:
* Reorganize website to allow for extension packages to be displayed
* Add support for name-space packages in yt
* (possible) split out some packages from analysis_modules, including
* Codify process of extension package creation, including how to have
CI set up for them and build system.
The second, more disruptive proposal:
* Split yt into subprojects. This would include spinning out the
volume rendering and some or all of the frontends, and probably the
testing infrastructure as well.
* Split further astro-specific routines into an astro extension, and
begin the process of doing this with other domains as well. (As in
the long-simmering domain context YTEP.)
I’ll invite comments from everyone, but particularly from folks who
have either not contributed to an analysis module or extension package
because of concerns that would be addressed by this, as well as from
core developers this would impact. If the thread gets too unweildy we
may also want to table this for the next yt team meeting.
yt-dev mailing list
Recently we refactored our setup.py file to no longer use numpy.distutils.
One of the major benefits of this change is that we can now use the
`setup_requires` and `install_requires` argument of `setup` to
automatically install yt's dependencies when someone does "pip install yt"
or "python setup.py develop" in the yt repo.
Unfortunately, that doesn't quite work right now. In particular, if cython
isn't in install_requires, the installation will fail with a somewhat
confusing error (http://paste.yt-project.org/show/6266/).
I can fix this error (but unfortunately only for "pip install -e .
/path/to/yt-repository", not for "python setup.py develop") by including
cython in our install_requires, as in my open pull request:
On slack, Kacper objected to this approach, since we should only need
cython to generate the C source for our extensions for development, not
unconditionally. I see where he is coming from, but think that making it so
"pip install yt" installs cython isn't so bad, especially given that up
until recently our "setup.py" unconditionally imported cython and would
simply bail if it wasn't installed.
I see a few paths forward here:
1. Accept pull request 2003 and be ok with "pip install yt" installing
cython and "python setup.py develop" failing when cython isn't installed or
figure out how to fix that.
2. Stop using setuptools' support for building cython extensions and
instead rely on cython to do that. This means going back to unconditionally
depending on cython in our setup.py. This will also mean we are no longer
depending on setuptools>=18.0, which was releaed on June 2015, and many
users do not have installed in their python installations.
I'd love to hear about alternate approaches, if people have strong opinions
either way, of if my understanding of the situation is incorrect.
We should come to a consensus about this before releasing yt 3.3.
Thanks for your time,
New issue 1175: fcoords and tcoords not the same shape for rays from particle datasets
This script: http://paste.yt-project.org/show/6268/
gives the following error:
File "test2.py", line 12, in <module>
File "/Users/britton/Documents/work/yt/yt-hg/yt/data_objects/data_containers.py", line 268, in __getitem__
File "/Users/britton/Documents/work/yt/yt-hg/yt/data_objects/data_containers.py", line 1181, in get_data
File "/Users/britton/Documents/work/yt/yt-hg/yt/data_objects/data_containers.py", line 1201, in _generate_fields
fd = self._generate_field(field)
File "/Users/britton/Documents/work/yt/yt-hg/yt/data_objects/data_containers.py", line 305, in _generate_field
tr = self._generate_fluid_field(field)
File "/Users/britton/Documents/work/yt/yt-hg/yt/data_objects/data_containers.py", line 325, in _generate_fluid_field
rv = finfo(gen_obj)
File "/Users/britton/Documents/work/yt/yt-hg/yt/fields/derived_field.py", line 184, in __call__
dd = self._function(self, data)
File "/Users/britton/Documents/work/yt/yt-hg/yt/geometry/coordinates/coordinate_handler.py", line 40, in _coords
rv = data.ds.arr(data.fcoords[...,axi].copy(), units)
File "/Users/britton/Documents/work/yt/yt-hg/yt/data_objects/data_containers.py", line 1282, in fcoords
File "/Users/britton/Documents/work/yt/yt-hg/yt/geometry/geometry_handler.py", line 271, in cached_func
tr = func(self)
File "/Users/britton/Documents/work/yt/yt-hg/yt/geometry/geometry_handler.py", line 322, in fcoords
ci[ind:ind+c.shape, :] = c
ValueError: could not broadcast input array from shape (81,3) into shape (40,3)
This seems to be because the fcoords and tcoords for the ray are coming back with different shapes.
New issue 1174: yt install script Centos 6.6
On Centos 6.6, the yt stable install_script fails with the following error
gcc -O3 -D_LARGEFILE64_SOURCE=1 -DHAVE_HIDDEN intel64 -c -o adler32.o adler32.c
gcc: intel64: No such file or directory Error 1
just before the ZLIB make install makes it go away.
Adding this in case anyone has the same problem.