Very interesting, thanks for the link. I think the notion of computational cost should be included here, particularly with 7. A numerical simulation analysis is, fundamentally, something that takes a random seed and code revision number and produces a plot. You can store intermediate data products, but each additional step introduces the possibility of error, version skew, etc, and incurs the costs of data storage, archival, and migration. Perhaps a high-level caching layer would be useful in an analysis tool that also acted as workflow manager...
Only those stages Douglas Rudd Scientific Computing Consultant Research Computing Center email@example.com
On Oct 25, 2013, at 8:54 AM, Matthew Turk firstname.lastname@example.org wrote:
Titus Brown just posted this to the SWCarpentry discussion list:
I thought people here may be interested. And, it occurs to me that in yt, these rules are things we have attempted to support, but without codifying them -- and, we could do a better job of supporting them. In particular, I think rules 5 and 7 are things we could do a better job of supporting.
As an example:
- FRBs are difficult to store
- Underlying slices/projections/etc are difficult to store (the raw
data is not, but the intermediate products are)
- Profiles are not easily saved
I think over time as we split the viz layer further from the data, and make the data accessible more easily through the viz, these will be improved. But, it's something to think about. And, the article is a good read, too!
-Matt _______________________________________________ yt-dev mailing list email@example.com http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org