[AstroPy] Tests for AstroPy (was POLL: vision for a common Astronomy package)

Erik Tollerud erik.tollerud at gmail.com
Wed Jul 6 15:29:56 EDT 2011


We (the Coordinating Committee) are finalizing a draft coding
standards document that we will soon send out to the list.  This does
indeed include unit tests/regression tests, although the way we've
phrased it is that tests are "encouraged for all public
classes/methods/functions".  The idea is that we don't want a strict
requirement, but we (the committee) will reserve the right to not
accept packages if they are not tested enough (and of course "enough"
depends on how critical a given component is).

As for the testing framework, we were thinking nose
(http://somethingaboutorange.com/mrl/projects/nose/1.0.0/).   Nose is
very nice for a project like this because it's super easy to run and
also easy to write tests with, but is also compatible with the stdlib
unittests.

I also think that the dependencies requirements for the tests need not
be as strict as for the package itself (the vision does *not* restrict
the requirements for running tests). Requiring an additional
dependency or two is not a huge burden  (especially with the
'test_requires' option Erik points out).  And if we end up using
external libraries for comparison (e.g. SOFA as the other thread
mentions) which is probably a very good idea, that will be a much
bigger pain anyway.  That's also another reason to go with nose - it's
easy to switch tests or groups of tests on and off with nose (i.e. if
the user doesn't have SOFA installed or something).


On Wed, Jul 6, 2011 at 8:35 AM, Victoria G. Laidler <laidler at stsci.edu> wrote:
> Ole Streicher wrote:
>> BTW, are there ideas for regression tests of astropy? Especially in the
>> case you mentioned, this would be *very* helpful to check that
>> everything works in the target system.
> I was thinking about this as well. I suggest that there be minimum
> testing standards for astropy packages. Packages should come with tests
> that:
> - can be run using a standard syntax (what?)
> - verify that all significant elements of the package can successfully
> execute
> - verify that all significant elements of the package, for some minimum
> set of common use cases, give the right answer when run
> - come with documentation that in some way describe test coverage (how
> much of the system do the tests cover) and the source of the test
> answers (ie, how did you determine the "right answer" used in the tests)
>
> This last point is especially important if we want to get people to
> adopt astropy libraries in place of their current favorite tool. In this
> case, testing serves as a confidence builder: astropy package X is at
> least as good as OtherPackage Y.
>
> I'm using two phrases in the above that are meant to have a basic
> definition but also allow some wiggle room for package developers:
> - all significant elements of the package: the basic parts that most
> people who use the package at all will use, similar to the dependency
> discussion)
> - minimum set of common use cases: the typical use cases that most
> people will encounter
>
> This allows a package to be included in astropy without requiring the
> developer to provide an exhaustive set of tests that verify correct
> behavior in every possible corner of the domain space. This is
> especially important for software whose behavior is data-dependent. Of
> course, more tests are better, and would produce better coverage.
>
> Vicki Laidler
>
> _______________________________________________
> AstroPy mailing list
> AstroPy at scipy.org
> http://mail.scipy.org/mailman/listinfo/astropy
>



-- 
Erik Tollerud



More information about the AstroPy mailing list