[Numpy-discussion] ANN: MyGrad 2.0 - Drop-in autodiff for NumPy
rsoklaski at gmail.com
Sun Apr 18 12:10:13 EDT 2021
I am excited to announce the release of MyGrad 2.0.
MyGrad's primary goal is to make automatic differentiation accessible and
easy to use across the NumPy ecosystem (see  for more detailed comments).
MyGrad's only dependency is NumPy and (as of version 2.0) it makes keen use
of NumPy's excellent protocols for overriding functions and ufuncs. Thus
you can "drop in" a mygrad-tensor into your pure NumPy code and compute
derivatives through it.
Ultimately, MyGrad could be extended to bring autodiff to other array-based
libraries like CuPy, Sparse, and Dask.
For full release notes see . Feedback, critiques, and ideas are welcome!
 MyGrad is not meant to "compete" with the likes of PyTorch and JAX,
which are fantastically-fast and powerful autodiff libraries. Rather, its
emphasis is on being lightweight and seamless to use in NumPy-centric
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the NumPy-Discussion