Several issues and PRs have proposed enhancements to the multivariate normal distribution (e.g. gh-8999, gh-9921, gh-9942, gh-11053, gh-11772, gh-12184, gh-15675), but the functionality has remained essentially untouched for at least six years.
gh-16002 begins to make progress toward these issues by introducing a `Covariance` class and public subclasses (e.g. `CovViaPrecision`) that allow the user to provide the desired decomposition of the covariance matrix instead the covariance matrix itself. After this and planned followup PRs are merged, the user will be able to specify a covariance matrix via its diagonal elements, Cholesky decomposition, LDL decomposition, eigenvalue decomposition, and inverse. (Of course, many other ideas are possible - e.g. sparse decompositions, tridiagonal, etc.) Advantages include better treatment of singular covariance matrices, avoiding repeated decomposition of the same matrix, and more efficient computation of the Mahalanobis distance. The `Covariance` classes implement essential methods and properties to separate the logic of these computations from the `multivariate_normal` class, and therefore they can be reused in other multivariate distributions (e.g. `multivariate_t`, `wishart`, `invwishart`) and directly in user code. It also sets the stage for enabling efficient broadcasting over n-dimensional shape parameter arrays (e.g. stacks of covariance matrices and means).
Please join the discussion in gh-16002 if you're interested. https://github.com/scipy/scipy/pull/16002 We'll leave the PR open to feedback for at least a week.