I wanted to give everyone an update on what's going on with the NumPy
grant . As you may have noticed, things have been moving a bit
slower than originally hoped -- unfortunately my health is improving
but has continued to be rocky .
Fortunately, I have awesome co-workers, and BIDS has an institutional
interest/mandate for figuring out how to make these things happen, so
after thinking it over we've decided to reorganize how we're doing
things internally and split up the work to let me focus on the core
technical/community aspects without getting overloaded. Specifically,
Fernando Pérez and Jonathan Dugan  are taking on PI/administration
duties, Stéfan van der Walt will focus on handling day-to-day
management of the incoming hires, and Nelle Varoquaux & Jarrod Millman
will also be joining the team (exact details TBD).
This shouldn't really affect any of you, except that you might see
some familiar faces with @berkeley.edu emails becoming more engaged.
I'm still leading the Berkeley effort, and in any case it's still
ultimately the community and NumPy steering council who will be making
decisions about the project – this is just some internal details about
how we're planning to manage our contributions. But in the interest of
full transparency I figured I'd let you know what's happening.
In other news, the job ad to start the official hiring process has now
been submitted for HR review, so it should hopefully be up soon --
depending on how efficient the bureaucracy is. I'll definitely let
everyone know as soon as its posted.
I'll also be giving a lunch talk at BIDS tomorrow to let folks locally
know about what's going on, which I think will be recorded – I'll send
around a link after in case others are interested.
Nathaniel J. Smith -- https://vorpus.org
Thank you all kindly for your responses! Based on your encouragement, I
will pursue an ndarray subclass / __array_ufunc__ implementation. I had
been toying with np.set_numeric_ops, which is less than ideal (for example,
np.ndarray.around segfaults if I use set_numeric_ops in any way).
A second question: very broadly speaking, how much 'pain' can I expect
trying to use an np.ndarray subclass in the broader python scientific
computing ecosystem, and is there general consensus that projects 'should'
support ndarray subclasses?
> We spent a *long time* sorting out the messy details of __array_ufunc__
, especially for handling interactions between different types, e.g.,
between numpy arrays, non-numpy array-like objects, builtin Python objects,
objects that override arithmetic to act in non-numpy-like ways, and of
course subclasses of all the above.
> We hope that we have it right this time, but as we wrote in the NumPy
1.13 release notes "The API is provisional, we do not yet guarantee
backward compatibility as modifications may be made pending feedback." That
said, let's give it a try!
> If any changes are necessary, I expect it would likely relate to how we
handle interactions between different types. That's where we spent the
majority of the design effort, but debate is a poor substitute for
experience. I would be very surprised if the basic cases (one argument or
two arguments of the same type) need any changes.
> Date: Fri, 27 Oct 2017 17:52:23 -0400
> From: Marten van Kerkwijk
> Hi Peter,
> When using units, if `a` is not angular (or dimensionless), I don't
> see how one could write code in which your example wouldn't fail...
> But I may be missing something, since for your example one would just
> realize that cos(ka)+i sin(ka) = exp(ika), in which case the log is
> just ika and one can the whole complexity...
Sorry, I thought I replied to you but somehow it didn’t go through.
Yes, that example was a bit contrived, but it was just an example
where something like sin(x) can be meaningful even if x is dimensional
(though you much more typically see these things with log or exp).
Right before 1.12, I arranged an API around an np.ndarray subclass, making
use of __array_ufunc__ to customize behavior based on structured dtype (we
come from c++ and really like operator overloading). Having seen
__array_ufunc__ featured in Travis Oliphant's Guide to NumPy: 2nd Edition,
I assumed this was the way to go. But it was removed in 1.12. Now that 1.13
has reintroduced __array_ufunc__, can I now rely on its continued
availability? I am considering using it as a base-level component in
several libraries... is this a dangerous idea?
William H. Sheffler Ph.D.
Institute for Protein Design
University of Washington
I am new to Numpy, and would like to start by translating a (badly written?) piece of MATLAB code.
What I have come up with so far is this:
px = np.zeros_like(tmp_px); py = np.zeros_like(tmp_py); pz = np.zeros_like(tmp_pz)
w = np.zeros_like(tmp_w)
x = np.zeros_like(tmp_x); y = np.zeros_like(tmp_y); z = np.zeros_like(tmp_z)
for i in range(tmp_px.size):
if tmp_px[i] > 2:
j += 1
px[j] = tmp_px[i]
py[j] = tmp_py[i]
pz[j] = tmp_pz[i]
w[j] = tmp_w[i]
x[j] = tmp_x[i]
y[j] = tmp_y[i]
z[j] = tmp_z[i]
px=px[:j+1]; py=py[:j+1]; pz=pz[:j+1]
x=x[:j+1]; y=y[:j+1]; z=z[:j+1]
It works, but I'm sure it's probably the most inefficient way of doing it. What would be a decent rewrite?
Thank you so much,
> Date: Thu, 26 Oct 2017 17:27:33 -0400
> From: Marten van Kerkwijk
> That sounds somewhat puzzling as units cannot really propagate without
> them somehow telling how they would change! (e.g., the outcome of
> sin(a) is possible only for angular units and then depends on that
> unit). But in any case, the mailing list is probably not the best case
> to discuss this - rather, I look forward to -- and will most happily
> give feedback on -- a NEP or other more detailed explanation!
So whilst it’s true that trigonometric functions only make sense for
dimensionless quantities, you might still want to compute them for
dimensional quantities for reasons of computational efficiency. Taking
your example of sin(a) in a spectral density identity:
log(cos(ka) + i sin(ka)) = k log(cos(a) + i sin(a))
so if you are computing the LHS for many k and a single a (i.e k the
wavenumber and ka dimensionless) then you might prefer the RHS, which
actually uses sin(a).
> > On Thu, Oct 26, 2017 at 12:11 PM, Daniele Nicolodi <daniele(a)grinta.net>
> > wrote:
> >> is there a better way to write the dot product between a stack of
> >> matrices? In my case I need to compute
> >> y = A.T @ inv(B) @ A
> >> with A a 3x1 matrix and B a 3x3 matrix, N times, with N in the few
> >> hundred thousands range. I thus "vectorize" the thing using stack of
> >> matrices, so that A is a Nx3x1 matrix and B is Nx3x3 and I can write:
> >> y = np.matmul(np.transpose(A, (0, 2, 1)), np.matmul(inv(B), A))
If you only ever multiply your matrix inverse by a single vector then
you may also wish to consider
which usually has a better prefactor (although for 3x3 it's pretty
marginal, your hardware may vary).
is there a better way to write the dot product between a stack of
matrices? In my case I need to compute
y = A.T @ inv(B) @ A
with A a 3x1 matrix and B a 3x3 matrix, N times, with N in the few
hundred thousands range. I thus "vectorize" the thing using stack of
matrices, so that A is a Nx3x1 matrix and B is Nx3x3 and I can write:
y = np.matmul(np.transpose(A, (0, 2, 1)), np.matmul(inv(B), A))
which I guess could be also written (in Python 3.6 and later):
y = np.transpose(A, (0, 2, 1)) @ inv(B) @ A
and I obtain a Nx1x1 y matrix which I can collapse to the vector I need
However, the need for the second argument of np.transpose() seems odd to
me, because all other functions handle transparently the matrix stacking.
Am I missing something? Is there a more natural matrix arrangement that
I could use to obtain the same results more naturally?
We are extremely pleased to announce the release of SciPy 1.0, 16 years
version 0.1 saw the light of day. It has been a long, productive journey to
get here, and we anticipate many more exciting new features and releases in
Why 1.0 now?
A version number should reflect the maturity of a project - and SciPy was a
mature and stable library that is heavily used in production settings for a
long time already. From that perspective, the 1.0 version number is long
Some key project goals, both technical (e.g. Windows wheels and continuous
integration) and organisational (a governance structure, code of conduct
roadmap), have been achieved recently.
Many of us are a bit perfectionist, and therefore are reluctant to call
something "1.0" because it may imply that it's "finished" or "we are 100%
with it". This is normal for many open source projects, however that
make it right. We acknowledge to ourselves that it's not perfect, and there
are some dusty corners left (that will probably always be the case).
that, SciPy is extremely useful to its users, on average has high quality
and documentation, and gives the stability and backwards compatibility
guarantees that a 1.0 label imply.
Some history and perspectives
- 2001: the first SciPy release
- 2005: transition to NumPy
- 2007: creation of scikits
- 2008: scipy.spatial module and first Cython code added
- 2010: moving to a 6-monthly release cycle
- 2011: SciPy development moves to GitHub
- 2011: Python 3 support
- 2012: adding a sparse graph module and unified optimization interface
- 2012: removal of scipy.maxentropy
- 2013: continuous integration with TravisCI
- 2015: adding Cython interface for BLAS/LAPACK and a benchmark suite
- 2017: adding a unified C API with scipy.LowLevelCallable; removal of
- 2017: SciPy 1.0 release
**Pauli Virtanen** is SciPy's Benevolent Dictator For Life (BDFL). He says:
*Truthfully speaking, we could have released a SciPy 1.0 a long time ago,
happy we do it now at long last. The project has a long history, and during
years it has matured also as a software project. I believe it has well
its merit to warrant a version number starting with unity.*
*Since its conception 15+ years ago, SciPy has largely been written by and
scientists, to provide a box of basic tools that they need. Over time, the
of people active in its development has undergone some rotation, and we have
evolved towards a somewhat more systematic approach to development.
this underlying drive has stayed the same, and I think it will also continue
propelling the project forward in future. This is all good, since not long
after 1.0 comes 1.1.*
**Travis Oliphant** is one of SciPy's creators. He says:
*I'm honored to write a note of congratulations to the SciPy developers and
entire SciPy community for the release of SciPy 1.0. This release
a dream of many that has been patiently pursued by a stalwart group of
for nearly 2 decades. Efforts have been broad and consistent over that
from many hundreds of people. From initial discussions to efforts coding
packaging to documentation efforts to extensive conference and community
building, the SciPy effort has been a global phenomenon that it has been a
privilege to participate in.*
*The idea of SciPy was already in multiple people’s minds in 1997 when I
joined the Python community as a young graduate student who had just fallen
love with the expressibility and extensibility of Python. The internet was
just starting to bringing together like-minded mathematicians and
nascent electronically-connected communities. In 1998, there was a
discussion on the matrix-SIG, python mailing list with people like Paul
Barrett, Joe Harrington, Perry Greenfield, Paul Dubois, Konrad Hinsen, David
Ascher, and others. This discussion encouraged me in 1998 and 1999 to
procrastinate my PhD and spend a lot of time writing extension modules to
Python that mostly wrapped battle-tested Fortran and C-code making it
to the Python user. This work attracted the help of others like Robert
Pearu Peterson and Eric Jones who joined their efforts with mine in 2000 so
that by 2001, the first SciPy release was ready. This was long before
simplified collaboration and input from others and the "patch" command and
email was how you helped a project improve.*
*Since that time, hundreds of people have spent an enormous amount of time
improving the SciPy library and the community surrounding this library has
dramatically grown. I stopped being able to participate actively in
the SciPy library around 2010. Fortunately, at that time, Pauli Virtanen
Ralf Gommers picked up the pace of development supported by dozens of other
contributors such as David Cournapeau, Evgeni Burovski, Josef Perktold, and
Warren Weckesser. While I have only been able to admire the development of
SciPy from a distance for the past 7 years, I have never lost my love of the
project and the concept of community-driven development. I remain driven
even now by a desire to help sustain the development of not only the SciPy
library but many other affiliated and related open-source projects. I am
extremely pleased that SciPy is in the hands of a world-wide community of
talented developers who will ensure that SciPy remains an example of how
grass-roots, community-driven development can succeed.*
**Fernando Perez** offers a wider community perspective:
*The existence of a nascent Scipy library, and the incredible --if tiny by
today's standards-- community surrounding it is what drew me into the
scientific Python world while still a physics graduate student in 2001.
I am awed when I see these tools power everything from high school
the research that led to the 2017 Nobel Prize in physics.*
*Don't be fooled by the 1.0 number: this project is a mature cornerstone of
modern scientific computing ecosystem. I am grateful for the many who have
made it possible, and hope to be able to contribute again to it in the
My sincere congratulations to the whole team!*
Highlights of this release
Some of the highlights of this release are:
- Major build improvements. Windows wheels are available on PyPI for the
first time, and continuous integration has been set up on Windows and OS X
in addition to Linux.
- A set of new ODE solvers and a unified interface to them
- Two new trust region optimizers and a new linear programming method, with
improved performance compared to what `scipy.optimize` offered previously.
- Many new BLAS and LAPACK functions were wrapped. The BLAS wrappers are
Upgrading and compatibility
There have been a number of deprecations and API changes in this release,
are documented below. Before upgrading, we recommend that users check that
their own code does not use deprecated SciPy functionality (to do so, run
code with ``python -Wd`` and check for ``DeprecationWarning`` s).
This release requires Python 2.7 or >=3.4 and NumPy 1.8.2 or greater.
This is also the last release to support LAPACK 3.1.x - 3.3.x. Moving the
lowest supported LAPACK version to >3.2.x was long blocked by Apple
providing the LAPACK 3.2.1 API. We have decided that it's time to either
Accelerate or, if there is enough interest, provide shims for functions
in more recent LAPACK versions so it can still be used.
`scipy.cluster.hierarchy.optimal_leaf_ordering`, a function to reorder a
linkage matrix to minimize distances between adjacent leaves, was added.
N-dimensional versions of the discrete sine and cosine transforms and their
inverses were added as ``dctn``, ``idctn``, ``dstn`` and ``idstn``.
A set of new ODE solvers have been added to `scipy.integrate`. The
function `scipy.integrate.solve_ivp` allows uniform access to all solvers.
The individual solvers (``RK23``, ``RK45``, ``Radau``, ``BDF`` and
can also be used directly.
The BLAS wrappers in `scipy.linalg.blas` have been completed. Added
are ``*gbmv``, ``*hbmv``, ``*hpmv``, ``*hpr``, ``*hpr2``, ``*spmv``,
``*tbmv``, ``*tbsv``, ``*tpmv``, ``*tpsv``, ``*trsm``, ``*trsv``, ``*sbmv``,
Wrappers for the LAPACK functions ``*gels``, ``*stev``, ``*sytrd``,
``*sytf2``, ``*hetrf``, ``*sytrf``, ``*sycon``, ``*hecon``, ``*gglse``,
``*stebz``, ``*stemr``, ``*sterf``, and ``*stein`` have been added.
The function `scipy.linalg.subspace_angles` has been added to compute the
subspace angles between two matrices.
The function `scipy.linalg.clarkson_woodruff_transform` has been added.
It finds low-rank matrix approximation via the Clarkson-Woodruff Transform.
The functions `scipy.linalg.eigh_tridiagonal` and
`scipy.linalg.eigvalsh_tridiagonal`, which find the eigenvalues and
eigenvectors of tridiagonal hermitian/symmetric matrices, were added.
Support for homogeneous coordinate transforms has been added to
The ``ndimage`` C code underwent a significant refactoring, and is now
a lot easier to understand and maintain.
The methods ``trust-region-exact`` and ``trust-krylov`` have been added to
function `scipy.optimize.minimize`. These new trust-region methods solve the
subproblem with higher accuracy at the cost of more Hessian factorizations
(compared to dogleg) or more matrix vector products (compared to ncg) but
usually require less nonlinear iterations and are able to deal with
Hessians. They seem very competitive against the other Newton methods
implemented in scipy.
`scipy.optimize.linprog` gained an interior point method. Its performance
superior (both in accuracy and speed) to the older simplex method.
An argument ``fs`` (sampling frequency) was added to the following
``firwin``, ``firwin2``, ``firls``, and ``remez``. This makes these
consistent with many other functions in `scipy.signal` in which the sampling
frequency can be specified.
`scipy.signal.freqz` has been sped up significantly for FIR filters.
Iterating over and slicing of CSC and CSR matrices is now faster by up to
The ``tocsr`` method of COO matrices is now several times faster.
The ``diagonal`` method of sparse matrices now takes a parameter, indicating
which diagonal to return.
A new iterative solver for large-scale nonsymmetric sparse linear systems,
`scipy.sparse.linalg.gcrotmk`, was added. It implements ``GCROT(m,k)``, a
flexible variant of ``GCROT``.
`scipy.sparse.linalg.lsmr` now accepts an initial guess, yielding
SuperLU was updated to version 5.2.1.
Many distance metrics in `scipy.spatial.distance` gained support for
The signatures of `scipy.spatial.distance.pdist` and
`scipy.spatial.distance.cdist` were changed to ``*args, **kwargs`` in order
support a wider range of metrics (e.g. string-based metrics that need extra
keywords). Also, an optional ``out`` parameter was added to ``pdist`` and
``cdist`` allowing the user to specify where the resulting distance matrix
to be stored
The methods ``cdf`` and ``logcdf`` were added to
`scipy.stats.multivariate_normal`, providing the cumulative distribution
function of the multivariate normal distribution.
New statistical distance functions were added, namely
`scipy.stats.wasserstein_distance` for the first Wasserstein distance and
`scipy.stats.energy_distance` for the energy distance.
The following functions in `scipy.misc` are deprecated: ``bytescale``,
``fromimage``, ``imfilter``, ``imread``, ``imresize``, ``imrotate``,
``imsave``, ``imshow`` and ``toimage``. Most of those functions have
behavior (like rescaling and type casting image data without the user asking
for that). Other functions simply have better alternatives.
``scipy.interpolate.interpolate_wrapper`` and all functions in that
are deprecated. This was a never finished set of wrapper functions which is
not relevant anymore.
The ``fillvalue`` of `scipy.signal.convolve2d` will be cast directly to the
dtypes of the input arrays in the future and checked that it is a scalar or
an array with a single element.
``scipy.spatial.distance.matching`` is deprecated. It is an alias of
`scipy.spatial.distance.hamming`, which should be used instead.
Implementation of `scipy.spatial.distance.wminkowski` was based on a wrong
interpretation of the metric definition. In scipy 1.0 it has been just
deprecated in the documentation to keep retro-compatibility but is
to use the new version of `scipy.spatial.distance.minkowski` that implements
the correct behaviour.
Positional arguments of `scipy.spatial.distance.pdist` and
`scipy.spatial.distance.cdist` should be replaced with their keyword
Backwards incompatible changes
The following deprecated functions have been removed from `scipy.stats`:
``betai``, ``chisqprob``, ``f_value``, ``histogram``, ``histogram2``,
``pdf_fromgamma``, ``signaltonoise``, ``square_of_sums``, ``ss`` and
The following deprecated functions have been removed from
``betai``, ``f_value_wilks_lambda``, ``signaltonoise`` and ``threshold``.
The deprecated ``a`` and ``reta`` keywords have been removed from
The deprecated functions ``sparse.csgraph.cs_graph_components`` and
``sparse.linalg.symeig`` have been removed from `scipy.sparse`.
The following deprecated keywords have been removed in
``drop_tol`` from ``splu``, and ``xtype`` from ``bicg``, ``bicgstab``,
``cgs``, ``gmres``, ``qmr`` and ``minres``.
The deprecated functions ``expm2`` and ``expm3`` have been removed from
`scipy.linalg`. The deprecated keyword ``q`` was removed from
`scipy.linalg.expm`. And the deprecated submodule ``linalg.calc_lwork`` was
The deprecated functions ``C2K``, ``K2C``, ``F2C``, ``C2F``, ``F2K`` and
``K2F`` have been removed from `scipy.constants`.
The deprecated ``ppform`` class was removed from `scipy.interpolate`.
The deprecated keyword ``iprint`` was removed from
The default value for the ``zero_phase`` keyword of `scipy.signal.decimate`
has been changed to True.
The ``kmeans`` and ``kmeans2`` functions in `scipy.cluster.vq` changed the
method used for random initialization, so using a fixed random seed will
not necessarily produce the same results as in previous versions.
`scipy.special.gammaln` does not accept complex arguments anymore.
The deprecated functions ``sph_jn``, ``sph_yn``, ``sph_jnyn``, ``sph_in``,
``sph_kn``, and ``sph_inkn`` have been removed. Users should instead use
the functions ``spherical_jn``, ``spherical_yn``, ``spherical_in``, and
``spherical_kn``. Be aware that the new functions have different
The cross-class properties of `scipy.signal.lti` systems have been removed.
The following properties/setters have been removed:
Name - (accessing/setting has been removed) - (setting has been removed)
* StateSpace - (``num``, ``den``, ``gain``) - (``zeros``, ``poles``)
* TransferFunction (``A``, ``B``, ``C``, ``D``, ``gain``) - (``zeros``,
* ZerosPolesGain (``A``, ``B``, ``C``, ``D``, ``num``, ``den``) - ()
``signal.freqz(b, a)`` with ``b`` or ``a`` >1-D raises a ``ValueError``.
was a corner case for which it was unclear that the behavior was
The method ``var`` of `scipy.stats.dirichlet` now returns a scalar rather
an ndarray when the length of alpha is 1.
SciPy now has a formal governance structure. It consists of a BDFL (Pauli
Virtanen) and a Steering Committee. See `the governance document
It is now possible to build SciPy on Windows with MSVC + gfortran!
integration has been set up for this build configuration on Appveyor,
Continuous integration for OS X has been set up on TravisCI.
The SciPy test suite has been migrated from ``nose`` to ``pytest``.
``scipy/_distributor_init.py`` was added to allow redistributors of SciPy to
add custom code that needs to run when importing SciPy (e.g. checks for
hardware, DLL search paths, etc.).
Support for PEP 518 (specifying build system requirements) was added - see
``pyproject.toml`` in the root of the SciPy repository.
In order to have consistent function names, the function
``scipy.linalg.solve_lyapunov`` is renamed to
`scipy.linalg.solve_continuous_lyapunov`. The old name is kept for
* @arcady +
* @xoviat +
* Anton Akhmerov
* Dominic Antonacci +
* Alessandro Pietro Bardelli
* Ved Basu +
* Michael James Bedford +
* Ray Bell +
* Juan M. Bello-Rivas +
* Sebastian Berg
* Felix Berkenkamp
* Jyotirmoy Bhattacharya +
* Matthew Brett
* Jonathan Bright
* Bruno Jiménez +
* Evgeni Burovski
* Patrick Callier
* Mark Campanelli +
* CJ Carey
* Robert Cimrman
* Adam Cox +
* Michael Danilov +
* David Haberthür +
* Andras Deak +
* Philip DeBoer
* Anne-Sylvie Deutsch
* Cathy Douglass +
* Dominic Else +
* Guo Fei +
* Roman Feldbauer +
* Yu Feng
* Jaime Fernandez del Rio
* Orestis Floros +
* David Freese +
* Adam Geitgey +
* James Gerity +
* Dezmond Goff +
* Christoph Gohlke
* Ralf Gommers
* Dirk Gorissen +
* Matt Haberland +
* David Hagen +
* Charles Harris
* Lam Yuen Hei +
* Jean Helie +
* Gaute Hope +
* Guillaume Horel +
* Franziska Horn +
* Yevhenii Hyzyla +
* Vladislav Iakovlev +
* Marvin Kastner +
* Mher Kazandjian
* Thomas Keck
* Adam Kurkiewicz +
* Ronan Lamy +
* J.L. Lanfranchi +
* Eric Larson
* Denis Laxalde
* Gregory R. Lee
* Felix Lenders +
* Evan Limanto
* Julian Lukwata +
* François Magimel
* Syrtis Major +
* Charles Masson +
* Nikolay Mayorov
* Tobias Megies
* Markus Meister +
* Roman Mirochnik +
* Jordi Montes +
* Nathan Musoke +
* Andrew Nelson
* M.J. Nichol
* Juan Nunez-Iglesias
* Arno Onken +
* Nick Papior +
* Dima Pasechnik +
* Ashwin Pathak +
* Oleksandr Pavlyk +
* Stefan Peterson
* Ilhan Polat
* Andrey Portnoy +
* Ravi Kumar Prasad +
* Aman Pratik
* Eric Quintero
* Vedant Rathore +
* Tyler Reddy
* Joscha Reimer
* Philipp Rentzsch +
* Antonio Horta Ribeiro
* Ned Richards +
* Kevin Rose +
* Benoit Rostykus +
* Matt Ruffalo +
* Eli Sadoff +
* Pim Schellart
* Nico Schlömer +
* Klaus Sembritzki +
* Nikolay Shebanov +
* Jonathan Tammo Siebert
* Scott Sievert
* Max Silbiger +
* Mandeep Singh +
* Michael Stewart +
* Jonathan Sutton +
* Deep Tavker +
* Martin Thoma
* James Tocknell +
* Aleksandar Trifunovic +
* Paul van Mulbregt +
* Jacob Vanderplas
* Aditya Vijaykumar
* Pauli Virtanen
* James Webber
* Warren Weckesser
* Eric Wieser +
* Josh Wilson
* Zhiqing Xiao +
* Evgeny Zhurko
* Nikolay Zinov +
* Zé Vinícius +
A total of 121 people contributed to this release.
People with a "+" by their names contributed a patch for the first time.
This list of names is automatically generated, and may not be fully