[Numpy-discussion] how much does binary size matter?
jtaylor.debian at googlemail.com
Fri Apr 26 15:13:22 EDT 2019
We understand that it can be burden, of course a larger binary is bad
but that bad usually also comes with good, like better performance or
How much of a burden is it and where is the line between I need twice as
long to download it which is just annoying and I cannot use it anymore
because for example it does not fit onto my device anymore.
Are there actual environments or do you know of any environments where
the size of the numpy binary has an impact on whether it can be deployed
or not or where it is more preferable for numpy to be small than it is
to be fast or full of features.
This is interesting to us just to judge on how to handle marginal
improvements which come with relatively large increases in binary size.
With some use case information we can better estimate were it is
worthwhile to think about alternatives or to spend more benchmarking
work to determine the most important cases and where not.
If there are such environments there are other options than blocking or
complicating future enhancements, like for example add a compile time
option to make it smaller again by e.g. stripping out hardware specific
code or avoiding size expensive optimizations.
But without concrete usecases this appears to be a not something worth
spending time on.
On 26.04.19 11:47, Éric Depagne wrote:
> Le vendredi 26 avril 2019, 11:10:56 SAST Ralf Gommers a écrit :
> Hi Ralf,
>> Right now a wheel is 16 MB. If we increase that by 10%/50%/100% - are we
>> causing a real problem for someone?
> Access to large bandwidth is not universal at all, and in many countries (I'd
> even say in most of the countries around the world), 16 Mb is a significant
> amount of data so increasing it is a burden.
More information about the NumPy-Discussion