[SciPy-User] Large Memory usage while doing median filter

Joe P Ninan indiajoe at gmail.com
Sun May 10 15:26:29 EDT 2015


Hi,
I was trying median_filter in scipy.ndimage.filters
on a 1024x1024 array.

What I noticed is that the memory requirement grows really fast when we
increase the size of the median filter.
On a machine with 6gb RAM I could do only (150,150) size filter.
Anything above gives Memory Error.

On a bigger server I could see it takes about 16gb RAM while using a filter
size (200, 200)

I can understand, computation time increasing with size of filter, but why
is the memory size exploding with respect to size of the median filter?
Is this expected behaviour?

-cheers
joe
-- 
/---------------------------------------------------------------
"GNU/Linux: because a PC is a terrible thing to waste" -  GNU Generation

************************************************
Joe Philip Ninan
Research Scholar
DAA,  TIFR,
Mumbai, India.
Ph: +917738438212
------------------------------------------------------------
Website: www.tifr.res.in/~ninan/ <http://www.tifr.res.in/%7Eninan/>
My GnuPG Public Key: www.tifr.res.in/~ninan/JPN_public.key
<http://www.tifr.res.in/%7Eninan/JPN_public.key>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.scipy.org/pipermail/scipy-user/attachments/20150511/24ff018a/attachment.html>


More information about the SciPy-User mailing list