[Numpy-discussion] Loading large NIfTI file -> MemoryError
jtaylor.debian at googlemail.com
Tue Dec 31 08:29:42 EST 2013
On 31.12.2013 14:13, Amira Chekir wrote:
> Hello together,
> I try to load a (large) NIfTI file (DMRI from Human Connectome Project,
> about 1 GB) with NiBabel.
> import nibabel as nib
> img = nib.load("dmri.nii.gz")
> data = img.get_data()
> The program crashes during "img.get_data()" with an "MemoryError"
> (having 4 GB of RAM in my machine).
> Any suggestions?
are you using a 64 bit operating system?
which version of numpy?
assuming nibabel uses np.load under the hood you could try it with numpy
1.8 which reduces excess memory usage when loading compressed files.
More information about the NumPy-Discussion