[Neuroimaging] Nibabel: Slowdown in traversing object using dataobj
szorowi1 at gmail.com
Wed Nov 30 09:10:48 EST 2016
Uncompressing absolutely did the trick. Thank you!
On Wed, Nov 30, 2016 at 6:16 AM, paul mccarthy <pauldmccarthy at gmail.com>
> If you need to stick with ``.nii.gz``, you could use my indexed_gzip
> The first access will be slow, but subsequent accesses much faster.
> On 30 November 2016 at 00:57, Matthew Brett <matthew.brett at gmail.com>
>> On Tue, Nov 29, 2016 at 4:52 PM, Sam Zorowitz <szorowi1 at gmail.com> wrote:
>> > Hi all,
>> > Hopefully a quick question: Assuming a 4d volume image, can someone
>> > why it takes longer to load from memory the 100th acquisition the 1st
>> > acquisition?
>> > For example, I have an image that is (110, 110, 63, 977). When I
>> >>> %timeit obj.dataobj[..., 0]
>> > I get: 10 loops, best of 3: 37.8 ms per loop
>> >>> %timeit obj.dataobj[..., 100]
>> > I get: 1 loop, best of 3: 5.99 s per loop
>> > Why is this? Can someone recommend an alternative?
>> I'm guessing this is a ``.nii.gz`` file? If so, then the difference
>> is just because nibabel has to gunzip 100 volumes worth of data in the
>> latter case.
>> If you can get away with an uncompressed file, my prediction is that
>> you'll find the difference will go away.
>> Neuroimaging mailing list
>> Neuroimaging at python.org
> Neuroimaging mailing list
> Neuroimaging at python.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Neuroimaging