[Neuroimaging] noise estimation in non diffusion datasets
Samuel St-Jean
stjeansam at gmail.com
Tue Aug 1 07:33:31 EDT 2017
I unfortunately don't think it would work here, since you need to have a
good idea of what to linear model stuff against (such as huge signal drop
in dwi due to vibration or the fmri hrf function) it seems. It's also that
linear modeling will shove in the error term anything not fitting correctly
your model, which will include a tons more source of error than say
(thermal) noise arising from error in your raw magnitude measurements.
Simple linear models (without the generalised part) like I planned to use
also assume normally distributed residuals, so it would likely overestimate
stuff with magnitude data.
It's also that I actually want to use a linear model to predict my data,
and I was planning to use the noise estimation part as part of a
regulariser while finding the coefficients for the equations, so it would
lead to a kind of circular problem in my case here.
Well anyway, if you have (or, well, anyone also reading this) a suggestion
of model approach to use on non fmri data (I have no idea if it even exist,
as plain structural something-weighted mri data might not follow any
special experimental model), it could also give me ideas, I was just going
for a plain good old matrix of random image patches as usual so far.
2017-07-31 13:39 GMT+02:00 Ariel Rokem <arokem at gmail.com>:
> Hi Samuel,
>
> On Sat, Jul 29, 2017 at 10:10 PM, Samuel St-Jean <stjeansam at gmail.com>
> wrote:
>
>> Hello,
>>
>> I've been trying out some stuff recently with another guy on 'legacy'
>> datasets, which comprise CT scans and stuff like T1w which are 0.8x0.8x8
>> mm. Unsurprisingly, they also have huge gradient intensity since these
>> things dates back from when I started high school. Anyway, they also have
>> diffusion data and noise estimation stuff works kind of ok, but the other
>> weighting and modality are mostly a no go or perhaps could be done way
>> better) using the diffusion tools we have now.
>>
>> So far I've also tried an aonlm-like noise estimator, but I was wondering
>> if people working with all types of datasets (I am mostly a diffusion mri
>> person in the first place) had suggestions about good or commonly used
>> noise estimator and where to find them? Most likely candidate for that
>> would be the fmri guys I'd guess, and of course those scans did not have
>> a/some noise maps back in the days.
>>
>>
> You may be interested in this model-based approach:
>
> http://journal.frontiersin.org/article/10.3389/fnins.2013.00247/full
>
> It's somewhat related to this DWI method: https://pdfs.semanticscholar.
> org/8e89/03f8092c27d159879a3a2429a771d21b7be8.pdf
>
> Cheers,
>
> Ariel
>
>
>> Thanks for the help and pointers,
>>
>> Samuel
>>
>>
>> _______________________________________________
>> Neuroimaging mailing list
>> Neuroimaging at python.org
>> https://mail.python.org/mailman/listinfo/neuroimaging
>>
>
>
> _______________________________________________
> Neuroimaging mailing list
> Neuroimaging at python.org
> https://mail.python.org/mailman/listinfo/neuroimaging
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/neuroimaging/attachments/20170801/ad824202/attachment.html>
More information about the Neuroimaging
mailing list