From badar.almarri at uconn.edu Sat Apr 2 22:50:02 2016 From: badar.almarri at uconn.edu (Badar Almarri) Date: Sat, 2 Apr 2016 22:50:02 -0400 Subject: [Neuroimaging] nibabel Installation Message-ID: Hello, I am trying to install nibabel on my OS X with Python 2.7.6 and numpy 1.8. I install it via pip, and it only works if I sudo. Although I can see it in the list of modules, when I import it, it says no module with this name. Any advice? Tahnks Best, Badar -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Sat Apr 2 23:03:02 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 2 Apr 2016 20:03:02 -0700 Subject: [Neuroimaging] nibabel Installation In-Reply-To: References: Message-ID: Hi, On Sat, Apr 2, 2016 at 7:50 PM, Badar Almarri wrote: > Hello, > > I am trying to install nibabel on my OS X with Python 2.7.6 and numpy 1.8. I > install it via pip, and it only works if I sudo. Although I can see it in > the list of modules, when I import it, it says no module with this name. Any > advice? Tahnks It sounds like you are using the Python that comes with OSX - I'd strongly advise you to use homebrew Python or Python downloaded from Python.org instead - see [1]. Either way, I would also suggest you always use the ``--user`` flag to pip, to install into your user directories instead of the system directories. Something like: pip install --user numpy nibabel Best, Matthew [1] https://github.com/MacPython/wiki/wiki/Which-Python From badar.almarri at uconn.edu Sun Apr 3 00:45:37 2016 From: badar.almarri at uconn.edu (Badar Almarri) Date: Sun, 3 Apr 2016 00:45:37 -0400 Subject: [Neuroimaging] nibabel Installation In-Reply-To: References: Message-ID: Hi Matthew, I went right away with the suggestion of (--user flag) because it made a lot of sense to me. And it worked. Thanks Best, Badar Badar Almarri Graduate Student Dept. of Computer Science and Engr. University of Connecticut badar.almarri at uconn.edu On Sat, Apr 2, 2016 at 11:03 PM, Matthew Brett wrote: > Hi, > > On Sat, Apr 2, 2016 at 7:50 PM, Badar Almarri > wrote: > > Hello, > > > > I am trying to install nibabel on my OS X with Python 2.7.6 and numpy > 1.8. I > > install it via pip, and it only works if I sudo. Although I can see it in > > the list of modules, when I import it, it says no module with this name. > Any > > advice? Tahnks > > It sounds like you are using the Python that comes with OSX - I'd > strongly advise you to use homebrew Python or Python downloaded from > Python.org instead - see [1]. > > Either way, I would also suggest you always use the ``--user`` flag to > pip, to install into your user directories instead of the system > directories. > > Something like: > > pip install --user numpy nibabel > > Best, > > Matthew > > [1] https://github.com/MacPython/wiki/wiki/Which-Python > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Wed Apr 6 22:04:05 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 6 Apr 2016 19:04:05 -0700 Subject: [Neuroimaging] Drop Python 2.6 for nibabel? Message-ID: Hi, Yarik current has a PR into nibabel that needs `collections.Counter`, available in Python >= 2.7 [1] https://github.com/nipy/nibabel/pull/437 At the moment, nibabel supports Python 2.6, but Python 2.6 was released in 2008 [2] isn't maintained any more [3], and numpy is dropping Python 2.6 in the next release [4]. So, I propose that we drop Python 2.6 support for nibabel as well. Any objections? Cheers, Matthew [1] https://github.com/nipy/nibabel/pull/437 [2] https://www.python.org/download/releases/2.6/ [3] https://www.python.org/download/releases/2.6.9/ [4] https://github.com/numpy/numpy/blob/master/doc/release/1.12.0-notes.rst#dropped-support From jbpoline at gmail.com Wed Apr 6 22:45:33 2016 From: jbpoline at gmail.com (JB Poline) Date: Wed, 6 Apr 2016 19:45:33 -0700 Subject: [Neuroimaging] Drop Python 2.6 for nibabel? In-Reply-To: References: Message-ID: Sounds most reasonable to me. JB On Wed, Apr 6, 2016 at 7:04 PM, Matthew Brett wrote: > Hi, > > Yarik current has a PR into nibabel that needs `collections.Counter`, > available in Python >= 2.7 [1] > > https://github.com/nipy/nibabel/pull/437 > > At the moment, nibabel supports Python 2.6, but Python 2.6 was > released in 2008 [2] isn't maintained any more [3], and numpy is > dropping Python 2.6 in the next release [4]. So, I propose that we > drop Python 2.6 support for nibabel as well. Any objections? > > Cheers, > > Matthew > > [1] https://github.com/nipy/nibabel/pull/437 > [2] https://www.python.org/download/releases/2.6/ > [3] https://www.python.org/download/releases/2.6.9/ > [4] > https://github.com/numpy/numpy/blob/master/doc/release/1.12.0-notes.rst#dropped-support > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dagutman at gmail.com Wed Apr 6 22:53:09 2016 From: dagutman at gmail.com (David Gutman) Date: Thu, 07 Apr 2016 02:53:09 +0000 Subject: [Neuroimaging] Drop Python 2.6 for nibabel? In-Reply-To: References: Message-ID: agreed-- I think 8 years is a fairly good support period... I haven't made the switch to the python 3.0+ series, but I'd be surprised if there were a lot of systems still using 2.6 I think the main challenge might be people using very old versions of REDHAT/Centos that may still? be dependent on 2.6 since YUM is dependent on it. On Wed, Apr 6, 2016 at 10:46 PM JB Poline wrote: > Sounds most reasonable to me. > JB > > On Wed, Apr 6, 2016 at 7:04 PM, Matthew Brett > wrote: > >> Hi, >> >> Yarik current has a PR into nibabel that needs `collections.Counter`, >> available in Python >= 2.7 [1] >> >> https://github.com/nipy/nibabel/pull/437 >> >> At the moment, nibabel supports Python 2.6, but Python 2.6 was >> released in 2008 [2] isn't maintained any more [3], and numpy is >> dropping Python 2.6 in the next release [4]. So, I propose that we >> drop Python 2.6 support for nibabel as well. Any objections? >> >> Cheers, >> >> Matthew >> >> [1] https://github.com/nipy/nibabel/pull/437 >> [2] https://www.python.org/download/releases/2.6/ >> [3] https://www.python.org/download/releases/2.6.9/ >> [4] >> https://github.com/numpy/numpy/blob/master/doc/release/1.12.0-notes.rst#dropped-support >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Wed Apr 6 23:04:14 2016 From: arokem at gmail.com (Ariel Rokem) Date: Wed, 6 Apr 2016 20:04:14 -0700 Subject: [Neuroimaging] Drop Python 2.6 for nibabel? In-Reply-To: References: Message-ID: Besides, (at least one of) the core Python dev team asks nicely: http://www.snarky.ca/stop-using-python-2-6 On Wed, Apr 6, 2016 at 7:53 PM, David Gutman wrote: > agreed-- I think 8 years is a fairly good support period... I haven't > made the switch to the python 3.0+ series, but I'd be surprised if there > were a lot of systems still using 2.6 I think the main challenge might > be people using very old versions of REDHAT/Centos that may still? be > dependent on 2.6 since YUM is dependent on it. > > > > On Wed, Apr 6, 2016 at 10:46 PM JB Poline wrote: > >> Sounds most reasonable to me. >> JB >> >> On Wed, Apr 6, 2016 at 7:04 PM, Matthew Brett >> wrote: >> >>> Hi, >>> >>> Yarik current has a PR into nibabel that needs `collections.Counter`, >>> available in Python >= 2.7 [1] >>> >>> https://github.com/nipy/nibabel/pull/437 >>> >>> At the moment, nibabel supports Python 2.6, but Python 2.6 was >>> released in 2008 [2] isn't maintained any more [3], and numpy is >>> dropping Python 2.6 in the next release [4]. So, I propose that we >>> drop Python 2.6 support for nibabel as well. Any objections? >>> >>> Cheers, >>> >>> Matthew >>> >>> [1] https://github.com/nipy/nibabel/pull/437 >>> [2] https://www.python.org/download/releases/2.6/ >>> [3] https://www.python.org/download/releases/2.6.9/ >>> [4] >>> https://github.com/numpy/numpy/blob/master/doc/release/1.12.0-notes.rst#dropped-support >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Thu Apr 7 01:50:27 2016 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 7 Apr 2016 07:50:27 +0200 Subject: [Neuroimaging] Drop Python 2.6 for nibabel? In-Reply-To: References: Message-ID: <20160407055027.GO1204357@phare.normalesup.org> Redhat/Centos is indeed the problem. However, they are holding back the whole ecosystem, and it's coming at quite a cost. +1 for dropping Py 2.6. Ga?l On Thu, Apr 07, 2016 at 02:53:09AM +0000, David Gutman wrote: > agreed-- I think 8 years is a fairly good support period... ? ?I haven't made > the switch to the python 3.0+ series, but I'd be surprised if there were a lot > of systems still using 2.6 ? ? I think the main challenge might be people using > very old versions of REDHAT/Centos that may still? be dependent on 2.6 since > YUM is dependent on it. ? > On Wed, Apr 6, 2016 at 10:46 PM JB Poline wrote: > Sounds most reasonable to me. > JB > On Wed, Apr 6, 2016 at 7:04 PM, Matthew Brett > wrote: > Hi, > Yarik current has a PR into nibabel that needs `collections.Counter`, > available in Python >= 2.7 [1] > https://github.com/nipy/nibabel/pull/437 > At the moment, nibabel supports Python 2.6, but Python 2.6 was > released in 2008 [2] isn't maintained any more [3], and numpy is > dropping Python 2.6 in the next release [4].? So, I propose that we > drop Python 2.6 support for nibabel as well.? Any objections? > Cheers, > Matthew > [1] https://github.com/nipy/nibabel/pull/437 > [2] https://www.python.org/download/releases/2.6/ > [3] https://www.python.org/download/releases/2.6.9/ > [4] https://github.com/numpy/numpy/blob/master/doc/release/ > 1.12.0-notes.rst#dropped-support > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -- Gael Varoquaux Researcher, INRIA Parietal NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France Phone: ++ 33-1-69-08-79-68 http://gael-varoquaux.info http://twitter.com/GaelVaroquaux From njs at pobox.com Thu Apr 7 02:15:26 2016 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 6 Apr 2016 23:15:26 -0700 Subject: [Neuroimaging] Drop Python 2.6 for nibabel? In-Reply-To: <20160407055027.GO1204357@phare.normalesup.org> References: <20160407055027.GO1204357@phare.normalesup.org> Message-ID: On Wed, Apr 6, 2016 at 10:50 PM, Gael Varoquaux wrote: > Redhat/Centos is indeed the problem. However, they are holding back the > whole ecosystem, and it's coming at quite a cost. See also a red hat employee's opinion on this: http://www.curiousefficiency.org/posts/2015/04/stop-supporting-python26.html -n -- Nathaniel J. Smith -- https://vorpus.org From gael.varoquaux at normalesup.org Thu Apr 7 02:22:53 2016 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 7 Apr 2016 08:22:53 +0200 Subject: [Neuroimaging] Drop Python 2.6 for nibabel? In-Reply-To: References: <20160407055027.GO1204357@phare.normalesup.org> Message-ID: <20160407062253.GJ1291128@phare.normalesup.org> On Wed, Apr 06, 2016 at 11:15:26PM -0700, Nathaniel Smith wrote: > See also a red hat employee's opinion on this: > http://www.curiousefficiency.org/posts/2015/04/stop-supporting-python26.html Despite all the technical solutions listed on that blog post, most clusters sysadmins don't support Python2.7 (because they run a vanilla Centos/RH), and most cluster users suffer. So I find parts of this blog post a bit hypocritical. Users are taken hostage of a tug-of-war between sysadmin and developpers. Gael From garyfallidis at gmail.com Sat Apr 9 18:08:07 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Sat, 09 Apr 2016 22:08:07 +0000 Subject: [Neuroimaging] [dipy]Fitting diffusion models in the absence of S0 signal In-Reply-To: References: Message-ID: Hi Rafael and Ariel, Apologies for delaying to answer here. I think we need to set a hangout to discuss about this. One thing that maybe important for this discussion is that the function from dipy.core.gradients import gradient_table has a parameter called b0_threshold. This can be set to be 300 or higher and then the b0 will be considered as the one at 300. So, if the datasets don't have b=0 but b=300 these can be used instead. This means that just by changing the b0_threshold the datasets can be fit in a different ways. Could be that the actual easier solution is to call the gradient table in a different way (different b0 threshold) rather than changing the API? I will look now to the free water implementation to understand better the different issues. Cheers, Eleftherios p.s. Please give me your availability for a design hangout during the week. On Fri, Mar 25, 2016 at 11:14 AM Ariel Rokem wrote: > Hi Rafael, > > On Thu, Mar 24, 2016 at 4:12 AM, Rafael Henriques > wrote: > >> Hi Eleftherios, >> >> What can we do if the data don't have b0s? >> In the last years, everyone was including the b0 data in their DWI >> acquisitions. However, nowadays some groups are starting to acquire >> diffusion volume of images with low b-values (e.g. 300 s.mm-2) instead >> of the b0 volumes. They are doing this to insure that when fitting >> diffusion models they do not take into account Perfusion confounding >> effects. So my question is - what can we do to generalize Dipy for >> these cases? My suggestion is to include S0 always as model parameter, >> so even if users do not have b0 data, the model can easily give the >> extrapolated non-perfusion effected S0 signal. >> > > My example code was not really that great to demonstrate this point. I > have now updated the notebook so that it works with data that has a b=0 > measurement, but also with data that doesn't (you'll need to change the > commented out line in cell 3 to see both options). > > I also have two alternative implementations, following Eleftherios' > suggestions (I think): > > https://gist.github.com/arokem/508dc1b22bdbd0bdd748 > > In one implementation an estimate of S0 (`S0_hat`) is part of the > TensorFit object (I think that's what Eleftherios is suggesting). In the > other implementation, the estimate is part of the TensorModel.fit function > (as you suggest). > > The main disadvantage of alternative 1 is that we would have to pass the > data again into a method of the `TensorFit` object. The main disadvantage > of alternative 2 is that it requires a change to the `TensorFit.__init__` > API. My own tendency is to prefer this change to the `TensorFit.__init__` > API, because I don't think that people are using that API on its own, but > are typically getting their `TensorFit` objects from the `TensorModel.fit` > function. > > I think that passing the data in again into the `TensorFit` object will > not only be error-prone, but is also not as efficient. > > Importantly, this is not just a matter for people who use the prediction > API to see that the model fits the data, but also an issue for fitting > models that depend on the DTI model, such as the new FWE DTI model. > > Cheers, > > Ariel > > > > >> Also, how can you recover the S0 information using the line that you >> are suggested? If params only have the diffusion tensor information, >> that line will always be equal to 1, right? Am I missing something >> here? > > Best, >> Rafael >> >> >> > Hi Ariel, >> > >> > Apologies for delaying to answer. >> > >> > What I understand is that now the fit_model is doing the prediction for >> the >> > S0. Am I correct? >> > You recreate a predicted S0 inside fit_model but fit_model is about >> fitting >> > and not about predicting. >> > >> > I am not comfortable to changing fit_model to generate two parameters >> > (params and S0). >> > >> > This command can be called inside the predict method >> > S0 = np.mean(np.exp(np.dot(dm, params))[..., gtab.b0s_mask]) >> > >> > So, for me there is no reason of changing the init method of TensorFit. >> > >> > I hope I am not missing something. >> > Let me know if this suggestion is helpful. >> > >> > Cheers, >> > Eleftherios >> > >> > On Sun, Mar 20, 2016 at 12:04 PM, Ariel Rokem >> wrote: >> > >> >> Hi everyone, >> >> >> >> Thought I would re-raise this. Anyone have any thoughts here? Would a >> PR >> >> against the DTI and DKI modules be more helpful to clarify? >> >> >> >> Cheers, >> >> >> >> Ariel >> >> >> >> On Sat, Mar 5, 2016 at 3:04 AM, Ariel Rokem >> wrote: >> >> >> >>> >> >>> On Thu, Mar 3, 2016 at 7:28 AM, Eleftherios Garyfallidis < >> >>> garyfallidis at gmail.com> wrote: >> >>> >> >>>> Sorry your suggestion is not exactly clear. Can you give show us how >> the >> >>>> code will look with your proposal? Also, apart from DTI and DKI what >> other >> >>>> models will be affected from this changes. Is this a change >> suggested only >> >>>> for DTI and DKI or will affect all or most reconstruction models? >> >>>> >> >>>> >> >>> First of all, to answer your last question: this will certainly affect >> >>> DTI and DKI, and there will be other models to follow. For example the >> >>> FWDTI that Rafael is currently proposing in that PR. The idea would >> be to >> >>> also more tightly integrate these three models (and future >> extensions... >> >>> !), so that we can remove some of the redundancies that currently >> exist. We >> >>> could make this a part of the base.Reconst* methods - it might apply >> to >> >>> other models as well (e.g. CSD, SFM, etc). But that's part of what I >> would >> >>> like to discuss here. >> >>> >> >>> As for code, for now, here's a sketch of what this would look like for >> >>> the tensor model: >> >>> >> >>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >> >>> >> >>> Note that though it changes the prediction API a bit, not much else >> would >> >>> have to change. In particular, all the code that relies on there >> being 12 >> >>> model parameters will still be intact, because S0 doesn't go into the >> model >> >>> parameters. >> >>> >> >>> What do you think? Am I missing something big here? Or should I go >> ahead >> >>> and start working on a PR implementing this? >> >>> >> >>> Thanks! >> >>> >> >>> Ariel >> >>> >> >>> >> >>> >> >>>> On Mon, Feb 29, 2016 at 11:53 AM, Ariel Rokem >> wrote: >> >>>> >> >>>>> Hi everyone, >> >>>>> >> >>>>> In Rafael's recent PR implementing free-water-eliminated DTI ( >> >>>>> https://github.com/nipy/dipy/pull/835), we had a little bit of a >> >>>>> discussion about the use of the non-diffusion weighted signal (S0). >> As >> >>>>> pointed out by Rafael, in the absence of an S0 in the measured >> data, for >> >>>>> some models, that can be derived from the model fit ( >> >>>>> https://github.com/nipy/dipy/pull/835#issuecomment-183060855). >> >>>>> >> >>>>> I think that we would like to support using data both with and >> without >> >>>>> S0. On the other hand, I don't think that we should treat the >> derived S0 as >> >>>>> a model parameter, because in some cases, we want to provide S0 as >> an input >> >>>>> (for example, when predicting back the signal for another >> measurement, with >> >>>>> a different ). In addition, it would be hard to incorporate that >> into the >> >>>>> model_params variable of the TensorFit object, while maintaining >> backwards >> >>>>> compatibility of the TensorModel/TensorFit and derived classes >> (e.g., DKI). >> >>>>> >> >>>>> My proposal is to have an S0 property for ReconstFit objects. When >> this >> >>>>> is calculated from the model (e.g. in DTI), it gets set by the >> `fit` method >> >>>>> of the ReconstModel object. When it isn't, it can be set from the >> data. >> >>>>> Either way, it can be over-ridden by the user (e.g., for the >> purpose of >> >>>>> predicting on a new data-set). This might change the behavior of the >> >>>>> prediction code slightly, but maybe that is something we can live >> with? >> >>>>> >> >>>>> Happy to hear what everyone thinks, before we move ahead with this. >> >>>>> >> >>>>> Cheers, >> >>>>> >> >>>>> Ariel >> >>>>> >> >>>>> >> >>>>> >> >>>>> >> >>>>> >> >>>>> >> >>>>> _______________________________________________ >> >>>>> Neuroimaging mailing list >> >>>>> Neuroimaging at python.org >> >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >> >>>>> >> >>>>> >> >>>>> >> >>>> _______________________________________________ >> >>>> Neuroimaging mailing list >> >>>> Neuroimaging at python.org >> >>>> https://mail.python.org/mailman/listinfo/neuroimaging >> >>>> >> >>>> >> >>> >> >> >> >> _______________________________________________ >> >> Neuroimaging mailing list >> >> Neuroimaging at python.org >> >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Sun Apr 10 12:05:30 2016 From: arokem at gmail.com (Ariel Rokem) Date: Sun, 10 Apr 2016 09:05:30 -0700 Subject: [Neuroimaging] [dipy]Fitting diffusion models in the absence of S0 signal In-Reply-To: References: Message-ID: Hi Eleftherios, On Sat, Apr 9, 2016 at 3:08 PM, Eleftherios Garyfallidis < garyfallidis at gmail.com> wrote: > > Hi Rafael and Ariel, > > Apologies for delaying to answer here. I think we need to set a hangout to > discuss about this. > > One thing that maybe important for this discussion is that the function > > from dipy.core.gradients import gradient_table > > has a parameter called b0_threshold. > > This can be set to be 300 or higher and then the b0 will be considered > as the one at 300. So, if the datasets don't have b=0 but b=300 these can > be used instead. > > This means that just by changing the b0_threshold the datasets can be fit > in a different ways. > Could be that the actual easier solution is to call the gradient table in > a different way (different > b0 threshold) rather than changing the API? > > No - unfortunately I don't think this would be a solution to this particular concern. That is because what we need is really the value of the intercept of the signal-by-b-value curve. If we have an S0 measurement, we take that, but if we don't we need to *estimate* it. I will look now to the free water implementation to understand better the > different issues. > Yeah - some of the recent commits are about this particular issue (e.g., https://github.com/nipy/dipy/pull/835/commits/89c09c7e4095309c9d7ae42eee313a4fd1f9c880), but note changes that follow (!). Maybe Rafael can also help explain. > Cheers, > Eleftherios > > p.s. Please give me your availability for a design hangout during the week. > You can (always) find my availability here: https://www.google.com/calendar/embed?src=arokem%40gmail.com&ctz=America/Los_Angeles Cheers, Ariel > > > On Fri, Mar 25, 2016 at 11:14 AM Ariel Rokem wrote: > >> Hi Rafael, >> >> On Thu, Mar 24, 2016 at 4:12 AM, Rafael Henriques >> wrote: >> >>> Hi Eleftherios, >>> >>> What can we do if the data don't have b0s? >>> In the last years, everyone was including the b0 data in their DWI >>> acquisitions. However, nowadays some groups are starting to acquire >>> diffusion volume of images with low b-values (e.g. 300 s.mm-2) instead >>> of the b0 volumes. They are doing this to insure that when fitting >>> diffusion models they do not take into account Perfusion confounding >>> effects. So my question is - what can we do to generalize Dipy for >>> these cases? My suggestion is to include S0 always as model parameter, >>> so even if users do not have b0 data, the model can easily give the >>> extrapolated non-perfusion effected S0 signal. >>> >> >> My example code was not really that great to demonstrate this point. I >> have now updated the notebook so that it works with data that has a b=0 >> measurement, but also with data that doesn't (you'll need to change the >> commented out line in cell 3 to see both options). >> >> I also have two alternative implementations, following Eleftherios' >> suggestions (I think): >> >> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >> >> In one implementation an estimate of S0 (`S0_hat`) is part of the >> TensorFit object (I think that's what Eleftherios is suggesting). In the >> other implementation, the estimate is part of the TensorModel.fit function >> (as you suggest). >> >> The main disadvantage of alternative 1 is that we would have to pass the >> data again into a method of the `TensorFit` object. The main disadvantage >> of alternative 2 is that it requires a change to the `TensorFit.__init__` >> API. My own tendency is to prefer this change to the `TensorFit.__init__` >> API, because I don't think that people are using that API on its own, but >> are typically getting their `TensorFit` objects from the `TensorModel.fit` >> function. >> >> I think that passing the data in again into the `TensorFit` object will >> not only be error-prone, but is also not as efficient. >> >> Importantly, this is not just a matter for people who use the prediction >> API to see that the model fits the data, but also an issue for fitting >> models that depend on the DTI model, such as the new FWE DTI model. >> >> Cheers, >> >> Ariel >> >> >> >> >>> Also, how can you recover the S0 information using the line that you >>> are suggested? If params only have the diffusion tensor information, >>> that line will always be equal to 1, right? Am I missing something >>> here? >> >> Best, >>> Rafael >>> >>> >>> > Hi Ariel, >>> > >>> > Apologies for delaying to answer. >>> > >>> > What I understand is that now the fit_model is doing the prediction >>> for the >>> > S0. Am I correct? >>> > You recreate a predicted S0 inside fit_model but fit_model is about >>> fitting >>> > and not about predicting. >>> > >>> > I am not comfortable to changing fit_model to generate two parameters >>> > (params and S0). >>> > >>> > This command can be called inside the predict method >>> > S0 = np.mean(np.exp(np.dot(dm, params))[..., gtab.b0s_mask]) >>> > >>> > So, for me there is no reason of changing the init method of TensorFit. >>> > >>> > I hope I am not missing something. >>> > Let me know if this suggestion is helpful. >>> > >>> > Cheers, >>> > Eleftherios >>> > >>> > On Sun, Mar 20, 2016 at 12:04 PM, Ariel Rokem >>> wrote: >>> > >>> >> Hi everyone, >>> >> >>> >> Thought I would re-raise this. Anyone have any thoughts here? Would a >>> PR >>> >> against the DTI and DKI modules be more helpful to clarify? >>> >> >>> >> Cheers, >>> >> >>> >> Ariel >>> >> >>> >> On Sat, Mar 5, 2016 at 3:04 AM, Ariel Rokem >>> wrote: >>> >> >>> >>> >>> >>> On Thu, Mar 3, 2016 at 7:28 AM, Eleftherios Garyfallidis < >>> >>> garyfallidis at gmail.com> wrote: >>> >>> >>> >>>> Sorry your suggestion is not exactly clear. Can you give show us >>> how the >>> >>>> code will look with your proposal? Also, apart from DTI and DKI >>> what other >>> >>>> models will be affected from this changes. Is this a change >>> suggested only >>> >>>> for DTI and DKI or will affect all or most reconstruction models? >>> >>>> >>> >>>> >>> >>> First of all, to answer your last question: this will certainly >>> affect >>> >>> DTI and DKI, and there will be other models to follow. For example >>> the >>> >>> FWDTI that Rafael is currently proposing in that PR. The idea would >>> be to >>> >>> also more tightly integrate these three models (and future >>> extensions... >>> >>> !), so that we can remove some of the redundancies that currently >>> exist. We >>> >>> could make this a part of the base.Reconst* methods - it might apply >>> to >>> >>> other models as well (e.g. CSD, SFM, etc). But that's part of what I >>> would >>> >>> like to discuss here. >>> >>> >>> >>> As for code, for now, here's a sketch of what this would look like >>> for >>> >>> the tensor model: >>> >>> >>> >>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>> >>> >>> >>> Note that though it changes the prediction API a bit, not much else >>> would >>> >>> have to change. In particular, all the code that relies on there >>> being 12 >>> >>> model parameters will still be intact, because S0 doesn't go into >>> the model >>> >>> parameters. >>> >>> >>> >>> What do you think? Am I missing something big here? Or should I go >>> ahead >>> >>> and start working on a PR implementing this? >>> >>> >>> >>> Thanks! >>> >>> >>> >>> Ariel >>> >>> >>> >>> >>> >>> >>> >>>> On Mon, Feb 29, 2016 at 11:53 AM, Ariel Rokem >>> wrote: >>> >>>> >>> >>>>> Hi everyone, >>> >>>>> >>> >>>>> In Rafael's recent PR implementing free-water-eliminated DTI ( >>> >>>>> https://github.com/nipy/dipy/pull/835), we had a little bit of a >>> >>>>> discussion about the use of the non-diffusion weighted signal >>> (S0). As >>> >>>>> pointed out by Rafael, in the absence of an S0 in the measured >>> data, for >>> >>>>> some models, that can be derived from the model fit ( >>> >>>>> https://github.com/nipy/dipy/pull/835#issuecomment-183060855). >>> >>>>> >>> >>>>> I think that we would like to support using data both with and >>> without >>> >>>>> S0. On the other hand, I don't think that we should treat the >>> derived S0 as >>> >>>>> a model parameter, because in some cases, we want to provide S0 as >>> an input >>> >>>>> (for example, when predicting back the signal for another >>> measurement, with >>> >>>>> a different ). In addition, it would be hard to incorporate that >>> into the >>> >>>>> model_params variable of the TensorFit object, while maintaining >>> backwards >>> >>>>> compatibility of the TensorModel/TensorFit and derived classes >>> (e.g., DKI). >>> >>>>> >>> >>>>> My proposal is to have an S0 property for ReconstFit objects. When >>> this >>> >>>>> is calculated from the model (e.g. in DTI), it gets set by the >>> `fit` method >>> >>>>> of the ReconstModel object. When it isn't, it can be set from the >>> data. >>> >>>>> Either way, it can be over-ridden by the user (e.g., for the >>> purpose of >>> >>>>> predicting on a new data-set). This might change the behavior of >>> the >>> >>>>> prediction code slightly, but maybe that is something we can live >>> with? >>> >>>>> >>> >>>>> Happy to hear what everyone thinks, before we move ahead with this. >>> >>>>> >>> >>>>> Cheers, >>> >>>>> >>> >>>>> Ariel >>> >>>>> >>> >>>>> >>> >>>>> >>> >>>>> >>> >>>>> >>> >>>>> >>> >>>>> _______________________________________________ >>> >>>>> Neuroimaging mailing list >>> >>>>> Neuroimaging at python.org >>> >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>>>> >>> >>>>> >>> >>>>> >>> >>>> _______________________________________________ >>> >>>> Neuroimaging mailing list >>> >>>> Neuroimaging at python.org >>> >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>>> >>> >>>> >>> >>> >>> >> >>> >> _______________________________________________ >>> >> Neuroimaging mailing list >>> >> Neuroimaging at python.org >>> >> https://mail.python.org/mailman/listinfo/neuroimaging >>> >> >>> >> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nkowalczyk2 at gmail.com Sun Apr 10 12:21:53 2016 From: nkowalczyk2 at gmail.com (Natalia Kowalczyk) Date: Sun, 10 Apr 2016 18:21:53 +0200 Subject: [Neuroimaging] [dipy] Message-ID: Hi, I am wirting to you because I have problems with installing the DIPY. I have followed your instructions at website http://nipy.org/dipy/installation.html. I have: Windows7- 64bit. Mingw 32 bit (also tried 64 bit) Anaconda 32 bit (also tried 64 bit) I have also tried the solution with creation a file called ?pydistutils.cfg? in notepad and give it the contents. Here are the logs: Natalia at Natalia-rf711PC MINGW32 ~ $ pip install dipy Collecting dipy Using cached dipy-0.11.0.tar.gz Requirement already satisfied (use --upgrade to upgrade): scipy>=0.9 in c:\users\natalia\anaconda3\lib\site-packages (from dipy) Requirement already satisfied (use --upgrade to upgrade): nibabel>=1.2.0 in c:\users\natalia\anaconda3\lib\site-packages (from dipy) Building wheels for collected packages: dipy Running setup.py bdist_wheel for dipy: started Running setup.py bdist_wheel for dipy: finished with status 'error' Complete output from command C:\Users\Natalia\Anaconda3\python.exe -u -c "import setuptools, tokenize;__file__='C:\\Users\\Natalia\\AppData\\Local\\Temp\\pip-build-57lh1tho\\dipy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" bdist_wheel -d C:\Users\Natalia\AppData\Local\Temp\tmp5gfkjdjvpip-wheel- --python-tag cp35: running bdist_wheel running build running build_py creating build creating build\lib.win32-3.5 creating build\lib.win32-3.5\dipy copying dipy\info.py -> build\lib.win32-3.5\dipy copying dipy\pkg_info.py -> build\lib.win32-3.5\dipy copying dipy\__init__.py -> build\lib.win32-3.5\dipy creating build\lib.win32-3.5\dipy\tests copying dipy\tests\scriptrunner.py -> build\lib.win32-3.5\dipy\tests copying dipy\tests\test_scripts.py -> build\lib.win32-3.5\dipy\tests copying dipy\tests\__init__.py -> build\lib.win32-3.5\dipy\tests creating build\lib.win32-3.5\dipy\align copying dipy\align\imaffine.py -> build\lib.win32-3.5\dipy\align copying dipy\align\imwarp.py -> build\lib.win32-3.5\dipy\align copying dipy\align\metrics.py -> build\lib.win32-3.5\dipy\align copying dipy\align\reslice.py -> build\lib.win32-3.5\dipy\align copying dipy\align\scalespace.py -> build\lib.win32-3.5\dipy\align copying dipy\align\streamlinear.py -> build\lib.win32-3.5\dipy\align copying dipy\align\__init__.py -> build\lib.win32-3.5\dipy\align creating build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_crosscorr.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_expectmax.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_imaffine.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_imwarp.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_metrics.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_parzenhist.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_reslice.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_scalespace.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_streamlinear.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_sumsqdiff.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_transforms.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_vector_fields.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\__init__.py -> build\lib.win32-3.5\dipy\align\tests creating build\lib.win32-3.5\dipy\core copying dipy\core\geometry.py -> build\lib.win32-3.5\dipy\core copying dipy\core\gradients.py -> build\lib.win32-3.5\dipy\core copying dipy\core\graph.py -> build\lib.win32-3.5\dipy\core copying dipy\core\histeq.py -> build\lib.win32-3.5\dipy\core copying dipy\core\ndindex.py -> build\lib.win32-3.5\dipy\core copying dipy\core\onetime.py -> build\lib.win32-3.5\dipy\core copying dipy\core\optimize.py -> build\lib.win32-3.5\dipy\core copying dipy\core\profile.py -> build\lib.win32-3.5\dipy\core copying dipy\core\rng.py -> build\lib.win32-3.5\dipy\core copying dipy\core\sphere.py -> build\lib.win32-3.5\dipy\core copying dipy\core\sphere_stats.py -> build\lib.win32-3.5\dipy\core copying dipy\core\subdivide_octahedron.py -> build\lib.win32-3.5\dipy\core copying dipy\core\__init__.py -> build\lib.win32-3.5\dipy\core creating build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_geometry.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_gradients.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_graph.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_ndindex.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_optimize.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_sphere.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_subdivide_octahedron.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\__init__.py -> build\lib.win32-3.5\dipy\core\tests creating build\lib.win32-3.5\dipy\direction copying dipy\direction\peaks.py -> build\lib.win32-3.5\dipy\direction copying dipy\direction\probabilistic_direction_getter.py -> build\lib.win32-3.5\dipy\direction copying dipy\direction\__init__.py -> build\lib.win32-3.5\dipy\direction creating build\lib.win32-3.5\dipy\direction\tests copying dipy\direction\tests\test_peaks.py -> build\lib.win32-3.5\dipy\direction\tests copying dipy\direction\tests\test_prob_direction_getter.py -> build\lib.win32-3.5\dipy\direction\tests copying dipy\direction\tests\__init__.py -> build\lib.win32-3.5\dipy\direction\tests creating build\lib.win32-3.5\dipy\tracking copying dipy\tracking\eudx.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\gui_tools.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\interfaces.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\learning.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\life.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\markov.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\metrics.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\streamline.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\utils.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\_utils.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\__init__.py -> build\lib.win32-3.5\dipy\tracking creating build\lib.win32-3.5\dipy\tracking\local copying dipy\tracking\local\localtracking.py -> build\lib.win32-3.5\dipy\tracking\local copying dipy\tracking\local\__init__.py -> build\lib.win32-3.5\dipy\tracking\local creating build\lib.win32-3.5\dipy\tracking\local\tests copying dipy\tracking\local\tests\test_local_tracking.py -> build\lib.win32-3.5\dipy\tracking\local\tests copying dipy\tracking\local\tests\test_tissue_classifier.py -> build\lib.win32-3.5\dipy\tracking\local\tests copying dipy\tracking\local\tests\__init__.py -> build\lib.win32-3.5\dipy\tracking\local\tests creating build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_distances.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_learning.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_life.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_localtrack.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_markov.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_metrics.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_propagation.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_streamline.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_track_volumes.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_utils.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\__init__.py -> build\lib.win32-3.5\dipy\tracking\tests creating build\lib.win32-3.5\dipy\tracking\benchmarks copying dipy\tracking\benchmarks\bench_streamline.py -> build\lib.win32-3.5\dipy\tracking\benchmarks copying dipy\tracking\benchmarks\__init__.py -> build\lib.win32-3.5\dipy\tracking\benchmarks creating build\lib.win32-3.5\dipy\reconst copying dipy\reconst\base.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\cache.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\cross_validation.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\csdeconv.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\dki.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\dsi.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\dti.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\gqi.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\interpolate.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\mapmri.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\multi_voxel.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\odf.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\peaks.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\sfm.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\shm.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\shore.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\utils.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\__init__.py -> build\lib.win32-3.5\dipy\reconst creating build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_bounding_box.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_csd.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_peaks.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_squash.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_vec_val_sum.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\__init__.py -> build\lib.win32-3.5\dipy\reconst\benchmarks creating build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_cache.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_cross_validation.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_csdeconv.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dki.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dsi.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dsi_deconv.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dsi_metrics.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dti.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_gqi.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_interpolate.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_mapmri.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_mapmri_odf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_multi_voxel.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_odf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_peakdf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_peak_finding.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_reco_utils.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_sfm.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shm.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shore_fitting.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shore_metrics.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shore_odf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_vec_val_vect.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\__init__.py -> build\lib.win32-3.5\dipy\reconst\tests creating build\lib.win32-3.5\dipy\io copying dipy\io\bvectxt.py -> build\lib.win32-3.5\dipy\io copying dipy\io\dpy.py -> build\lib.win32-3.5\dipy\io copying dipy\io\gradients.py -> build\lib.win32-3.5\dipy\io copying dipy\io\pickles.py -> build\lib.win32-3.5\dipy\io copying dipy\io\trackvis.py -> build\lib.win32-3.5\dipy\io copying dipy\io\utils.py -> build\lib.win32-3.5\dipy\io copying dipy\io\__init__.py -> build\lib.win32-3.5\dipy\io creating build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_bvectxt.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_dpy.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_io.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_io_gradients.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\__init__.py -> build\lib.win32-3.5\dipy\io\tests creating build\lib.win32-3.5\dipy\viz copying dipy\viz\actor.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\colormap.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\fvtk.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\projections.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\regtools.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\utils.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\widget.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\window.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\__init__.py -> build\lib.win32-3.5\dipy\viz creating build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_actors.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_utils.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_widgets.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_window.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_regtools.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\__init__.py -> build\lib.win32-3.5\dipy\viz\tests creating build\lib.win32-3.5\dipy\testing copying dipy\testing\decorators.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\memory.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\spherepoints.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\__init__.py -> build\lib.win32-3.5\dipy\testing creating build\lib.win32-3.5\dipy\testing\tests copying dipy\testing\tests\test_decorators.py -> build\lib.win32-3.5\dipy\testing\tests copying dipy\testing\tests\test_memory.py -> build\lib.win32-3.5\dipy\testing\tests copying dipy\testing\tests\__init__.py -> build\lib.win32-3.5\dipy\testing\tests creating build\lib.win32-3.5\dipy\boots copying dipy\boots\resampling.py -> build\lib.win32-3.5\dipy\boots copying dipy\boots\__init__.py -> build\lib.win32-3.5\dipy\boots creating build\lib.win32-3.5\dipy\data copying dipy\data\fetcher.py -> build\lib.win32-3.5\dipy\data copying dipy\data\__init__.py -> build\lib.win32-3.5\dipy\data creating build\lib.win32-3.5\dipy\utils copying dipy\utils\arrfuncs.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\optpkg.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\six.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\tripwire.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\_importlib.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\__init__.py -> build\lib.win32-3.5\dipy\utils creating build\lib.win32-3.5\dipy\utils\tests copying dipy\utils\tests\test_arrfuncs.py -> build\lib.win32-3.5\dipy\utils\tests copying dipy\utils\tests\test_tripwire.py -> build\lib.win32-3.5\dipy\utils\tests copying dipy\utils\tests\__init__.py -> build\lib.win32-3.5\dipy\utils\tests creating build\lib.win32-3.5\dipy\fixes copying dipy\fixes\argparse.py -> build\lib.win32-3.5\dipy\fixes copying dipy\fixes\scipy.py -> build\lib.win32-3.5\dipy\fixes copying dipy\fixes\__init__.py -> build\lib.win32-3.5\dipy\fixes creating build\lib.win32-3.5\dipy\external copying dipy\external\fsl.py -> build\lib.win32-3.5\dipy\external copying dipy\external\__init__.py -> build\lib.win32-3.5\dipy\external creating build\lib.win32-3.5\dipy\external\tests copying dipy\external\tests\__init__.py -> build\lib.win32-3.5\dipy\external\tests creating build\lib.win32-3.5\dipy\segment copying dipy\segment\clustering.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\mask.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\metric.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\quickbundles.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\threshold.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\__init__.py -> build\lib.win32-3.5\dipy\segment creating build\lib.win32-3.5\dipy\segment\benchmarks copying dipy\segment\benchmarks\bench_quickbundles.py -> build\lib.win32-3.5\dipy\segment\benchmarks copying dipy\segment\benchmarks\__init__.py -> build\lib.win32-3.5\dipy\segment\benchmarks creating build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_clustering.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_feature.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_mask.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_metric.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_qb.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_quickbundles.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\__init__.py -> build\lib.win32-3.5\dipy\segment\tests creating build\lib.win32-3.5\dipy\sims copying dipy\sims\phantom.py -> build\lib.win32-3.5\dipy\sims copying dipy\sims\voxel.py -> build\lib.win32-3.5\dipy\sims copying dipy\sims\__init__.py -> build\lib.win32-3.5\dipy\sims creating build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\test_phantom.py -> build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\test_voxel.py -> build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\__init__.py -> build\lib.win32-3.5\dipy\sims\tests creating build\lib.win32-3.5\dipy\denoise copying dipy\denoise\nlmeans.py -> build\lib.win32-3.5\dipy\denoise copying dipy\denoise\noise_estimate.py -> build\lib.win32-3.5\dipy\denoise copying dipy\denoise\__init__.py -> build\lib.win32-3.5\dipy\denoise creating build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_denoise.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_kernel.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_nlmeans.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_noise_estimate.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\__init__.py -> build\lib.win32-3.5\dipy\denoise\tests creating build\lib.win32-3.5\dipy\workflows copying dipy\workflows\base.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\docstring_parser.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\reconst.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\segment.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\tracking.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\utils.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\__init__.py -> build\lib.win32-3.5\dipy\workflows creating build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_docstring_parser.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_iap.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_segment.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_utils.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\__init__.py -> build\lib.win32-3.5\dipy\workflows\tests creating build\lib.win32-3.5\dipy\data\files copying dipy\data\files\55dir_grad.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\55dir_grad.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\aniso_vox.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C1.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C3.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\cb_2.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\circle.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\dipy_colormaps.json -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\dsi4169_b_table.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\dsi515_b_table.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\eg_3voxels.pkl -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_362.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_642.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_724.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib0.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib1.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib2.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\func_coef.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\func_discrete.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\grad_514.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_3shell.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_isbi2013_2shell.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_taiwan_dsi.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\life_matlab_rmse.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\life_matlab_weights.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\repulsion100.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\repulsion724.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\S0_10slices.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\ScannerVectors_GQI101.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_64D.bvals.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_64D.gradients.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_64D.nii -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\sphere_grad.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\t1_coronal_slice.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\test_piesno.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\tracks300.trk -> build\lib.win32-3.5\dipy\data\files running build_ext error: Unable to find vcvarsall.bat ---------------------------------------- Failed building wheel for dipy Running setup.py clean for dipy Failed to build dipy Installing collected packages: dipy Running setup.py install for dipy: started Running setup.py install for dipy: finished with status 'error' Complete output from command C:\Users\Natalia\Anaconda3\python.exe -u -c "import setuptools, tokenize;__file__='C:\\Users\\Natalia\\AppData\\Local\\Temp\\pip-build-57lh1tho\\dipy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record C:\Users\Natalia\AppData\Local\Temp\pip-4nxumgxb-record\install-record.txt --single-version-externally-managed --compile: running install running build running build_py creating build creating build\lib.win32-3.5 creating build\lib.win32-3.5\dipy copying dipy\info.py -> build\lib.win32-3.5\dipy copying dipy\pkg_info.py -> build\lib.win32-3.5\dipy copying dipy\__init__.py -> build\lib.win32-3.5\dipy creating build\lib.win32-3.5\dipy\tests copying dipy\tests\scriptrunner.py -> build\lib.win32-3.5\dipy\tests copying dipy\tests\test_scripts.py -> build\lib.win32-3.5\dipy\tests copying dipy\tests\__init__.py -> build\lib.win32-3.5\dipy\tests creating build\lib.win32-3.5\dipy\align copying dipy\align\imaffine.py -> build\lib.win32-3.5\dipy\align copying dipy\align\imwarp.py -> build\lib.win32-3.5\dipy\align copying dipy\align\metrics.py -> build\lib.win32-3.5\dipy\align copying dipy\align\reslice.py -> build\lib.win32-3.5\dipy\align copying dipy\align\scalespace.py -> build\lib.win32-3.5\dipy\align copying dipy\align\streamlinear.py -> build\lib.win32-3.5\dipy\align copying dipy\align\__init__.py -> build\lib.win32-3.5\dipy\align creating build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_crosscorr.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_expectmax.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_imaffine.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_imwarp.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_metrics.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_parzenhist.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_reslice.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_scalespace.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_streamlinear.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_sumsqdiff.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_transforms.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\test_vector_fields.py -> build\lib.win32-3.5\dipy\align\tests copying dipy\align\tests\__init__.py -> build\lib.win32-3.5\dipy\align\tests creating build\lib.win32-3.5\dipy\core copying dipy\core\geometry.py -> build\lib.win32-3.5\dipy\core copying dipy\core\gradients.py -> build\lib.win32-3.5\dipy\core copying dipy\core\graph.py -> build\lib.win32-3.5\dipy\core copying dipy\core\histeq.py -> build\lib.win32-3.5\dipy\core copying dipy\core\ndindex.py -> build\lib.win32-3.5\dipy\core copying dipy\core\onetime.py -> build\lib.win32-3.5\dipy\core copying dipy\core\optimize.py -> build\lib.win32-3.5\dipy\core copying dipy\core\profile.py -> build\lib.win32-3.5\dipy\core copying dipy\core\rng.py -> build\lib.win32-3.5\dipy\core copying dipy\core\sphere.py -> build\lib.win32-3.5\dipy\core copying dipy\core\sphere_stats.py -> build\lib.win32-3.5\dipy\core copying dipy\core\subdivide_octahedron.py -> build\lib.win32-3.5\dipy\core copying dipy\core\__init__.py -> build\lib.win32-3.5\dipy\core creating build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_geometry.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_gradients.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_graph.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_ndindex.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_optimize.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_sphere.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\test_subdivide_octahedron.py -> build\lib.win32-3.5\dipy\core\tests copying dipy\core\tests\__init__.py -> build\lib.win32-3.5\dipy\core\tests creating build\lib.win32-3.5\dipy\direction copying dipy\direction\peaks.py -> build\lib.win32-3.5\dipy\direction copying dipy\direction\probabilistic_direction_getter.py -> build\lib.win32-3.5\dipy\direction copying dipy\direction\__init__.py -> build\lib.win32-3.5\dipy\direction creating build\lib.win32-3.5\dipy\direction\tests copying dipy\direction\tests\test_peaks.py -> build\lib.win32-3.5\dipy\direction\tests copying dipy\direction\tests\test_prob_direction_getter.py -> build\lib.win32-3.5\dipy\direction\tests copying dipy\direction\tests\__init__.py -> build\lib.win32-3.5\dipy\direction\tests creating build\lib.win32-3.5\dipy\tracking copying dipy\tracking\eudx.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\gui_tools.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\interfaces.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\learning.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\life.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\markov.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\metrics.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\streamline.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\utils.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\_utils.py -> build\lib.win32-3.5\dipy\tracking copying dipy\tracking\__init__.py -> build\lib.win32-3.5\dipy\tracking creating build\lib.win32-3.5\dipy\tracking\local copying dipy\tracking\local\localtracking.py -> build\lib.win32-3.5\dipy\tracking\local copying dipy\tracking\local\__init__.py -> build\lib.win32-3.5\dipy\tracking\local creating build\lib.win32-3.5\dipy\tracking\local\tests copying dipy\tracking\local\tests\test_local_tracking.py -> build\lib.win32-3.5\dipy\tracking\local\tests copying dipy\tracking\local\tests\test_tissue_classifier.py -> build\lib.win32-3.5\dipy\tracking\local\tests copying dipy\tracking\local\tests\__init__.py -> build\lib.win32-3.5\dipy\tracking\local\tests creating build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_distances.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_learning.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_life.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_localtrack.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_markov.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_metrics.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_propagation.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_streamline.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_track_volumes.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\test_utils.py -> build\lib.win32-3.5\dipy\tracking\tests copying dipy\tracking\tests\__init__.py -> build\lib.win32-3.5\dipy\tracking\tests creating build\lib.win32-3.5\dipy\tracking\benchmarks copying dipy\tracking\benchmarks\bench_streamline.py -> build\lib.win32-3.5\dipy\tracking\benchmarks copying dipy\tracking\benchmarks\__init__.py -> build\lib.win32-3.5\dipy\tracking\benchmarks creating build\lib.win32-3.5\dipy\reconst copying dipy\reconst\base.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\cache.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\cross_validation.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\csdeconv.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\dki.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\dsi.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\dti.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\gqi.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\interpolate.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\mapmri.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\multi_voxel.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\odf.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\peaks.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\sfm.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\shm.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\shore.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\utils.py -> build\lib.win32-3.5\dipy\reconst copying dipy\reconst\__init__.py -> build\lib.win32-3.5\dipy\reconst creating build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_bounding_box.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_csd.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_peaks.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_squash.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\bench_vec_val_sum.py -> build\lib.win32-3.5\dipy\reconst\benchmarks copying dipy\reconst\benchmarks\__init__.py -> build\lib.win32-3.5\dipy\reconst\benchmarks creating build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_cache.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_cross_validation.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_csdeconv.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dki.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dsi.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dsi_deconv.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dsi_metrics.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_dti.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_gqi.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_interpolate.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_mapmri.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_mapmri_odf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_multi_voxel.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_odf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_peakdf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_peak_finding.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_reco_utils.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_sfm.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shm.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shore_fitting.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shore_metrics.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_shore_odf.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\test_vec_val_vect.py -> build\lib.win32-3.5\dipy\reconst\tests copying dipy\reconst\tests\__init__.py -> build\lib.win32-3.5\dipy\reconst\tests creating build\lib.win32-3.5\dipy\io copying dipy\io\bvectxt.py -> build\lib.win32-3.5\dipy\io copying dipy\io\dpy.py -> build\lib.win32-3.5\dipy\io copying dipy\io\gradients.py -> build\lib.win32-3.5\dipy\io copying dipy\io\pickles.py -> build\lib.win32-3.5\dipy\io copying dipy\io\trackvis.py -> build\lib.win32-3.5\dipy\io copying dipy\io\utils.py -> build\lib.win32-3.5\dipy\io copying dipy\io\__init__.py -> build\lib.win32-3.5\dipy\io creating build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_bvectxt.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_dpy.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_io.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\test_io_gradients.py -> build\lib.win32-3.5\dipy\io\tests copying dipy\io\tests\__init__.py -> build\lib.win32-3.5\dipy\io\tests creating build\lib.win32-3.5\dipy\viz copying dipy\viz\actor.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\colormap.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\fvtk.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\projections.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\regtools.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\utils.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\widget.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\window.py -> build\lib.win32-3.5\dipy\viz copying dipy\viz\__init__.py -> build\lib.win32-3.5\dipy\viz creating build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_actors.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_utils.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_widgets.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_fvtk_window.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\test_regtools.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\__init__.py -> build\lib.win32-3.5\dipy\viz\tests creating build\lib.win32-3.5\dipy\testing copying dipy\testing\decorators.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\memory.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\spherepoints.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\__init__.py -> build\lib.win32-3.5\dipy\testing creating build\lib.win32-3.5\dipy\testing\tests copying dipy\testing\tests\test_decorators.py -> build\lib.win32-3.5\dipy\testing\tests copying dipy\testing\tests\test_memory.py -> build\lib.win32-3.5\dipy\testing\tests copying dipy\testing\tests\__init__.py -> build\lib.win32-3.5\dipy\testing\tests creating build\lib.win32-3.5\dipy\boots copying dipy\boots\resampling.py -> build\lib.win32-3.5\dipy\boots copying dipy\boots\__init__.py -> build\lib.win32-3.5\dipy\boots creating build\lib.win32-3.5\dipy\data copying dipy\data\fetcher.py -> build\lib.win32-3.5\dipy\data copying dipy\data\__init__.py -> build\lib.win32-3.5\dipy\data creating build\lib.win32-3.5\dipy\utils copying dipy\utils\arrfuncs.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\optpkg.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\six.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\tripwire.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\_importlib.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\__init__.py -> build\lib.win32-3.5\dipy\utils creating build\lib.win32-3.5\dipy\utils\tests copying dipy\utils\tests\test_arrfuncs.py -> build\lib.win32-3.5\dipy\utils\tests copying dipy\utils\tests\test_tripwire.py -> build\lib.win32-3.5\dipy\utils\tests copying dipy\utils\tests\__init__.py -> build\lib.win32-3.5\dipy\utils\tests creating build\lib.win32-3.5\dipy\fixes copying dipy\fixes\argparse.py -> build\lib.win32-3.5\dipy\fixes copying dipy\fixes\scipy.py -> build\lib.win32-3.5\dipy\fixes copying dipy\fixes\__init__.py -> build\lib.win32-3.5\dipy\fixes creating build\lib.win32-3.5\dipy\external copying dipy\external\fsl.py -> build\lib.win32-3.5\dipy\external copying dipy\external\__init__.py -> build\lib.win32-3.5\dipy\external creating build\lib.win32-3.5\dipy\external\tests copying dipy\external\tests\__init__.py -> build\lib.win32-3.5\dipy\external\tests creating build\lib.win32-3.5\dipy\segment copying dipy\segment\clustering.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\mask.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\metric.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\quickbundles.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\threshold.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\__init__.py -> build\lib.win32-3.5\dipy\segment creating build\lib.win32-3.5\dipy\segment\benchmarks copying dipy\segment\benchmarks\bench_quickbundles.py -> build\lib.win32-3.5\dipy\segment\benchmarks copying dipy\segment\benchmarks\__init__.py -> build\lib.win32-3.5\dipy\segment\benchmarks creating build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_clustering.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_feature.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_mask.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_metric.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_qb.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_quickbundles.py -> build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\__init__.py -> build\lib.win32-3.5\dipy\segment\tests creating build\lib.win32-3.5\dipy\sims copying dipy\sims\phantom.py -> build\lib.win32-3.5\dipy\sims copying dipy\sims\voxel.py -> build\lib.win32-3.5\dipy\sims copying dipy\sims\__init__.py -> build\lib.win32-3.5\dipy\sims creating build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\test_phantom.py -> build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\test_voxel.py -> build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\__init__.py -> build\lib.win32-3.5\dipy\sims\tests creating build\lib.win32-3.5\dipy\denoise copying dipy\denoise\nlmeans.py -> build\lib.win32-3.5\dipy\denoise copying dipy\denoise\noise_estimate.py -> build\lib.win32-3.5\dipy\denoise copying dipy\denoise\__init__.py -> build\lib.win32-3.5\dipy\denoise creating build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_denoise.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_kernel.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_nlmeans.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_noise_estimate.py -> build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\__init__.py -> build\lib.win32-3.5\dipy\denoise\tests creating build\lib.win32-3.5\dipy\workflows copying dipy\workflows\base.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\docstring_parser.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\reconst.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\segment.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\tracking.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\utils.py -> build\lib.win32-3.5\dipy\workflows copying dipy\workflows\__init__.py -> build\lib.win32-3.5\dipy\workflows creating build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_docstring_parser.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_iap.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_segment.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\test_utils.py -> build\lib.win32-3.5\dipy\workflows\tests copying dipy\workflows\tests\__init__.py -> build\lib.win32-3.5\dipy\workflows\tests creating build\lib.win32-3.5\dipy\data\files copying dipy\data\files\55dir_grad.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\55dir_grad.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\aniso_vox.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C1.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C3.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\cb_2.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\circle.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\dipy_colormaps.json -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\dsi4169_b_table.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\dsi515_b_table.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\eg_3voxels.pkl -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_362.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_642.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_724.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib0.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib1.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib2.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\func_coef.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\func_discrete.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\grad_514.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_3shell.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_isbi2013_2shell.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_taiwan_dsi.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\life_matlab_rmse.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\life_matlab_weights.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\repulsion100.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\repulsion724.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\S0_10slices.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\ScannerVectors_GQI101.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_64D.bvals.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_64D.gradients.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_64D.nii -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\sphere_grad.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\t1_coronal_slice.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\test_piesno.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\tracks300.trk -> build\lib.win32-3.5\dipy\data\files running build_ext error: Unable to find vcvarsall.bat ---------------------------------------- Command "C:\Users\Natalia\Anaconda3\python.exe -u -c "import setuptools, tokenize;__file__='C:\\Users\\Natalia\\AppData\\Local\\Temp\\pip-build-57lh1tho\\dipy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record C:\Users\Natalia\AppData\Local\Temp\pip-4nxumgxb-record\install-record.txt --single-version-externally-managed --compile" failed with error code 1 in C:\Users\Natalia\AppData\Local\Temp\pip-build-57lh1tho\dipy\ Thank you in advice for your help, Bests, Natalia -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Sun Apr 10 15:24:10 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Sun, 10 Apr 2016 21:24:10 +0200 Subject: [Neuroimaging] [dipy] In-Reply-To: <570AA79C.5010009@gmail.com> References: <570AA79C.5010009@gmail.com> Message-ID: <570AA85A.50900@gmail.com> Regarding the vcvarsall.bat error, that would point to a missing visual studio installation. While mingw works well as a compiler, unfortunately the official python distribution is using visual studio on windows, which also means that any python extension also needs to be built using that. I see from the log that you are using python 3.5, but if the purpose is to only use dipy, I would suggest maybe using python 2.7 instead, as microsoft has released a special build of visual studio which just works as is over here : https://www.microsoft.com/en-us/download/details.aspx?id=44266. Well, in any case I could build dipy on windows without too much hassle (actually, i think it just worked directly with pip install dipy) using this and a 2.7 build of python from anaconda at least. I would also suggest the 64 bits build as it allows you to use more than 4 go of ram, which is a limit easy to hit when processing diffusion mri datasets unfortunately. Anyway, if you still have questions feel free to ask back. For more info on building on windows, maybe Omar can pitch in if he reads the mailing list, if I recall correctly he also had some issues (and perhaps solved them) regarding building on windows and mingw/visual studio. Samuel P.S. I had to cut the logs as it became too big for the mailing list (> 100 ko needs moderator approvalapparently) for anyone wanting to reply, please see the original post. > > Le 2016-04-10 18:21, Natalia Kowalczyk a ?crit : >> Hi, >> I am wirting to you because I have problems with installing the DIPY. >> I have followed your instructions at website >> http://nipy.org/dipy/installation.html. >> >> I have: >> Windows7- 64bit. >> Mingw 32 bit (also tried 64 bit) >> Anaconda 32 bit (also tried 64 bit) >> >> >> I have also tried the solution with creation a file called >> ?pydistutils.cfg? in notepad and give it the contents. >> >> Here are the logs: >> >> error: Unable to find vcvarsall.bat >> >> ---------------------------------------- >> Command "C:\Users\Natalia\Anaconda3\python.exe -u -c "import >> setuptools, >> tokenize;__file__='C:\\Users\\Natalia\\AppData\\Local\\Temp\\pip-build-57lh1tho\\dipy\\setup.py';exec(compile(getattr(tokenize, >> 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, >> 'exec'))" install --record >> C:\Users\Natalia\AppData\Local\Temp\pip-4nxumgxb-record\install-record.txt >> --single-version-externally-managed --compile" failed with error code >> 1 in C:\Users\Natalia\AppData\Local\Temp\pip-build-57lh1tho\dipy\ >> >> >> Thank you in advice for your help, >> Bests, >> Natalia >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Mon Apr 11 11:58:09 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Mon, 11 Apr 2016 15:58:09 +0000 Subject: [Neuroimaging] [dipy]Fitting diffusion models in the absence of S0 signal In-Reply-To: References: Message-ID: Rafael can you please make a decision for when to meet to discussion about this design change? You should pick the time as you are in a very different timezone than us. Me and Ariel have only 3 hours of difference. On Sun, Apr 10, 2016 at 12:13 PM Ariel Rokem wrote: > Hi Eleftherios, > > On Sat, Apr 9, 2016 at 3:08 PM, Eleftherios Garyfallidis < > garyfallidis at gmail.com> wrote: > >> >> Hi Rafael and Ariel, >> >> Apologies for delaying to answer here. I think we need to set a hangout >> to discuss about this. >> >> One thing that maybe important for this discussion is that the function >> >> from dipy.core.gradients import gradient_table >> >> has a parameter called b0_threshold. >> >> This can be set to be 300 or higher and then the b0 will be considered >> as the one at 300. So, if the datasets don't have b=0 but b=300 these can >> be used instead. >> > >> This means that just by changing the b0_threshold the datasets can be fit >> in a different ways. >> Could be that the actual easier solution is to call the gradient table in >> a different way (different >> b0 threshold) rather than changing the API? >> >> > No - unfortunately I don't think this would be a solution to this > particular concern. That is because what we need is really the value of the > intercept of the signal-by-b-value curve. If we have an S0 measurement, we > take that, but if we don't we need to *estimate* it. > > I will look now to the free water implementation to understand better the >> different issues. >> > > Yeah - some of the recent commits are about this particular issue (e.g., > https://github.com/nipy/dipy/pull/835/commits/89c09c7e4095309c9d7ae42eee313a4fd1f9c880), > but note changes that follow (!). Maybe Rafael can also help explain. > > >> Cheers, >> Eleftherios >> >> p.s. Please give me your availability for a design hangout during the >> week. >> > > You can (always) find my availability here: > https://www.google.com/calendar/embed?src=arokem%40gmail.com&ctz=America/Los_Angeles > > Cheers, > > Ariel > > >> >> >> On Fri, Mar 25, 2016 at 11:14 AM Ariel Rokem wrote: >> >>> Hi Rafael, >>> >>> On Thu, Mar 24, 2016 at 4:12 AM, Rafael Henriques >>> wrote: >>> >>>> Hi Eleftherios, >>>> >>>> What can we do if the data don't have b0s? >>>> In the last years, everyone was including the b0 data in their DWI >>>> acquisitions. However, nowadays some groups are starting to acquire >>>> diffusion volume of images with low b-values (e.g. 300 s.mm-2) instead >>>> of the b0 volumes. They are doing this to insure that when fitting >>>> diffusion models they do not take into account Perfusion confounding >>>> effects. So my question is - what can we do to generalize Dipy for >>>> these cases? My suggestion is to include S0 always as model parameter, >>>> so even if users do not have b0 data, the model can easily give the >>>> extrapolated non-perfusion effected S0 signal. >>>> >>> >>> My example code was not really that great to demonstrate this point. I >>> have now updated the notebook so that it works with data that has a b=0 >>> measurement, but also with data that doesn't (you'll need to change the >>> commented out line in cell 3 to see both options). >>> >>> I also have two alternative implementations, following Eleftherios' >>> suggestions (I think): >>> >>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>> >>> In one implementation an estimate of S0 (`S0_hat`) is part of the >>> TensorFit object (I think that's what Eleftherios is suggesting). In the >>> other implementation, the estimate is part of the TensorModel.fit function >>> (as you suggest). >>> >>> The main disadvantage of alternative 1 is that we would have to pass the >>> data again into a method of the `TensorFit` object. The main disadvantage >>> of alternative 2 is that it requires a change to the `TensorFit.__init__` >>> API. My own tendency is to prefer this change to the `TensorFit.__init__` >>> API, because I don't think that people are using that API on its own, but >>> are typically getting their `TensorFit` objects from the `TensorModel.fit` >>> function. >>> >>> I think that passing the data in again into the `TensorFit` object will >>> not only be error-prone, but is also not as efficient. >>> >>> Importantly, this is not just a matter for people who use the prediction >>> API to see that the model fits the data, but also an issue for fitting >>> models that depend on the DTI model, such as the new FWE DTI model. >>> >>> Cheers, >>> >>> Ariel >>> >>> >>> >>> >>>> Also, how can you recover the S0 information using the line that you >>>> are suggested? If params only have the diffusion tensor information, >>>> that line will always be equal to 1, right? Am I missing something >>>> here? >>> >>> Best, >>>> Rafael >>>> >>>> >>>> > Hi Ariel, >>>> > >>>> > Apologies for delaying to answer. >>>> > >>>> > What I understand is that now the fit_model is doing the prediction >>>> for the >>>> > S0. Am I correct? >>>> > You recreate a predicted S0 inside fit_model but fit_model is about >>>> fitting >>>> > and not about predicting. >>>> > >>>> > I am not comfortable to changing fit_model to generate two parameters >>>> > (params and S0). >>>> > >>>> > This command can be called inside the predict method >>>> > S0 = np.mean(np.exp(np.dot(dm, params))[..., gtab.b0s_mask]) >>>> > >>>> > So, for me there is no reason of changing the init method of >>>> TensorFit. >>>> > >>>> > I hope I am not missing something. >>>> > Let me know if this suggestion is helpful. >>>> > >>>> > Cheers, >>>> > Eleftherios >>>> > >>>> > On Sun, Mar 20, 2016 at 12:04 PM, Ariel Rokem >>>> wrote: >>>> > >>>> >> Hi everyone, >>>> >> >>>> >> Thought I would re-raise this. Anyone have any thoughts here? Would >>>> a PR >>>> >> against the DTI and DKI modules be more helpful to clarify? >>>> >> >>>> >> Cheers, >>>> >> >>>> >> Ariel >>>> >> >>>> >> On Sat, Mar 5, 2016 at 3:04 AM, Ariel Rokem >>>> wrote: >>>> >> >>>> >>> >>>> >>> On Thu, Mar 3, 2016 at 7:28 AM, Eleftherios Garyfallidis < >>>> >>> garyfallidis at gmail.com> wrote: >>>> >>> >>>> >>>> Sorry your suggestion is not exactly clear. Can you give show us >>>> how the >>>> >>>> code will look with your proposal? Also, apart from DTI and DKI >>>> what other >>>> >>>> models will be affected from this changes. Is this a change >>>> suggested only >>>> >>>> for DTI and DKI or will affect all or most reconstruction models? >>>> >>>> >>>> >>>> >>>> >>> First of all, to answer your last question: this will certainly >>>> affect >>>> >>> DTI and DKI, and there will be other models to follow. For example >>>> the >>>> >>> FWDTI that Rafael is currently proposing in that PR. The idea would >>>> be to >>>> >>> also more tightly integrate these three models (and future >>>> extensions... >>>> >>> !), so that we can remove some of the redundancies that currently >>>> exist. We >>>> >>> could make this a part of the base.Reconst* methods - it might >>>> apply to >>>> >>> other models as well (e.g. CSD, SFM, etc). But that's part of what >>>> I would >>>> >>> like to discuss here. >>>> >>> >>>> >>> As for code, for now, here's a sketch of what this would look like >>>> for >>>> >>> the tensor model: >>>> >>> >>>> >>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>>> >>> >>>> >>> Note that though it changes the prediction API a bit, not much else >>>> would >>>> >>> have to change. In particular, all the code that relies on there >>>> being 12 >>>> >>> model parameters will still be intact, because S0 doesn't go into >>>> the model >>>> >>> parameters. >>>> >>> >>>> >>> What do you think? Am I missing something big here? Or should I go >>>> ahead >>>> >>> and start working on a PR implementing this? >>>> >>> >>>> >>> Thanks! >>>> >>> >>>> >>> Ariel >>>> >>> >>>> >>> >>>> >>> >>>> >>>> On Mon, Feb 29, 2016 at 11:53 AM, Ariel Rokem >>>> wrote: >>>> >>>> >>>> >>>>> Hi everyone, >>>> >>>>> >>>> >>>>> In Rafael's recent PR implementing free-water-eliminated DTI ( >>>> >>>>> https://github.com/nipy/dipy/pull/835), we had a little bit of a >>>> >>>>> discussion about the use of the non-diffusion weighted signal >>>> (S0). As >>>> >>>>> pointed out by Rafael, in the absence of an S0 in the measured >>>> data, for >>>> >>>>> some models, that can be derived from the model fit ( >>>> >>>>> https://github.com/nipy/dipy/pull/835#issuecomment-183060855). >>>> >>>>> >>>> >>>>> I think that we would like to support using data both with and >>>> without >>>> >>>>> S0. On the other hand, I don't think that we should treat the >>>> derived S0 as >>>> >>>>> a model parameter, because in some cases, we want to provide S0 >>>> as an input >>>> >>>>> (for example, when predicting back the signal for another >>>> measurement, with >>>> >>>>> a different ). In addition, it would be hard to incorporate that >>>> into the >>>> >>>>> model_params variable of the TensorFit object, while maintaining >>>> backwards >>>> >>>>> compatibility of the TensorModel/TensorFit and derived classes >>>> (e.g., DKI). >>>> >>>>> >>>> >>>>> My proposal is to have an S0 property for ReconstFit objects. >>>> When this >>>> >>>>> is calculated from the model (e.g. in DTI), it gets set by the >>>> `fit` method >>>> >>>>> of the ReconstModel object. When it isn't, it can be set from the >>>> data. >>>> >>>>> Either way, it can be over-ridden by the user (e.g., for the >>>> purpose of >>>> >>>>> predicting on a new data-set). This might change the behavior of >>>> the >>>> >>>>> prediction code slightly, but maybe that is something we can live >>>> with? >>>> >>>>> >>>> >>>>> Happy to hear what everyone thinks, before we move ahead with >>>> this. >>>> >>>>> >>>> >>>>> Cheers, >>>> >>>>> >>>> >>>>> Ariel >>>> >>>>> >>>> >>>>> >>>> >>>>> >>>> >>>>> >>>> >>>>> >>>> >>>>> >>>> >>>>> _______________________________________________ >>>> >>>>> Neuroimaging mailing list >>>> >>>>> Neuroimaging at python.org >>>> >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>>> >>>> >>>>> >>>> >>>>> >>>> >>>> _______________________________________________ >>>> >>>> Neuroimaging mailing list >>>> >>>> Neuroimaging at python.org >>>> >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> >>>> >>>> >>>> >>> >>>> >> >>>> >> _______________________________________________ >>>> >> Neuroimaging mailing list >>>> >> Neuroimaging at python.org >>>> >> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >> >>>> >> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nkowalczyk2 at gmail.com Sun Apr 10 17:27:39 2016 From: nkowalczyk2 at gmail.com (Natalia Kowalczyk) Date: Sun, 10 Apr 2016 23:27:39 +0200 Subject: [Neuroimaging] [dipy] In-Reply-To: References: Message-ID: SOLVED! I had to install Visual Studio 2015 community edition (with c++ additional tools checked) 2016-04-10 18:21 GMT+02:00 Natalia Kowalczyk : > Hi, > I am wirting to you because I have problems with installing the DIPY. I > have followed your instructions at website > http://nipy.org/dipy/installation.html. > > I have: > Windows7- 64bit. > Mingw 32 bit (also tried 64 bit) > Anaconda 32 bit (also tried 64 bit) > > > I have also tried the solution with creation a file called > ?pydistutils.cfg? in notepad and give it the contents. > > Here are the logs: > > Natalia at Natalia-rf711PC MINGW32 ~ > $ pip install dipy > Collecting dipy > Using cached dipy-0.11.0.tar.gz > Requirement already satisfied (use --upgrade to upgrade): scipy>=0.9 in > c:\users\natalia\anaconda3\lib\site-packages (from dipy) > Requirement already satisfied (use --upgrade to upgrade): nibabel>=1.2.0 > in c:\users\natalia\anaconda3\lib\site-packages (from dipy) > Building wheels for collected packages: dipy > Running setup.py bdist_wheel for dipy: started > Running setup.py bdist_wheel for dipy: finished with status 'error' > Complete output from command C:\Users\Natalia\Anaconda3\python.exe -u -c > "import setuptools, > tokenize;__file__='C:\\Users\\Natalia\\AppData\\Local\\Temp\\pip-build-57lh1tho\\dipy\\setup.py';exec(compile(getattr(tokenize, > 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" > bdist_wheel -d C:\Users\Natalia\AppData\Local\Temp\tmp5gfkjdjvpip-wheel- > --python-tag cp35: > running bdist_wheel > running build > running build_py > creating build > creating build\lib.win32-3.5 > creating build\lib.win32-3.5\dipy > copying dipy\info.py -> build\lib.win32-3.5\dipy > copying dipy\pkg_info.py -> build\lib.win32-3.5\dipy > copying dipy\__init__.py -> build\lib.win32-3.5\dipy > creating build\lib.win32-3.5\dipy\tests > copying dipy\tests\scriptrunner.py -> build\lib.win32-3.5\dipy\tests > copying dipy\tests\test_scripts.py -> build\lib.win32-3.5\dipy\tests > copying dipy\tests\__init__.py -> build\lib.win32-3.5\dipy\tests > creating build\lib.win32-3.5\dipy\align > copying dipy\align\imaffine.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\imwarp.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\metrics.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\reslice.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\scalespace.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\streamlinear.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\__init__.py -> build\lib.win32-3.5\dipy\align > creating build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_crosscorr.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_expectmax.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_imaffine.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_imwarp.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_metrics.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_parzenhist.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_reslice.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_scalespace.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_streamlinear.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_sumsqdiff.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_transforms.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_vector_fields.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\__init__.py -> > build\lib.win32-3.5\dipy\align\tests > creating build\lib.win32-3.5\dipy\core > copying dipy\core\geometry.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\gradients.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\graph.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\histeq.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\ndindex.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\onetime.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\optimize.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\profile.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\rng.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\sphere.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\sphere_stats.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\subdivide_octahedron.py -> > build\lib.win32-3.5\dipy\core > copying dipy\core\__init__.py -> build\lib.win32-3.5\dipy\core > creating build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_geometry.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_gradients.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_graph.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_ndindex.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_optimize.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_sphere.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_subdivide_octahedron.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\__init__.py -> > build\lib.win32-3.5\dipy\core\tests > creating build\lib.win32-3.5\dipy\direction > copying dipy\direction\peaks.py -> build\lib.win32-3.5\dipy\direction > copying dipy\direction\probabilistic_direction_getter.py -> > build\lib.win32-3.5\dipy\direction > copying dipy\direction\__init__.py -> build\lib.win32-3.5\dipy\direction > creating build\lib.win32-3.5\dipy\direction\tests > copying dipy\direction\tests\test_peaks.py -> > build\lib.win32-3.5\dipy\direction\tests > copying dipy\direction\tests\test_prob_direction_getter.py -> > build\lib.win32-3.5\dipy\direction\tests > copying dipy\direction\tests\__init__.py -> > build\lib.win32-3.5\dipy\direction\tests > creating build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\eudx.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\gui_tools.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\interfaces.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\learning.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\life.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\markov.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\metrics.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\streamline.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\utils.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\_utils.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\__init__.py -> build\lib.win32-3.5\dipy\tracking > creating build\lib.win32-3.5\dipy\tracking\local > copying dipy\tracking\local\localtracking.py -> > build\lib.win32-3.5\dipy\tracking\local > copying dipy\tracking\local\__init__.py -> > build\lib.win32-3.5\dipy\tracking\local > creating build\lib.win32-3.5\dipy\tracking\local\tests > copying dipy\tracking\local\tests\test_local_tracking.py -> > build\lib.win32-3.5\dipy\tracking\local\tests > copying dipy\tracking\local\tests\test_tissue_classifier.py -> > build\lib.win32-3.5\dipy\tracking\local\tests > copying dipy\tracking\local\tests\__init__.py -> > build\lib.win32-3.5\dipy\tracking\local\tests > creating build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_distances.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_learning.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_life.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_localtrack.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_markov.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_metrics.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_propagation.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_streamline.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_track_volumes.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_utils.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\__init__.py -> > build\lib.win32-3.5\dipy\tracking\tests > creating build\lib.win32-3.5\dipy\tracking\benchmarks > copying dipy\tracking\benchmarks\bench_streamline.py -> > build\lib.win32-3.5\dipy\tracking\benchmarks > copying dipy\tracking\benchmarks\__init__.py -> > build\lib.win32-3.5\dipy\tracking\benchmarks > creating build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\base.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\cache.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\cross_validation.py -> > build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\csdeconv.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\dki.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\dsi.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\dti.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\gqi.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\interpolate.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\mapmri.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\multi_voxel.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\odf.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\peaks.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\sfm.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\shm.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\shore.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\utils.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\__init__.py -> build\lib.win32-3.5\dipy\reconst > creating build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_bounding_box.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_csd.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_peaks.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_squash.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_vec_val_sum.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\__init__.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > creating build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_cache.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_cross_validation.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_csdeconv.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dki.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dsi.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dsi_deconv.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dsi_metrics.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dti.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_gqi.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_interpolate.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_mapmri.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_mapmri_odf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_multi_voxel.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_odf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_peakdf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_peak_finding.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_reco_utils.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_sfm.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shm.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shore_fitting.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shore_metrics.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shore_odf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_vec_val_vect.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\__init__.py -> > build\lib.win32-3.5\dipy\reconst\tests > creating build\lib.win32-3.5\dipy\io > copying dipy\io\bvectxt.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\dpy.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\gradients.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\pickles.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\trackvis.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\utils.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\__init__.py -> build\lib.win32-3.5\dipy\io > creating build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_bvectxt.py -> > build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_dpy.py -> build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_io.py -> build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_io_gradients.py -> > build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\__init__.py -> build\lib.win32-3.5\dipy\io\tests > creating build\lib.win32-3.5\dipy\viz > copying dipy\viz\actor.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\colormap.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\fvtk.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\projections.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\regtools.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\utils.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\widget.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\window.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\__init__.py -> build\lib.win32-3.5\dipy\viz > creating build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk.py -> build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_actors.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_utils.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_widgets.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_window.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_regtools.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\__init__.py -> build\lib.win32-3.5\dipy\viz\tests > creating build\lib.win32-3.5\dipy\testing > copying dipy\testing\decorators.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\memory.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\spherepoints.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\__init__.py -> build\lib.win32-3.5\dipy\testing > creating build\lib.win32-3.5\dipy\testing\tests > copying dipy\testing\tests\test_decorators.py -> > build\lib.win32-3.5\dipy\testing\tests > copying dipy\testing\tests\test_memory.py -> > build\lib.win32-3.5\dipy\testing\tests > copying dipy\testing\tests\__init__.py -> > build\lib.win32-3.5\dipy\testing\tests > creating build\lib.win32-3.5\dipy\boots > copying dipy\boots\resampling.py -> build\lib.win32-3.5\dipy\boots > copying dipy\boots\__init__.py -> build\lib.win32-3.5\dipy\boots > creating build\lib.win32-3.5\dipy\data > copying dipy\data\fetcher.py -> build\lib.win32-3.5\dipy\data > copying dipy\data\__init__.py -> build\lib.win32-3.5\dipy\data > creating build\lib.win32-3.5\dipy\utils > copying dipy\utils\arrfuncs.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\optpkg.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\six.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\tripwire.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\_importlib.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\__init__.py -> build\lib.win32-3.5\dipy\utils > creating build\lib.win32-3.5\dipy\utils\tests > copying dipy\utils\tests\test_arrfuncs.py -> > build\lib.win32-3.5\dipy\utils\tests > copying dipy\utils\tests\test_tripwire.py -> > build\lib.win32-3.5\dipy\utils\tests > copying dipy\utils\tests\__init__.py -> > build\lib.win32-3.5\dipy\utils\tests > creating build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\argparse.py -> build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\scipy.py -> build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\__init__.py -> build\lib.win32-3.5\dipy\fixes > creating build\lib.win32-3.5\dipy\external > copying dipy\external\fsl.py -> build\lib.win32-3.5\dipy\external > copying dipy\external\__init__.py -> build\lib.win32-3.5\dipy\external > creating build\lib.win32-3.5\dipy\external\tests > copying dipy\external\tests\__init__.py -> > build\lib.win32-3.5\dipy\external\tests > creating build\lib.win32-3.5\dipy\segment > copying dipy\segment\clustering.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\mask.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\metric.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\quickbundles.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\threshold.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\__init__.py -> build\lib.win32-3.5\dipy\segment > creating build\lib.win32-3.5\dipy\segment\benchmarks > copying dipy\segment\benchmarks\bench_quickbundles.py -> > build\lib.win32-3.5\dipy\segment\benchmarks > copying dipy\segment\benchmarks\__init__.py -> > build\lib.win32-3.5\dipy\segment\benchmarks > creating build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_clustering.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_feature.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_mask.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_metric.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_qb.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_quickbundles.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\__init__.py -> > build\lib.win32-3.5\dipy\segment\tests > creating build\lib.win32-3.5\dipy\sims > copying dipy\sims\phantom.py -> build\lib.win32-3.5\dipy\sims > copying dipy\sims\voxel.py -> build\lib.win32-3.5\dipy\sims > copying dipy\sims\__init__.py -> build\lib.win32-3.5\dipy\sims > creating build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\test_phantom.py -> > build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\test_voxel.py -> > build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\__init__.py -> > build\lib.win32-3.5\dipy\sims\tests > creating build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\nlmeans.py -> build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\noise_estimate.py -> > build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\__init__.py -> build\lib.win32-3.5\dipy\denoise > creating build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_denoise.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_kernel.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_nlmeans.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_noise_estimate.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\__init__.py -> > build\lib.win32-3.5\dipy\denoise\tests > creating build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\base.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\docstring_parser.py -> > build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\reconst.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\segment.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\tracking.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\utils.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\__init__.py -> build\lib.win32-3.5\dipy\workflows > creating build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_docstring_parser.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_iap.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_segment.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_utils.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\__init__.py -> > build\lib.win32-3.5\dipy\workflows\tests > creating build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\55dir_grad.bval -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\55dir_grad.bvec -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\aniso_vox.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C.npy -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C1.pkl.gz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C3.pkl.gz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\cb_2.npz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\circle.npy -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\dipy_colormaps.json -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\dsi4169_b_table.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\dsi515_b_table.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\eg_3voxels.pkl -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_362.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_642.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_724.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib0.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib1.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib2.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\func_coef.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\func_discrete.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\grad_514.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\gtab_3shell.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\gtab_isbi2013_2shell.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\gtab_taiwan_dsi.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\life_matlab_rmse.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\life_matlab_weights.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\repulsion100.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\repulsion724.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\S0_10slices.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\ScannerVectors_GQI101.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_101D.bval -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_101D.bvec -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_101D.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.bval -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.bvec -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_64D.bvals.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_64D.gradients.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_64D.nii -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\sphere_grad.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\t1_coronal_slice.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\test_piesno.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\tracks300.trk -> > build\lib.win32-3.5\dipy\data\files > running build_ext > error: Unable to find vcvarsall.bat > > ---------------------------------------- > Failed building wheel for dipy > Running setup.py clean for dipy > Failed to build dipy > Installing collected packages: dipy > Running setup.py install for dipy: started > Running setup.py install for dipy: finished with status 'error' > Complete output from command C:\Users\Natalia\Anaconda3\python.exe -u > -c "import setuptools, > tokenize;__file__='C:\\Users\\Natalia\\AppData\\Local\\Temp\\pip-build-57lh1tho\\dipy\\setup.py';exec(compile(getattr(tokenize, > 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" > install --record > C:\Users\Natalia\AppData\Local\Temp\pip-4nxumgxb-record\install-record.txt > --single-version-externally-managed --compile: > running install > running build > running build_py > creating build > creating build\lib.win32-3.5 > creating build\lib.win32-3.5\dipy > copying dipy\info.py -> build\lib.win32-3.5\dipy > copying dipy\pkg_info.py -> build\lib.win32-3.5\dipy > copying dipy\__init__.py -> build\lib.win32-3.5\dipy > creating build\lib.win32-3.5\dipy\tests > copying dipy\tests\scriptrunner.py -> build\lib.win32-3.5\dipy\tests > copying dipy\tests\test_scripts.py -> build\lib.win32-3.5\dipy\tests > copying dipy\tests\__init__.py -> build\lib.win32-3.5\dipy\tests > creating build\lib.win32-3.5\dipy\align > copying dipy\align\imaffine.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\imwarp.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\metrics.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\reslice.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\scalespace.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\streamlinear.py -> build\lib.win32-3.5\dipy\align > copying dipy\align\__init__.py -> build\lib.win32-3.5\dipy\align > creating build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_crosscorr.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_expectmax.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_imaffine.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_imwarp.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_metrics.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_parzenhist.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_reslice.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_scalespace.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_streamlinear.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_sumsqdiff.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_transforms.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\test_vector_fields.py -> > build\lib.win32-3.5\dipy\align\tests > copying dipy\align\tests\__init__.py -> > build\lib.win32-3.5\dipy\align\tests > creating build\lib.win32-3.5\dipy\core > copying dipy\core\geometry.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\gradients.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\graph.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\histeq.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\ndindex.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\onetime.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\optimize.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\profile.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\rng.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\sphere.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\sphere_stats.py -> build\lib.win32-3.5\dipy\core > copying dipy\core\subdivide_octahedron.py -> > build\lib.win32-3.5\dipy\core > copying dipy\core\__init__.py -> build\lib.win32-3.5\dipy\core > creating build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_geometry.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_gradients.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_graph.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_ndindex.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_optimize.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_sphere.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\test_subdivide_octahedron.py -> > build\lib.win32-3.5\dipy\core\tests > copying dipy\core\tests\__init__.py -> > build\lib.win32-3.5\dipy\core\tests > creating build\lib.win32-3.5\dipy\direction > copying dipy\direction\peaks.py -> build\lib.win32-3.5\dipy\direction > copying dipy\direction\probabilistic_direction_getter.py -> > build\lib.win32-3.5\dipy\direction > copying dipy\direction\__init__.py -> > build\lib.win32-3.5\dipy\direction > creating build\lib.win32-3.5\dipy\direction\tests > copying dipy\direction\tests\test_peaks.py -> > build\lib.win32-3.5\dipy\direction\tests > copying dipy\direction\tests\test_prob_direction_getter.py -> > build\lib.win32-3.5\dipy\direction\tests > copying dipy\direction\tests\__init__.py -> > build\lib.win32-3.5\dipy\direction\tests > creating build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\eudx.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\gui_tools.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\interfaces.py -> > build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\learning.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\life.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\markov.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\metrics.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\streamline.py -> > build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\utils.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\_utils.py -> build\lib.win32-3.5\dipy\tracking > copying dipy\tracking\__init__.py -> build\lib.win32-3.5\dipy\tracking > creating build\lib.win32-3.5\dipy\tracking\local > copying dipy\tracking\local\localtracking.py -> > build\lib.win32-3.5\dipy\tracking\local > copying dipy\tracking\local\__init__.py -> > build\lib.win32-3.5\dipy\tracking\local > creating build\lib.win32-3.5\dipy\tracking\local\tests > copying dipy\tracking\local\tests\test_local_tracking.py -> > build\lib.win32-3.5\dipy\tracking\local\tests > copying dipy\tracking\local\tests\test_tissue_classifier.py -> > build\lib.win32-3.5\dipy\tracking\local\tests > copying dipy\tracking\local\tests\__init__.py -> > build\lib.win32-3.5\dipy\tracking\local\tests > creating build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_distances.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_learning.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_life.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_localtrack.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_markov.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_metrics.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_propagation.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_streamline.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_track_volumes.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\test_utils.py -> > build\lib.win32-3.5\dipy\tracking\tests > copying dipy\tracking\tests\__init__.py -> > build\lib.win32-3.5\dipy\tracking\tests > creating build\lib.win32-3.5\dipy\tracking\benchmarks > copying dipy\tracking\benchmarks\bench_streamline.py -> > build\lib.win32-3.5\dipy\tracking\benchmarks > copying dipy\tracking\benchmarks\__init__.py -> > build\lib.win32-3.5\dipy\tracking\benchmarks > creating build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\base.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\cache.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\cross_validation.py -> > build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\csdeconv.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\dki.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\dsi.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\dti.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\gqi.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\interpolate.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\mapmri.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\multi_voxel.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\odf.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\peaks.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\sfm.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\shm.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\shore.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\utils.py -> build\lib.win32-3.5\dipy\reconst > copying dipy\reconst\__init__.py -> build\lib.win32-3.5\dipy\reconst > creating build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_bounding_box.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_csd.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_peaks.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_squash.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\bench_vec_val_sum.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > copying dipy\reconst\benchmarks\__init__.py -> > build\lib.win32-3.5\dipy\reconst\benchmarks > creating build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_cache.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_cross_validation.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_csdeconv.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dki.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dsi.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dsi_deconv.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dsi_metrics.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_dti.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_gqi.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_interpolate.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_mapmri.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_mapmri_odf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_multi_voxel.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_odf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_peakdf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_peak_finding.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_reco_utils.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_sfm.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shm.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shore_fitting.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shore_metrics.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_shore_odf.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\test_vec_val_vect.py -> > build\lib.win32-3.5\dipy\reconst\tests > copying dipy\reconst\tests\__init__.py -> > build\lib.win32-3.5\dipy\reconst\tests > creating build\lib.win32-3.5\dipy\io > copying dipy\io\bvectxt.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\dpy.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\gradients.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\pickles.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\trackvis.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\utils.py -> build\lib.win32-3.5\dipy\io > copying dipy\io\__init__.py -> build\lib.win32-3.5\dipy\io > creating build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_bvectxt.py -> > build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_dpy.py -> build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_io.py -> build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\test_io_gradients.py -> > build\lib.win32-3.5\dipy\io\tests > copying dipy\io\tests\__init__.py -> build\lib.win32-3.5\dipy\io\tests > creating build\lib.win32-3.5\dipy\viz > copying dipy\viz\actor.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\colormap.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\fvtk.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\projections.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\regtools.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\utils.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\widget.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\window.py -> build\lib.win32-3.5\dipy\viz > copying dipy\viz\__init__.py -> build\lib.win32-3.5\dipy\viz > creating build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_actors.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_utils.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_widgets.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_fvtk_window.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\test_regtools.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\__init__.py -> > build\lib.win32-3.5\dipy\viz\tests > creating build\lib.win32-3.5\dipy\testing > copying dipy\testing\decorators.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\memory.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\spherepoints.py -> > build\lib.win32-3.5\dipy\testing > copying dipy\testing\__init__.py -> build\lib.win32-3.5\dipy\testing > creating build\lib.win32-3.5\dipy\testing\tests > copying dipy\testing\tests\test_decorators.py -> > build\lib.win32-3.5\dipy\testing\tests > copying dipy\testing\tests\test_memory.py -> > build\lib.win32-3.5\dipy\testing\tests > copying dipy\testing\tests\__init__.py -> > build\lib.win32-3.5\dipy\testing\tests > creating build\lib.win32-3.5\dipy\boots > copying dipy\boots\resampling.py -> build\lib.win32-3.5\dipy\boots > copying dipy\boots\__init__.py -> build\lib.win32-3.5\dipy\boots > creating build\lib.win32-3.5\dipy\data > copying dipy\data\fetcher.py -> build\lib.win32-3.5\dipy\data > copying dipy\data\__init__.py -> build\lib.win32-3.5\dipy\data > creating build\lib.win32-3.5\dipy\utils > copying dipy\utils\arrfuncs.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\optpkg.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\six.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\tripwire.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\_importlib.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\__init__.py -> build\lib.win32-3.5\dipy\utils > creating build\lib.win32-3.5\dipy\utils\tests > copying dipy\utils\tests\test_arrfuncs.py -> > build\lib.win32-3.5\dipy\utils\tests > copying dipy\utils\tests\test_tripwire.py -> > build\lib.win32-3.5\dipy\utils\tests > copying dipy\utils\tests\__init__.py -> > build\lib.win32-3.5\dipy\utils\tests > creating build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\argparse.py -> build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\scipy.py -> build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\__init__.py -> build\lib.win32-3.5\dipy\fixes > creating build\lib.win32-3.5\dipy\external > copying dipy\external\fsl.py -> build\lib.win32-3.5\dipy\external > copying dipy\external\__init__.py -> build\lib.win32-3.5\dipy\external > creating build\lib.win32-3.5\dipy\external\tests > copying dipy\external\tests\__init__.py -> > build\lib.win32-3.5\dipy\external\tests > creating build\lib.win32-3.5\dipy\segment > copying dipy\segment\clustering.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\mask.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\metric.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\quickbundles.py -> > build\lib.win32-3.5\dipy\segment > copying dipy\segment\threshold.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\__init__.py -> build\lib.win32-3.5\dipy\segment > creating build\lib.win32-3.5\dipy\segment\benchmarks > copying dipy\segment\benchmarks\bench_quickbundles.py -> > build\lib.win32-3.5\dipy\segment\benchmarks > copying dipy\segment\benchmarks\__init__.py -> > build\lib.win32-3.5\dipy\segment\benchmarks > creating build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_clustering.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_feature.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_mask.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_metric.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_qb.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_quickbundles.py -> > build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\__init__.py -> > build\lib.win32-3.5\dipy\segment\tests > creating build\lib.win32-3.5\dipy\sims > copying dipy\sims\phantom.py -> build\lib.win32-3.5\dipy\sims > copying dipy\sims\voxel.py -> build\lib.win32-3.5\dipy\sims > copying dipy\sims\__init__.py -> build\lib.win32-3.5\dipy\sims > creating build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\test_phantom.py -> > build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\test_voxel.py -> > build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\__init__.py -> > build\lib.win32-3.5\dipy\sims\tests > creating build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\nlmeans.py -> build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\noise_estimate.py -> > build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\__init__.py -> build\lib.win32-3.5\dipy\denoise > creating build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_denoise.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_kernel.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_nlmeans.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_noise_estimate.py -> > build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\__init__.py -> > build\lib.win32-3.5\dipy\denoise\tests > creating build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\base.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\docstring_parser.py -> > build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\reconst.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\segment.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\tracking.py -> > build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\utils.py -> build\lib.win32-3.5\dipy\workflows > copying dipy\workflows\__init__.py -> > build\lib.win32-3.5\dipy\workflows > creating build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_docstring_parser.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_iap.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_segment.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\test_utils.py -> > build\lib.win32-3.5\dipy\workflows\tests > copying dipy\workflows\tests\__init__.py -> > build\lib.win32-3.5\dipy\workflows\tests > creating build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\55dir_grad.bval -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\55dir_grad.bvec -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\aniso_vox.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C.npy -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C1.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C3.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\cb_2.npz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\circle.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\dipy_colormaps.json -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\dsi4169_b_table.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\dsi515_b_table.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\eg_3voxels.pkl -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_362.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_642.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_724.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib0.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib1.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib2.pkl.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\func_coef.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\func_discrete.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\grad_514.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\gtab_3shell.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\gtab_isbi2013_2shell.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\gtab_taiwan_dsi.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\life_matlab_rmse.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\life_matlab_weights.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\repulsion100.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\repulsion724.npz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\S0_10slices.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\ScannerVectors_GQI101.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_101D.bval -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_101D.bvec -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_101D.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.bval -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.bvec -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_64D.bvals.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_64D.gradients.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_64D.nii -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\sphere_grad.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\t1_coronal_slice.npy -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\test_piesno.nii.gz -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\tracks300.trk -> > build\lib.win32-3.5\dipy\data\files > running build_ext > error: Unable to find vcvarsall.bat > > ---------------------------------------- > Command "C:\Users\Natalia\Anaconda3\python.exe -u -c "import setuptools, > tokenize;__file__='C:\\Users\\Natalia\\AppData\\Local\\Temp\\pip-build-57lh1tho\\dipy\\setup.py';exec(compile(getattr(tokenize, > 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" > install --record > C:\Users\Natalia\AppData\Local\Temp\pip-4nxumgxb-record\install-record.txt > --single-version-externally-managed --compile" failed with error code 1 in > C:\Users\Natalia\AppData\Local\Temp\pip-build-57lh1tho\dipy\ > > > Thank you in advice for your help, > Bests, > Natalia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Mon Apr 11 13:13:16 2016 From: arokem at gmail.com (Ariel Rokem) Date: Mon, 11 Apr 2016 10:13:16 -0700 Subject: [Neuroimaging] [dipy]Fitting diffusion models in the absence of S0 signal In-Reply-To: References: Message-ID: Just talked with Rafael about this: next Monday at 10 AM PST works for both of us. Does that work for you? I sent you a calendar invite. Anyone else who wants to join, please let me know and I will add you to the hangout invite as well. Ariel On Mon, Apr 11, 2016 at 8:58 AM, Eleftherios Garyfallidis < garyfallidis at gmail.com> wrote: > > Rafael can you please make a decision for when to meet to discussion about > this design change? You should pick the time as you are in a very different > timezone than us. Me and Ariel have only 3 hours of difference. > > > On Sun, Apr 10, 2016 at 12:13 PM Ariel Rokem wrote: > >> Hi Eleftherios, >> >> On Sat, Apr 9, 2016 at 3:08 PM, Eleftherios Garyfallidis < >> garyfallidis at gmail.com> wrote: >> >>> >>> Hi Rafael and Ariel, >>> >>> Apologies for delaying to answer here. I think we need to set a hangout >>> to discuss about this. >>> >>> One thing that maybe important for this discussion is that the function >>> >>> from dipy.core.gradients import gradient_table >>> >>> has a parameter called b0_threshold. >>> >>> This can be set to be 300 or higher and then the b0 will be considered >>> as the one at 300. So, if the datasets don't have b=0 but b=300 these >>> can be used instead. >>> >> >>> This means that just by changing the b0_threshold the datasets can be >>> fit in a different ways. >>> Could be that the actual easier solution is to call the gradient table >>> in a different way (different >>> b0 threshold) rather than changing the API? >>> >>> >> No - unfortunately I don't think this would be a solution to this >> particular concern. That is because what we need is really the value of the >> intercept of the signal-by-b-value curve. If we have an S0 measurement, we >> take that, but if we don't we need to *estimate* it. >> >> I will look now to the free water implementation to understand better the >>> different issues. >>> >> >> Yeah - some of the recent commits are about this particular issue (e.g., >> https://github.com/nipy/dipy/pull/835/commits/89c09c7e4095309c9d7ae42eee313a4fd1f9c880), >> but note changes that follow (!). Maybe Rafael can also help explain. >> >> >>> Cheers, >>> Eleftherios >>> >>> p.s. Please give me your availability for a design hangout during the >>> week. >>> >> >> You can (always) find my availability here: >> https://www.google.com/calendar/embed?src=arokem%40gmail.com&ctz=America/Los_Angeles >> >> Cheers, >> >> Ariel >> >> >>> >>> >>> On Fri, Mar 25, 2016 at 11:14 AM Ariel Rokem wrote: >>> >>>> Hi Rafael, >>>> >>>> On Thu, Mar 24, 2016 at 4:12 AM, Rafael Henriques >>> > wrote: >>>> >>>>> Hi Eleftherios, >>>>> >>>>> What can we do if the data don't have b0s? >>>>> In the last years, everyone was including the b0 data in their DWI >>>>> acquisitions. However, nowadays some groups are starting to acquire >>>>> diffusion volume of images with low b-values (e.g. 300 s.mm-2) instead >>>>> of the b0 volumes. They are doing this to insure that when fitting >>>>> diffusion models they do not take into account Perfusion confounding >>>>> effects. So my question is - what can we do to generalize Dipy for >>>>> these cases? My suggestion is to include S0 always as model parameter, >>>>> so even if users do not have b0 data, the model can easily give the >>>>> extrapolated non-perfusion effected S0 signal. >>>>> >>>> >>>> My example code was not really that great to demonstrate this point. I >>>> have now updated the notebook so that it works with data that has a b=0 >>>> measurement, but also with data that doesn't (you'll need to change the >>>> commented out line in cell 3 to see both options). >>>> >>>> I also have two alternative implementations, following Eleftherios' >>>> suggestions (I think): >>>> >>>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>>> >>>> In one implementation an estimate of S0 (`S0_hat`) is part of the >>>> TensorFit object (I think that's what Eleftherios is suggesting). In the >>>> other implementation, the estimate is part of the TensorModel.fit function >>>> (as you suggest). >>>> >>>> The main disadvantage of alternative 1 is that we would have to pass >>>> the data again into a method of the `TensorFit` object. The main >>>> disadvantage of alternative 2 is that it requires a change to the >>>> `TensorFit.__init__` API. My own tendency is to prefer this change to the >>>> `TensorFit.__init__` API, because I don't think that people are using that >>>> API on its own, but are typically getting their `TensorFit` objects from >>>> the `TensorModel.fit` function. >>>> >>>> I think that passing the data in again into the `TensorFit` object will >>>> not only be error-prone, but is also not as efficient. >>>> >>>> Importantly, this is not just a matter for people who use the >>>> prediction API to see that the model fits the data, but also an issue for >>>> fitting models that depend on the DTI model, such as the new FWE DTI model. >>>> >>>> Cheers, >>>> >>>> Ariel >>>> >>>> >>>> >>>> >>>>> Also, how can you recover the S0 information using the line that you >>>>> are suggested? If params only have the diffusion tensor information, >>>>> that line will always be equal to 1, right? Am I missing something >>>>> here? >>>> >>>> Best, >>>>> Rafael >>>>> >>>>> >>>>> > Hi Ariel, >>>>> > >>>>> > Apologies for delaying to answer. >>>>> > >>>>> > What I understand is that now the fit_model is doing the prediction >>>>> for the >>>>> > S0. Am I correct? >>>>> > You recreate a predicted S0 inside fit_model but fit_model is about >>>>> fitting >>>>> > and not about predicting. >>>>> > >>>>> > I am not comfortable to changing fit_model to generate two parameters >>>>> > (params and S0). >>>>> > >>>>> > This command can be called inside the predict method >>>>> > S0 = np.mean(np.exp(np.dot(dm, params))[..., gtab.b0s_mask]) >>>>> > >>>>> > So, for me there is no reason of changing the init method of >>>>> TensorFit. >>>>> > >>>>> > I hope I am not missing something. >>>>> > Let me know if this suggestion is helpful. >>>>> > >>>>> > Cheers, >>>>> > Eleftherios >>>>> > >>>>> > On Sun, Mar 20, 2016 at 12:04 PM, Ariel Rokem >>>>> wrote: >>>>> > >>>>> >> Hi everyone, >>>>> >> >>>>> >> Thought I would re-raise this. Anyone have any thoughts here? Would >>>>> a PR >>>>> >> against the DTI and DKI modules be more helpful to clarify? >>>>> >> >>>>> >> Cheers, >>>>> >> >>>>> >> Ariel >>>>> >> >>>>> >> On Sat, Mar 5, 2016 at 3:04 AM, Ariel Rokem >>>>> wrote: >>>>> >> >>>>> >>> >>>>> >>> On Thu, Mar 3, 2016 at 7:28 AM, Eleftherios Garyfallidis < >>>>> >>> garyfallidis at gmail.com> wrote: >>>>> >>> >>>>> >>>> Sorry your suggestion is not exactly clear. Can you give show us >>>>> how the >>>>> >>>> code will look with your proposal? Also, apart from DTI and DKI >>>>> what other >>>>> >>>> models will be affected from this changes. Is this a change >>>>> suggested only >>>>> >>>> for DTI and DKI or will affect all or most reconstruction models? >>>>> >>>> >>>>> >>>> >>>>> >>> First of all, to answer your last question: this will certainly >>>>> affect >>>>> >>> DTI and DKI, and there will be other models to follow. For example >>>>> the >>>>> >>> FWDTI that Rafael is currently proposing in that PR. The idea >>>>> would be to >>>>> >>> also more tightly integrate these three models (and future >>>>> extensions... >>>>> >>> !), so that we can remove some of the redundancies that currently >>>>> exist. We >>>>> >>> could make this a part of the base.Reconst* methods - it might >>>>> apply to >>>>> >>> other models as well (e.g. CSD, SFM, etc). But that's part of what >>>>> I would >>>>> >>> like to discuss here. >>>>> >>> >>>>> >>> As for code, for now, here's a sketch of what this would look like >>>>> for >>>>> >>> the tensor model: >>>>> >>> >>>>> >>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>>>> >>> >>>>> >>> Note that though it changes the prediction API a bit, not much >>>>> else would >>>>> >>> have to change. In particular, all the code that relies on there >>>>> being 12 >>>>> >>> model parameters will still be intact, because S0 doesn't go into >>>>> the model >>>>> >>> parameters. >>>>> >>> >>>>> >>> What do you think? Am I missing something big here? Or should I go >>>>> ahead >>>>> >>> and start working on a PR implementing this? >>>>> >>> >>>>> >>> Thanks! >>>>> >>> >>>>> >>> Ariel >>>>> >>> >>>>> >>> >>>>> >>> >>>>> >>>> On Mon, Feb 29, 2016 at 11:53 AM, Ariel Rokem >>>> gmail.com> wrote: >>>>> >>>> >>>>> >>>>> Hi everyone, >>>>> >>>>> >>>>> >>>>> In Rafael's recent PR implementing free-water-eliminated DTI ( >>>>> >>>>> https://github.com/nipy/dipy/pull/835), we had a little bit of a >>>>> >>>>> discussion about the use of the non-diffusion weighted signal >>>>> (S0). As >>>>> >>>>> pointed out by Rafael, in the absence of an S0 in the measured >>>>> data, for >>>>> >>>>> some models, that can be derived from the model fit ( >>>>> >>>>> https://github.com/nipy/dipy/pull/835#issuecomment-183060855). >>>>> >>>>> >>>>> >>>>> I think that we would like to support using data both with and >>>>> without >>>>> >>>>> S0. On the other hand, I don't think that we should treat the >>>>> derived S0 as >>>>> >>>>> a model parameter, because in some cases, we want to provide S0 >>>>> as an input >>>>> >>>>> (for example, when predicting back the signal for another >>>>> measurement, with >>>>> >>>>> a different ). In addition, it would be hard to incorporate that >>>>> into the >>>>> >>>>> model_params variable of the TensorFit object, while >>>>> maintaining backwards >>>>> >>>>> compatibility of the TensorModel/TensorFit and derived classes >>>>> (e.g., DKI). >>>>> >>>>> >>>>> >>>>> My proposal is to have an S0 property for ReconstFit objects. >>>>> When this >>>>> >>>>> is calculated from the model (e.g. in DTI), it gets set by the >>>>> `fit` method >>>>> >>>>> of the ReconstModel object. When it isn't, it can be set from >>>>> the data. >>>>> >>>>> Either way, it can be over-ridden by the user (e.g., for the >>>>> purpose of >>>>> >>>>> predicting on a new data-set). This might change the behavior of >>>>> the >>>>> >>>>> prediction code slightly, but maybe that is something we can >>>>> live with? >>>>> >>>>> >>>>> >>>>> Happy to hear what everyone thinks, before we move ahead with >>>>> this. >>>>> >>>>> >>>>> >>>>> Cheers, >>>>> >>>>> >>>>> >>>>> Ariel >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> _______________________________________________ >>>>> >>>>> Neuroimaging mailing list >>>>> >>>>> Neuroimaging at python.org >>>>> >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>> _______________________________________________ >>>>> >>>> Neuroimaging mailing list >>>>> >>>> Neuroimaging at python.org >>>>> >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>> >>>>> >>>> >>>>> >>> >>>>> >> >>>>> >> _______________________________________________ >>>>> >> Neuroimaging mailing list >>>>> >> Neuroimaging at python.org >>>>> >> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >> >>>>> >> >>>>> _______________________________________________ >>>>> Neuroimaging mailing list >>>>> Neuroimaging at python.org >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Mon Apr 11 13:21:24 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Mon, 11 Apr 2016 17:21:24 +0000 Subject: [Neuroimaging] [dipy]Fitting diffusion models in the absence of S0 signal In-Reply-To: References: Message-ID: Cool, thank you. That time is good for me. On Mon, Apr 11, 2016 at 1:15 PM Ariel Rokem wrote: > Just talked with Rafael about this: next Monday at 10 AM PST works for > both of us. Does that work for you? I sent you a calendar invite. Anyone > else who wants to join, please let me know and I will add you to the > hangout invite as well. > > Ariel > > On Mon, Apr 11, 2016 at 8:58 AM, Eleftherios Garyfallidis < > garyfallidis at gmail.com> wrote: > >> >> Rafael can you please make a decision for when to meet to discussion >> about this design change? You should pick the time as you are in a very >> different timezone than us. Me and Ariel have only 3 hours of difference. >> >> >> On Sun, Apr 10, 2016 at 12:13 PM Ariel Rokem wrote: >> >>> Hi Eleftherios, >>> >>> On Sat, Apr 9, 2016 at 3:08 PM, Eleftherios Garyfallidis < >>> garyfallidis at gmail.com> wrote: >>> >>>> >>>> Hi Rafael and Ariel, >>>> >>>> Apologies for delaying to answer here. I think we need to set a hangout >>>> to discuss about this. >>>> >>>> One thing that maybe important for this discussion is that the function >>>> >>>> from dipy.core.gradients import gradient_table >>>> >>>> has a parameter called b0_threshold. >>>> >>>> This can be set to be 300 or higher and then the b0 will be considered >>>> as the one at 300. So, if the datasets don't have b=0 but b=300 these >>>> can be used instead. >>>> >>> >>>> This means that just by changing the b0_threshold the datasets can be >>>> fit in a different ways. >>>> Could be that the actual easier solution is to call the gradient table >>>> in a different way (different >>>> b0 threshold) rather than changing the API? >>>> >>>> >>> No - unfortunately I don't think this would be a solution to this >>> particular concern. That is because what we need is really the value of the >>> intercept of the signal-by-b-value curve. If we have an S0 measurement, we >>> take that, but if we don't we need to *estimate* it. >>> >>> I will look now to the free water implementation to understand better >>>> the different issues. >>>> >>> >>> Yeah - some of the recent commits are about this particular issue (e.g., >>> https://github.com/nipy/dipy/pull/835/commits/89c09c7e4095309c9d7ae42eee313a4fd1f9c880), >>> but note changes that follow (!). Maybe Rafael can also help explain. >>> >>> >>>> Cheers, >>>> Eleftherios >>>> >>>> p.s. Please give me your availability for a design hangout during the >>>> week. >>>> >>> >>> You can (always) find my availability here: >>> https://www.google.com/calendar/embed?src=arokem%40gmail.com&ctz=America/Los_Angeles >>> >>> Cheers, >>> >>> Ariel >>> >>> >>>> >>>> >>>> On Fri, Mar 25, 2016 at 11:14 AM Ariel Rokem wrote: >>>> >>>>> Hi Rafael, >>>>> >>>>> On Thu, Mar 24, 2016 at 4:12 AM, Rafael Henriques < >>>>> rafaelnh21 at gmail.com> wrote: >>>>> >>>>>> Hi Eleftherios, >>>>>> >>>>>> What can we do if the data don't have b0s? >>>>>> In the last years, everyone was including the b0 data in their DWI >>>>>> acquisitions. However, nowadays some groups are starting to acquire >>>>>> diffusion volume of images with low b-values (e.g. 300 s.mm-2) instead >>>>>> of the b0 volumes. They are doing this to insure that when fitting >>>>>> diffusion models they do not take into account Perfusion confounding >>>>>> effects. So my question is - what can we do to generalize Dipy for >>>>>> these cases? My suggestion is to include S0 always as model parameter, >>>>>> so even if users do not have b0 data, the model can easily give the >>>>>> extrapolated non-perfusion effected S0 signal. >>>>>> >>>>> >>>>> My example code was not really that great to demonstrate this point. I >>>>> have now updated the notebook so that it works with data that has a b=0 >>>>> measurement, but also with data that doesn't (you'll need to change the >>>>> commented out line in cell 3 to see both options). >>>>> >>>>> I also have two alternative implementations, following Eleftherios' >>>>> suggestions (I think): >>>>> >>>>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>>>> >>>>> In one implementation an estimate of S0 (`S0_hat`) is part of the >>>>> TensorFit object (I think that's what Eleftherios is suggesting). In the >>>>> other implementation, the estimate is part of the TensorModel.fit function >>>>> (as you suggest). >>>>> >>>>> The main disadvantage of alternative 1 is that we would have to pass >>>>> the data again into a method of the `TensorFit` object. The main >>>>> disadvantage of alternative 2 is that it requires a change to the >>>>> `TensorFit.__init__` API. My own tendency is to prefer this change to the >>>>> `TensorFit.__init__` API, because I don't think that people are using that >>>>> API on its own, but are typically getting their `TensorFit` objects from >>>>> the `TensorModel.fit` function. >>>>> >>>>> I think that passing the data in again into the `TensorFit` object >>>>> will not only be error-prone, but is also not as efficient. >>>>> >>>>> Importantly, this is not just a matter for people who use the >>>>> prediction API to see that the model fits the data, but also an issue for >>>>> fitting models that depend on the DTI model, such as the new FWE DTI model. >>>>> >>>>> Cheers, >>>>> >>>>> Ariel >>>>> >>>>> >>>>> >>>>> >>>>>> Also, how can you recover the S0 information using the line that you >>>>>> are suggested? If params only have the diffusion tensor information, >>>>>> that line will always be equal to 1, right? Am I missing something >>>>>> here? >>>>> >>>>> Best, >>>>>> Rafael >>>>>> >>>>>> >>>>>> > Hi Ariel, >>>>>> > >>>>>> > Apologies for delaying to answer. >>>>>> > >>>>>> > What I understand is that now the fit_model is doing the prediction >>>>>> for the >>>>>> > S0. Am I correct? >>>>>> > You recreate a predicted S0 inside fit_model but fit_model is about >>>>>> fitting >>>>>> > and not about predicting. >>>>>> > >>>>>> > I am not comfortable to changing fit_model to generate two >>>>>> parameters >>>>>> > (params and S0). >>>>>> > >>>>>> > This command can be called inside the predict method >>>>>> > S0 = np.mean(np.exp(np.dot(dm, params))[..., gtab.b0s_mask]) >>>>>> > >>>>>> > So, for me there is no reason of changing the init method of >>>>>> TensorFit. >>>>>> > >>>>>> > I hope I am not missing something. >>>>>> > Let me know if this suggestion is helpful. >>>>>> > >>>>>> > Cheers, >>>>>> > Eleftherios >>>>>> > >>>>>> > On Sun, Mar 20, 2016 at 12:04 PM, Ariel Rokem >>>>>> wrote: >>>>>> > >>>>>> >> Hi everyone, >>>>>> >> >>>>>> >> Thought I would re-raise this. Anyone have any thoughts here? >>>>>> Would a PR >>>>>> >> against the DTI and DKI modules be more helpful to clarify? >>>>>> >> >>>>>> >> Cheers, >>>>>> >> >>>>>> >> Ariel >>>>>> >> >>>>>> >> On Sat, Mar 5, 2016 at 3:04 AM, Ariel Rokem >>>>>> wrote: >>>>>> >> >>>>>> >>> >>>>>> >>> On Thu, Mar 3, 2016 at 7:28 AM, Eleftherios Garyfallidis < >>>>>> >>> garyfallidis at gmail.com> wrote: >>>>>> >>> >>>>>> >>>> Sorry your suggestion is not exactly clear. Can you give show us >>>>>> how the >>>>>> >>>> code will look with your proposal? Also, apart from DTI and DKI >>>>>> what other >>>>>> >>>> models will be affected from this changes. Is this a change >>>>>> suggested only >>>>>> >>>> for DTI and DKI or will affect all or most reconstruction models? >>>>>> >>>> >>>>>> >>>> >>>>>> >>> First of all, to answer your last question: this will certainly >>>>>> affect >>>>>> >>> DTI and DKI, and there will be other models to follow. For >>>>>> example the >>>>>> >>> FWDTI that Rafael is currently proposing in that PR. The idea >>>>>> would be to >>>>>> >>> also more tightly integrate these three models (and future >>>>>> extensions... >>>>>> >>> !), so that we can remove some of the redundancies that currently >>>>>> exist. We >>>>>> >>> could make this a part of the base.Reconst* methods - it might >>>>>> apply to >>>>>> >>> other models as well (e.g. CSD, SFM, etc). But that's part of >>>>>> what I would >>>>>> >>> like to discuss here. >>>>>> >>> >>>>>> >>> As for code, for now, here's a sketch of what this would look >>>>>> like for >>>>>> >>> the tensor model: >>>>>> >>> >>>>>> >>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>>>>> >>> >>>>>> >>> Note that though it changes the prediction API a bit, not much >>>>>> else would >>>>>> >>> have to change. In particular, all the code that relies on there >>>>>> being 12 >>>>>> >>> model parameters will still be intact, because S0 doesn't go into >>>>>> the model >>>>>> >>> parameters. >>>>>> >>> >>>>>> >>> What do you think? Am I missing something big here? Or should I >>>>>> go ahead >>>>>> >>> and start working on a PR implementing this? >>>>>> >>> >>>>>> >>> Thanks! >>>>>> >>> >>>>>> >>> Ariel >>>>>> >>> >>>>>> >>> >>>>>> >>> >>>>>> >>>> On Mon, Feb 29, 2016 at 11:53 AM, Ariel Rokem >>>>> gmail.com> wrote: >>>>>> >>>> >>>>>> >>>>> Hi everyone, >>>>>> >>>>> >>>>>> >>>>> In Rafael's recent PR implementing free-water-eliminated DTI ( >>>>>> >>>>> https://github.com/nipy/dipy/pull/835), we had a little bit of >>>>>> a >>>>>> >>>>> discussion about the use of the non-diffusion weighted signal >>>>>> (S0). As >>>>>> >>>>> pointed out by Rafael, in the absence of an S0 in the measured >>>>>> data, for >>>>>> >>>>> some models, that can be derived from the model fit ( >>>>>> >>>>> https://github.com/nipy/dipy/pull/835#issuecomment-183060855). >>>>>> >>>>> >>>>>> >>>>> I think that we would like to support using data both with and >>>>>> without >>>>>> >>>>> S0. On the other hand, I don't think that we should treat the >>>>>> derived S0 as >>>>>> >>>>> a model parameter, because in some cases, we want to provide S0 >>>>>> as an input >>>>>> >>>>> (for example, when predicting back the signal for another >>>>>> measurement, with >>>>>> >>>>> a different ). In addition, it would be hard to incorporate >>>>>> that into the >>>>>> >>>>> model_params variable of the TensorFit object, while >>>>>> maintaining backwards >>>>>> >>>>> compatibility of the TensorModel/TensorFit and derived classes >>>>>> (e.g., DKI). >>>>>> >>>>> >>>>>> >>>>> My proposal is to have an S0 property for ReconstFit objects. >>>>>> When this >>>>>> >>>>> is calculated from the model (e.g. in DTI), it gets set by the >>>>>> `fit` method >>>>>> >>>>> of the ReconstModel object. When it isn't, it can be set from >>>>>> the data. >>>>>> >>>>> Either way, it can be over-ridden by the user (e.g., for the >>>>>> purpose of >>>>>> >>>>> predicting on a new data-set). This might change the behavior >>>>>> of the >>>>>> >>>>> prediction code slightly, but maybe that is something we can >>>>>> live with? >>>>>> >>>>> >>>>>> >>>>> Happy to hear what everyone thinks, before we move ahead with >>>>>> this. >>>>>> >>>>> >>>>>> >>>>> Cheers, >>>>>> >>>>> >>>>>> >>>>> Ariel >>>>>> >>>>> >>>>>> >>>>> >>>>>> >>>>> >>>>>> >>>>> >>>>>> >>>>> >>>>>> >>>>> >>>>>> >>>>> _______________________________________________ >>>>>> >>>>> Neuroimaging mailing list >>>>>> >>>>> Neuroimaging at python.org >>>>>> >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >>>>> >>>>>> >>>>> >>>>>> >>>>> >>>>>> >>>> _______________________________________________ >>>>>> >>>> Neuroimaging mailing list >>>>>> >>>> Neuroimaging at python.org >>>>>> >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >>>> >>>>>> >>>> >>>>>> >>> >>>>>> >> >>>>>> >> _______________________________________________ >>>>>> >> Neuroimaging mailing list >>>>>> >> Neuroimaging at python.org >>>>>> >> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >> >>>>>> >> >>>>>> _______________________________________________ >>>>>> Neuroimaging mailing list >>>>>> Neuroimaging at python.org >>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >>>>> _______________________________________________ >>>>> Neuroimaging mailing list >>>>> Neuroimaging at python.org >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jevillalonr at gmail.com Mon Apr 11 16:16:50 2016 From: jevillalonr at gmail.com (Julio Villalon) Date: Mon, 11 Apr 2016 13:16:50 -0700 Subject: [Neuroimaging] [dipy]Fitting diffusion models in the absence of S0 signal In-Reply-To: References: Message-ID: Hi guys, I was wondering if you guys could share some literature about the reason behind acquiring volumes with low b-values instead of B0s. Thanks! Julio 2016-04-11 10:21 GMT-07:00 Eleftherios Garyfallidis : > Cool, thank you. That time is good for me. > > On Mon, Apr 11, 2016 at 1:15 PM Ariel Rokem wrote: > >> Just talked with Rafael about this: next Monday at 10 AM PST works for >> both of us. Does that work for you? I sent you a calendar invite. Anyone >> else who wants to join, please let me know and I will add you to the >> hangout invite as well. >> >> Ariel >> >> On Mon, Apr 11, 2016 at 8:58 AM, Eleftherios Garyfallidis < >> garyfallidis at gmail.com> wrote: >> >>> >>> Rafael can you please make a decision for when to meet to discussion >>> about this design change? You should pick the time as you are in a very >>> different timezone than us. Me and Ariel have only 3 hours of difference. >>> >>> >>> On Sun, Apr 10, 2016 at 12:13 PM Ariel Rokem wrote: >>> >>>> Hi Eleftherios, >>>> >>>> On Sat, Apr 9, 2016 at 3:08 PM, Eleftherios Garyfallidis < >>>> garyfallidis at gmail.com> wrote: >>>> >>>>> >>>>> Hi Rafael and Ariel, >>>>> >>>>> Apologies for delaying to answer here. I think we need to set a >>>>> hangout to discuss about this. >>>>> >>>>> One thing that maybe important for this discussion is that the >>>>> function >>>>> >>>>> from dipy.core.gradients import gradient_table >>>>> >>>>> has a parameter called b0_threshold. >>>>> >>>>> This can be set to be 300 or higher and then the b0 will be considered >>>>> as the one at 300. So, if the datasets don't have b=0 but b=300 these >>>>> can be used instead. >>>>> >>>> >>>>> This means that just by changing the b0_threshold the datasets can be >>>>> fit in a different ways. >>>>> Could be that the actual easier solution is to call the gradient table >>>>> in a different way (different >>>>> b0 threshold) rather than changing the API? >>>>> >>>>> >>>> No - unfortunately I don't think this would be a solution to this >>>> particular concern. That is because what we need is really the value of the >>>> intercept of the signal-by-b-value curve. If we have an S0 measurement, we >>>> take that, but if we don't we need to *estimate* it. >>>> >>>> I will look now to the free water implementation to understand better >>>>> the different issues. >>>>> >>>> >>>> Yeah - some of the recent commits are about this particular issue >>>> (e.g., >>>> https://github.com/nipy/dipy/pull/835/commits/89c09c7e4095309c9d7ae42eee313a4fd1f9c880), >>>> but note changes that follow (!). Maybe Rafael can also help explain. >>>> >>>> >>>>> Cheers, >>>>> Eleftherios >>>>> >>>>> p.s. Please give me your availability for a design hangout during the >>>>> week. >>>>> >>>> >>>> You can (always) find my availability here: >>>> https://www.google.com/calendar/embed?src=arokem%40gmail.com&ctz=America/Los_Angeles >>>> >>>> Cheers, >>>> >>>> Ariel >>>> >>>> >>>>> >>>>> >>>>> On Fri, Mar 25, 2016 at 11:14 AM Ariel Rokem wrote: >>>>> >>>>>> Hi Rafael, >>>>>> >>>>>> On Thu, Mar 24, 2016 at 4:12 AM, Rafael Henriques < >>>>>> rafaelnh21 at gmail.com> wrote: >>>>>> >>>>>>> Hi Eleftherios, >>>>>>> >>>>>>> What can we do if the data don't have b0s? >>>>>>> In the last years, everyone was including the b0 data in their DWI >>>>>>> acquisitions. However, nowadays some groups are starting to acquire >>>>>>> diffusion volume of images with low b-values (e.g. 300 s.mm-2) >>>>>>> instead >>>>>>> of the b0 volumes. They are doing this to insure that when fitting >>>>>>> diffusion models they do not take into account Perfusion confounding >>>>>>> effects. So my question is - what can we do to generalize Dipy for >>>>>>> these cases? My suggestion is to include S0 always as model >>>>>>> parameter, >>>>>>> so even if users do not have b0 data, the model can easily give the >>>>>>> extrapolated non-perfusion effected S0 signal. >>>>>>> >>>>>> >>>>>> My example code was not really that great to demonstrate this point. >>>>>> I have now updated the notebook so that it works with data that has a b=0 >>>>>> measurement, but also with data that doesn't (you'll need to change the >>>>>> commented out line in cell 3 to see both options). >>>>>> >>>>>> I also have two alternative implementations, following Eleftherios' >>>>>> suggestions (I think): >>>>>> >>>>>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>>>>> >>>>>> In one implementation an estimate of S0 (`S0_hat`) is part of the >>>>>> TensorFit object (I think that's what Eleftherios is suggesting). In the >>>>>> other implementation, the estimate is part of the TensorModel.fit function >>>>>> (as you suggest). >>>>>> >>>>>> The main disadvantage of alternative 1 is that we would have to pass >>>>>> the data again into a method of the `TensorFit` object. The main >>>>>> disadvantage of alternative 2 is that it requires a change to the >>>>>> `TensorFit.__init__` API. My own tendency is to prefer this change to the >>>>>> `TensorFit.__init__` API, because I don't think that people are using that >>>>>> API on its own, but are typically getting their `TensorFit` objects from >>>>>> the `TensorModel.fit` function. >>>>>> >>>>>> I think that passing the data in again into the `TensorFit` object >>>>>> will not only be error-prone, but is also not as efficient. >>>>>> >>>>>> Importantly, this is not just a matter for people who use the >>>>>> prediction API to see that the model fits the data, but also an issue for >>>>>> fitting models that depend on the DTI model, such as the new FWE DTI model. >>>>>> >>>>>> Cheers, >>>>>> >>>>>> Ariel >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> Also, how can you recover the S0 information using the line that you >>>>>>> are suggested? If params only have the diffusion tensor information, >>>>>>> that line will always be equal to 1, right? Am I missing something >>>>>>> here? >>>>>> >>>>>> Best, >>>>>>> Rafael >>>>>>> >>>>>>> >>>>>>> > Hi Ariel, >>>>>>> > >>>>>>> > Apologies for delaying to answer. >>>>>>> > >>>>>>> > What I understand is that now the fit_model is doing the >>>>>>> prediction for the >>>>>>> > S0. Am I correct? >>>>>>> > You recreate a predicted S0 inside fit_model but fit_model is >>>>>>> about fitting >>>>>>> > and not about predicting. >>>>>>> > >>>>>>> > I am not comfortable to changing fit_model to generate two >>>>>>> parameters >>>>>>> > (params and S0). >>>>>>> > >>>>>>> > This command can be called inside the predict method >>>>>>> > S0 = np.mean(np.exp(np.dot(dm, params))[..., gtab.b0s_mask]) >>>>>>> > >>>>>>> > So, for me there is no reason of changing the init method of >>>>>>> TensorFit. >>>>>>> > >>>>>>> > I hope I am not missing something. >>>>>>> > Let me know if this suggestion is helpful. >>>>>>> > >>>>>>> > Cheers, >>>>>>> > Eleftherios >>>>>>> > >>>>>>> > On Sun, Mar 20, 2016 at 12:04 PM, Ariel Rokem >>>>>>> wrote: >>>>>>> > >>>>>>> >> Hi everyone, >>>>>>> >> >>>>>>> >> Thought I would re-raise this. Anyone have any thoughts here? >>>>>>> Would a PR >>>>>>> >> against the DTI and DKI modules be more helpful to clarify? >>>>>>> >> >>>>>>> >> Cheers, >>>>>>> >> >>>>>>> >> Ariel >>>>>>> >> >>>>>>> >> On Sat, Mar 5, 2016 at 3:04 AM, Ariel Rokem >>>>>>> wrote: >>>>>>> >> >>>>>>> >>> >>>>>>> >>> On Thu, Mar 3, 2016 at 7:28 AM, Eleftherios Garyfallidis < >>>>>>> >>> garyfallidis at gmail.com> wrote: >>>>>>> >>> >>>>>>> >>>> Sorry your suggestion is not exactly clear. Can you give show >>>>>>> us how the >>>>>>> >>>> code will look with your proposal? Also, apart from DTI and DKI >>>>>>> what other >>>>>>> >>>> models will be affected from this changes. Is this a change >>>>>>> suggested only >>>>>>> >>>> for DTI and DKI or will affect all or most reconstruction >>>>>>> models? >>>>>>> >>>> >>>>>>> >>>> >>>>>>> >>> First of all, to answer your last question: this will certainly >>>>>>> affect >>>>>>> >>> DTI and DKI, and there will be other models to follow. For >>>>>>> example the >>>>>>> >>> FWDTI that Rafael is currently proposing in that PR. The idea >>>>>>> would be to >>>>>>> >>> also more tightly integrate these three models (and future >>>>>>> extensions... >>>>>>> >>> !), so that we can remove some of the redundancies that >>>>>>> currently exist. We >>>>>>> >>> could make this a part of the base.Reconst* methods - it might >>>>>>> apply to >>>>>>> >>> other models as well (e.g. CSD, SFM, etc). But that's part of >>>>>>> what I would >>>>>>> >>> like to discuss here. >>>>>>> >>> >>>>>>> >>> As for code, for now, here's a sketch of what this would look >>>>>>> like for >>>>>>> >>> the tensor model: >>>>>>> >>> >>>>>>> >>> https://gist.github.com/arokem/508dc1b22bdbd0bdd748 >>>>>>> >>> >>>>>>> >>> Note that though it changes the prediction API a bit, not much >>>>>>> else would >>>>>>> >>> have to change. In particular, all the code that relies on there >>>>>>> being 12 >>>>>>> >>> model parameters will still be intact, because S0 doesn't go >>>>>>> into the model >>>>>>> >>> parameters. >>>>>>> >>> >>>>>>> >>> What do you think? Am I missing something big here? Or should I >>>>>>> go ahead >>>>>>> >>> and start working on a PR implementing this? >>>>>>> >>> >>>>>>> >>> Thanks! >>>>>>> >>> >>>>>>> >>> Ariel >>>>>>> >>> >>>>>>> >>> >>>>>>> >>> >>>>>>> >>>> On Mon, Feb 29, 2016 at 11:53 AM, Ariel Rokem >>>>>> gmail.com> wrote: >>>>>>> >>>> >>>>>>> >>>>> Hi everyone, >>>>>>> >>>>> >>>>>>> >>>>> In Rafael's recent PR implementing free-water-eliminated DTI ( >>>>>>> >>>>> https://github.com/nipy/dipy/pull/835), we had a little bit >>>>>>> of a >>>>>>> >>>>> discussion about the use of the non-diffusion weighted signal >>>>>>> (S0). As >>>>>>> >>>>> pointed out by Rafael, in the absence of an S0 in the measured >>>>>>> data, for >>>>>>> >>>>> some models, that can be derived from the model fit ( >>>>>>> >>>>> https://github.com/nipy/dipy/pull/835#issuecomment-183060855). >>>>>>> >>>>> >>>>>>> >>>>> I think that we would like to support using data both with and >>>>>>> without >>>>>>> >>>>> S0. On the other hand, I don't think that we should treat the >>>>>>> derived S0 as >>>>>>> >>>>> a model parameter, because in some cases, we want to provide >>>>>>> S0 as an input >>>>>>> >>>>> (for example, when predicting back the signal for another >>>>>>> measurement, with >>>>>>> >>>>> a different ). In addition, it would be hard to incorporate >>>>>>> that into the >>>>>>> >>>>> model_params variable of the TensorFit object, while >>>>>>> maintaining backwards >>>>>>> >>>>> compatibility of the TensorModel/TensorFit and derived classes >>>>>>> (e.g., DKI). >>>>>>> >>>>> >>>>>>> >>>>> My proposal is to have an S0 property for ReconstFit objects. >>>>>>> When this >>>>>>> >>>>> is calculated from the model (e.g. in DTI), it gets set by the >>>>>>> `fit` method >>>>>>> >>>>> of the ReconstModel object. When it isn't, it can be set from >>>>>>> the data. >>>>>>> >>>>> Either way, it can be over-ridden by the user (e.g., for the >>>>>>> purpose of >>>>>>> >>>>> predicting on a new data-set). This might change the behavior >>>>>>> of the >>>>>>> >>>>> prediction code slightly, but maybe that is something we can >>>>>>> live with? >>>>>>> >>>>> >>>>>>> >>>>> Happy to hear what everyone thinks, before we move ahead with >>>>>>> this. >>>>>>> >>>>> >>>>>>> >>>>> Cheers, >>>>>>> >>>>> >>>>>>> >>>>> Ariel >>>>>>> >>>>> >>>>>>> >>>>> >>>>>>> >>>>> >>>>>>> >>>>> >>>>>>> >>>>> >>>>>>> >>>>> >>>>>>> >>>>> _______________________________________________ >>>>>>> >>>>> Neuroimaging mailing list >>>>>>> >>>>> Neuroimaging at python.org >>>>>>> >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>> >>>>> >>>>>>> >>>>> >>>>>>> >>>>> >>>>>>> >>>> _______________________________________________ >>>>>>> >>>> Neuroimaging mailing list >>>>>>> >>>> Neuroimaging at python.org >>>>>>> >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>> >>>> >>>>>>> >>>> >>>>>>> >>> >>>>>>> >> >>>>>>> >> _______________________________________________ >>>>>>> >> Neuroimaging mailing list >>>>>>> >> Neuroimaging at python.org >>>>>>> >> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>> >> >>>>>>> >> >>>>>>> _______________________________________________ >>>>>>> Neuroimaging mailing list >>>>>>> Neuroimaging at python.org >>>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>> >>>>>> _______________________________________________ >>>>>> Neuroimaging mailing list >>>>>> Neuroimaging at python.org >>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >>>>> >>>>> _______________________________________________ >>>>> Neuroimaging mailing list >>>>> Neuroimaging at python.org >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Apr 12 11:26:28 2016 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 12 Apr 2016 08:26:28 -0700 Subject: [Neuroimaging] Journal articles based on PRs Message-ID: Hi everyone, In a conversation I had with Rafael recently, he mentioned to me the Journal of Open Research Software (http://openresearchsoftware.metajnl.com/) that publishes articles about open-source research software, and proposed this as a good place to publish software contributions in our community. This is a good thing because it provides a venue for articles specifically focused on software implementations, even in cases where the methods have previously been published as scientific articles. This provides a standard reference for the software, and an opportunity for researchers who spend time writing open-source software to get credit for the work they are doing. I propose to submit articles to JORS, based on newer additions to libraries (particularly Dipy, but maybe others as well?), a PR, or series of PRs that contribute substantial new features, or a substantial upgrade to previous features. This addresses two major challenges: The first is the challenge we face in incentivizing new contributors to join us. This is because if a standard reference article has already been published for the software, their newer contributions might not get them credit when this standard reference is cited. For example, Dipy contributors who joined the project after 2014 get no credit when that paper is cited. Two recent examples from Dipy are the work that Stephan Meesters has done on contextual enhancement and fiber-to-bundle coherence measures (still in progress in #828), and the work Rutger Fick is doing implementing Laplacian regularization for MAP (#740). These are both implementations of previously published scientific work (in these cases, work that these contributors have been involved in). As you can all appreciate, the effort of implementing these methods in Dipy is substantial, and we want to incentivize these efforts and reward them. A journal article that other researchers can cite is common currency for that. Another challenge we face is incentivizing code review. This is a serious bottle-neck for progress. I propose to add code reviewers as authors to these papers. This will incentivize the substantial effort that goes into reviewing code. JORS allows author contributions to be specified and we would clearly designate these kinds of contributions, so as to not diminish from the effort made by the primary author of the code. But I would like to include the people doing code review (if only because I have spent a lot my own time in code review...). I hope that this will allow people to justify spending time doing this crucial part of the work, and energize our code review process a bit. I'd be happy to hear what people think about this idea. Cheers, Ariel -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Apr 12 16:13:36 2016 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 12 Apr 2016 13:13:36 -0700 Subject: [Neuroimaging] Conda-forge. (was "Re: [dipy]") Message-ID: Hi Natalia, Great to hear that you've resolved that. Indeed, I believe Python 3.5 requires the 2015 Visual Studio to compile our cython extensions. You (and others here) might be interested to hear that I am also making an effort to make it possible to install Dipy on different operating systems with 'conda'. This will be enabled by conda-forge, which is a community-based approach to creating, maintaining and building conda recipes on continuous integration systems (see: https://conda-forge.github.io/) When I finally succeed in making this work, installing Dipy will be a matter of typing the following into your shell (and should work smoothly across different platforms...): conda-config --add-channels conda-forge conda install dipy. In the meanwhile, I am still working on getting the nibabel dependency to build on all systems (almost there! https://github.com/conda-forge/staged-recipes/pull/284), and will then proceed to continue work on the dipy recipe ( https://github.com/conda-forge/staged-recipes/pull/308). If anyone else wants to get involved in maintaining these recipes, that would be great. Cheers, Ariel On Sun, Apr 10, 2016 at 2:27 PM, Natalia Kowalczyk wrote: > SOLVED! > > I had to install Visual Studio 2015 community edition (with c++ additional > tools checked) > >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From krzysztof.gorgolewski at gmail.com Tue Apr 12 16:47:57 2016 From: krzysztof.gorgolewski at gmail.com (Chris Filo Gorgolewski) Date: Tue, 12 Apr 2016 13:47:57 -0700 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: Hi Ariel, I have been recently thinking about the problem of giving academic credit to contributors. In Nipype we are facing exactly the same problem - people joining the project after the paper was published do not benefit from citations. The project is as strong as its community so we wanted to give people credit for their hard work. After some deliberations we have decided to go with a different approach then you are proposing. Beginning from the next release we will switch the main recommended citation from the Frontiers paper to a Zenodo handle. In the past weeks we have been reaching out to Nipype contributors to get their permission and details to become coauthors of this Zenodo entry (please get in touch if you have contributed to Nipype and did not receive an email about this). Zenodo is a non profit organisation that provides citable DOIs for software (free of charge). It is indexed by Google scholar and is easy to set up and update. Thanks to this feature, with each release we will keep adding new contributors as coauthors. This solution has the following advantages: - Everyone contributing to nipype gets credit in the form of citations - There is one publication which makes citing easy (compare to Freesurfer https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation, also consider that some journals limit the number of references) - There is no overhead of writing, submitting, and revising a manuscript on top of developing and revising code There is, however, one big drawback - there is only one first and one senior author on the Zenodo handle. I think this calls for a hybrid solution: an always up to date Zenodo entry combined with individual papers written by developers who feel they need such publication and are willing to put the extra effort of writing the manuscript. I would stay away from making a "deal" or explicitly recommending one particular commercial publisher. There are many outlets which publish software or software extensions (for example F1000Research has "Software Tool" category that often publishes small contributions: see http://f1000research.com/articles/2-192/v2 for example). I would leave it open to the developers to choose the venue where they want to publish using their best judgement. Why is it important? Most publishers are commercial entities and strongly recommending one over another benefits them financially. I think that as an open community, we should stay away from such decisions to avoid being accused of some shady internal deals. Let developers (and potential authors of such papers) decide by themselves. I hope this helps! Best, Chris On Tue, Apr 12, 2016 at 8:26 AM, Ariel Rokem wrote: > Hi everyone, > > In a conversation I had with Rafael recently, he mentioned to me the > Journal of Open Research Software ( > http://openresearchsoftware.metajnl.com/) that publishes articles about > open-source research software, and proposed this as a good place to publish > software contributions in our community. This is a good thing because it > provides a venue for articles specifically focused on software > implementations, even in cases where the methods have previously been > published as scientific articles. This provides a standard reference for > the software, and an opportunity for researchers who spend time writing > open-source software to get credit for the work they are doing. > > I propose to submit articles to JORS, based on newer additions to > libraries (particularly Dipy, but maybe others as well?), a PR, or series > of PRs that contribute substantial new features, or a substantial upgrade > to previous features. This addresses two major challenges: > > The first is the challenge we face in incentivizing new contributors to > join us. This is because if a standard reference article has already been > published for the software, their newer contributions might not get them > credit when this standard reference is cited. For example, Dipy > contributors who joined the project after 2014 get no credit when that > paper is cited. > > Two recent examples from Dipy are the work that Stephan Meesters has done > on contextual enhancement and fiber-to-bundle coherence measures (still in > progress in #828), and the work Rutger Fick is doing implementing Laplacian > regularization for MAP (#740). These are both implementations of previously > published scientific work (in these cases, work that these contributors > have been involved in). As you can all appreciate, the effort of > implementing these methods in Dipy is substantial, and we want to > incentivize these efforts and reward them. A journal article that other > researchers can cite is common currency for that. > > Another challenge we face is incentivizing code review. This is a serious > bottle-neck for progress. I propose to add code reviewers as authors to > these papers. This will incentivize the substantial effort that goes into > reviewing code. JORS allows author contributions to be specified and we > would clearly designate these kinds of contributions, so as to not diminish > from the effort made by the primary author of the code. But I would like to > include the people doing code review (if only because I have spent a lot my > own time in code review...). I hope that this will allow people to justify > spending time doing this crucial part of the work, and energize our code > review process a bit. > > I'd be happy to hear what people think about this idea. > > Cheers, > > Ariel > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Apr 12 17:49:17 2016 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 12 Apr 2016 14:49:17 -0700 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: Hi Chris, Very helpful points. Full disclosure up-front: I have now also volunteered to serve as an associate editor on JORS. Mostly because I realize that in advocating for this, I am pushing the burden further down the stack to the people who will now need to edit/review our papers... This is not a paid position, so I really don't feel like I have a very strong conflict of interest here, but it's worth mentioning. They also might not take me, making this even less relevant. On Tue, Apr 12, 2016 at 1:47 PM, Chris Filo Gorgolewski < krzysztof.gorgolewski at gmail.com> wrote: > Hi Ariel, > I have been recently thinking about the problem of giving academic credit > to contributors. In Nipype we are facing exactly the same problem - people > joining the project after the paper was published do not benefit from > citations. The project is as strong as its community so we wanted to give > people credit for their hard work. After some deliberations we have decided > to go with a different approach then you are proposing. Beginning from the > next release we will switch the main recommended citation from the > Frontiers paper to a Zenodo handle. In the past > weeks we have been reaching out to Nipype contributors to get their > permission and details to become coauthors of this Zenodo entry (please get > in touch if you have contributed to Nipype and did not receive an email > about this). > > Zenodo is a non profit organisation that provides citable DOIs for > software (free of charge). It is indexed by Google scholar and is easy to > set up and update. Thanks to this feature, with each release we will keep > adding new contributors as coauthors. This solution has the following > advantages: > - Everyone contributing to nipype gets credit in the form of citations > - There is one publication which makes citing easy (compare to Freesurfer > https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation, also > consider that some journals limit the number of references) > - There is no overhead of writing, submitting, and revising a manuscript > on top of developing and revising code > Yep - I like the Zenodo solution. We should consider going that route for Dipy as well. I think it's a good idea, on top of writing more papers. For large-ish contributions to Dipy, much of the paper will already get written during the PR process - we usually require rather well fleshed-out examples, so there are already Figures to include in the paper, and some explanations of the methods and their specific benefits and issues. A lot of that can be redirected into the potential software contribution paper. There is, however, one big drawback - there is only one first and one > senior author on the Zenodo handle. I think this calls for a hybrid > solution: an always up to date Zenodo entry combined with individual papers > written by developers who feel they need such publication and are willing > to put the extra effort of writing the manuscript. I would stay away from > making a "deal" or explicitly recommending one particular commercial > publisher. There are many outlets which publish software or software > extensions (for example F1000Research has "Software Tool" category that > often publishes small contributions: see > http://f1000research.com/articles/2-192/v2 for example). I would leave it > open to the developers to choose the venue where they want to publish using > their best judgement. Why is it important? Most publishers are commercial > entities and strongly recommending one over another benefits them > financially. I think that as an open community, we should stay away from > such decisions to avoid being accused of some shady internal deals. Let > developers (and potential authors of such papers) decide by themselves. > Agreed! I don't think any one here can effectively endorse, let alone enforce, choice of any journal, and authors should definitely choose according to their needs and interests. Here's a more complete list to choose from compiled by Neil Chue Hong, who happens to be the editor-in=chief of JORS, but is also the founding director of the UK-based Software Sustainability Institute : http://www.software.ac.uk/resources/guides/which-journals-should-i-publish-my-software We should publish in all of these! :-) That said, I don't mind discussing the merits and drawbacks of different journals. For example, there have been recent discussions about this topic among the vision science community (on email lists that do not have public archives, and contact me off-list if you have). One of the things that emerged is that publishers respond when researchers can agree what they want. Ultimately, if we need to, and no publisher from that list responds to our specific needs as s community, why not start our own journal? But you're right that this is a discussion that authors of specific contributions (the authors of the code in the PR, and the code reviewers that reviewed the PR) should have for each contribution. We would want to make sure before we submit that the journal is willing to review and publish a paper that covers a part of an existing software (JORS would -- I checked). Cheers, Ariel > I hope this helps! > > Best, > Chris > > > > On Tue, Apr 12, 2016 at 8:26 AM, Ariel Rokem wrote: > >> Hi everyone, >> >> In a conversation I had with Rafael recently, he mentioned to me the >> Journal of Open Research Software ( >> http://openresearchsoftware.metajnl.com/) that publishes articles about >> open-source research software, and proposed this as a good place to publish >> software contributions in our community. This is a good thing because it >> provides a venue for articles specifically focused on software >> implementations, even in cases where the methods have previously been >> published as scientific articles. This provides a standard reference for >> the software, and an opportunity for researchers who spend time writing >> open-source software to get credit for the work they are doing. >> >> I propose to submit articles to JORS, based on newer additions to >> libraries (particularly Dipy, but maybe others as well?), a PR, or series >> of PRs that contribute substantial new features, or a substantial upgrade >> to previous features. This addresses two major challenges: >> >> The first is the challenge we face in incentivizing new contributors to >> join us. This is because if a standard reference article has already been >> published for the software, their newer contributions might not get them >> credit when this standard reference is cited. For example, Dipy >> contributors who joined the project after 2014 get no credit when that >> paper is cited. >> >> Two recent examples from Dipy are the work that Stephan Meesters has done >> on contextual enhancement and fiber-to-bundle coherence measures (still in >> progress in #828), and the work Rutger Fick is doing implementing Laplacian >> regularization for MAP (#740). These are both implementations of previously >> published scientific work (in these cases, work that these contributors >> have been involved in). As you can all appreciate, the effort of >> implementing these methods in Dipy is substantial, and we want to >> incentivize these efforts and reward them. A journal article that other >> researchers can cite is common currency for that. >> >> Another challenge we face is incentivizing code review. This is a serious >> bottle-neck for progress. I propose to add code reviewers as authors to >> these papers. This will incentivize the substantial effort that goes into >> reviewing code. JORS allows author contributions to be specified and we >> would clearly designate these kinds of contributions, so as to not diminish >> from the effort made by the primary author of the code. But I would like to >> include the people doing code review (if only because I have spent a lot my >> own time in code review...). I hope that this will allow people to justify >> spending time doing this crucial part of the work, and energize our code >> review process a bit. >> >> I'd be happy to hear what people think about this idea. >> >> Cheers, >> >> Ariel >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Apr 12 20:30:59 2016 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 12 Apr 2016 17:30:59 -0700 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: Just noticed an omission: On Tue, Apr 12, 2016 at 2:49 PM, Ariel Rokem wrote: > Hi Chris, > > Very helpful points. > > Full disclosure up-front: I have now also volunteered to serve as an > associate editor on JORS. Mostly because I realize that in advocating for > this, I am pushing the burden further down the stack to the people who will > now need to edit/review our papers... This is not a paid position, so I > really don't feel like I have a very strong conflict of interest here, but > it's worth mentioning. They also might not take me, making this even less > relevant. > > On Tue, Apr 12, 2016 at 1:47 PM, Chris Filo Gorgolewski < > krzysztof.gorgolewski at gmail.com> wrote: > >> Hi Ariel, >> I have been recently thinking about the problem of giving academic credit >> to contributors. In Nipype we are facing exactly the same problem - people >> joining the project after the paper was published do not benefit from >> citations. The project is as strong as its community so we wanted to give >> people credit for their hard work. After some deliberations we have decided >> to go with a different approach then you are proposing. Beginning from the >> next release we will switch the main recommended citation from the >> Frontiers paper to a Zenodo handle. In the past >> weeks we have been reaching out to Nipype contributors to get their >> permission and details to become coauthors of this Zenodo entry (please get >> in touch if you have contributed to Nipype and did not receive an email >> about this). >> >> Zenodo is a non profit organisation that provides citable DOIs for >> software (free of charge). It is indexed by Google scholar and is easy to >> set up and update. Thanks to this feature, with each release we will keep >> adding new contributors as coauthors. This solution has the following >> advantages: >> - Everyone contributing to nipype gets credit in the form of citations >> - There is one publication which makes citing easy (compare to Freesurfer >> https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation, >> also consider that some journals limit the number of references) >> - There is no overhead of writing, submitting, and revising a manuscript >> on top of developing and revising code >> > > Yep - I like the Zenodo solution. We should consider going that route for > Dipy as well. > > I think it's a good idea, on top of writing more papers. > > For large-ish contributions to Dipy, much of the paper will already get > written during the PR process - we usually require rather well fleshed-out > examples, so there are already Figures to include in the paper, and some > explanations of the methods and their specific benefits and issues. A lot > of that can be redirected into the potential software contribution paper. > > There is, however, one big drawback - there is only one first and one >> senior author on the Zenodo handle. I think this calls for a hybrid >> solution: an always up to date Zenodo entry combined with individual papers >> written by developers who feel they need such publication and are willing >> to put the extra effort of writing the manuscript. I would stay away from >> making a "deal" or explicitly recommending one particular commercial >> publisher. There are many outlets which publish software or software >> extensions (for example F1000Research has "Software Tool" category that >> often publishes small contributions: see >> http://f1000research.com/articles/2-192/v2 for example). I would leave >> it open to the developers to choose the venue where they want to publish >> using their best judgement. Why is it important? Most publishers are >> commercial entities and strongly recommending one over another benefits >> them financially. I think that as an open community, we should stay away >> from such decisions to avoid being accused of some shady internal deals. >> Let developers (and potential authors of such papers) decide by themselves. >> > > > Agreed! I don't think any one here can effectively endorse, let alone > enforce, choice of any journal, and authors should definitely choose > according to their needs and interests. Here's a more complete list to > choose from compiled by Neil Chue Hong, who happens to be the > editor-in=chief of JORS, but is also the founding director of the UK-based > Software Sustainability Institute : > http://www.software.ac.uk/resources/guides/which-journals-should-i-publish-my-software > > We should publish in all of these! :-) > > That said, I don't mind discussing the merits and drawbacks of different > journals. For example, there have been recent discussions about this topic > among the vision science community (on email lists that do not have public > archives, and contact me off-list if you have). One of the things that > emerged is that publishers respond when researchers can agree what they > want. Ultimately, if we need to, and no publisher from that list responds > to our specific needs as s community, why not start our own journal? > Should have said => "contact me off-list if you want to learn more about these conversations" (about the discussions on the vision science lists). > But you're right that this is a discussion that authors of specific > contributions (the authors of the code in the PR, and the code reviewers > that reviewed the PR) should have for each contribution. We would want to > make sure before we submit that the journal is willing to review and > publish a paper that covers a part of an existing software (JORS would -- I > checked). > > Cheers, > > Ariel > > > >> I hope this helps! >> >> Best, >> Chris >> >> >> >> On Tue, Apr 12, 2016 at 8:26 AM, Ariel Rokem wrote: >> >>> Hi everyone, >>> >>> In a conversation I had with Rafael recently, he mentioned to me the >>> Journal of Open Research Software ( >>> http://openresearchsoftware.metajnl.com/) that publishes articles about >>> open-source research software, and proposed this as a good place to publish >>> software contributions in our community. This is a good thing because it >>> provides a venue for articles specifically focused on software >>> implementations, even in cases where the methods have previously been >>> published as scientific articles. This provides a standard reference for >>> the software, and an opportunity for researchers who spend time writing >>> open-source software to get credit for the work they are doing. >>> >>> I propose to submit articles to JORS, based on newer additions to >>> libraries (particularly Dipy, but maybe others as well?), a PR, or series >>> of PRs that contribute substantial new features, or a substantial upgrade >>> to previous features. This addresses two major challenges: >>> >>> The first is the challenge we face in incentivizing new contributors to >>> join us. This is because if a standard reference article has already been >>> published for the software, their newer contributions might not get them >>> credit when this standard reference is cited. For example, Dipy >>> contributors who joined the project after 2014 get no credit when that >>> paper is cited. >>> >>> Two recent examples from Dipy are the work that Stephan Meesters has >>> done on contextual enhancement and fiber-to-bundle coherence measures >>> (still in progress in #828), and the work Rutger Fick is doing implementing >>> Laplacian regularization for MAP (#740). These are both implementations of >>> previously published scientific work (in these cases, work that these >>> contributors have been involved in). As you can all appreciate, the effort >>> of implementing these methods in Dipy is substantial, and we want to >>> incentivize these efforts and reward them. A journal article that other >>> researchers can cite is common currency for that. >>> >>> Another challenge we face is incentivizing code review. This is a >>> serious bottle-neck for progress. I propose to add code reviewers as >>> authors to these papers. This will incentivize the substantial effort that >>> goes into reviewing code. JORS allows author contributions to be specified >>> and we would clearly designate these kinds of contributions, so as to not >>> diminish from the effort made by the primary author of the code. But I >>> would like to include the people doing code review (if only because I have >>> spent a lot my own time in code review...). I hope that this will allow >>> people to justify spending time doing this crucial part of the work, and >>> energize our code review process a bit. >>> >>> I'd be happy to hear what people think about this idea. >>> >>> Cheers, >>> >>> Ariel >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Apr 12 14:24:51 2016 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 12 Apr 2016 11:24:51 -0700 Subject: [Neuroimaging] Reminder: applications for Neurohackweek (Seattle, September 5th-9th) close on April 18th Message-ID: This is a reminder that you can still apply to participate in the first installment of Neurohackweek, a 5 day hands-on workshop, unconference and hackathon held at the University of Washington eScience Institute in Seattle (September 5th-9th, 2016). Neurohackweek will focus on technologies used to analyze human neuroimaging data, on methods used to extract information from large datasets of publicly available data (such as the NKI Enhanced, Human Connectome Project, OpenfMRI, etc.), and on tools for making neuroimaging research open and reproducible. Morning sessions will be devoted to lectures and hands-on guided tutorials, and afternoon sessions will be devoted to participant-directed (unconference ) activities: work on team projects, hackathon sessions, and breakout sessions on topics of interest. To apply: http://neurohackweek.github.io/ Accepted applicants will be asked to pay a fee of $200 upon final registration (June 1st). This fee will include participation in the course, accommodation in the UW dorms, and two meals a day (breakfast and lunch), for the duration of the course. A limited number of fee waivers and travel grants will be available. We encourage students with financial need and students from groups that are underrepresented in neuroimaging and data science to apply for these grants (email with a statement of interest to: arokem at uw.edu) Thanks to funding from the OHBM open science special interest group, participants will be able to submit peer-reviewed progress reports, reporting on their project, to be published as part of Brainhack Proceedings series at Gigascience: https://gigascience.biomedcentral.com/new-content-item Important dates: April 18th: Deadline for applications to participate May 6th: Notification of acceptance June 1st: Final registration deadline -------------- next part -------------- An HTML attachment was scrubbed... URL: From bnucon at gmail.com Wed Apr 13 08:46:23 2016 From: bnucon at gmail.com (Xiangzhen Kong) Date: Wed, 13 Apr 2016 20:46:23 +0800 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: Hi, Chris. The Zenodo solution is so cool. I have not received the email about the this event. Please help me to check it. My GitHub ID is Conxz. Thank you! Best, Xiangzhen On Wed, Apr 13, 2016 at 4:47 AM, Chris Filo Gorgolewski < krzysztof.gorgolewski at gmail.com> wrote: > Hi Ariel, > I have been recently thinking about the problem of giving academic credit > to contributors. In Nipype we are facing exactly the same problem - people > joining the project after the paper was published do not benefit from > citations. The project is as strong as its community so we wanted to give > people credit for their hard work. After some deliberations we have decided > to go with a different approach then you are proposing. Beginning from the > next release we will switch the main recommended citation from the > Frontiers paper to a Zenodo handle. In the past > weeks we have been reaching out to Nipype contributors to get their > permission and details to become coauthors of this Zenodo entry (please get > in touch if you have contributed to Nipype and did not receive an email > about this). > > Zenodo is a non profit organisation that provides citable DOIs for > software (free of charge). It is indexed by Google scholar and is easy to > set up and update. Thanks to this feature, with each release we will keep > adding new contributors as coauthors. This solution has the following > advantages: > - Everyone contributing to nipype gets credit in the form of citations > - There is one publication which makes citing easy (compare to Freesurfer > https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation, also > consider that some journals limit the number of references) > - There is no overhead of writing, submitting, and revising a manuscript > on top of developing and revising code > > There is, however, one big drawback - there is only one first and one > senior author on the Zenodo handle. I think this calls for a hybrid > solution: an always up to date Zenodo entry combined with individual papers > written by developers who feel they need such publication and are willing > to put the extra effort of writing the manuscript. I would stay away from > making a "deal" or explicitly recommending one particular commercial > publisher. There are many outlets which publish software or software > extensions (for example F1000Research has "Software Tool" category that > often publishes small contributions: see > http://f1000research.com/articles/2-192/v2 for example). I would leave it > open to the developers to choose the venue where they want to publish using > their best judgement. Why is it important? Most publishers are commercial > entities and strongly recommending one over another benefits them > financially. I think that as an open community, we should stay away from > such decisions to avoid being accused of some shady internal deals. Let > developers (and potential authors of such papers) decide by themselves. > > I hope this helps! > > Best, > Chris > > > > On Tue, Apr 12, 2016 at 8:26 AM, Ariel Rokem wrote: > >> Hi everyone, >> >> In a conversation I had with Rafael recently, he mentioned to me the >> Journal of Open Research Software ( >> http://openresearchsoftware.metajnl.com/) that publishes articles about >> open-source research software, and proposed this as a good place to publish >> software contributions in our community. This is a good thing because it >> provides a venue for articles specifically focused on software >> implementations, even in cases where the methods have previously been >> published as scientific articles. This provides a standard reference for >> the software, and an opportunity for researchers who spend time writing >> open-source software to get credit for the work they are doing. >> >> I propose to submit articles to JORS, based on newer additions to >> libraries (particularly Dipy, but maybe others as well?), a PR, or series >> of PRs that contribute substantial new features, or a substantial upgrade >> to previous features. This addresses two major challenges: >> >> The first is the challenge we face in incentivizing new contributors to >> join us. This is because if a standard reference article has already been >> published for the software, their newer contributions might not get them >> credit when this standard reference is cited. For example, Dipy >> contributors who joined the project after 2014 get no credit when that >> paper is cited. >> >> Two recent examples from Dipy are the work that Stephan Meesters has done >> on contextual enhancement and fiber-to-bundle coherence measures (still in >> progress in #828), and the work Rutger Fick is doing implementing Laplacian >> regularization for MAP (#740). These are both implementations of previously >> published scientific work (in these cases, work that these contributors >> have been involved in). As you can all appreciate, the effort of >> implementing these methods in Dipy is substantial, and we want to >> incentivize these efforts and reward them. A journal article that other >> researchers can cite is common currency for that. >> >> Another challenge we face is incentivizing code review. This is a serious >> bottle-neck for progress. I propose to add code reviewers as authors to >> these papers. This will incentivize the substantial effort that goes into >> reviewing code. JORS allows author contributions to be specified and we >> would clearly designate these kinds of contributions, so as to not diminish >> from the effort made by the primary author of the code. But I would like to >> include the people doing code review (if only because I have spent a lot my >> own time in code review...). I hope that this will allow people to justify >> spending time doing this crucial part of the work, and energize our code >> review process a bit. >> >> I'd be happy to hear what people think about this idea. >> >> Cheers, >> >> Ariel >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- ----------------------------------------------------------------------------------------- Kong Xiangzhen State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China, 100875. -------------- next part -------------- An HTML attachment was scrubbed... URL: From etienne.roesch at gmail.com Wed Apr 13 10:02:15 2016 From: etienne.roesch at gmail.com (Etienne B. Roesch) Date: Wed, 13 Apr 2016 14:02:15 +0000 Subject: [Neuroimaging] Summer School "Advanced Scientific Programming in Python" in Reading, UK, September 5--11, 2016 Message-ID: *Advanced Scientific Programming in Python* a Summer School by the G-Node, and the Centre for Integrative Neuroscience and Neurodynamics, School of Psychology and Clinical Language Sciences, University of Reading, UK Scientists spend more and more time writing, maintaining, and debugging software. While techniques for doing this efficiently have evolved, only few scientists have been trained to use them. As a result, instead of doing their research, they spend far too much time writing deficient code and reinventing the wheel. In this course we will present a selection of advanced programming techniques and best practices which are standard in the industry, but especially tailored to the needs of a programming scientist. Lectures are devised to be interactive and to give the students enough time to acquire direct hands-on experience with the materials. Students will work in pairs throughout the school and will team up to practice the newly learned skills in a real programming project ? an entertaining computer game. We use the Python programming language for the entire course. Python works as a simple programming language for beginners, but more importantly, it also works great in scientific simulations and data analysis. We show how clean language design, ease of extensibility, and the great wealth of open source libraries for scientific computing and data visualization are driving Python to become a standard tool for the programming scientist. This school is targeted at Master or PhD students and Post-docs from all areas of science. Competence in Python or in another language such as Java, C/C++, MATLAB, or Mathematica is absolutely required. Basic knowledge of Python and of a version control system such as git, subversion, mercurial, or bazaar is assumed. Participants without any prior experience with Python and/or git should work through the proposed introductory material before the course. We are striving hard to get a pool of students which is international and gender-balanced. You can apply online: https://python.g-node.org *Application deadline: 23:59 UTC, May 15, 2016. Be sure to read the FAQ before applying. * Participation is for free, i.e. no fee is charged! Participants however should take care of travel, living, and accommodation expenses by themselves. Travel grants may be available. *Date & Location * September 5?11, 2016. Reading, UK *Program * - Best Programming Practices ? Best practices for scientific programming ? Version control with git and how to contribute to open source projects with GitHub ? Best practices in data visualization - Software Carpentry ? Test-driven development ? Debugging with a debugger ? Profiling code - Scientific Tools for Python ? Advanced NumPy - Advanced Python ? Decorators ? Context managers ? Generators - The Quest for Speed ? Writing parallel applications ? Interfacing to C with Cython ? Memory-bound problems and memory profiling ? Data containers: storage and fast access to large data - Practical Software Development ? Group project *Preliminary Faculty * ? Francesc Alted, freelance consultant, author of PyTables, Spain ? Pietro Berkes, Enthought Inc., Cambridge, UK ? Zbigniew J?drzejewski-Szmek, Krasnow Institute, George Mason University, Fairfax, VA, USA ? Eilif Muller, Blue Brain Project, ?cole Polytechnique F?d?rale de Lausanne, Switzerland ? Juan Nunez-Iglesias, Victorian Life Sciences Computation Initiative, University of Melbourne, Australia ? Rike-Benjamin Schuppner, Institute for Theoretical Biology, Humboldt-Universit?t zu Berlin, Germany ? Bartosz Tele?czuk, European Institute for Theoretical Neuroscience, CNRS, Paris, France ? St?fan van der Walt, Berkeley Institute for Data Science, UC Berkeley, CA, USA ? Nelle Varoquaux, Centre for Computational Biology Mines ParisTech, Institut Curie, U900 INSERM, Paris, France ? Tiziano Zito, freelance consultant, Germany *Organizers * For the German Neuroinformatics Node of the INCF (G-Node) Germany: ? Tiziano Zito, freelance consultant, Germany ? Zbigniew J?drzejewski-Szmek, Krasnow Institute, George Mason University, Fairfax, USA ? Jakob Jordan, Institute of Neuroscience and Medicine (INM-6), Forschungszentrum J?lich GmbH, Germany For the Centre for Integrative Neuroscience and Neurodynamics, School of Psychology and Clinical Language Sciences, University of Reading UK: ? Etienne Roesch, Centre for Integrative Neuroscience and Neurodynamics, University of Reading, UK *Website*: https://python.g-node.org *Contact*: python-info at g-node.org Kind regards, Etienne ----- Dr. Etienne B. Roesch Lecturer in Cognitive Science University of Reading -------------- next part -------------- An HTML attachment was scrubbed... URL: From soft.join at gmail.com Thu Apr 14 08:17:12 2016 From: soft.join at gmail.com (soft.join Huang) Date: Thu, 14 Apr 2016 20:17:12 +0800 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: Hi, Chris, Xiangzhen shares the mail, and I like the Zenodo solution very much. I don't know what information should I provide (my GitHub ID is sealhuang). Best, Lijie Huang On Wed, Apr 13, 2016 at 8:46 PM, Xiangzhen Kong wrote: > Hi, Chris. > The Zenodo solution is so cool. > I have not received the email about the this event. Please help me to > check it. My GitHub ID is Conxz. Thank you! > > > Best, > Xiangzhen > > On Wed, Apr 13, 2016 at 4:47 AM, Chris Filo Gorgolewski < > krzysztof.gorgolewski at gmail.com> wrote: > >> Hi Ariel, >> I have been recently thinking about the problem of giving academic credit >> to contributors. In Nipype we are facing exactly the same problem - people >> joining the project after the paper was published do not benefit from >> citations. The project is as strong as its community so we wanted to give >> people credit for their hard work. After some deliberations we have decided >> to go with a different approach then you are proposing. Beginning from the >> next release we will switch the main recommended citation from the >> Frontiers paper to a Zenodo handle. In the past >> weeks we have been reaching out to Nipype contributors to get their >> permission and details to become coauthors of this Zenodo entry (please get >> in touch if you have contributed to Nipype and did not receive an email >> about this). >> >> Zenodo is a non profit organisation that provides citable DOIs for >> software (free of charge). It is indexed by Google scholar and is easy to >> set up and update. Thanks to this feature, with each release we will keep >> adding new contributors as coauthors. This solution has the following >> advantages: >> - Everyone contributing to nipype gets credit in the form of citations >> - There is one publication which makes citing easy (compare to Freesurfer >> https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation, >> also consider that some journals limit the number of references) >> - There is no overhead of writing, submitting, and revising a manuscript >> on top of developing and revising code >> >> There is, however, one big drawback - there is only one first and one >> senior author on the Zenodo handle. I think this calls for a hybrid >> solution: an always up to date Zenodo entry combined with individual papers >> written by developers who feel they need such publication and are willing >> to put the extra effort of writing the manuscript. I would stay away from >> making a "deal" or explicitly recommending one particular commercial >> publisher. There are many outlets which publish software or software >> extensions (for example F1000Research has "Software Tool" category that >> often publishes small contributions: see >> http://f1000research.com/articles/2-192/v2 for example). I would leave >> it open to the developers to choose the venue where they want to publish >> using their best judgement. Why is it important? Most publishers are >> commercial entities and strongly recommending one over another benefits >> them financially. I think that as an open community, we should stay away >> from such decisions to avoid being accused of some shady internal deals. >> Let developers (and potential authors of such papers) decide by themselves. >> >> I hope this helps! >> >> Best, >> Chris >> >> >> >> On Tue, Apr 12, 2016 at 8:26 AM, Ariel Rokem wrote: >> >>> Hi everyone, >>> >>> In a conversation I had with Rafael recently, he mentioned to me the >>> Journal of Open Research Software ( >>> http://openresearchsoftware.metajnl.com/) that publishes articles about >>> open-source research software, and proposed this as a good place to publish >>> software contributions in our community. This is a good thing because it >>> provides a venue for articles specifically focused on software >>> implementations, even in cases where the methods have previously been >>> published as scientific articles. This provides a standard reference for >>> the software, and an opportunity for researchers who spend time writing >>> open-source software to get credit for the work they are doing. >>> >>> I propose to submit articles to JORS, based on newer additions to >>> libraries (particularly Dipy, but maybe others as well?), a PR, or series >>> of PRs that contribute substantial new features, or a substantial upgrade >>> to previous features. This addresses two major challenges: >>> >>> The first is the challenge we face in incentivizing new contributors to >>> join us. This is because if a standard reference article has already been >>> published for the software, their newer contributions might not get them >>> credit when this standard reference is cited. For example, Dipy >>> contributors who joined the project after 2014 get no credit when that >>> paper is cited. >>> >>> Two recent examples from Dipy are the work that Stephan Meesters has >>> done on contextual enhancement and fiber-to-bundle coherence measures >>> (still in progress in #828), and the work Rutger Fick is doing implementing >>> Laplacian regularization for MAP (#740). These are both implementations of >>> previously published scientific work (in these cases, work that these >>> contributors have been involved in). As you can all appreciate, the effort >>> of implementing these methods in Dipy is substantial, and we want to >>> incentivize these efforts and reward them. A journal article that other >>> researchers can cite is common currency for that. >>> >>> Another challenge we face is incentivizing code review. This is a >>> serious bottle-neck for progress. I propose to add code reviewers as >>> authors to these papers. This will incentivize the substantial effort that >>> goes into reviewing code. JORS allows author contributions to be specified >>> and we would clearly designate these kinds of contributions, so as to not >>> diminish from the effort made by the primary author of the code. But I >>> would like to include the people doing code review (if only because I have >>> spent a lot my own time in code review...). I hope that this will allow >>> people to justify spending time doing this crucial part of the work, and >>> energize our code review process a bit. >>> >>> I'd be happy to hear what people think about this idea. >>> >>> Cheers, >>> >>> Ariel >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > > -- > > ----------------------------------------------------------------------------------------- > Kong Xiangzhen > State Key Laboratory of Cognitive Neuroscience and Learning, > Beijing Normal University, > Beijing, China, 100875. > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kw401 at cam.ac.uk Thu Apr 14 17:57:07 2016 From: kw401 at cam.ac.uk (Kirstie Whitaker) Date: Thu, 14 Apr 2016 22:57:07 +0100 Subject: [Neuroimaging] [PySurfer][Nibabel] Write annot from roi display Message-ID: Hi Pysurfer and Nibabel teams, I have some code that is heavily stolen from the Display ROI values pysurfer example. I'd like to be able to overlay the same ROI information in BrainNet (as they allow plotting a network on top of a freesurfer image and because my colleague already uses it!) and so I tried to use the write_annot command in nibabel. The command was (at the end of the example linked above) nib.freesurfer.io.write_annot('output.annot', vtx_data, ctab, names) Unfortunately this file does not load up in freeview. I have a feeling that's because I'm passing the wrong information to the write_annot command. Any suggestions on what to do? Thank you! Kx -- Kirstie Whitaker, PhD Research Associate Department of Psychiatry University of Cambridge *Mailing Address* Brain Mapping Unit Department of Psychiatry Sir William Hardy Building Downing Street Cambridge CB2 3EB *Phone: *+44 7583 535 307 *Website:* www.kirstiewhitaker.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From mwaskom at stanford.edu Thu Apr 14 18:19:48 2016 From: mwaskom at stanford.edu (Michael Waskom) Date: Thu, 14 Apr 2016 15:19:48 -0700 Subject: [Neuroimaging] [PySurfer][Nibabel] Write annot from roi display In-Reply-To: References: Message-ID: Hi Kirstie, What do the vtx_data, ctab, and names that you are passing look like? Best, Michael On Thu, Apr 14, 2016 at 2:57 PM, Kirstie Whitaker wrote: > Hi Pysurfer and Nibabel teams, > > I have some code that is heavily stolen from the Display ROI values > pysurfer > example. > > I'd like to be able to overlay the same ROI information in BrainNet > (as they allow plotting a network > on top of a freesurfer image and because my colleague already uses it!) and > so I tried to use the write_annot > > command in nibabel. > > The command was (at the end of the example linked above) > > nib.freesurfer.io.write_annot('output.annot', vtx_data, ctab, names) > > Unfortunately this file does not load up in freeview. I have a feeling > that's because I'm passing the wrong information to the write_annot command. > > Any suggestions on what to do? > > Thank you! > Kx > > -- > Kirstie Whitaker, PhD > Research Associate > > Department of Psychiatry > University of Cambridge > > *Mailing Address* > Brain Mapping Unit > Department of Psychiatry > Sir William Hardy Building > Downing Street > Cambridge CB2 3EB > > *Phone: *+44 7583 535 307 > *Website:* www.kirstiewhitaker.com > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kw401 at cam.ac.uk Thu Apr 14 18:27:02 2016 From: kw401 at cam.ac.uk (Kirstie Whitaker) Date: Thu, 14 Apr 2016 23:27:02 +0100 Subject: [Neuroimaging] [PySurfer][Nibabel] Write annot from roi display In-Reply-To: References: Message-ID: <9D75177F-D69A-477B-8B3B-7C413993B3B3@cam.ac.uk> Hi Michael, Sorry for not being clear. They're exactly as defined in the Display ROIs example that I linked to. There aren't any error messages when writing out the file, I just can't load it up in freeview. I'll send a screenshot from freeview tomorrow morning. (I'm on my iPhone now.) Is there anything else that would be good to know? Kx Sent from my iPhone, please excuse any typos or excessive brevity > On 14 Apr 2016, at 23:19, Michael Waskom wrote: > > Hi Kirstie, > > What do the vtx_data, ctab, and names that you are passing look like? > > Best, > Michael > >> On Thu, Apr 14, 2016 at 2:57 PM, Kirstie Whitaker wrote: >> Hi Pysurfer and Nibabel teams, >> >> I have some code that is heavily stolen from the Display ROI values pysurfer example. >> >> I'd like to be able to overlay the same ROI information in BrainNet (as they allow plotting a network on top of a freesurfer image and because my colleague already uses it!) and so I tried to use the write_annot command in nibabel. >> >> The command was (at the end of the example linked above) >> >> nib.freesurfer.io.write_annot('output.annot', vtx_data, ctab, names) >> >> Unfortunately this file does not load up in freeview. I have a feeling that's because I'm passing the wrong information to the write_annot command. >> >> Any suggestions on what to do? >> >> Thank you! >> Kx >> >> -- >> Kirstie Whitaker, PhD >> Research Associate >> >> Department of Psychiatry >> University of Cambridge >> >> Mailing Address >> Brain Mapping Unit >> Department of Psychiatry >> Sir William Hardy Building >> Downing Street >> Cambridge CB2 3EB >> >> Phone: +44 7583 535 307 >> Website: www.kirstiewhitaker.com >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From mwaskom at stanford.edu Thu Apr 14 19:19:26 2016 From: mwaskom at stanford.edu (Michael Waskom) Date: Thu, 14 Apr 2016 16:19:26 -0700 Subject: [Neuroimaging] [PySurfer][Nibabel] Write annot from roi display In-Reply-To: <9D75177F-D69A-477B-8B3B-7C413993B3B3@cam.ac.uk> References: <9D75177F-D69A-477B-8B3B-7C413993B3B3@cam.ac.uk> Message-ID: One thought is that the annot labels are likely expected to be integers, whereas if you're adapting the script to show statistical values in each parcel, they're probably floats. That could be causing issues. On Thu, Apr 14, 2016 at 3:27 PM, Kirstie Whitaker wrote: > Hi Michael, > > Sorry for not being clear. > > They're exactly as defined in the Display ROIs example that I linked to. > > There aren't any error messages when writing out the file, I just can't > load it up in freeview. I'll send a screenshot from freeview tomorrow > morning. (I'm on my iPhone now.) Is there anything else that would be good > to know? > > Kx > > Sent from my iPhone, please excuse any typos or excessive brevity > > On 14 Apr 2016, at 23:19, Michael Waskom wrote: > > Hi Kirstie, > > What do the vtx_data, ctab, and names that you are passing look like? > > Best, > Michael > > On Thu, Apr 14, 2016 at 2:57 PM, Kirstie Whitaker wrote: > >> Hi Pysurfer and Nibabel teams, >> >> I have some code that is heavily stolen from the Display ROI values >> pysurfer >> example. >> >> I'd like to be able to overlay the same ROI information in BrainNet >> (as they allow plotting a network >> on top of a freesurfer image and because my colleague already uses it!) and >> so I tried to use the write_annot >> >> command in nibabel. >> >> The command was (at the end of the example linked above) >> >> nib.freesurfer.io.write_annot('output.annot', vtx_data, ctab, names) >> >> Unfortunately this file does not load up in freeview. I have a feeling >> that's because I'm passing the wrong information to the write_annot command. >> >> Any suggestions on what to do? >> >> Thank you! >> Kx >> >> -- >> Kirstie Whitaker, PhD >> Research Associate >> >> Department of Psychiatry >> University of Cambridge >> >> *Mailing Address* >> Brain Mapping Unit >> Department of Psychiatry >> Sir William Hardy Building >> Downing Street >> Cambridge CB2 3EB >> >> *Phone: *+44 7583 535 307 >> *Website:* www.kirstiewhitaker.com >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vsochat at stanford.edu Fri Apr 15 12:07:01 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Fri, 15 Apr 2016 09:07:01 -0700 Subject: [Neuroimaging] Fwd: @vsoch has been invited to Tweet from @nipydev. In-Reply-To: <14.07.25799.0AA01175@twitter.com> References: <14.07.25799.0AA01175@twitter.com> Message-ID: Any reason to make a second twitter handle? I made nipyorg last year (linked to nipy.org, respectively). Did you not like the name? ---------- Forwarded message ---------- From: Twitter Date: Fri, Apr 15, 2016 at 8:37 AM Subject: @vsoch has been invited to Tweet from @nipydev. To: Vanessa Hi, Vanessa. You have been given permission to Tweet from @nipydev . <#m_8251679740162604275_> <#m_8251679740162604275_> *nipy community* @nipydev This feature makes TweetDeck more secure for teams by avoiding the need to share passwords. Learn more about teams . Start by visiting tweetdeck.twitter.com . Log in to TweetDeck Help | Update who can invite me to their team Twitter, Inc. 1355 Market St., Suite 900 San Francisco, CA 94103 <#m_8251679740162604275_> -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Fri Apr 15 12:42:33 2016 From: arokem at gmail.com (Ariel Rokem) Date: Fri, 15 Apr 2016 09:42:33 -0700 Subject: [Neuroimaging] Fwd: @vsoch has been invited to Tweet from @nipydev. In-Reply-To: References: <14.07.25799.0AA01175@twitter.com> Message-ID: Hi Vanessa, That was my (somewhat awkward) doing. I actually created the nipydev twitter account a while back (Jan 2015?), but did fairly poorly at actually using it. I am a bit of a luddite, and just discovered this morning how I might give others permission to tweet from that account, and added you. I'm happy to stick to nipyorg, though. It seems to be doing a much better job :-) Cheers, Ariel On Fri, Apr 15, 2016 at 9:07 AM, vanessa sochat wrote: > Any reason to make a second twitter handle? I made nipyorg > last year (linked to nipy.org, > respectively). Did you not like the name? > > > ---------- Forwarded message ---------- > From: Twitter > Date: Fri, Apr 15, 2016 at 8:37 AM > Subject: @vsoch has been invited to Tweet from @nipydev. > To: Vanessa > > > > > Hi, > Vanessa. > > You have been given permission to Tweet from @nipydev > . > > <#m_9024479373722167915_m_8251679740162604275_> > > <#m_9024479373722167915_m_8251679740162604275_> > > > *nipy community* > > @nipydev > > This feature makes TweetDeck > > more secure for teams by avoiding the need to share passwords. Learn more > about teams > . > > Start by visiting tweetdeck.twitter.com > . > > Log in to TweetDeck > > > Help > > | Update who can invite me to their team > > Twitter, Inc. 1355 Market St., Suite 900 San Francisco, CA 94103 > <#m_9024479373722167915_m_8251679740162604275_> > > > > -- > Vanessa Villamia Sochat > Stanford University > (603) 321-0676 > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vsochat at stanford.edu Fri Apr 15 13:37:58 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Fri, 15 Apr 2016 10:37:58 -0700 Subject: [Neuroimaging] Fwd: @vsoch has been invited to Tweet from @nipydev. In-Reply-To: References: <14.07.25799.0AA01175@twitter.com> Message-ID: Hi Ariel, I'm indifferent between the two, but I do think we should be consistent, and perhaps just have one. Neither is well established, so I think we have two options. 1) I can share the credentials for nipyorg, and you can do the equivalent "tweet permissions" for it, or 2) we can decide on one of the accounts, and adjust the branding / logo for the other. Since nipyorg is already hooked up to nipy.org and has a few more followers and tweets, I am thinking the first is best. Let me know you what you think. Best, Vanessa On Fri, Apr 15, 2016 at 9:42 AM, Ariel Rokem wrote: > Hi Vanessa, > > That was my (somewhat awkward) doing. I actually created the nipydev > twitter account a while back (Jan 2015?), but did fairly poorly at actually > using it. I am a bit of a luddite, and just discovered this morning how I > might give others permission to tweet from that account, and added you. I'm > happy to stick to nipyorg, though. It seems to be doing a much better job > :-) > > Cheers, > > Ariel > > On Fri, Apr 15, 2016 at 9:07 AM, vanessa sochat > wrote: > >> Any reason to make a second twitter handle? I made nipyorg >> last year (linked to nipy.org, >> respectively). Did you not like the name? >> >> >> ---------- Forwarded message ---------- >> From: Twitter >> Date: Fri, Apr 15, 2016 at 8:37 AM >> Subject: @vsoch has been invited to Tweet from @nipydev. >> To: Vanessa >> >> >> >> >> Hi, >> Vanessa. >> >> You have been given permission to Tweet from @nipydev >> . >> >> <#m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >> >> <#m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >> >> >> *nipy community* >> >> @nipydev >> >> This feature makes TweetDeck >> >> more secure for teams by avoiding the need to share passwords. Learn >> more about teams >> . >> >> Start by visiting tweetdeck.twitter.com >> . >> >> Log in to TweetDeck >> >> >> Help >> >> | Update who can invite me to their team >> >> Twitter, Inc. 1355 Market St., Suite 900 San Francisco, CA 94103 >> <#m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >> >> >> >> -- >> Vanessa Villamia Sochat >> Stanford University >> (603) 321-0676 >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Fri Apr 15 13:47:49 2016 From: arokem at gmail.com (Ariel Rokem) Date: Fri, 15 Apr 2016 10:47:49 -0700 Subject: [Neuroimaging] Fwd: @vsoch has been invited to Tweet from @nipydev. In-Reply-To: References: <14.07.25799.0AA01175@twitter.com> Message-ID: On Fri, Apr 15, 2016 at 10:37 AM, vanessa sochat wrote: > Hi Ariel, > > I'm indifferent between the two, but I do think we should be consistent, > and perhaps just have one. Neither is well established, so I think we have > two options. 1) I can share the credentials for nipyorg, and you can do the > equivalent "tweet permissions" for it, or 2) we can decide on one of the > accounts, and adjust the branding / logo for the other. Since nipyorg is > already hooked up to nipy.org and has a few more followers and tweets, I > am thinking the first is best. Let me know you what you think. > > +1. I'll give nipydev a proper twitter sea burial. > Best, > > Vanessa > > On Fri, Apr 15, 2016 at 9:42 AM, Ariel Rokem wrote: > >> Hi Vanessa, >> >> That was my (somewhat awkward) doing. I actually created the nipydev >> twitter account a while back (Jan 2015?), but did fairly poorly at actually >> using it. I am a bit of a luddite, and just discovered this morning how I >> might give others permission to tweet from that account, and added you. I'm >> happy to stick to nipyorg, though. It seems to be doing a much better job >> :-) >> >> Cheers, >> >> Ariel >> >> On Fri, Apr 15, 2016 at 9:07 AM, vanessa sochat >> wrote: >> >>> Any reason to make a second twitter handle? I made nipyorg >>> last year (linked to nipy.org, >>> respectively). Did you not like the name? >>> >>> >>> ---------- Forwarded message ---------- >>> From: Twitter >>> Date: Fri, Apr 15, 2016 at 8:37 AM >>> Subject: @vsoch has been invited to Tweet from @nipydev. >>> To: Vanessa >>> >>> >>> >>> >>> Hi, >>> Vanessa. >>> >>> You have been given permission to Tweet from @nipydev >>> . >>> >>> >>> <#m_3071064693756941791_m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >>> >>> >>> <#m_3071064693756941791_m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >>> >>> >>> *nipy community* >>> >>> @nipydev >>> >>> This feature makes TweetDeck >>> >>> more secure for teams by avoiding the need to share passwords. Learn >>> more about teams >>> . >>> >>> Start by visiting tweetdeck.twitter.com >>> . >>> >>> Log in to TweetDeck >>> >>> >>> Help >>> >>> | Update who can invite me to their team >>> >>> Twitter, Inc. 1355 Market St., Suite 900 San Francisco, CA 94103 >>> <#m_3071064693756941791_m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >>> >>> >>> >>> -- >>> Vanessa Villamia Sochat >>> Stanford University >>> (603) 321-0676 >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > > -- > Vanessa Villamia Sochat > Stanford University > (603) 321-0676 > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Sun Apr 17 23:19:09 2016 From: arokem at gmail.com (Ariel Rokem) Date: Sun, 17 Apr 2016 20:19:09 -0700 Subject: [Neuroimaging] Nitime 0.6 In-Reply-To: <20160208005204.GB7904@onerussian.com> References: <20160208005204.GB7904@onerussian.com> Message-ID: On Sun, Feb 7, 2016 at 4:52 PM, Yaroslav Halchenko wrote: > > On Sun, 07 Feb 2016, Ariel Rokem wrote: > > > Hi everyone, > > I am happy to announce the release of nitime version 0.6, now > available on > > PyPI. > Update: you can now install nitime using conda : conda config --add channels conda-forge conda install nitime Best, Ariel > > This is a maintenance release, supporting newer versions of numpy and > > matplotlib. > > Congrats. And now 0.6 was uploaded to (Neuro)Debian > > Cheers! > -- > Yaroslav O. Halchenko > Center for Open Neuroscience http://centerforopenneuroscience.org > Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 > Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 > WWW: http://www.linkedin.com/in/yarik > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Mon Apr 18 10:17:43 2016 From: arokem at gmail.com (Ariel Rokem) Date: Mon, 18 Apr 2016 07:17:43 -0700 Subject: [Neuroimaging] Travis turned off Message-ID: Hi everyone, Just a quick note to let you know that Travis seems to have reset all the github.com/nipy/* repos to "off" (they do that occasionally; at an increasing rate, it seems?). I just switched nibabel and dipy back on, and please flip the switch (or ask me to) on other repos, as needed. Cheers, Ariel -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Mon Apr 18 11:08:11 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Mon, 18 Apr 2016 11:08:11 -0400 Subject: [Neuroimaging] Travis turned off In-Reply-To: References: Message-ID: thanks ariel. cheers, satra On Mon, Apr 18, 2016 at 10:17 AM, Ariel Rokem wrote: > Hi everyone, > > Just a quick note to let you know that Travis seems to have reset all the > github.com/nipy/* repos to "off" (they do that occasionally; at an > increasing rate, it seems?). I just switched nibabel and dipy back on, and > please flip the switch (or ask me to) on other repos, as needed. > > Cheers, > > Ariel > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at onerussian.com Mon Apr 18 10:36:00 2016 From: lists at onerussian.com (Yaroslav Halchenko) Date: Mon, 18 Apr 2016 10:36:00 -0400 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: <20160418143600.GC23764@onerussian.com> On Tue, 12 Apr 2016, Chris Filo Gorgolewski wrote: > - There is one publication which makes citing easy (compare to > Freesurfer https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation, > also consider that some journals limit the number of references) > - There is no overhead of writing, submitting, and revising a manuscript > on top of developing and revising code brief follow-up. IMHO Zenodo solution is indeed great and I hope to do the same later on for our projects. BUT I still think that ideally we should support "FreeSurfer"'s approach for citing relevant methods papers for specific algorithms/implementations. That is why I would strongly encourage you to look/join our slowly moving http://duecredit.org effort. As for the number of references, I hope that such archaic demands would be relaxed soon(ish) and there are ongoing efforts: https://twitter.com/figgyjam/status/721755759449493505 > There is, however, one big drawback - there is only one first and one > senior author on the Zenodo handle. I think this calls for a hybrid > solution: an always up to date Zenodo entry combined with individual > papers written by developers who feel they need such publication and are > willing to put the extra effort of writing the manuscript. I would stay > away from making a "deal" or explicitly recommending one particular > commercial publisher. BTW -- have you looked into some Zenodo API to be able to modify the record automagically? then may be for each release the order of authors could be generated automagically e.g. by sorting according to some metric since previous release (as bad as # of commits and/or lines of code touched), followed by the rest of the authors in the order as in the previous release. So the order would then be quite dynamic... may be the last X contributors (seniors) could be selected from those based on overall (not just between releases) value of the metric, also joggling/looping through the releases. not an ideal, but imho viable way... what do you think? -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From vsochat at stanford.edu Fri Apr 15 13:49:58 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Fri, 15 Apr 2016 10:49:58 -0700 Subject: [Neuroimaging] Fwd: @vsoch has been invited to Tweet from @nipydev. In-Reply-To: References: <14.07.25799.0AA01175@twitter.com> Message-ID: The proper way to do that, these days :) [image: Inline image 1] On Fri, Apr 15, 2016 at 10:47 AM, Ariel Rokem wrote: > > > On Fri, Apr 15, 2016 at 10:37 AM, vanessa sochat > wrote: > >> Hi Ariel, >> >> I'm indifferent between the two, but I do think we should be consistent, >> and perhaps just have one. Neither is well established, so I think we have >> two options. 1) I can share the credentials for nipyorg, and you can do the >> equivalent "tweet permissions" for it, or 2) we can decide on one of the >> accounts, and adjust the branding / logo for the other. Since nipyorg is >> already hooked up to nipy.org and has a few more followers and tweets, I >> am thinking the first is best. Let me know you what you think. >> >> > +1. I'll give nipydev a proper twitter sea burial. > > > >> Best, >> >> Vanessa >> >> On Fri, Apr 15, 2016 at 9:42 AM, Ariel Rokem wrote: >> >>> Hi Vanessa, >>> >>> That was my (somewhat awkward) doing. I actually created the nipydev >>> twitter account a while back (Jan 2015?), but did fairly poorly at actually >>> using it. I am a bit of a luddite, and just discovered this morning how I >>> might give others permission to tweet from that account, and added you. I'm >>> happy to stick to nipyorg, though. It seems to be doing a much better job >>> :-) >>> >>> Cheers, >>> >>> Ariel >>> >>> On Fri, Apr 15, 2016 at 9:07 AM, vanessa sochat >>> wrote: >>> >>>> Any reason to make a second twitter handle? I made nipyorg >>>> last year (linked to nipy.org, >>>> respectively). Did you not like the name? >>>> >>>> >>>> ---------- Forwarded message ---------- >>>> From: Twitter >>>> Date: Fri, Apr 15, 2016 at 8:37 AM >>>> Subject: @vsoch has been invited to Tweet from @nipydev. >>>> To: Vanessa >>>> >>>> >>>> >>>> >>>> Hi, >>>> Vanessa. >>>> >>>> You have been given permission to Tweet from @nipydev >>>> . >>>> >>>> >>>> <#m_-6022243588946031282_m_3071064693756941791_m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >>>> >>>> >>>> <#m_-6022243588946031282_m_3071064693756941791_m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >>>> >>>> >>>> *nipy community* >>>> >>>> @nipydev >>>> >>>> This feature makes TweetDeck >>>> >>>> more secure for teams by avoiding the need to share passwords. Learn >>>> more about teams >>>> . >>>> >>>> Start by visiting tweetdeck.twitter.com >>>> . >>>> >>>> Log in to TweetDeck >>>> >>>> >>>> Help >>>> >>>> | Update who can invite me to their team >>>> >>>> Twitter, Inc. 1355 Market St., Suite 900 San Francisco, CA 94103 >>>> <#m_-6022243588946031282_m_3071064693756941791_m_6266249404085409615_m_9024479373722167915_m_8251679740162604275_> >>>> >>>> >>>> >>>> -- >>>> Vanessa Villamia Sochat >>>> Stanford University >>>> (603) 321-0676 >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> >> -- >> Vanessa Villamia Sochat >> Stanford University >> (603) 321-0676 >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 83728 bytes Desc: not available URL: From krzysztof.gorgolewski at gmail.com Thu Apr 21 10:55:52 2016 From: krzysztof.gorgolewski at gmail.com (Chris Filo Gorgolewski) Date: Thu, 21 Apr 2016 07:55:52 -0700 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: <20160418143600.GC23764@onerussian.com> References: <20160418143600.GC23764@onerussian.com> Message-ID: On Mon, Apr 18, 2016 at 7:36 AM, Yaroslav Halchenko wrote: > > On Tue, 12 Apr 2016, Chris Filo Gorgolewski wrote: > > - There is one publication which makes citing easy (compare to > > Freesurfer > https://surfer.nmr.mgh.harvard.edu/fswiki/FreeSurferMethodsCitation, > > also consider that some journals limit the number of references) > > - There is no overhead of writing, submitting, and revising a > manuscript > > on top of developing and revising code > > brief follow-up. IMHO Zenodo solution is indeed great and I hope > to do the same later on for our projects. BUT I still think that > ideally we should support "FreeSurfer"'s approach for citing relevant > methods papers for specific algorithms/implementations. That is why I > would strongly encourage you to look/join our slowly moving > http://duecredit.org effort. > > As for the number of references, I hope that such archaic demands would > be relaxed soon(ish) and there are ongoing efforts: > https://twitter.com/figgyjam/status/721755759449493505 Totally agree! > > > > There is, however, one big drawback - there is only one first and one > > senior author on the Zenodo handle. I think this calls for a hybrid > > solution: an always up to date Zenodo entry combined with individual > > papers written by developers who feel they need such publication and > are > > willing to put the extra effort of writing the manuscript. I would > stay > > away from making a "deal" or explicitly recommending one particular > > commercial publisher. > > BTW -- have you looked into some Zenodo API to be able to modify the > record automagically? then may be for each release the order of authors > could be generated automagically e.g. by sorting according to some > metric since previous release (as bad as # of commits and/or lines > of code touched), followed by the rest of the authors in the order as in > the previous release. So the order would then be quite dynamic... may > be the last X contributors (seniors) could be selected from those based > on overall (not just between releases) value of the metric, also > joggling/looping through the releases. > > not an ideal, but imho viable way... what do you think? > Yup - this exactly what we are planning to do. IT did require some manual curation (we have over 100 contributors) to figure out real names, affiliations and ORCID for everyone. Best, Chris > > -- > Yaroslav O. Halchenko > Center for Open Neuroscience http://centerforopenneuroscience.org > Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 > Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 > WWW: http://www.linkedin.com/in/yarik > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at onerussian.com Thu Apr 21 11:21:59 2016 From: lists at onerussian.com (Yaroslav Halchenko) Date: Thu, 21 Apr 2016 11:21:59 -0400 Subject: [Neuroimaging] scivatar? Re: Journal articles based on PRs In-Reply-To: References: <20160418143600.GC23764@onerussian.com> Message-ID: <20160421152159.GF23764@onerussian.com> On Thu, 21 Apr 2016, Chris Filo Gorgolewski wrote: > BTW -- have you looked into some Zenodo API to be able to modify the > record automagically?A then may be for each release the order of > authors > could be generated automagically e.g. by sorting according to some > metric since previous release (as bad as # of commits and/or lines > of code touched), followed by the rest of the authors in the order as in > the previous release.A A So the order would then be quite dynamic... > may > be the last X contributors (seniors) could be selected from those based > on overall (not just between releases) value of the metric, also > joggling/looping through the releases. > not an ideal, but imho viable way... what do you think? > Yup - this exactly what we are planning to do. awesome! > IT did require some manual > curation (we have over 100 contributors) to figure out real names, > affiliations and ORCID for everyone. I guess it might/should be somehow "integrated" with .mailmap you already carry. I guess, ideally, there should be some helper project/script (may be you have started already smth, don't see mentioning of orcid in nipype git repo) which would take some definition like [ { 'name': 'Full Name', 'email': 'email at example.com', 'git_ids': [ 'bla ', 'asdf' ], 'orcid': 'XXXX', 'senior': True, 'homepage': 'http:///', 'twitter': '...', 'gravatar': 'email', ... }, ... ] NB 1. senior -- probably to still maintain some dedication for senior authors to rotate at the end ;) 2. social media pointers etc might come handy if such a list later used to generate a sphinx page for the project's webpage. and then it to be used for adjusting/generating zenodo entry and updating/generataing .mailmap file for the repository. FURTHER IDEA: Moreover, may be it is worth maintaining such a DB shared among multiple projects! ;) and/or providing a service later which given a list of git_ids would provide relevant selection and/or end file (e.g. .mailmap, sphinx page, ...). Should we initiate such a beast if you haven't done so yet? ;) Perspective names: - scidevs -- problematic since taken already https://github.com/scidevs - scivatar - scimune (scientific commune) ? (meanwhile reserved gh organizations for the last two ;) ) -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From krzysztof.gorgolewski at gmail.com Thu Apr 21 12:36:18 2016 From: krzysztof.gorgolewski at gmail.com (Chris Filo Gorgolewski) Date: Thu, 21 Apr 2016 09:36:18 -0700 Subject: [Neuroimaging] scivatar? Re: Journal articles based on PRs In-Reply-To: <20160421152159.GF23764@onerussian.com> References: <20160418143600.GC23764@onerussian.com> <20160421152159.GF23764@onerussian.com> Message-ID: This seems a little bit of an overkill to me... We just used Google forms and sheets to collect and organize authors data. I'll transfer the mappings to mailmap at some point. Best, Chris On Apr 21, 2016 8:28 AM, "Yaroslav Halchenko" wrote: > > On Thu, 21 Apr 2016, Chris Filo Gorgolewski wrote: > > > BTW -- have you looked into some Zenodo API to be able to modify the > > record automagically?A then may be for each release the order of > > authors > > could be generated automagically e.g. by sorting according to some > > metric since previous release (as bad as # of commits and/or lines > > of code touched), followed by the rest of the authors in the order > as in > > the previous release.A A So the order would then be quite > dynamic... > > may > > be the last X contributors (seniors) could be selected from those > based > > on overall (not just between releases) value of the metric, also > > joggling/looping through the releases. > > > not an ideal, but imho viable way... what do you think? > > > Yup - this exactly what we are planning to do. > > awesome! > > > IT did require some manual > > curation (we have over 100 contributors) to figure out real names, > > affiliations and ORCID for everyone. > > I guess it might/should be somehow "integrated" with .mailmap you > already carry. I guess, ideally, there should be some helper > project/script (may be you have started already smth, don't see > mentioning of orcid in nipype git repo) which would take some > definition like > > [ > { > 'name': 'Full Name', > 'email': 'email at example.com', > 'git_ids': [ > 'bla ', > 'asdf' > ], > 'orcid': 'XXXX', > 'senior': True, > 'homepage': 'http:///', > 'twitter': '...', > 'gravatar': 'email', > ... > }, > ... > ] > > NB > 1. senior -- probably to still maintain some dedication for senior > authors to rotate at the end ;) > 2. social media pointers etc might come handy if such a list later used > to generate a sphinx page for the project's webpage. > > and then it to be used for adjusting/generating zenodo entry and > updating/generataing .mailmap file for the repository. > > FURTHER IDEA: Moreover, may be it is worth maintaining such a DB > shared among multiple projects! ;) and/or providing a service later > which given a list of git_ids would provide relevant selection and/or > end file (e.g. .mailmap, sphinx page, ...). > > Should we initiate such a beast if you haven't done so yet? ;) > > Perspective names: > > - scidevs -- problematic since taken already https://github.com/scidevs > - scivatar > - scimune (scientific commune) > > ? (meanwhile reserved gh organizations for the last two ;) ) > > -- > Yaroslav O. Halchenko > Center for Open Neuroscience http://centerforopenneuroscience.org > Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 > Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 > WWW: http://www.linkedin.com/in/yarik > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From RushtonSK at cardiff.ac.uk Thu Apr 21 12:36:14 2016 From: RushtonSK at cardiff.ac.uk (Simon Rushton) Date: Thu, 21 Apr 2016 16:36:14 +0000 Subject: [Neuroimaging] Example code for NiBabel Message-ID: Hi All I'm new to doing analysis of fmri data using Python and NiBabel. Could someone point me to some very simple code to get me started. I'm thinking of a program just a few lines long that loads a 4D NIfTI file, thresholds each of the individual 3D volumes, and then saves the result out as a new 4D NIfTI file. Some simple code like that would get me past all things like accessing data via memory pointers etc so I can start learning rather than just scratching my head! And yes, I know that most of this is in the NiBabel documentation but I'm obviously not putting the snippets together correctly as my code doesn't work... simon -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Apr 21 14:12:59 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 21 Apr 2016 11:12:59 -0700 Subject: [Neuroimaging] Example code for NiBabel In-Reply-To: References: Message-ID: Hi, On Thu, Apr 21, 2016 at 9:36 AM, Simon Rushton wrote: > Hi All > > > > I?m new to doing analysis of fmri data using Python and NiBabel. Could > someone point me to some very simple code to get me started. I?m thinking > of a program just a few lines long that loads a 4D NIfTI file, thresholds > each of the individual 3D volumes, and then saves the result out as a new 4D > NIfTI file. Some simple code like that would get me past all things like > accessing data via memory pointers etc so I can start learning rather than > just scratching my head! And yes, I know that most of this is in the > NiBabel documentation but I?m obviously not putting the snippets together > correctly as my code doesn?t work? Does this doc help? http://nipy.org/nibabel/gettingstarted.html Can you post what you have, and the error you are getting? Thanks a lot, Matthew From gael.varoquaux at normalesup.org Thu Apr 21 14:23:30 2016 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 21 Apr 2016 20:23:30 +0200 Subject: [Neuroimaging] Example code for NiBabel In-Reply-To: References: Message-ID: <20160421182330.GJ455010@phare.normalesup.org> Hi, I know that this is probably not the answer that you are looking for, but in nilearn (which relies on nibabel) it's a couple of lines of code: from nilearn import image image.threshold_img('input.nii', threshold=3).to_filename('ouptut.nii') Reference documentation: http://nilearn.github.io/modules/generated/nilearn.image.threshold_img.html Cheers, Ga?l On Thu, Apr 21, 2016 at 04:36:14PM +0000, Simon Rushton wrote: > Hi All > I?m new to doing analysis of fmri data using Python and NiBabel. Could someone > point me to some very simple code to get me started. I?m thinking of a program > just a few lines long that loads a 4D NIfTI file, thresholds each of the > individual 3D volumes, and then saves the result out as a new 4D NIfTI file. > Some simple code like that would get me past all things like accessing data via > memory pointers etc so I can start learning rather than just scratching my > head! And yes, I know that most of this is in the NiBabel documentation but > I?m obviously not putting the snippets together correctly as my code doesn?t > work? > simon > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -- Gael Varoquaux Researcher, INRIA Parietal NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France Phone: ++ 33-1-69-08-79-68 http://gael-varoquaux.info http://twitter.com/GaelVaroquaux From garyfallidis at gmail.com Sat Apr 23 20:23:32 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Sun, 24 Apr 2016 00:23:32 +0000 Subject: [Neuroimaging] [DIPY] Selection results for GSoC 2016 Message-ID: Dear all, Shahnawaz Ahmed, Ranveer Aggarwal, Bishakh Ghosh and Riddhish Bhalodia are the 4 students selected for participating at GSoC with DIPY. This is really great news as we moved from 2 funded projects (last year) to 4 (this year). We would like to thank everyone who applied and especially those who submitted fixes and PRs and welcome them to continue contributing beyond GSoC. Now for the 4 students (and their mentors) there is plenty of very cool but also very hard work to do during this summer!!! So, RISE and GRIND!!! Best regards, Eleftherios p.s. As the students move forward with their projects their will be reporting their progress in their blogs. You may also have an eye for their upcoming PRs. The more eyes reviewing the better :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefanv at berkeley.edu Mon Apr 25 19:33:15 2016 From: stefanv at berkeley.edu (=?UTF-8?Q?St=C3=A9fan_van_der_Walt?=) Date: Mon, 25 Apr 2016 23:33:15 +0000 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: On Tue, 12 Apr 2016 at 08:27 Ariel Rokem wrote: > In a conversation I had with Rafael recently, he mentioned to me the > Journal of Open Research Software ( > http://openresearchsoftware.metajnl.com/) that publishes articles about > open-source research software, and proposed this as a good place to publish > software contributions in our community. > Karthik Ram recently told me about http://joss.theoj.org/about I would like to hear what others think of this journal (tl;dr: it takes about 1-3 hrs to prepare a paper for publication in their peer-reviewed journal). St?fan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vsochat at stanford.edu Mon Apr 25 19:48:57 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Mon, 25 Apr 2016 16:48:57 -0700 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: I would be skeptical about any journal that opens with: The Journal of Open Source Software (JOSS) is a legitimate academic > journal. :) On Mon, Apr 25, 2016 at 4:33 PM, St?fan van der Walt wrote: > On Tue, 12 Apr 2016 at 08:27 Ariel Rokem wrote: > >> In a conversation I had with Rafael recently, he mentioned to me the >> Journal of Open Research Software ( >> http://openresearchsoftware.metajnl.com/) that publishes articles about >> open-source research software, and proposed this as a good place to publish >> software contributions in our community. >> > > Karthik Ram recently told me about > > http://joss.theoj.org/about > > I would like to hear what others think of this journal (tl;dr: it takes > about 1-3 hrs to prepare a paper for publication in their peer-reviewed > journal). > > St?fan > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Mon Apr 25 19:49:13 2016 From: arokem at gmail.com (Ariel Rokem) Date: Mon, 25 Apr 2016 16:49:13 -0700 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: On Mon, Apr 25, 2016 at 4:33 PM, St?fan van der Walt wrote: > On Tue, 12 Apr 2016 at 08:27 Ariel Rokem wrote: > >> In a conversation I had with Rafael recently, he mentioned to me the >> Journal of Open Research Software ( >> http://openresearchsoftware.metajnl.com/) that publishes articles about >> open-source research software, and proposed this as a good place to publish >> software contributions in our community. >> > > Karthik Ram recently told me about > > http://joss.theoj.org/about > > I would like to hear what others think of this journal (tl;dr: it takes > about 1-3 hrs to prepare a paper for publication in their peer-reviewed > journal). > Interesting. Looks like it's not quite up and running yet. Do you know if they are planning a mechanism whereby PRs could be counted as distinct contributions? > St?fan > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Tue Apr 26 01:27:39 2016 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 26 Apr 2016 07:27:39 +0200 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: References: Message-ID: <20160426052739.GP3689071@phare.normalesup.org> On Mon, Apr 25, 2016 at 11:33:15PM +0000, St?fan van der Walt wrote: > I would like to hear what others think of this journal (tl;dr: it takes about > 1-3 hrs to prepare a paper for publication in their peer-reviewed journal). 1-3 hrs to prepare a paper! Is that a good thing? It takes me more than 3 hours to peer review a publications. It takes me at least an hour to read one. Don't we already have too many publications of low quality? G From vsochat at stanford.edu Tue Apr 26 01:58:44 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Mon, 25 Apr 2016 22:58:44 -0700 Subject: [Neuroimaging] Journal articles based on PRs In-Reply-To: <20160426052739.GP3689071@phare.normalesup.org> References: <20160426052739.GP3689071@phare.normalesup.org> Message-ID: However, I do like the idea of an alternative to peer reviewed publication to get attention for something like a small piece of quasi finished software. There are already options for this - namely "git push origin master" or spending a few hours to write a blog post, and the missing component is hooking those things into an organized feed to be viewed by the appropriate folks to give feedback (e.g., something like 1 ,2 ,3 ). Maybe this "JOSS" is something akin to that, however it seems like a mistake to advertise as a journal, because it's just not. I think advertising as "something else" would be desired, and the challenge is making that something else appealing to researchers to offer quick, high volume (likely harsh but useful) feedback. Given our current incentives and culture, methinks it would be named something like "The Scoop Shoot." :) On Mon, Apr 25, 2016 at 10:27 PM, Gael Varoquaux < gael.varoquaux at normalesup.org> wrote: > On Mon, Apr 25, 2016 at 11:33:15PM +0000, St?fan van der Walt wrote: > > I would like to hear what others think of this journal (tl;dr: it takes > about > > 1-3 hrs to prepare a paper for publication in their peer-reviewed > journal). > > 1-3 hrs to prepare a paper! Is that a good thing? It takes me more than 3 > hours to peer review a publications. It takes me at least an hour to read > one. Don't we already have too many publications of low quality? > > G > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: From RushtonSK at cardiff.ac.uk Tue Apr 26 10:33:21 2016 From: RushtonSK at cardiff.ac.uk (Simon Rushton) Date: Tue, 26 Apr 2016 14:33:21 +0000 Subject: [Neuroimaging] Example code for NiBabel In-Reply-To: References: Message-ID: Thanks all for the help! I found the solution. The problem wasn?t with Nibabel but my (lack of) familiarity with Python. I?ve learnt that the meaning of ?=? depends on what is being assigned.. simon > On 21 Apr 2016, at 19:12, Matthew Brett wrote: > > Hi, > > On Thu, Apr 21, 2016 at 9:36 AM, Simon Rushton wrote: >> Hi All >> >> >> >> I?m new to doing analysis of fmri data using Python and NiBabel. Could >> someone point me to some very simple code to get me started. I?m thinking >> of a program just a few lines long that loads a 4D NIfTI file, thresholds >> each of the individual 3D volumes, and then saves the result out as a new 4D >> NIfTI file. Some simple code like that would get me past all things like >> accessing data via memory pointers etc so I can start learning rather than >> just scratching my head! And yes, I know that most of this is in the >> NiBabel documentation but I?m obviously not putting the snippets together >> correctly as my code doesn?t work? > > Does this doc help? http://nipy.org/nibabel/gettingstarted.html > > Can you post what you have, and the error you are getting? > > Thanks a lot, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging From brahim.belaoucha at inria.fr Wed Apr 27 04:49:46 2016 From: brahim.belaoucha at inria.fr (Brahim Belaoucha) Date: Wed, 27 Apr 2016 10:49:46 +0200 (CEST) Subject: [Neuroimaging] Contribute to cortical surface parcellation In-Reply-To: <94572555.27492865.1461746652160.JavaMail.zimbra@inria.fr> Message-ID: <1021409098.27496734.1461746986759.JavaMail.zimbra@inria.fr> Good morning, I am sending you this email to ask if it is possible to include my code that uses the probabilistic tractography results to parcellate the cortical surface using different metrics. my code now uses the results of FSL probailistic tractography (.nii.gz images). -- --- Sincerely Brahim Belaoucha PhD Student Athena Project Team Inria Sophia Antipolis - M?diterran?e http://www-sop.inria.fr/athena/ Phone: (+33) 4-9238-7557 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: inr_logo_corpo_UK_coul.png Type: image/png Size: 6535 bytes Desc: not available URL: From arokem at gmail.com Wed Apr 27 10:49:16 2016 From: arokem at gmail.com (Ariel Rokem) Date: Wed, 27 Apr 2016 07:49:16 -0700 Subject: [Neuroimaging] Contribute to cortical surface parcellation In-Reply-To: <1021409098.27496734.1461746986759.JavaMail.zimbra@inria.fr> References: <94572555.27492865.1461746652160.JavaMail.zimbra@inria.fr> <1021409098.27496734.1461746986759.JavaMail.zimbra@inria.fr> Message-ID: Hi Brahim, Is the code already publicly available (e.g. on Github)? I think that we can include projects that our in line with our code of conduct ( http://nipy.org/conduct.html) in the nipy.org list of projects (on the front page of nipy.org). Cheers, Ariel On Wed, Apr 27, 2016 at 1:49 AM, Brahim Belaoucha wrote: > Good morning, > I am sending you this email to ask if it is possible to include my code > that uses the probabilistic tractography results to parcellate the cortical > surface using different metrics. > my code now uses the results of FSL proba ilistic tractography (.nii.gz > images). > > > -- > --- > Sincerely > Brahim Belaoucha > > PhD Student > Athena Project Team > Inria Sophia Antipolis - M?diterran?e > http://www-sop.inria.fr/athena/ > Phone: (+33) 4-9238-7557 > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: inr_logo_corpo_UK_coul.png Type: image/png Size: 6535 bytes Desc: not available URL: From garyfallidis at gmail.com Wed Apr 27 13:24:23 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Wed, 27 Apr 2016 17:24:23 +0000 Subject: [Neuroimaging] Contribute to cortical surface parcellation In-Reply-To: <1021409098.27496734.1461746986759.JavaMail.zimbra@inria.fr> References: <94572555.27492865.1461746652160.JavaMail.zimbra@inria.fr> <1021409098.27496734.1461746986759.JavaMail.zimbra@inria.fr> Message-ID: Hi Brahim, Apart from the information that Ariel asked I would like to tell you that there is no problem adding your algorithm in DIPY. But it would be great if you could generate a probabilistic nifti files from our existing probabilistic tracking algorithms rather that requiring always FSL files. This would make your method available to much larger range of input methods. But let's go step by step. Please send us your reference paper if you have already something submitted. It is also important for us to know if you are willing to help maintaining your code and fixing bugs let's say for at least 2 years after your code is merged. Maintenance is time consuming and we need to be sure that you can help with that. Let us know if you have other questions. Best regards, Eleftherios On Wed, Apr 27, 2016 at 10:42 AM Brahim Belaoucha wrote: > Good morning, > I am sending you this email to ask if it is possible to include my code > that uses the probabilistic tractography results to parcellate the cortical > surface using different metrics. > my code now uses the results of FSL proba ilistic tractography (.nii.gz > images). > > > -- > --- > Sincerely > Brahim Belaoucha > > PhD Student > Athena Project Team > Inria Sophia Antipolis - M?diterran?e > http://www-sop.inria.fr/athena/ > Phone: (+33) 4-9238-7557 > [image: inr_logo_corpo_UK_coul.png] > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: inr_logo_corpo_UK_coul.png Type: image/png Size: 6535 bytes Desc: not available URL: From hucheng at indiana.edu Wed Apr 27 16:06:51 2016 From: hucheng at indiana.edu (Cheng, Hu) Date: Wed, 27 Apr 2016 20:06:51 +0000 Subject: [Neuroimaging] post-doc position at Indiana University Message-ID: <54bc58c54cbb42138fb408d133159baa@in-cci-exch08.ads.iu.edu> Dear all, We are seeking a motivated individual to work on the methodology development in neuroimaging as a post-doc/research scientist. The position starts in July 2016. The description of the position is attached. Hu Cheng, Ph.D., DABMP MRI Physicist, Imaging Research Facility Department of Psychological and Brain Sciences Adjunct Professor, Department of Physics Indiana University Bloomington, IN 47405 Tel. 812-856-2518 Fax. 812-855-4691 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Postdoctoral position in Neuroimaging.docx Type: application/vnd.openxmlformats-officedocument.wordprocessingml.document Size: 14895 bytes Desc: Postdoctoral position in Neuroimaging.docx URL: From vivekjoshi1894 at gmail.com Thu Apr 28 03:44:33 2016 From: vivekjoshi1894 at gmail.com (Vivek Joshi) Date: Thu, 28 Apr 2016 13:14:33 +0530 Subject: [Neuroimaging] Regarding PIESNO Paper by Koay Message-ID: Respected Sir, My name is Vivek Joshi and i am doing a project on PIESNO by Koay for my final year project. We implemented the PIESNO code in python. But our Professor is asking us to identify the noise (whether rician or gaussian) present in mri images. So it would help me if you have any idea on how to identify noise. Please think about it. THANK YOU!! -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Thu Apr 28 07:36:06 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Thu, 28 Apr 2016 13:36:06 +0200 Subject: [Neuroimaging] Regarding PIESNO Paper by Koay In-Reply-To: References: Message-ID: As a first point, piesno actually works only (based on the math) for rician or non central chi distributed noise (which is what is commonly found in magnitude images in MRI) and correct for that case to get back an estimation of the standard deviation based on a single gaussian distribution. You could have a look at [1] to get a feeling of the various noise distribution encountered in most cases, as the gaussian case would only apply if you also collect phase data I'd say (or do fancy pre-processing scanner side, which is not commonly done by everyone). Anyway, to go more toward your question, you can have at [2] which gives method to estimate the number of coils of the acquisition. Although these require knowledge in the type of acquisition you have, it does lay some groundwork for ideas on how to estimate the noise from any acquisition. I also tried to implement the method of [3] (probably have some code lying around if I look hard enough if you would want it) but found the estimation to not be inline with my simulations (could be also my quick implementation, I had nothing to compare against). Talking about implementation, if you want to validate/add features to it, there is alos another piesno implementation over here : https://github.com/nipy/dipy/blob/master/dipy/denoise/noise_estimate.py This was validated against the original closed source code, but as you can see I used the precomputed tables from the article. If you also have the original equations implemented instead of the precomputed value it would be a great addition (since it requires 1D optimisation, I though it would be too slow to compute it each time, so you might also want to use these tables instead in that case.) Anyway, if you need more help feel free to ask back. Samuel [1] Dietrich et al. 2008 Measurement of signal-to-noise ratios in MR images: Influence of multichannel coils, parallel imaging, and reconstruction filters [2] Aja-Fern?ndez, S., Vegas-S?nchez-Ferrero, G., Trist?n-Vega, A., 2014. Noise estimation in parallel MRI: GRAPPA and SENSE. Magnetic resonance imaging 32, 281?90. [3] Veraart et al. 2013 Comprehensive framework for accurate diffusion MRI parameter estimation 2016-04-28 9:44 GMT+02:00 Vivek Joshi : > Respected Sir, > My name is Vivek Joshi and i am doing a project on > PIESNO by Koay for my final year project. We implemented the PIESNO code in > python. But our Professor is asking us to identify the noise (whether > rician or gaussian) present in mri images. So it would help me if you have > any idea on how to identify noise. Please think about it. > > THANK YOU!! > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Thu Apr 28 12:10:54 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Thu, 28 Apr 2016 16:10:54 +0000 Subject: [Neuroimaging] [Neuroimaing][DIPY] Preparations for OHBM 2016 - workshop, hackathon, stand and releasing Message-ID: Dear DIPY developers, We need to start planning for our workshop, hackathon and exhibition stand for OHBM 2016 in Geneva. First I need a list of all DIPY developers/contributors who will attend OHBM and they are eager to help in any possible way. So far I know that Francois, Jasmeen, Julio, Rafael and Stephan Meesters will attend OHBM. With me included, we are 6 people in total. If we can coordinate we can share the load and promote DIPY and Neuroimaging in Python very well. If other people join too that is even better. Please contact me asap if you are planning to attend OHBM and your name was not stated above. a) Exhibition stand We need to buy new t-shirts. Me, Francois and Jasmeen will work on that. The cost will be around $20-25 (USD). Get back to me if you need one (even if you do not attend the conference). Perhaps we can send it to you. We need a banner. Julio and me can work on designing a nice banner. We need leaflets. Number to be discussed later. We need a monitor. Maybe we can bring a monitor from a nearby affiliated center. Otherwise usually renting a screen can be very expensive. I will investigate if the friends from EPFL can help with that. b) Hackathon Okay we should definitely participate in the hackathon. Think of projects that can take place in a couple of days. I know Julio has already a nice idea for a sprint. We should try to register that for a sprint during the Hackathon. Any ideas are much welcome too! Let's brainstorm! c) Workshop We have a DIPY workshop during OHBM. This is on the morning of Tuesday. The workshop is for two hours. I will take the first 45' to give an overview of DIPY and speak about some of the state-of-the-art and then I would like that, for the rest of the time, the other contributors can show hands-on examples related (or not) to their work. d) Release To support the hackathon and workshop we need to move on with merging some very important PRs. I will sent an e-mail later with what I think is important to merge first. Absolutely critical is the new streamlines API. I have already talked with Matthew Brett who is now helping to merge the new improved streamlines API in Nibabel. After we have this merged, we release Nibabel, update DIPY with the new Nibabel and release DIPY before OHBM!!! We will need the help from all developers with reviewing PRs to make sure we can meet the deadline at a high quality and impress our potential new users visiting OHBM. Please e-mail me or Ariel (or even better express interest in gitter) if you want to help with reviewing. I hope you are all excited about this! Crack on! Best regards, Eleftherios p.s. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vivekjoshi1894 at gmail.com Thu Apr 28 13:00:05 2016 From: vivekjoshi1894 at gmail.com (Vivek Joshi) Date: Thu, 28 Apr 2016 22:30:05 +0530 Subject: [Neuroimaging] Neuroimaging Digest, Vol 11, Issue 21 In-Reply-To: References: Message-ID: Thank You Samuel St-Jean! I will Try the steps you mentioned.... On Thu, Apr 28, 2016 at 9:30 PM, wrote: > Send Neuroimaging mailing list submissions to > neuroimaging at python.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.python.org/mailman/listinfo/neuroimaging > or, via email, send a message with subject or body 'help' to > neuroimaging-request at python.org > > You can reach the person managing the list at > neuroimaging-owner at python.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Neuroimaging digest..." > > > Today's Topics: > > 1. Re: Contribute to cortical surface parcellation > (Eleftherios Garyfallidis) > 2. post-doc position at Indiana University (Cheng, Hu) > 3. Regarding PIESNO Paper by Koay (Vivek Joshi) > 4. Re: Regarding PIESNO Paper by Koay (Samuel St-Jean) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 27 Apr 2016 17:24:23 +0000 > From: Eleftherios Garyfallidis > To: Neuroimaging analysis in Python > Subject: Re: [Neuroimaging] Contribute to cortical surface > parcellation > Message-ID: > KqqPHo1F0jcnOMNZ1nuBNMEpjE7Q at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > Hi Brahim, > > Apart from the information that Ariel asked I would like to tell you that > there is no problem adding your algorithm in DIPY. But it would be great if > you could generate a probabilistic nifti files from our existing > probabilistic tracking algorithms rather that requiring always FSL files. > This would make your method available to much larger range of input > methods. But let's go step by step. Please send us your reference paper if > you have already something submitted. > > It is also important for us to know if you are willing to help maintaining > your code and fixing bugs let's say for at least 2 years after your code is > merged. Maintenance is time consuming and we need to be sure that you can > help with that. > > Let us know if you have other questions. > > Best regards, > Eleftherios > > > On Wed, Apr 27, 2016 at 10:42 AM Brahim Belaoucha < > brahim.belaoucha at inria.fr> > wrote: > > > Good morning, > > I am sending you this email to ask if it is possible to include my code > > that uses the probabilistic tractography results to parcellate the > cortical > > surface using different metrics. > > my code now uses the results of FSL proba ilistic tractography (.nii.gz > > images). > > > > > > -- > > --- > > Sincerely > > Brahim Belaoucha > > > > PhD Student > > Athena Project Team > > Inria Sophia Antipolis - M?diterran?e > > http://www-sop.inria.fr/athena/ > > Phone: (+33) 4-9238-7557 > > [image: inr_logo_corpo_UK_coul.png] > > > > _______________________________________________ > > Neuroimaging mailing list > > Neuroimaging at python.org > > https://mail.python.org/mailman/listinfo/neuroimaging > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://mail.python.org/pipermail/neuroimaging/attachments/20160427/b2e61289/attachment-0001.html > > > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: inr_logo_corpo_UK_coul.png > Type: image/png > Size: 6535 bytes > Desc: not available > URL: < > http://mail.python.org/pipermail/neuroimaging/attachments/20160427/b2e61289/attachment-0001.png > > > > ------------------------------ > > Message: 2 > Date: Wed, 27 Apr 2016 20:06:51 +0000 > From: "Cheng, Hu" > To: "neuroimaging at python.org" > Subject: [Neuroimaging] post-doc position at Indiana University > Message-ID: > <54bc58c54cbb42138fb408d133159baa at in-cci-exch08.ads.iu.edu> > Content-Type: text/plain; charset="us-ascii" > > Dear all, > > We are seeking a motivated individual to work on the methodology > development in neuroimaging as a post-doc/research scientist. The position > starts in July 2016. The description of the position is attached. > > Hu Cheng, Ph.D., DABMP > MRI Physicist, Imaging Research Facility > Department of Psychological and Brain Sciences > Adjunct Professor, Department of Physics > Indiana University > Bloomington, IN 47405 > Tel. 812-856-2518 > Fax. 812-855-4691 > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://mail.python.org/pipermail/neuroimaging/attachments/20160427/b97bb6ec/attachment-0001.html > > > -------------- next part -------------- > A non-text attachment was scrubbed... > Name: Postdoctoral position in Neuroimaging.docx > Type: > application/vnd.openxmlformats-officedocument.wordprocessingml.document > Size: 14895 bytes > Desc: Postdoctoral position in Neuroimaging.docx > URL: < > http://mail.python.org/pipermail/neuroimaging/attachments/20160427/b97bb6ec/attachment-0001.docx > > > > ------------------------------ > > Message: 3 > Date: Thu, 28 Apr 2016 13:14:33 +0530 > From: Vivek Joshi > To: neuroimaging at python.org > Subject: [Neuroimaging] Regarding PIESNO Paper by Koay > Message-ID: > A at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > Respected Sir, > My name is Vivek Joshi and i am doing a project on > PIESNO by Koay for my final year project. We implemented the PIESNO code in > python. But our Professor is asking us to identify the noise (whether > rician or gaussian) present in mri images. So it would help me if you have > any idea on how to identify noise. Please think about it. > > THANK YOU!! > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://mail.python.org/pipermail/neuroimaging/attachments/20160428/c6843c36/attachment-0001.html > > > > ------------------------------ > > Message: 4 > Date: Thu, 28 Apr 2016 13:36:06 +0200 > From: Samuel St-Jean > To: Neuroimaging analysis in Python > Subject: Re: [Neuroimaging] Regarding PIESNO Paper by Koay > Message-ID: > < > CAKADGuqbvubeL+BtkXeLuBB-7YLtz3fn1wxi2MOB59fi6uAhhA at mail.gmail.com> > Content-Type: text/plain; charset="utf-8" > > As a first point, piesno actually works only (based on the math) for rician > or non central chi distributed noise (which is what is commonly found in > magnitude images in MRI) and correct for that case to get back an > estimation of the standard deviation based on a single gaussian > distribution. You could have a look at [1] to get a feeling of the various > noise distribution encountered in most cases, as the gaussian case would > only apply if you also collect phase data I'd say (or do fancy > pre-processing scanner side, which is not commonly done by everyone). > > Anyway, to go more toward your question, you can have at [2] which gives > method to estimate the number of coils of the acquisition. Although these > require knowledge in the type of acquisition you have, it does lay some > groundwork for ideas on how to estimate the noise from any acquisition. I > also tried to implement the method of [3] (probably have some code lying > around if I look hard enough if you would want it) but found the estimation > to not be inline with my simulations (could be also my quick > implementation, I had nothing to compare against). > > Talking about implementation, if you want to validate/add features to it, > there is alos another piesno implementation over here : > https://github.com/nipy/dipy/blob/master/dipy/denoise/noise_estimate.py > > This was validated against the original closed source code, but as you can > see I used the precomputed tables from the article. If you also have the > original equations implemented instead of the precomputed value it would be > a great addition (since it requires 1D optimisation, I though it would be > too slow to compute it each time, so you might also want to use these > tables instead in that case.) > > Anyway, if you need more help feel free to ask back. > > Samuel > > [1] Dietrich et al. 2008 Measurement of signal-to-noise ratios in MR > images: Influence of multichannel coils, parallel imaging, and > reconstruction filters > [2] Aja-Fern?ndez, S., Vegas-S?nchez-Ferrero, G., Trist?n-Vega, A., 2014. > Noise estimation in parallel MRI: GRAPPA and SENSE. Magnetic resonance > imaging 32, 281?90. > [3] Veraart et al. 2013 Comprehensive framework for accurate diffusion MRI > parameter estimation > > 2016-04-28 9:44 GMT+02:00 Vivek Joshi : > > > Respected Sir, > > My name is Vivek Joshi and i am doing a project on > > PIESNO by Koay for my final year project. We implemented the PIESNO code > in > > python. But our Professor is asking us to identify the noise (whether > > rician or gaussian) present in mri images. So it would help me if you > have > > any idea on how to identify noise. Please think about it. > > > > THANK YOU!! > > > > _______________________________________________ > > Neuroimaging mailing list > > Neuroimaging at python.org > > https://mail.python.org/mailman/listinfo/neuroimaging > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: < > http://mail.python.org/pipermail/neuroimaging/attachments/20160428/967f4b71/attachment-0001.html > > > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > ------------------------------ > > End of Neuroimaging Digest, Vol 11, Issue 21 > ******************************************** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sw742 at cam.ac.uk Sat Apr 30 11:52:51 2016 From: sw742 at cam.ac.uk (S. Winzeck) Date: Sat, 30 Apr 2016 16:52:51 +0100 Subject: [Neuroimaging] mTOP: Call for Participation In-Reply-To: References: Message-ID: <690d101228cc0291548be8e8d89a1695@cam.ac.uk> Dear All, This year I am organizing a challenge for analysis of traumatic brain injury MRI data, please find further details below. Thank you Stefan =================================================== mTOP: Call for Participation Mild Traumatic Brain Injury Outcome Prediction =================================================== What: MRI Feature Extraction / Unsupervised Classification When: 17 October 2016 Where: MICCAI 2016, Athens, Greece, http://www.miccai2016.org/en/ Web: https://tbichallenge.wordpress.com/ Traumatic brain injury (TBI) is a leading cause for death and long term disability across demographics. Predicting the outcome of patients suffering from TBI could bring many benefits for clinical decision making and development of new therapeutic concepts. However, the strong heterogeneity of the injury pattern and the complex change of pathology complicate the prognosis. This especially holds true for mild TBI (mTBI), where lesions are non-prevalent and conventional MRI often appears normal, but injury can cause post-concussional symptoms and neurocognitive dysfunction. This challenge asks for methods that focus on finding differences between healthy subjects and TBI patients and sort the given data in distinct categories in an unsupervised manner. Researchers are invited to submit novel or their existing methods. mTOP will be held in conjunction with the Brain Lesion Segmentation Workshop at the International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) 2016 We are looking forward to welcome you as participant. Best Regards, Stefan Winzeck, Marta Correia, Ben Glocker & Bjoern Menze