[Neuroimaging] iterating a workflow over inputs

Satrajit Ghosh satra at mit.edu
Wed Jan 11 10:47:37 EST 2017


hi ian,

in the current API, there are a few ways to do this, but all involve
wrapping the subworkflow in something.

option 1: create a function node
option 2: create a workflow interface

in both cases, some code will have to take the node/interface inputs and
map them to the inputs of the subworkflow, take the outputs and mapping it
to the outputs of the node/interface. however, unless your cluster allows
job submission from arbitrary nodes, you may need to preallocate resources.

a function node example:

def run_subwf(input1, input2, plugin='MultiProc', plugin_args={'n_procs':
2}):
     import os
     from myscripts import import create_workflow_func
     wf = create_workflow_func()
     wf.inputs.inputnode.input1 = input1
     wf.inputs.inputnode.input2 = input2
     wf.base_dir = os.getcwd()
     egraph = wf.run(plugin=plugin, plugin_args=plugin_args)
     outputnode = ['outputnode' in node for node in egraph.nodes()]
     return outputnode.out1, outputnode.out2

subnode = Node(Function(input_names=['input1', 'input2', ...],
output_names=['out1', 'out2'], func=run_subwf), name='subwf')

one could probably optimize a few things automatically given a workflow.

in the next generation API, this will be doable without creating these
special nodes/interfaces.

cheers,

satra

On Tue, Jan 10, 2017 at 11:47 AM, Ian Malone <ibmalone at gmail.com> wrote:

> On 9 January 2017 at 18:59, Ian Malone <ibmalone at gmail.com> wrote:
> > Hi,
> >
> > I've got a relatively complex workflow that I'd like to use as a
> > sub-workflow of another one, however it needs to be iterated over some
> > of the inputs. I suppose I could replace the appropriate pe.Node()s in
> > it with MapNode()s, but there are a fair number of them, and quite a
> > few connections. (I also think, that this would prevent it being used
> > on single instance inputs without first packing them into a list,
> > though I could be wrong.)
> >
>
> > Is this at all possible, or should I bite the bullet and start
> > MapNode-ing the sub-workflow?
>
> This has turned out to be doubly interesting as I forgot my
> sub-workflow already had its own sub-workflow, which is already used
> elsewhere with a single set of inputs. I suppose I can use
> interfaces.utility.Split() to extract the single output again in that
> case, but the bigger workflow (which I'd also like to use elsewhere)
> has quite a few outputs, and connecting a split to each one seems a
> bit unwieldy. Any good solutions to this?
>
> --
> imalone
> _______________________________________________
> Neuroimaging mailing list
> Neuroimaging at python.org
> https://mail.python.org/mailman/listinfo/neuroimaging
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/neuroimaging/attachments/20170111/de3e57cd/attachment.html>


More information about the Neuroimaging mailing list