在 2016年8月30日星期二 UTC+2上午11:39:11,Robert Cimrman写道:
On 08/29/2016 03:04 PM, Ronghai Wu wrote:
>
>>
> Thanks, and sorry for the delay feedback. It took me some time to learn mpi
> and the parallel examples work for me. However, since I couple sfepy with
> fipy, the fipy will automatically portioning mesh once running with mpi. I
> have to figure out a way to let subdomains be the same for sfepy and fipy.
>
> Regards
> Ronghai

You can replace the call to pl.partition_mesh() with a function that returns
the fipy partitioning - it should return cell_tasks = an array containing a
task (partition) number (starting from 0) for each element in the mesh.

r.


Thanks for your suggestion. I tried several times and think it is better to pass the whole (global) mesh and values from fipy to sfepy, the let sfepy do its own partitioning, solve mechanical equlibrium equation, then pass the whole (global) stress and strain to fipy. But I have some questions about the parallel solving in sfepy:
(1) In the interactive linear elasticity example [1], the residual will print out during solving, but not in the parallel example [2]. In this case, how could we judge if the solution converges and if the precision is good enough?
(2) In the interactive linear elasticity example [1], we can have access to the displacement, stress and strain arrays by 
 
vec = pb.solve()
strain
= pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg')
stress
= pb.evaluate('ev_cauchy_stress.2.Omega(u)', mode='el_avg')

Can I get access to the whole (global) displacement, stress and strain arrays in parallel example?

[1] http://sfepy.org/doc/tutorial.html#interactive-example-linear-elasticity
[2] http://sfepy.org/doc-devel/examples/multi_physics/biot_parallel_interactive.html#multi-physics-biot-parallel-interactive

Thanks
Ronghai