Do I generate mesh in the wrong way?
Hi Robert,
I am using sfepy for static linear elasticity problem in 2D. The mesh size is 200*200 (mesh file is attached) and equations are as following:
def D_function(self, ts, coors, mode, term=None, **kwargs): if mode != 'qp': return _, weights = term.integral.get_qp('2_4') self.n_qp = weights.shape[0] self.val_D = np.repeat(self.stiffness_tensor, self.n_qp, axis=0) return {'function' : self.val_D}
def prestress_function(self,ts, coors, mode, term=None, **kwargs): if mode != 'qp': return _, weights = term.integral.get_qp('2_4') self.n_qp = weights.shape[0] self.val_pre = np.repeat(self.prestress_value, self.n_qp, axis=0) return {'function' : self.val_pre}
self.m = Material('m', function=self.D_function) self.prestress_value = np.copy(self.stress_0) (self.prestress_value).shape = (1, 3, 1)
self.prestress = Material('prestress', function=self.prestress_function) integral = Integral('i', order=2) self.t1 = Term.new('dw_lin_elastic(m.function, v, u )', integral, self.omega, m=self.m, v=self.v, u=self.u) self.t2 = Term.new('dw_lin_prestress(prestress.function, v )',integral, self.omega, prestress=self.prestress, v=self.v) self.t3 = Term.new('dw_surface_ltr(traction_lef.val, v)', integral, self.lef, traction_lef=self.traction_lef, v=self.v) self.t4 = Term.new('dw_surface_ltr(traction_rig.val, v)', integral, self.rig, traction_rig=self.traction_rig, v=self.v) self.t5 = Term.new('dw_surface_ltr(traction_top.val, v)', integral, self.top, traction_top=self.traction_top, v=self.v) self.t6 = Term.new('dw_surface_ltr(traction_bot.val, v)', integral, self.bot, traction_bot=self.traction_bot, v=self.v) self.fem_eq = Equation('balance', self.t1  self.t2  self.t3  self.t4  self.t5  self.t6) self.eqs = Equations([self.fem_eq]) self.ls = ScipyDirect({}) self.nls_status = IndexedStruct() self.nls = Newton({'i_max' : 1,'eps_a' : 1e8,'problem' : 'nonlinear'}, lin_solver=self.ls, status=self.nls_status) self.pb = Problem('elasticity', equations=self.eqs, nls=self.nls, ls=self.ls)
self.pb.time_update(ebcs=Conditions([fix_u_lef_corn]))
self.pb.update_materials() self.vec = self.pb.solve()
When I run the simulation, the information below are printed. I have several questions about it. I tried to find answers from source codes but failed. Could you please give me some help, thanks. (1) I do understand why "matrix shape: (79999, 79999)" and "matrix structural nonzeros: 1439965". what exactly this "matrix" is? what is its relation with mesh size (200*200)? (2) I do not understand why "updating materials..." did twice. (3) "solve: 6.18 [s]". I guess it should not take such a long time for a mesh size (200*200) problem. Is it because I generate mesh file in the wrong way or because of other reasons? (4) In which source code file the following information are printed?
sfepy: updating variables... sfepy: ...done sfepy: setting up dof connectivities... sfepy: ...done in 0.00 s sfepy: matrix shape: (79999, 79999) sfepy: assembling matrix graph... sfepy: ...done in 0.05 s sfepy: matrix structural nonzeros: 1439965 (2.25e04% fill) sfepy: updating materials... sfepy: m sfepy: traction_bot sfepy: traction_rig sfepy: prestress sfepy: traction_top sfepy: traction_lef sfepy: ...done in 0.05 s sfepy: updating materials... sfepy: m sfepy: traction_bot sfepy: traction_rig sfepy: prestress sfepy: traction_top sfepy: traction_lef sfepy: ...done in 0.05 s sfepy: nls: iter: 0, residual: 4.741758e+01 (rel: 1.000000e+00) sfepy: rezidual: 0.04 [s] sfepy: solve: 6.18 [s] sfepy: matrix: 0.10 [s] sfepy: nls: iter: 1, residual: 5.936719e14 (rel: 1.252008e15) sfepy: equation "tmp": sfepy: ev_cauchy_strain.2.Omega(u) sfepy: updating materials... sfepy: ...done in 0.00 s
Regards Ronghai
Hi Ronghai,
On 08/19/2016 07:23 PM, Ronghai Wu wrote:
Hi Robert,
I am using sfepy for static linear elasticity problem in 2D. The mesh size is 200*200 (mesh file is attached) and equations are as following:
def D_function(self, ts, coors, mode, term=None, **kwargs): if mode != 'qp': return _, weights = term.integral.get_qp('2_4') self.n_qp = weights.shape[0] self.val_D = np.repeat(self.stiffness_tensor, self.n_qp, axis=0) return {'function' : self.val_D}
def prestress_function(self,ts, coors, mode, term=None, **kwargs): if mode != 'qp': return _, weights = term.integral.get_qp('2_4') self.n_qp = weights.shape[0] self.val_pre = np.repeat(self.prestress_value, self.n_qp, axis=0) return {'function' : self.val_pre}
self.m = Material('m', function=self.D_function) self.prestress_value = np.copy(self.stress_0) (self.prestress_value).shape = (1, 3, 1)
self.prestress = Material('prestress', function=self.prestress_function) integral = Integral('i', order=2) self.t1 = Term.new('dw_lin_elastic(m.function, v, u )', integral, self.omega, m=self.m, v=self.v, u=self.u) self.t2 = Term.new('dw_lin_prestress(prestress.function, v )',integral, self.omega, prestress=self.prestress, v=self.v) self.t3 = Term.new('dw_surface_ltr(traction_lef.val, v)', integral, self.lef, traction_lef=self.traction_lef, v=self.v) self.t4 = Term.new('dw_surface_ltr(traction_rig.val, v)', integral, self.rig, traction_rig=self.traction_rig, v=self.v) self.t5 = Term.new('dw_surface_ltr(traction_top.val, v)', integral, self.top, traction_top=self.traction_top, v=self.v) self.t6 = Term.new('dw_surface_ltr(traction_bot.val, v)', integral, self.bot, traction_bot=self.traction_bot, v=self.v) self.fem_eq = Equation('balance', self.t1  self.t2  self.t3  self.t4  self.t5  self.t6) self.eqs = Equations([self.fem_eq]) self.ls = ScipyDirect({}) self.nls_status = IndexedStruct() self.nls = Newton({'i_max' : 1,'eps_a' : 1e8,'problem' : 'nonlinear'}, lin_solver=self.ls, status=self.nls_status) self.pb = Problem('elasticity', equations=self.eqs, nls=self.nls, ls=self.ls)
self.pb.time_update(ebcs=Conditions([fix_u_lef_corn]))
self.pb.update_materials() self.vec = self.pb.solve()
When I run the simulation, the information below are printed. I have several questions about it. I tried to find answers from source codes but failed. Could you please give me some help, thanks. (1) I do understand why "matrix shape: (79999, 79999)" and "matrix structural nonzeros: 1439965". what exactly this "matrix" is? what is its relation with mesh size (200*200)?
You have 201 * 201 * 2 = 80802 degrees of freedom (DOFs)  two displacements in each vertex, and some DOFs are probably fixed by fix_u_lef_corn boundary condition (what are your regions?), so the shape is (79999, 79999). Concerning the number of nonzeros, it corresponds to the mesh connectivity (how many nodes are connected to a node), and indicates how much memory the matrix storage would take.
(2) I do not understand why "updating materials..." did twice.
Because you called:
self.pb.update_materials()
Problem.update_materials() is called automatically in Problem.solve().
(3) "solve: 6.18 [s]". I guess it should not take such a long time for a mesh size (200*200) problem. Is it because I generate mesh file in the wrong way or because of other reasons?
This depends on the linear solver used. You use ScipyDirect which is either superLU (rather slow), or umfpack (faster), provided scikitumfpack and the umfpack libraries are installed.
You can try another solver (e.g. pyamg, or an iterative solver from petsc) to speed things up.
(4) In which source code file the following information are printed?
You can use 'git grep' to find the patterns if you have the git version of sfepy.
r.
在 2016年8月21日星期日 UTC+2下午10:37:37，Robert Cimrman写道：
Hi Ronghai,
On 08/19/2016 07:23 PM, Ronghai Wu wrote:
Hi Robert,
I am using sfepy for static linear elasticity problem in 2D. The mesh
size
is 200*200 (mesh file is attached) and equations are as following:
def D_function(self, ts, coors, mode, term=None, **kwargs): if mode != 'qp': return _, weights = term.integral.get_qp('2_4') self.n_qp = weights.shape[0] self.val_D = np.repeat(self.stiffness_tensor, self.n_qp, axis=0) return {'function' : self.val_D}
def prestress_function(self,ts, coors, mode, term=None, **kwargs): if mode != 'qp': return _, weights = term.integral.get_qp('2_4') self.n_qp = weights.shape[0] self.val_pre = np.repeat(self.prestress_value, self.n_qp, axis=0) return {'function' : self.val_pre}
self.m = Material('m', function=self.D_function) self.prestress_value = np.copy(self.stress_0) (self.prestress_value).shape = (1, 3, 1)
self.prestress = Material('prestress', function=self.prestress_function) integral = Integral('i', order=2) self.t1 = Term.new('dw_lin_elastic(m.function, v, u )', integral,
self.omega, m=self.m, v=self.v, u=self.u)
self.t2 = Term.new('dw_lin_prestress(prestress.function, v )',integral,
self.omega, prestress=self.prestress, v=self.v)
self.t3 = Term.new('dw_surface_ltr(traction_lef.val, v)', integral,
self.lef, traction_lef=self.traction_lef, v=self.v)
self.t4 = Term.new('dw_surface_ltr(traction_rig.val, v)', integral,
self.rig, traction_rig=self.traction_rig, v=self.v)
self.t5 = Term.new('dw_surface_ltr(traction_top.val, v)', integral,
self.top, traction_top=self.traction_top, v=self.v)
self.t6 = Term.new('dw_surface_ltr(traction_bot.val, v)', integral,
self.bot, traction_bot=self.traction_bot, v=self.v)
self.fem_eq = Equation('balance', self.t1  self.t2  self.t3  self.t4
 self.t5  self.t6)
self.eqs = Equations([self.fem_eq]) self.ls = ScipyDirect({}) self.nls_status = IndexedStruct() self.nls = Newton({'i_max' : 1,'eps_a' : 1e8,'problem' : 'nonlinear'},
lin_solver=self.ls, status=self.nls_status)
self.pb = Problem('elasticity', equations=self.eqs, nls=self.nls, ls=
self.ls)
self.pb.time_update(ebcs=Conditions([fix_u_lef_corn]))
self.pb.update_materials() self.vec = self.pb.solve()
When I run the simulation, the information below are printed. I have several questions about it. I tried to find answers from source codes
but
failed. Could you please give me some help, thanks. (1) I do understand why "matrix shape: (79999, 79999)" and "matrix structural nonzeros: 1439965". what exactly this "matrix" is? what is
its
relation with mesh size (200*200)?
You have 201 * 201 * 2 = 80802 degrees of freedom (DOFs)  two displacements in each vertex, and some DOFs are probably fixed by fix_u_lef_corn boundary condition (what are your regions?), so the shape is (79999, 79999). Concerning the number of nonzeros, it corresponds to the mesh connectivity (how many nodes are connected to a node), and indicates how much memory the matrix storage would take.
(2) I do not understand why "updating materials..." did twice.
Because you called:
self.pb.update_materials()
Problem.update_materials() is called automatically in Problem.solve().
Thanks, that clarifies a lot.
(3) "solve: 6.18 [s]". I guess it should not take such a long time for a
mesh size (200*200) problem. Is it because I generate mesh file in the wrong way or because of other reasons?
This depends on the linear solver used. You use ScipyDirect which is either superLU (rather slow), or umfpack (faster), provided scikitumfpack and the umfpack libraries are installed.
You can try another solver (e.g. pyamg, or an iterative solver from petsc) to speed things up.
I tried the following solvers (2D mesh 512*512), and the solve time and residual are: *ScipyDirect({'method':'superlu'}) 42 s *
*3e12ScipyDirect({'method':'umpack'}) 7.1 s
3e12PyAMGSolver({}) 12 s 1.e0*
*PETScKrylovSolver({}) 2.5 s 5.e1*The *PETScKrylovSolver({}) *is the fastest, but the solution precision is low. If I try to improve the prcision by solving it iteratively with "*nls = Newton({'i_max' : 3, 'problem' : 'nonlinear'}, lin_solver=ls, status=nls_status)*", I got the following warning massage and residual remains the same:
*sfepy: warning: linear system solution precision is lowersfepy: then the value set in solver options! (err = 5.188237e01 < 1.000000e10)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e01)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e02)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e03)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e04)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e05)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e06)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e07)sfepy: linesearch failed, continuing anywaysfepy: nls: iter: 2, residual: 5.188237e01 (rel: 2.155866e02)*
But I do not understand why the iterative does not work. An additional question: is it possible to solve my case parallelly, following the example "diffusion/poisson_parallel_interactive.py".
Regards Ronghai
(4) In which source code file the following information are printed?
You can use 'git grep' to find the patterns if you have the git version of sfepy.
r.
On 08/22/2016 05:12 PM, Ronghai Wu wrote:
在 2016年8月21日星期日 UTC+2下午10:37:37，Robert Cimrman写道：
(3) "solve: 6.18 [s]". I guess it should not take such a long time for a
mesh size (200*200) problem. Is it because I generate mesh file in the wrong way or because of other reasons?
This depends on the linear solver used. You use ScipyDirect which is either superLU (rather slow), or umfpack (faster), provided scikitumfpack and the umfpack libraries are installed.
You can try another solver (e.g. pyamg, or an iterative solver from petsc) to speed things up.
I tried the following solvers (2D mesh 512*512), and the solve time and residual are: *ScipyDirect({'method':'superlu'}) 42 s *
*3e12ScipyDirect({'method':'umpack'}) 7.1 s 3e12PyAMGSolver({}) 12 s 1.e0*
*PETScKrylovSolver({}) 2.5 s 5.e1*The *PETScKrylovSolver({}) *is the fastest, but the solution precision is low. If I try to improve the prcision by solving it iteratively with "*nls = Newton({'i_max' : 3, 'problem' : 'nonlinear'}, lin_solver=ls, status=nls_status)*", I got the following warning massage and residual remains the same:
You could try setting different tolerances and max. number of iterations, see [1] for all the options. Also using a suitable preconditioner is key to good convergence  but this depends on the particulars of your problem. Maybe try bcgsl or gmres with gamg preconditioning. Then, if the linear solver converges, the Newton solver should converge in just one iteration. What you did corresponds to restarting the linear solver three times with the previous solution as the initial guess. The problem was, that the linear solver did not converge, so nothing was gained.
[1] http://sfepy.org/docdevel/src/sfepy/solvers/ls.html?highlight=petsckrylov#s...
*sfepy: warning: linear system solution precision is lowersfepy: then the value set in solver options! (err = 5.188237e01 < 1.000000e10)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e01)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e02)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e03)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e04)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e05)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e06)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e07)sfepy: linesearch failed, continuing anywaysfepy: nls: iter: 2, residual: 5.188237e01 (rel: 2.155866e02)*
But I do not understand why the iterative does not work.
See above.
An additional question: is it possible to solve my case parallelly, following the example "diffusion/poisson_parallel_interactive.py".
Yes (provided the parallel examples work for you  all the prerequisites are installed etc.)  just replace the contents of create_local_problem() to define the linear elasticity instead of the Poisson's equation. See also the other parallel example [2], that shows how to deal with several unknown fields, and has linear elasticity as its part.
[2] http://sfepy.org/docdevel/examples/multi_physics/biot_parallel_interactive....
r.
在 2016年8月22日星期一 UTC+2下午7:40:03，Robert Cimrman写道：
On 08/22/2016 05:12 PM, Ronghai Wu wrote:
在 2016年8月21日星期日 UTC+2下午10:37:37，Robert Cimrman写道：
(3) "solve: 6.18 [s]". I guess it should not take such a long time
for
a
mesh size (200*200) problem. Is it because I generate mesh file in the wrong way or because of other reasons?
This depends on the linear solver used. You use ScipyDirect which is either superLU (rather slow), or umfpack (faster), provided scikitumfpack and the umfpack libraries are installed.
You can try another solver (e.g. pyamg, or an iterative solver from
petsc)
to speed things up.
I tried the following solvers (2D mesh 512*512), and the solve time and residual are: *ScipyDirect({'method':'superlu'}) 42 s *
*3e12ScipyDirect({'method':'umpack'}) 7.1 s 3e12PyAMGSolver({}) 12 s 1.e0*
*PETScKrylovSolver({}) 2.5 s 5.e1*The
*PETScKrylovSolver({})
*is the fastest, but the solution precision is low. If I try to improve
the
prcision by solving it iteratively with "*nls = Newton({'i_max' : 3, 'problem' : 'nonlinear'}, lin_solver=ls, status=nls_status)*", I got the following warning massage and residual remains the same:
You could try setting different tolerances and max. number of iterations, see [1] for all the options. Also using a suitable preconditioner is key to good convergence  but this depends on the particulars of your problem. Maybe try bcgsl or gmres with gamg preconditioning. Then, if the linear solver converges, the Newton solver should converge in just one iteration. What you did corresponds to restarting the linear solver three times with the previous solution as the initial guess. The problem was, that the linear solver did not converge, so nothing was gained.
[1]
http://sfepy.org/docdevel/src/sfepy/solvers/ls.html?highlight=petsckrylov#s...
*sfepy: warning: linear system solution precision is lowersfepy: then
the
value set in solver options! (err = 5.188237e01 < 1.000000e10)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e01)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e02)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e03)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e04)sfepy: linesearch: iter 2,
(5.18824e01
< 5.18819e01) (new ls: 1.000000e05)sfepy: linesearch: iter 2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e06)sfepy: linesearch:
iter
2, (5.18824e01 < 5.18819e01) (new ls: 1.000000e07)sfepy: linesearch failed, continuing anywaysfepy: nls: iter: 2, residual: 5.188237e01
(rel:
2.155866e02)*
But I do not understand why the iterative does not work.
See above.
An additional question: is it possible to solve my case parallelly,
following
the example "diffusion/poisson_parallel_interactive.py".
Yes (provided the parallel examples work for you  all the prerequisites are installed etc.)  just replace the contents of create_local_problem() to define the linear elasticity instead of the Poisson's equation. See also the other parallel example [2], that shows how to deal with several unknown fields, and has linear elasticity as its part.
Thanks, and sorry for the delay feedback. It took me some time to learn mpi and the parallel examples work for me. However, since I couple sfepy with fipy, the fipy will automatically portioning mesh once running with mpi. I have to figure out a way to let subdomains be the same for sfepy and fipy.
Regards Ronghai
[2]
http://sfepy.org/docdevel/examples/multi_physics/biot_parallel_interactive....
r.
On 08/29/2016 03:04 PM, Ronghai Wu wrote:
Thanks, and sorry for the delay feedback. It took me some time to learn mpi and the parallel examples work for me. However, since I couple sfepy with fipy, the fipy will automatically portioning mesh once running with mpi. I have to figure out a way to let subdomains be the same for sfepy and fipy.
Regards Ronghai
You can replace the call to pl.partition_mesh() with a function that returns the fipy partitioning  it should return cell_tasks = an array containing a task (partition) number (starting from 0) for each element in the mesh.
r.
在 2016年8月30日星期二 UTC+2上午11:39:11，Robert Cimrman写道：
On 08/29/2016 03:04 PM, Ronghai Wu wrote:
Thanks, and sorry for the delay feedback. It took me some time to learn
mpi
and the parallel examples work for me. However, since I couple sfepy
with
fipy, the fipy will automatically portioning mesh once running with mpi.
I
have to figure out a way to let subdomains be the same for sfepy and
fipy.
Regards Ronghai
You can replace the call to pl.partition_mesh() with a function that returns the fipy partitioning  it should return cell_tasks = an array containing a task (partition) number (starting from 0) for each element in the mesh.
r.
Thanks for your suggestion. I tried several times and think it is better to pass the whole (global) mesh and values from fipy to sfepy, the let sfepy do its own partitioning, solve mechanical equlibrium equation, then pass the whole (global) stress and strain to fipy. But I have some questions about the parallel solving in sfepy: (1) In the interactive linear elasticity example [1], the residual will print out during solving, but not in the parallel example [2]. In this case, how could we judge if the solution converges and if the precision is good enough? (2) In the interactive linear elasticity example [1], we can have access to the displacement, stress and strain arrays by
vec = pb.solve() strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg') stress = pb.evaluate('ev_cauchy_stress.2.Omega(u)', mode='el_avg')
Can I get access to the whole (global) displacement, stress and strain arrays in parallel example?
[1] http://sfepy.org/doc/tutorial.html#interactiveexamplelinearelasticity [2] http://sfepy.org/docdevel/examples/multi_physics/biot_parallel_interactive....
Thanks Ronghai
On 08/30/2016 03:21 PM, Ronghai Wu wrote:
在 2016年8月30日星期二 UTC+2上午11:39:11，Robert Cimrman写道：
On 08/29/2016 03:04 PM, Ronghai Wu wrote:
Thanks, and sorry for the delay feedback. It took me some time to learn
mpi
and the parallel examples work for me. However, since I couple sfepy
with
fipy, the fipy will automatically portioning mesh once running with mpi.
I
have to figure out a way to let subdomains be the same for sfepy and
fipy.
Regards Ronghai
You can replace the call to pl.partition_mesh() with a function that returns the fipy partitioning  it should return cell_tasks = an array containing a task (partition) number (starting from 0) for each element in the mesh.
r.
Thanks for your suggestion. I tried several times and think it is better to pass the whole (global) mesh and values from fipy to sfepy, the let sfepy do its own partitioning, solve mechanical equlibrium equation, then pass the whole (global) stress and strain to fipy. But I have some questions about the parallel solving in sfepy: (1) In the interactive linear elasticity example [1], the residual will print out during solving, but not in the parallel example [2]. In this case, how could we judge if the solution converges and if the precision is good enough?
You can pass any petsc options to the parallel examples. To see the convergence, pass
snes_monitor snes_converged_reason ksp_monitor
 check the docstring of [2] for other useful options.
(2) In the interactive linear elasticity example [1], we can have access to the displacement, stress and strain arrays by
vec = pb.solve() strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg') stress = pb.evaluate('ev_cauchy_stress.2.Omega(u)', mode='el_avg')
Can I get access to the whole (global) displacement, stress and strain arrays in parallel example?
Yes, after the call to pl.create_gather_to_zero() you have the whole solution vector in the task 0 (sol = psol_full[...].copy()....). Then you can evaluate global strain and stress as usual.
r.
[1] http://sfepy.org/doc/tutorial.html#interactiveexamplelinearelasticity [2] http://sfepy.org/docdevel/examples/multi_physics/biot_parallel_interactive....
Thanks Ronghai
在 2016年8月30日星期二 UTC+2下午9:05:43，Robert Cimrman写道：
On 08/30/2016 03:21 PM, Ronghai Wu wrote:
在 2016年8月30日星期二 UTC+2上午11:39:11，Robert Cimrman写道：
On 08/29/2016 03:04 PM, Ronghai Wu wrote:
Thanks, and sorry for the delay feedback. It took me some time to
learn
mpi
and the parallel examples work for me. However, since I couple sfepy
with
fipy, the fipy will automatically portioning mesh once running with
mpi.
I
have to figure out a way to let subdomains be the same for sfepy and
fipy.
Regards Ronghai
You can replace the call to pl.partition_mesh() with a function that returns the fipy partitioning  it should return cell_tasks = an array
containing
a task (partition) number (starting from 0) for each element in the mesh.
r.
Thanks for your suggestion. I tried several times and think it is better
to
pass the whole (global) mesh and values from fipy to sfepy, the let
sfepy
do its own partitioning, solve mechanical equlibrium equation, then pass the whole (global) stress and strain to fipy. But I have some questions about the parallel solving in sfepy: (1) In the interactive linear elasticity example [1], the residual will print out during solving, but not in the parallel example [2]. In this case, how could we judge if the solution converges and if the precision
is
good enough?
You can pass any petsc options to the parallel examples. To see the convergence, pass
snes_monitor snes_converged_reason ksp_monitor
 check the docstring of [2] for other useful options.
Thanks, it woks.
(2) In the interactive linear elasticity example [1], we can have access
to
the displacement, stress and strain arrays by
vec = pb.solve() strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg') stress = pb.evaluate('ev_cauchy_stress.2.Omega(u)', mode='el_avg')
Can I get access to the whole (global) displacement, stress and strain
arrays in parallel example?
Yes, after the call to pl.create_gather_to_zero() you have the whole solution vector in the task 0 (sol = psol_full[...].copy()....). Then you can evaluate global strain and stress as usual.
As I do the following things with 2D mesh 50*50 in [2]
sol = psol_full[...].copy()
print 'sol is:',sol
print 'sol shape is:',sol.shape
u = FieldVariable('u', 'parameter', field1,
primary_var_name='(settoNone)')
remap = gfds[0].id_map
ug = sol[remap]
print 'ug is:', ug
print 'ug shape is:', ug.shape
strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg')
print 'strain is:', strain
print 'strain shape is:', strain
I got {sol shape is: (7500,)} and {ug shape is: (5000,)}. Is {ug} the displacement at element center? and {ug[0]} is the displacement in xaxis? But {strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg')} does not work, with ValueError: argument u not found! This way works for interactive linear elasticity example, why it does not work here?
r.
r.
[1]
http://sfepy.org/doc/tutorial.html#interactiveexamplelinearelasticity
[2]
http://sfepy.org/docdevel/examples/multi_physics/biot_parallel_interactive....
Thanks Ronghai
On 08/31/2016 04:45 PM, Ronghai Wu wrote:
在 2016年8月30日星期二 UTC+2下午9:05:43，Robert Cimrman写道：
On 08/30/2016 03:21 PM, Ronghai Wu wrote:
(2) In the interactive linear elasticity example [1], we can have access
to
the displacement, stress and strain arrays by
vec = pb.solve() strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg') stress = pb.evaluate('ev_cauchy_stress.2.Omega(u)', mode='el_avg')
Can I get access to the whole (global) displacement, stress and strain
arrays in parallel example?
Yes, after the call to pl.create_gather_to_zero() you have the whole solution vector in the task 0 (sol = psol_full[...].copy()....). Then you can evaluate global strain and stress as usual.
As I do the following things with 2D mesh 50*50 in [2]
sol = psol_full[...].copy() print 'sol is:',sol print 'sol shape is:',sol.shape u = FieldVariable('u', 'parameter', field1, primary_var_name='(settoNone)') remap = gfds[0].id_map ug = sol[remap] print 'ug is:', ug print 'ug shape is:', ug.shape strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg') print 'strain is:', strain print 'strain shape is:', strain
I got {sol shape is: (7500,)} and {ug shape is: (5000,)}. Is {ug} the displacement at element center? and {ug[0]} is the displacement in xaxis?
sol contains both the displacements (2 components) and the pressure (it is the Biot problem example, right?), that is why it is bigger by 3/2. ug are the usual displacements in nodes in the correct sfepy ordering.
But {strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg')} does not work, with ValueError: argument u not found! This way works for interactive linear elasticity example, why it does not work here?
It does not work, because pb is a tasklocal Problem, with variables named u_i, p_i, instead of u, p.
In order to evaluate a term globally, do something like this:
# Set the actual global solution vector to u.
u.set_data(ug)
integral = Integral('i', order=2*(max(order_u, order_p)))
term = Term.new('ev_cauchy_strain(u)', integral, u.field.region, u=u)
term.setup()
strain = term.evaluate(mode='el_avg')
print 'strain is:', strain
print 'strain shape is:', strain.shape
Does that work for you?
r.
[1]
http://sfepy.org/doc/tutorial.html#interactiveexamplelinearelasticity
[2]
http://sfepy.org/docdevel/examples/multi_physics/biot_parallel_interactive....
Thanks Ronghai
在 2016年8月31日星期三 UTC+2下午10:40:22，Robert Cimrman写道：
On 08/31/2016 04:45 PM, Ronghai Wu wrote:
在 2016年8月30日星期二 UTC+2下午9:05:43，Robert Cimrman写道：
On 08/30/2016 03:21 PM, Ronghai Wu wrote:
(2) In the interactive linear elasticity example [1], we can have
access
to
the displacement, stress and strain arrays by
vec = pb.solve() strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg') stress = pb.evaluate('ev_cauchy_stress.2.Omega(u)', mode='el_avg')
Can I get access to the whole (global) displacement, stress and strain
arrays in parallel example?
Yes, after the call to pl.create_gather_to_zero() you have the whole solution vector in the task 0 (sol = psol_full[...].copy()....). Then you can evaluate global strain and stress as usual.
As I do the following things with 2D mesh 50*50 in [2]
sol = psol_full[...].copy() print 'sol is:',sol print 'sol shape is:',sol.shape u = FieldVariable('u', 'parameter', field1, primary_var_name='(settoNone)') remap = gfds[0].id_map ug = sol[remap] print 'ug is:', ug print 'ug shape is:', ug.shape strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)',
mode='el_avg')
print 'strain is:', strain print 'strain shape is:', strain
I got {sol shape is: (7500,)} and {ug shape is: (5000,)}. Is {ug} the displacement at element center? and {ug[0]} is the displacement in
xaxis?
sol contains both the displacements (2 components) and the pressure (it is the Biot problem example, right?), that is why it is bigger by 3/2. ug are the usual displacements in nodes in the correct sfepy ordering.
But {strain = pb.evaluate('ev_cauchy_strain.2.Omega(u)', mode='el_avg')} does not work, with ValueError: argument u not found! This way works for interactive linear elasticity example, why it does
not
work here?
It does not work, because pb is a tasklocal Problem, with variables named u_i, p_i, instead of u, p.
In order to evaluate a term globally, do something like this:
# Set the actual global solution vector to u. u.set_data(ug) integral = Integral('i', order=2*(max(order_u, order_p))) term = Term.new('ev_cauchy_strain(u)', integral, u.field.region,
u=u) term.setup()
strain = term.evaluate(mode='el_avg') print 'strain is:', strain print 'strain shape is:', strain.shape
Does that work for you?
Yes, it works. I will try to fit parallel running into my codes based on [2]. If I have further problem, I may need more help. Thank you as always.
r.
r.
[1]
http://sfepy.org/doc/tutorial.html#interactiveexamplelinearelasticity
[2]
http://sfepy.org/docdevel/examples/multi_physics/biot_parallel_interactive....
Thanks Ronghai
participants (2)

Robert Cimrman

Ronghai Wu