在 2016年8月22日星期一 UTC+2下午7:40:03，Robert Cimrman写道：

On 08/22/2016 05:12 PM, Ronghai Wu wrote:

>

>

> 在 2016年8月21日星期日 UTC+2下午10:37:37，Robert Cimrman写道：

>

>> (3) "solve: 6.18 [s]". I guess it should not take such a long time for

>> a

>>> mesh size (200*200) problem. Is it because I generate mesh file in the

>>> wrong way or because of other reasons?

>>

>> This depends on the linear solver used. You use ScipyDirect which is

>> either

>> superLU (rather slow), or umfpack (faster), provided scikit-umfpack and

>> the

>> umfpack libraries are installed.

>>

>> You can try another solver (e.g. pyamg, or an iterative solver from petsc)

>> to

>> speed things up.

>>

>>

>

> I tried the following solvers (2D mesh 512*512), and the solve time and

> residual are:

> *ScipyDirect({'method':'superlu'}) 42 s *

>

> *3e-12ScipyDirect({'method':'umpack'}) 7.1 s

> 3e-12PyAMGSolver({}) 12 s 1.e0*

>

>

> *PETScKrylovSolver({}) 2.5 s 5.e-1*The *PETScKrylovSolver({})

> *is the fastest, but the solution precision is low. If I try to improve the

> prcision by solving it iteratively with "*nls = Newton({'i_max' : 3,

> 'problem' : 'nonlinear'}, lin_solver=ls, status=nls_status)*", I got the

> following warning massage and residual remains the same:

>

You could try setting different tolerances and max. number of iterations, see

[1] for all the options. Also using a suitable preconditioner is key to good

convergence - but this depends on the particulars of your problem. Maybe try

bcgsl or gmres with gamg preconditioning. Then, if the linear solver converges,

the Newton solver should converge in just one iteration. What you did

corresponds to restarting the linear solver three times with the previous

solution as the initial guess. The problem was, that the linear solver did not

converge, so nothing was gained.

[1]

http://sfepy.org/doc-devel/src/sfepy/solvers/ls.html? highlight=petsckrylov#sfepy. solvers.ls.PETScKrylovSolver

>

> *sfepy: warning: linear system solution precision is lowersfepy: then the

> value set in solver options! (err = 5.188237e-01 < 1.000000e-10)sfepy:

> linesearch: iter 2, (5.18824e-01 < 5.18819e-01) (new ls:

> 1.000000e-01)sfepy: linesearch: iter 2, (5.18824e-01 < 5.18819e-01) (new

> ls: 1.000000e-02)sfepy: linesearch: iter 2, (5.18824e-01 < 5.18819e-01)

> (new ls: 1.000000e-03)sfepy: linesearch: iter 2, (5.18824e-01 <

> 5.18819e-01) (new ls: 1.000000e-04)sfepy: linesearch: iter 2, (5.18824e-01

> < 5.18819e-01) (new ls: 1.000000e-05)sfepy: linesearch: iter 2,

> (5.18824e-01 < 5.18819e-01) (new ls: 1.000000e-06)sfepy: linesearch: iter

> 2, (5.18824e-01 < 5.18819e-01) (new ls: 1.000000e-07)sfepy: linesearch

> failed, continuing anywaysfepy: nls: iter: 2, residual: 5.188237e-01 (rel:

> 2.155866e-02)*

>

> But I do not understand why the iterative does not work.

See above.

> An additional question: is it possible to solve my case parallelly, following

> the example "diffusion/poisson_parallel_interactive.py".

Yes (provided the parallel examples work for you - all the prerequisites are

installed etc.) - just replace the contents of create_local_problem() to define

the linear elasticity instead of the Poisson's equation. See also the other

parallel example [2], that shows how to deal with several unknown fields, and

has linear elasticity as its part.

Thanks, and sorry for the delay feedback. It took me some time to learn mpi and the parallel examples work for me. However, since I couple sfepy with fipy, the fipy will automatically portioning mesh once running with mpi. I have to figure out a way to let subdomains be the same for sfepy and fipy.

Regards

Ronghai

[2]

http://sfepy.org/doc-devel/examples/multi_physics/biot_ parallel_interactive.html

r.