Execution Time Problem on 3D uniaxial compressive simulation #649 from Github
Hi guys,
Currently, I am using Sfepy to simulate a 3D geometry model with a uniaxial displacement to find out the von Mises strength and the total execution time for the simulation is about *40 mins with 8G RAM*. Besides, 94% of the RAM is occupied when the program is running. I am not sure what is causing this execution time to take that long and what I could do to shorten the execution time. The geometry model was inputted from an Abaqus input-file with 5514 vertices and 4417 cells. Also, I have checked the execution time line-by-line and found that the most time consumable step is defined as: state = pb.solve(status=status) In this case, I am supposing the solving process used most of the RAM and causing the timing issue.
The mesh file and our code is attached.
Thank you and best regards!
Kedi Yang
Hi Kedi,
Yes, the solution process caused your issues.
Note that you are using the approximation order 2 in your field, so the linear system has not just 15242 active DOFs, which is OK for the direct solver you use, but it has 113933 active DOFs. I tried to use the MUMPS solver, and it required about 3.5 GB:
sfepy: nls: iter: 0, residual: 5.867625e+04 (rel: 1.000000e+00) sfepy: residual: 0.05 [s] sfepy: matrix: 2.28 [s] sfepy: solve: 14.80 [s] sfepy: warning: linear system solution precision is lower sfepy: then the value set in solver options! (err = 1.537797e-09 < 1.000000e-10) sfepy: nls: iter: 1, residual: 1.396787e-09 (rel: 2.380498e-14)
With the iterative PETSc Krylov solver, the solution can be obtained with peak memory requirement of about 800 MB:
from sfepy.base.base import Struct from sfepy.solvers import Solver
ls = Solver.any_from_conf(Struct(**{ 'kind' : 'ls.petsc', 'method': 'cg', 'precond': 'icc', 'i_max' : 200 }))
sfepy: nls: iter: 0, residual: 5.867625e+04 (rel: 1.000000e+00) sfepy: residual: 0.05 [s] sfepy: matrix: 2.28 [s] sfepy: solve: 17.64 [s] sfepy: warning: linear system solution precision is lower sfepy: then the value set in solver options! (err = 3.613957e+00 < 1.000000e-10) sfepy: nls: iter: 1, residual: 3.613957e+00 (rel: 6.159148e-05)
-> try using an iterative solver (and a suitable preconditioner - the 'icc' used above is not very good, but I am not expert in this to give a better advice).
BTW. for the approximation order 3, 375578 DOFs requires about 4.5 GB with PETSc CG/ICC. and 17.5 GB with MUMPS (both took about 1.5 minutes, but the direct solver was much more precise - note that the iterative solver had limited max. number of iterations to 200.)
Last note: the integral order should be 2 * approximation order (for constant material data).
Best regards, r.
On 15. 10. 20 3:18, Yang, Kedi wrote:
Hi guys,
Currently, I am using Sfepy to simulate a 3D geometry model with a uniaxial displacement to find out the von Mises strength and the total execution time for the simulation is about *40 mins with 8G RAM*. Besides, 94% of the RAM is occupied when the program is running. I am not sure what is causing this execution time to take that long and what I could do to shorten the execution time. The geometry model was inputted from an Abaqus input-file with 5514 vertices and 4417 cells. Also, I have checked the execution time line-by-line and found that the most time consumable step is defined as: state = pb.solve(status=status) In this case, I am supposing the solving process used most of the RAM and causing the timing issue.
The mesh file and our code is attached.
Thank you and best regards!
Kedi Yang
SfePy mailing list -- sfepy@python.org To unsubscribe send an email to sfepy-leave@python.org https://mail.python.org/mailman3/lists/sfepy.python.org/ Member address: cimrman3@ntc.zcu.cz
participants (2)
-
Robert Cimrman
-
Yang, Kedi