Hi Robert,
I am running a linear elastic simulation, as in my examples described some weeks ago.
The current problem is as follows: After running the simulation with some small test meshes (3D with tetrahedrons) I changed to bigger meshes.
The first example, with 71759 nodes and 310106 tetrahedrons, worked fine. Now I switched to a mesh with 105579 nodes and 460442 tetrahedrons and get the following output:
sfepy: saving regions as groups... sfepy: Omega sfepy: Surface sfepy: Bottom sfepy: ...done sfepy: updating variables... sfepy: ...done sfepy: setting up dof connectivities... sfepy: ...done in 0.83 s sfepy: matrix shape: (313416, 313416) sfepy: assembling matrix graph... sfepy: ...done in 3.52 s sfepy: matrix structural nonzeros: 11941254 (1.22e04% fill) sfepy: updating materials... sfepy: solid sfepy: f sfepy: load sfepy: ...done in 0.80 s sfepy: nls: iter: 0, residual: 2.878127e+07 (rel: 1.000000e+00) Can't expand MemType 1: jcol 311926
Do you have a suggestion what the problem could be?
Thanks for your help. Kathrin
Hi Kathrin,
you are using a direct linear solver from scipy (ls.scipy_direct), right? This error seems to be caused by the SuperLU solver running out of memory.
To be able to solver large problems, you will need to use a preconditioned iterative solver (preferably a multigrid, for example PyAMG, or some iterative solver from petsc). It is not well documented in sfepy, but the latest version has a decent support for such things, see [1]. Unfortunately, there is no example for linear elasticity only yet (I struggle with choosing correct preconditioning options myself).
r.
[1] http://sfepy.org/docdevel/examples/multi_physics/biot_short_syntax.html
On 10/06/2017 01:32 PM, Kathrin Sobe wrote:
Hi Robert,
I am running a linear elastic simulation, as in my examples described some weeks ago.
The current problem is as follows: After running the simulation with some small test meshes (3D with tetrahedrons) I changed to bigger meshes.
The first example, with 71759 nodes and 310106 tetrahedrons, worked fine. Now I switched to a mesh with 105579 nodes and 460442 tetrahedrons and get the following output:
sfepy: saving regions as groups... sfepy: Omega sfepy: Surface sfepy: Bottom sfepy: ...done sfepy: updating variables... sfepy: ...done sfepy: setting up dof connectivities... sfepy: ...done in 0.83 s sfepy: matrix shape: (313416, 313416) sfepy: assembling matrix graph... sfepy: ...done in 3.52 s sfepy: matrix structural nonzeros: 11941254 (1.22e04% fill) sfepy: updating materials... sfepy: solid sfepy: f sfepy: load sfepy: ...done in 0.80 s sfepy: nls: iter: 0, residual: 2.878127e+07 (rel: 1.000000e+00) Can't expand MemType 1: jcol 311926
Do you have a suggestion what the problem could be?
Thanks for your help. Kathrin _______________________________________________ SfePy mailing list sfepy@python.org https://mail.python.org/mm3/mailman3/lists/sfepy.python.org/
Hi Robert,
yes, currently I am using the direct solver.
Changing to another direct solver was working. I tried ScipyIterative and PyAMGSolver, both helped.
Is there a known limitation in the size of the matrix for these solvers. The sfepy output above shows my current matrix shape (313416, 313416). It might increase by factor 10. Is that a problem from your point of view?
Thank you and regards, Kathrin
20171006 14:40 GMT+02:00 Robert Cimrman cimrman3@ntc.zcu.cz:
Hi Kathrin,
you are using a direct linear solver from scipy (ls.scipy_direct), right? This error seems to be caused by the SuperLU solver running out of memory.
To be able to solver large problems, you will need to use a preconditioned iterative solver (preferably a multigrid, for example PyAMG, or some iterative solver from petsc). It is not well documented in sfepy, but the latest version has a decent support for such things, see [1]. Unfortunately, there is no example for linear elasticity only yet (I struggle with choosing correct preconditioning options myself).
r.
[1] http://sfepy.org/docdevel/examples/multi_physics/biot_short _syntax.html
On 10/06/2017 01:32 PM, Kathrin Sobe wrote:
Hi Robert,
I am running a linear elastic simulation, as in my examples described some weeks ago.
The current problem is as follows: After running the simulation with some small test meshes (3D with tetrahedrons) I changed to bigger meshes.
The first example, with 71759 nodes and 310106 tetrahedrons, worked fine. Now I switched to a mesh with 105579 nodes and 460442 tetrahedrons and get the following output:
sfepy: saving regions as groups... sfepy: Omega sfepy: Surface sfepy: Bottom sfepy: ...done sfepy: updating variables... sfepy: ...done sfepy: setting up dof connectivities... sfepy: ...done in 0.83 s sfepy: matrix shape: (313416, 313416) sfepy: assembling matrix graph... sfepy: ...done in 3.52 s sfepy: matrix structural nonzeros: 11941254 (1.22e04% fill) sfepy: updating materials... sfepy: solid sfepy: f sfepy: load sfepy: ...done in 0.80 s sfepy: nls: iter: 0, residual: 2.878127e+07 (rel: 1.000000e+00) Can't expand MemType 1: jcol 311926
Do you have a suggestion what the problem could be?
Thanks for your help. Kathrin _______________________________________________ SfePy mailing list sfepy@python.org https://mail.python.org/mm3/mailman3/lists/sfepy.python.org/
SfePy mailing list sfepy@python.org https://mail.python.org/mm3/mailman3/lists/sfepy.python.org/
Hi Kathrin,
On 10/06/2017 03:37 PM, Kathrin Sobe wrote:
Hi Robert,
yes, currently I am using the direct solver.
Changing to another direct solver was working. I tried ScipyIterative and PyAMGSolver, both helped.
Is there a known limitation in the size of the matrix for these solvers. The sfepy output above shows my current matrix shape (313416, 313416). It might increase by factor 10. Is that a problem from your point of view?
A factor about 10 should not be a (big) problem for a desktop computer, provided it has enough RAM.
But be aware of the fact, that without a correct (problemdependent) preconditioning the number of iterations required for a given precision increases with the size of the problem. Good preconditioning (in terms of scalability) should give (almost) the same number of iterations independent of the problem size. For elasticity, check [1] (search linear elasticity on the page).
r. [1] https://github.com/pyamg/pyamg/wiki/Examples
Thank you and regards, Kathrin
20171006 14:40 GMT+02:00 Robert Cimrman cimrman3@ntc.zcu.cz:
Hi Kathrin,
you are using a direct linear solver from scipy (ls.scipy_direct), right? This error seems to be caused by the SuperLU solver running out of memory.
To be able to solver large problems, you will need to use a preconditioned iterative solver (preferably a multigrid, for example PyAMG, or some iterative solver from petsc). It is not well documented in sfepy, but the latest version has a decent support for such things, see [1]. Unfortunately, there is no example for linear elasticity only yet (I struggle with choosing correct preconditioning options myself).
r.
[1] http://sfepy.org/docdevel/examples/multi_physics/biot_short _syntax.html
On 10/06/2017 01:32 PM, Kathrin Sobe wrote:
Hi Robert,
I am running a linear elastic simulation, as in my examples described some weeks ago.
The current problem is as follows: After running the simulation with some small test meshes (3D with tetrahedrons) I changed to bigger meshes.
The first example, with 71759 nodes and 310106 tetrahedrons, worked fine. Now I switched to a mesh with 105579 nodes and 460442 tetrahedrons and get the following output:
sfepy: saving regions as groups... sfepy: Omega sfepy: Surface sfepy: Bottom sfepy: ...done sfepy: updating variables... sfepy: ...done sfepy: setting up dof connectivities... sfepy: ...done in 0.83 s sfepy: matrix shape: (313416, 313416) sfepy: assembling matrix graph... sfepy: ...done in 3.52 s sfepy: matrix structural nonzeros: 11941254 (1.22e04% fill) sfepy: updating materials... sfepy: solid sfepy: f sfepy: load sfepy: ...done in 0.80 s sfepy: nls: iter: 0, residual: 2.878127e+07 (rel: 1.000000e+00) Can't expand MemType 1: jcol 311926
Do you have a suggestion what the problem could be?
Thanks for your help. Kathrin _______________________________________________ SfePy mailing list sfepy@python.org https://mail.python.org/mm3/mailman3/lists/sfepy.python.org/
SfePy mailing list sfepy@python.org https://mail.python.org/mm3/mailman3/lists/sfepy.python.org/
participants (2)

Kathrin Sobe

Robert Cimrman