On 08/07/2014 01:48 AM, Geoff Wright wrote:
Hi Robert,
Thanks for the response -- and good catch! The first order results are a lot better now with the bugfix in the material. But still losing about 20% of flux using first order. I tried running it with approx_order=2, and using the petsc solver, but it always converges to this trivial/invalid solution:
Source flux: 4.69775273941 Sink flux: -5.7976979887e-25 Block flux: 8.34388481281e-06 Entire surf: 4.6977610833 Sum of partial fluxes == total: 4.6977610833 == 4.6977610833
I could get the petsc solvers converge either. But I had some success with pyamg and finer geometry (with smaller domain and coarser outer sphere- probably not needed):
lc = 0.015;
...
// Define the outer sphere lc = 10.0; R = 20;
solver_0 = { 'name' : 'ls', 'kind' : 'ls.pyamg', 'method' : 'smoothed_aggregation_solver', 'accel' : 'gmres', 'eps_r' : 1e-18, # rtol - not met anyway }
results for approx_order = 1 in:
(two years old pyamg) Source flux: 0.129920699311 Sink flux: -0.120377402218 Block flux: -0.0479851466075 Entire surf: -0.0384418495146
(current pyamg from github) Source flux: 0.12990612732 Sink flux: -0.120381187623 Block flux: -0.0480000264776 Entire surf: -0.0384750867815
although it reduces the residual by two orders of magnitude only. Guess it's enough.
I'm hopeful that if we could get the second order field working then the errors would be small enough to be tolerable. Any ideas how to make it converge to the right solution? I attached the definitions file and two screenshots showing the 1st order and invalid 2nd order results. Its the same mesh as my previous post.
Finding an iterative solver + preconditioner that works for a problem is often difficult. Even pyamg does not converge for the second order field. Also note that I had to use gmres as an accelerator, because cg failed after a while saying that the matrix is indefinite - there is some instability, as this should not happen for a Laplacian. Maybe some scaling of the matrix is needed. I am not expert in this.
There are multigrid solvers in PETSc too but I never tried them - not sure if the sfepy interface is flexible enough to use them.
r.