FEAP User Forum
FEAP => Parallel FEAP => Topic started by: tgross on November 25, 2013, 01:57:35 AM
-
Hello!
We are currently running parallel FEAP 8.4 and the latest Petsc version (3.4.3).
Using the solver and pre conditioners:
-ksp_type cg -pc_type jacobi
our test case runs without problems.
However, if we want to use the Prometheus alternative: -ksp_type cg -pc_type gamg -pc_gamg_type agg (Prometheus was removed in the new Petsc version) we get the following Error:
- PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Floating point exception!
- PETSC ERROR: Inserting nan at matrix entry (505,7)!
Does anyone have experience with gamg in parallel FEAP?
For partitioning, we used:
OUTDomains,AIJ,1
with standard nsbk=nfd
(You can find a small test case attached.)
I am grateful for any advice!
Best Regards,
Thomas
-
I did a test on a very basic problem and am getting similar insertion errors (with petsc 3.4.0). I'll try to debug the issue.
-
Dear Prof. Govindjee,
Did you get parallel FEAP 8.4 with "-ksp_type cg -pc_type gamg -pc_gamg_type agg" running (petsc 3.4.0)?
If not, are there any other alternatives for Prometheus in parallel FEAP 8.4 for large scale plasticity problems?
Best regards,
Thomas
-
not yet. just manage to build 3.4.3 on my machine. seems that i can no longer use the intel compiler to build petsc due to some metis errors that i will have to try to chase down another day. now to see if it plays nice with FEAP.
-
Finally got a working PETSc build (version 3.4.3).
I ran your problem using the distributed parfeap/makefile as:
make -f parfeap/makefile feaprun
and
make -f parfeap/makefile feaprun-gamg
both cases appear to function just fine. Here is the tail of the L-file:
SOLUTION SUMMARY
----------------
Load Total Solution Time Residual Norm Energy Norm CPU Time
Step Tang+Forms Time Incr. Initial Final Initial Final (Seconds)
--------------------------------------------------------------------------------------
1 2 0 5.000E-01 5.00E-01 6.36E+02 1.44E-06 2.73E+02 8.83E-15 0.23
2 2 0 1.000E+00 5.00E-01 6.36E+02 2.10E-06 2.73E+02 2.06E-14 0.39
Total 4 0
---------------------------------- END OF FEAP LOG -----------------------------------
Interestingly originally saw similar errors but in this case I did a complete re-build -- both of my petsc installation and my FEAP installation. It could be that you have some hysterisis in your build environment. Try a fresh installation.
-sg
-
This is great news! I will try it with a fresh install!
Did you finally use the intel compiler for your re-build?
Did you also use:
OUTDomains,AIJ,1
for partitioning?
Many thanks!
Thomas
-
graph node 2
outd aij 1
also make sure that you have set the number of processors correctly at run time
-
Dear Prof. Govindjee,
I built a completely new parallel FEAP version using your updated "pform.f" from:http://feap.berkeley.edu/forum/index.php?topic=617.0 and a fresh petsc 3.4.3 built.
Now "-ksp_type:cg -pc_type:gamg" works for the 8 element cube I posted earlier. However, when increasing the number of elements to 512 in a very basic test, i get the following petsc error again:
- PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Floating point exception!
- PETSC ERROR: Inserting nan at matrix entry (556,4)!
[0]PETSC ERROR: ------------------------------------------------------------------------
- PETSC ERROR: Petsc Release Version 3.4.3, Oct, 15, 2013
The same input file runs when using: "-ksp_type:cg -pc_type:jacobi".
Could you please try to run the input file in the attachment (2 processors)? I am curious to see if this problem just occurs in my parallel feap installation.
Many thanks!
Thomas
-
I ran your problem it worked with the following 5 methods
-ksp_type cg -pc_type jacobi
-ksp_type gmres -pc_type bjacobi
-ksp_type cg -pc_type hypre -pc_hypre_type boomeramg -pc_hypre_boomeramg_strong_threshold 0.25 -pc_hypre_boomeramg_relax_type_all symmetric-SOR/Jacobi
-ksp_type cg -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1
-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package superlu_dist
It failed to run with various segmentation violation errors with the following 2 methods:
-ksp_type cg -pc_type ml -mat_no_inode
-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package mumps
Stack traces show the MUMPS error is related to a dylib issue (which is probably associated with the build enviroment). The ML issue appears to be related to the PCSetCoordinates call but that is hard to figure out since it works fine for gamg.
-
Followup: I was able to make ML work also.
In usolve.F there is a line
if(pfeap_bcin) call MatSetBlockSize(Kmat,nsbk,ierr)
which is commented out. If you uncomment it, then ML will also work (and the other solvers seem to be ok too with the exception of MUMPS).
-
Dear Prof. Govindjee,
there seems to be something wrong with my petsc built. However, it is strange that gamg works for the 8 element cube, but not for the 512 element cube.
I will try again a fresh install with the FEAP 8.4 updates from January 9th.
Could you please post the options and compiler you used to build your petsc 3.4.3 version?
Many thanks and best regards,
Thomas
-
Here is my ./configure; note the { } syntax which does not seem to be documented that well.
When I build with the intel compiler:
./configure --with-cc=icc --with-cxx=icpc --with-fc=ifort --download-{parmetis,superlu_dist,mpich,ml,hypre,metis,mumps,scalapack,blacs}
When I build for the gnu compiler system
./configure --download-{parmetis,superlu_dist,mpich,ml,hypre,metis,mumps,scalapack,blacs}
-
Dear Prof. Govindjee,
Thanks for your configure script! Now GAMG also works for larger problems. I changed the mpi version from openmpi to mpich. However, I am not sure if openmpi was responsible for the problem which caused the crash in my earlier built...
Regarding parFEAP84 vs parFEAP83 performance:
I solely use parFEAP for large scale 3d plasticity problems
When comparing parFEAP83 vs parFEAP84 using the same petsc options: -ksp_type cg -pc_type jacobi, both parFEAP versions reach the same level of performance.
When changing to the more efficient Prometheus preconditioner and the Prometheus successor GAMG, parFEAP83 is 3 times faster then parFEAP84.
the petsc setting for parFEAP83: -ksp_type cg -pc_type prometheus -options_left
the petsc setting for parFEAP84: -ksp_type cg -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 -options_left
Do you have any suggestions, how to make parFEAP84 with cg and gamg preconditioner as efficient as parFEAP83 with cg and prometheus?
Many Thxs!
Thomas
-
It is unfortunate that Prometheus is no longer available -- it performed well in the past.
I would suggest making a post to the petsc user forum indicating your prometheus options
and then your gamg options. Dr. M. Adams the author of both, regularly responds to posts there.