Author Topic: parFEAP84-petsc343 gamg Error  (Read 14381 times)

tgross

  • Jr. Member
  • **
  • Posts: 18
parFEAP84-petsc343 gamg Error
« on: November 25, 2013, 01:57:35 AM »
Hello!

We are currently running parallel FEAP 8.4 and the latest Petsc version (3.4.3).

Using the solver and pre conditioners:
-ksp_type cg -pc_type jacobi
our test case runs without problems.

However, if we want to use the Prometheus alternative: -ksp_type cg -pc_type gamg -pc_gamg_type agg (Prometheus was removed in the new Petsc version) we get the following Error:
  • PETSC ERROR: --------------------- Error Message ------------------------------------
  • [0]PETSC ERROR: Floating point exception!
  • PETSC ERROR: Inserting nan at matrix entry (505,7)!


Does anyone have experience with gamg in parallel FEAP?

For partitioning, we used:
OUTDomains,AIJ,1
with standard nsbk=nfd
(You can find a small test case attached.)

I am grateful for any advice!
Best Regards,
Thomas

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: parFEAP84-petsc343 gamg Error
« Reply #1 on: November 26, 2013, 09:50:14 PM »
I did a test on a very basic problem and am getting similar insertion errors (with petsc 3.4.0).  I'll try to debug the issue.

tgross

  • Jr. Member
  • **
  • Posts: 18
Re: parFEAP84-petsc343 gamg Error
« Reply #2 on: December 09, 2013, 12:53:07 AM »
Dear Prof. Govindjee,

Did you get parallel FEAP 8.4 with "-ksp_type cg -pc_type gamg -pc_gamg_type agg" running (petsc 3.4.0)?

If not, are there any other alternatives for Prometheus in parallel FEAP 8.4 for large scale plasticity problems?

Best regards,
Thomas

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: parFEAP84-petsc343 gamg Error
« Reply #3 on: December 09, 2013, 02:04:04 PM »
not yet.  just manage to build 3.4.3 on my machine.  seems that i can no longer use the intel compiler to build petsc due to some metis errors that i will have to try to chase down another day.  now to see if it plays nice with FEAP.

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: parFEAP84-petsc343 gamg Error
« Reply #4 on: December 12, 2013, 12:10:54 AM »
Finally got a working PETSc build (version 3.4.3).

I ran your problem using the distributed parfeap/makefile as:

make -f parfeap/makefile feaprun

and

make -f parfeap/makefile feaprun-gamg

both cases appear to function just fine.   Here is the tail of the L-file:

  SOLUTION SUMMARY
  ----------------

  Load    Total     Solution      Time     Residual Norm        Energy Norm     CPU Time
  Step  Tang+Forms      Time      Incr.  Initial     Final   Initial     Final (Seconds)
  --------------------------------------------------------------------------------------
     1     2    0  5.000E-01  5.00E-01  6.36E+02  1.44E-06  2.73E+02  8.83E-15      0.23
     2     2    0  1.000E+00  5.00E-01  6.36E+02  2.10E-06  2.73E+02  2.06E-14      0.39

 Total     4    0
  ---------------------------------- END OF FEAP LOG -----------------------------------

Interestingly originally saw similar errors but in this case I did a complete re-build -- both of my petsc installation and my FEAP installation.  It could be that you have some hysterisis in your build environment.  Try a fresh installation.

-sg

tgross

  • Jr. Member
  • **
  • Posts: 18
Re: parFEAP84-petsc343 gamg Error
« Reply #5 on: December 12, 2013, 01:22:01 AM »
This is great news! I will try it with a fresh install!
Did you finally use the intel compiler for your re-build?

Did you also use:
OUTDomains,AIJ,1
for partitioning?

Many thanks!
Thomas

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: parFEAP84-petsc343 gamg Error
« Reply #6 on: December 12, 2013, 01:40:32 AM »
graph node 2
outd aij 1

also make sure that you have set the number of processors correctly at run time

tgross

  • Jr. Member
  • **
  • Posts: 18
Re: parFEAP84-petsc343 gamg Error
« Reply #7 on: December 19, 2013, 02:38:49 AM »
Dear Prof. Govindjee,

I built a completely new parallel FEAP version using your updated "pform.f" from:http://feap.berkeley.edu/forum/index.php?topic=617.0 and a fresh petsc 3.4.3 built.

Now "-ksp_type:cg -pc_type:gamg" works for the 8 element cube I posted earlier. However, when increasing the number of elements to 512 in a very basic test, i get the following petsc error again:

  • PETSC ERROR: --------------------- Error Message ------------------------------------
  • [0]PETSC ERROR: Floating point exception!
  • PETSC ERROR: Inserting nan at matrix entry (556,4)!
  • [0]PETSC ERROR: ------------------------------------------------------------------------
  • PETSC ERROR: Petsc Release Version 3.4.3, Oct, 15, 2013


The same input file runs when using: "-ksp_type:cg -pc_type:jacobi".

Could you please try to run the input file in the attachment (2 processors)? I am curious to see if this problem just occurs in my parallel feap installation.

Many thanks!
Thomas

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: parFEAP84-petsc343 gamg Error
« Reply #8 on: December 19, 2013, 01:38:04 PM »
I ran your problem it worked with the following 5 methods

-ksp_type cg  -pc_type jacobi

-ksp_type gmres  -pc_type bjacobi

-ksp_type cg  -pc_type hypre -pc_hypre_type boomeramg  -pc_hypre_boomeramg_strong_threshold 0.25 -pc_hypre_boomeramg_relax_type_all symmetric-SOR/Jacobi

-ksp_type cg  -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1

-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package superlu_dist

It failed to run with various segmentation violation errors with the following 2 methods:

-ksp_type cg -pc_type ml  -mat_no_inode

-ksp_type preonly -pc_type lu -pc_factor_mat_solver_package mumps

Stack traces show the MUMPS error is related to a dylib issue (which is probably associated with the build enviroment).  The ML issue appears to be related to the PCSetCoordinates call but that is hard to figure out since it works fine for gamg.

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: parFEAP84-petsc343 gamg Error
« Reply #9 on: December 19, 2013, 01:59:06 PM »
Followup:  I was able to make ML work also.
In usolve.F there is a line
Code: [Select]
if(pfeap_bcin) call MatSetBlockSize(Kmat,nsbk,ierr)which is commented out.  If you uncomment it, then ML will also work (and the other solvers seem to be ok too with the exception of MUMPS).

tgross

  • Jr. Member
  • **
  • Posts: 18
Re: parFEAP84-petsc343 gamg Error
« Reply #10 on: January 19, 2014, 04:33:47 AM »
Dear Prof. Govindjee,

there seems to be something wrong with my petsc built. However, it is strange that gamg works for the 8 element cube, but not for the 512 element cube.

I will try again a fresh install with the FEAP 8.4 updates from January 9th.

Could you please post the options and compiler you used to build your petsc 3.4.3 version?

Many thanks and best regards,
Thomas

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: parFEAP84-petsc343 gamg Error
« Reply #11 on: January 19, 2014, 01:10:53 PM »
Here is my ./configure; note the { } syntax which does not seem to be documented that well.

When I build with the intel compiler:

./configure --with-cc=icc --with-cxx=icpc --with-fc=ifort --download-{parmetis,superlu_dist,mpich,ml,hypre,metis,mumps,scalapack,blacs}

When I build for the gnu compiler system

./configure --download-{parmetis,superlu_dist,mpich,ml,hypre,metis,mumps,scalapack,blacs}

tgross

  • Jr. Member
  • **
  • Posts: 18
Re: parFEAP84-petsc343 gamg Error
« Reply #12 on: January 24, 2014, 03:30:05 AM »
Dear Prof. Govindjee,

Thanks for your configure script! Now GAMG also works for larger problems. I changed the mpi version from openmpi to mpich. However, I am not sure if openmpi was responsible for the problem which caused the crash in my earlier built...

Regarding parFEAP84 vs parFEAP83 performance:
I solely use parFEAP for large scale 3d plasticity problems

When comparing parFEAP83 vs parFEAP84 using the same petsc options: -ksp_type cg -pc_type jacobi, both parFEAP versions reach the same level of performance.

When changing to the more efficient Prometheus preconditioner and the Prometheus successor GAMG, parFEAP83 is 3 times faster then parFEAP84.
the petsc setting for parFEAP83: -ksp_type cg -pc_type prometheus -options_left
the petsc setting for parFEAP84: -ksp_type cg -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 -options_left

Do you have any suggestions, how to make parFEAP84 with cg and gamg preconditioner as efficient as parFEAP83 with cg and prometheus?

Many Thxs!
Thomas

FEAP_Admin

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 993
Re: parFEAP84-petsc343 gamg Error
« Reply #13 on: January 24, 2014, 10:55:56 AM »
It is unfortunate that Prometheus is no longer available -- it performed well in the past.
I would suggest making a post to the petsc user forum indicating your prometheus options
and then your gamg options.  Dr. M. Adams the author of both, regularly responds to posts there.