Author Topic: NO CONVERGENCE REASON: Iterations exceeded  (Read 16221 times)

Shu

  • Jr. Member
  • **
  • Posts: 25
NO CONVERGENCE REASON: Iterations exceeded
« on: December 16, 2014, 03:02:23 PM »
Dear Colleagues,

I am solving a dynamic problem using hyperelastic material. During the solving process using parallel FEAP, "NO CONVERGENCE REASON:  Iterations exceeded" is printed on the screen and output files. But as I checked the output files, there is no warnings or errors etc..

The command I used in solve.sample2 is

loop time 10
       time
          loop newton 10
              tang,,1
          next
        ....
       next

My guess is like this. During one loop of newton method, when the KSP solver is not able to give a solution which renders current energy residual to be less than 1.0e-16, "NO CONVERGENCE REASON:  Iterations exceeded" is printed. But this is not an error or warning, since the solution will go to next newton iteration.

Am I understanding this correctly? Thanks very much.

Shu


FEAP_Admin

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 993
Re: NO CONVERGENCE REASON: Iterations exceeded
« Reply #1 on: December 16, 2014, 09:24:54 PM »
The error you are seeing is coming from PETSc.  It is telling you that it can not solve the linear equations (within the Newton step).  Probably your tangent matrix is singular or indefinite.

Prof. R.L. Taylor

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 2649
Re: NO CONVERGENCE REASON: Iterations exceeded
« Reply #2 on: December 17, 2014, 07:43:57 AM »
You may check for sufficient boundary conditions (for quasistatic type problem) using the CHECk command.  Often it can also tell you if elements are numbered incorrectly.  Primarily, however, it counts the number of restraints imposed for each degree of freedom.  If the problem has no restraint in one of the displacement (not rotational dof) directions the problem is singular.  Other situations can arise in which the material properties are too different, rendering the problem difficult to solve with an iterative solution scheme.

Shu

  • Jr. Member
  • **
  • Posts: 25
Re: NO CONVERGENCE REASON: Iterations exceeded
« Reply #3 on: December 17, 2014, 09:21:33 AM »
Dear Prof. Taylor,

Thanks for your detailed analysis. My situation is that I modified fld3d1.f, adding piezoelectric induced stress to the S_n, where S_n+1=S_n+d_S. So in nonlinear case, it added additional parts to RHS as well as stiffness matrix. What I was simulating is a case which the piezo-effect bends a beam(fixed end, no other applied force), if the ratio of [beam length] to [beam width(same as the height)] is a little large, the NO CONVERGENCE showed up. Even though the element aspect ratio is good enough. By the way, I am using 8 node brick element.

From my point of view, it seems that the structure itself rising this problem. If I apply even larger external loading to the piezo-effect case showing "NO CONVERGENCE", however, the simulation moves smoothly. In this case, do you have any insight suggestion to overcome it?

Another question is, with the newton iterations go along with the "NO CONVERGENCE" print, does it mean the result is not reliable anymore? If the result is not correct, how the FEAP utilize the PETSc solution and move further? And why the newton iteration can still give a convergent energy norm after some iterations?

Again, thanks very much for your helpful suggestion!

Shu

You may check for sufficient boundary conditions (for quasistatic type problem) using the CHECk command.  Often it can also tell you if elements are numbered incorrectly.  Primarily, however, it counts the number of restraints imposed for each degree of freedom.  If the problem has no restraint in one of the displacement (not rotational dof) directions the problem is singular.  Other situations can arise in which the material properties are too different, rendering the problem difficult to solve with an iterative solution scheme.

luc

  • Full Member
  • ***
  • Posts: 53
Re: NO CONVERGENCE REASON: Iterations exceeded
« Reply #4 on: December 18, 2014, 09:10:41 AM »
PETSc is telling you that the iterative solver that you use did not converge within 10,000 iterations (unless you have set -ksp_it_max). So theoretically from that point onward your solution will be inaccurate or plain wrong.
FEAP does print the warning from PETSc but continues the simulation as if the iterative solver had reached convergence. This is a dangerous behavior and it is up to you to verify that your solution is still making sense.
You can choose to use an LU solver instead of an iterative one by adding the following arguments when you run parfeap:

parfeap -ksp_type preonly -pc_type lu

If you run simulations using MPI and you need to add -pc_factor_mat_solver_package <mumps,pastix,superlu_dist> to use one of these packages. These are the only packages that will perform a parallel LU decomposition with PETSc. To use them you need to have them installed and linked correctly to PETSc. The simplest way to get these solver it to reconfigure PETSc with the following arguments: --download-mumps --download-superlu_dist --download-pastix

Shu

  • Jr. Member
  • **
  • Posts: 25
Re: NO CONVERGENCE REASON: Iterations exceeded
« Reply #5 on: December 18, 2014, 12:32:47 PM »
Thanks for your detailed explanation!

Shu

PETSc is telling you that the iterative solver that you use did not converge within 10,000 iterations (unless you have set -ksp_it_max). So theoretically from that point onward your solution will be inaccurate or plain wrong.
FEAP does print the warning from PETSc but continues the simulation as if the iterative solver had reached convergence. This is a dangerous behavior and it is up to you to verify that your solution is still making sense.
You can choose to use an LU solver instead of an iterative one by adding the following arguments when you run parfeap:

parfeap -ksp_type preonly -pc_type lu

If you run simulations using MPI and you need to add -pc_factor_mat_solver_package <mumps,pastix,superlu_dist> to use one of these packages. These are the only packages that will perform a parallel LU decomposition with PETSc. To use them you need to have them installed and linked correctly to PETSc. The simplest way to get these solver it to reconfigure PETSc with the following arguments: --download-mumps --download-superlu_dist --download-pastix