Author Topic: Parallel error  (Read 7040 times)

shenrilin

  • Full Member
  • ***
  • Posts: 67
Parallel error
« on: August 14, 2017, 01:11:52 PM »
Dear all,

When I use serial feap with petsc, I can get the correct results, dispalcement, strain, stress, etc.

However, when I apply parallel feap with petsc using 2 processors, I get the correct strain, stress but wrong displacement.

The nodes, elements and boundary conditions found in the parfeap input file are correct compared with the original input file.

It's very weired,  where is the problem?

Best,
« Last Edit: August 14, 2017, 05:21:49 PM by shenrilin »

blackbird

  • Full Member
  • ***
  • Posts: 100
Re: Parallel error
« Reply #1 on: August 15, 2017, 05:57:20 AM »
Dear all,

I have a similar problem. I append a small benchmark, which is a parallel version of the "Kingpost" given in the manual. Every truss in the original problem is divided into 20 elements. I get very weired displacement for the top node, as you can see in the contour plot (second contour plot with displacement of the nodes magnified by factor of 100). Also, these wrong displacements are found in the O-files due to a TPLO-command.

Do you know this issue? Is it fixed in 8.5 (I am using 8.4 with petsc-3.3-p7)?

Best Regards,
Christian

Prof. R.L. Taylor

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 2647
Re: Parallel error
« Reply #2 on: August 15, 2017, 06:03:23 AM »
Where do you see the incorrect displacements, in printed output or in contour plots?  Do you have a file you can post?

Prof. R.L. Taylor

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 2647
Re: Parallel error
« Reply #3 on: August 15, 2017, 06:04:46 AM »
If you subdivide a truss then the initial stiffness is singular at the nodes between the truss joints.  Is this what you are doing???

Prof. R.L. Taylor

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 2647
Re: Parallel error
« Reply #4 on: August 15, 2017, 06:14:19 AM »
GENERAL REMARK TO USERS:  Please check the output file ("O" file) when you solve a problem that does not work before posting on the Forum.  Check for "WARN" or "Warn" -- especially after a "TANG" or "UTAN" -- we mark cases where solutions are singular or very sensitive -- BUT DO NOT STOP THE EXECUTION.  Try to understand why this happened, it is the best way to learn about solving FEA problems and to also understand the theory you are trying to solve.

blackbird

  • Full Member
  • ***
  • Posts: 100
Re: Parallel error
« Reply #5 on: August 15, 2017, 08:35:54 AM »
Dear Prof. Taylor,

yes, I was subdividing a truss. The only warning I obtained was
Code: [Select]
     W a r n i n g s   &   E r r o r s
          Material density is zero.
The parallel simulation converged fine. Also, the stress values are reasonable compared to the original example. But of course you are right, obviously the subdivision causes the error here. Yet, the warning obtained in serial
Code: [Select]

*D4TRI WARNING* Lost at least 7 digits in reducing diagonals of        41 equations.
                 Step:      1 Iteration:     0

is not available in the parallel computation. Is there a similar way to check for such things in parallel?

Prof. R.L. Taylor

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 2647
Re: Parallel error
« Reply #6 on: August 15, 2017, 08:38:47 AM »
What solver do you use in the parallel?

blackbird

  • Full Member
  • ***
  • Posts: 100
Re: Parallel error
« Reply #7 on: August 15, 2017, 09:25:47 AM »
my petsc-options are -ksp_type cg -pc_type jacobi as I assume the matrix to be symmetric. In other cases I use -ksp_type bcgs, if I suspect the matrices to be unsymmetric.

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1160
Re: Parallel error
« Reply #8 on: August 15, 2017, 04:36:53 PM »
If you look on the PETSc web site you will find a large number of -ksp options that you can set to get further information.  For example there is an option to monitor extreme_singular_values, which may be helpful in monitoring  the state of your linear solve.