FEAP User Forum
FEAP => Parallel FEAP => Topic started by: jbruss on November 02, 2017, 06:40:56 PM
-
Hello -
I would like to place the solution from the previous solve into the PETSc Vec "rhs" (for an algorithm I am working on). It seems that the array U = hr(np(40)) includes the solution values at the boundaries with Dirichlet conditions (e.g. 0 displacements) whereas the Vec "rhs" does not include these degrees of freedom. Is there some way for me to know the proper correspondence between the U array and the Vec "rhs" (and "sol" for that matter but I assume the ordering is consistent)?
I assume the U array has the following ordering for nodes with 2 dof:
node dof offset_into_U
1 1 0
1 2 1
2 1 2
2 2 3
3 1 4
3 2 5
...etc...
up to a value of offset_into_U = (numpn*ndf - 1)
where offset_into_U is such that the value of U = hr(np(40) + offset_into_U).
Is the ordering in "rhs"/"sol" the same except with the dirichlet nodal dofs removed? If so, is there an array in parfeap which I could search to see if a particular nodal dof has a prescribed value?
Thank you very much for your help.
Jonathan
-
Jonathan,
The answer to your question depends on the type of partitioning you have asked for. In particular, do you use outd,aij,1 to include the dirichlet boundary dofs? or outd,aij?
-
Hi Professor Govindjee -
I tried the following statements but it did not seem to change anything so I'm clearly doing something incorrectly.
BATCh
GRAP,,2
OUTD,AIJ,1
END
I'm running this on a small test problem to try to understand what is happening. The test problem has 65 nodes with 3 dofs per node. I receive the following information:
Rank: 0, numpn: 32
Rank: 0, numnp: 38
Rank: 0, numel: 26
Rank: 0, rhs loc size: 86
Rank: 0, sol loc size: 86
Rank: 1, numpn: 33
Rank: 1, numnp: 39
Rank: 1, numel: 27
Rank: 1, rhs loc size: 99
Rank: 1, sol loc size: 99
Rank 0 contains 10 dirichlet constraints so it is clear that these dofs are not included in the solution (since numpn*ndf = 96, not 86). Is there something else I need to have in the "BATCH" block in order to tell parfeap to keep these dofs? I looked through the Parallel Manual and it only says to do exactly as you described. Does it matter that I am using FEAP 8.3?
Thank you again,
Jonathan
-
Hi Professor -
I see that it does matter that I am using FEAP 8.3 unfortunately. Do you know of a way to include these dofs in 8.3?
Thank you,
Jonathan
-
Using 8.3 is going to be problematic unless you have an old version of PETSc.
Notwithstanding, I believe in version 8.3 the graph practitioner will include the equations
for the dirichlet boundary conditions if you partition with the BAIJ option. Look in parfeap/pmacr7.F
to understand the needed syntax for OUTDomain.
I do recommend that you upgrade to version 8.5.
-
Okay thank you. I am in the process of trying to convert my code over to work in 8.4. In FEAP 8.4 the serial version works with my code but the parallel version does not. I assume this is because I was using the older version of PETSc. Is there a particular version of PETSc that is recommended for 8.4 or is the latest stable version acceptable?
Thank you again,
Jonathan
-
Unfortunately I am unable to get FEAP 8.4 working with METIS. The serial version works but the parallel version does not work (the partitioning works for 1 processor but the answer is incorrect and seg faults for multiple processors).
Is there some way to know the ordering in version 8.3 if I allow these degrees of freedom associated with dirichlet boundaries to be removed? I see there are equation numbers written to the parallel files but I'm uncertain of where this information goes.
Thank you,
Jonathan
-
The parallel manual explains the data in the partitioned files.
I recommend updating to version 8.5 which works with the most
recent releases of petsc, metis, etc.