Hi,
I'm trying to set some Master/Slave conditions and that information is not being passed to the Parallel input. I'm using a 4-node example with 2 displcement dof and one temperature. I want to make the X displacement of node 2 a slave dof of the X displacement of node 3.
4 -------- 3
| |
di |
| |
1 -- di -- 2
MASTer !Mx,My,Sx,Sy,rl(x,y,t)
Node di,di,di, 0, 0,1,1
In the initial serial set-up run, if I plot the id31 array, before the uoutdom subroutine in pmacr7 I have the following:
Node | DispX | DispY | Temp |
1 | -1 | 1 | 2 |
2 | 5 | 3 | 4 |
3 | 5 | 6 | 7 |
4 | -17 | 8 | 9 |
As we can see, the displacement on the X direction has the same numbering for nodes 2 and 3, which means the Master/Slave routine was successfull.
However, after uoutdom the array id31 becomes:
Node | Dx | Dy | T |
1 | 0 | 1 | 2 |
2 | 3 | 4 | 5 |
3 | 6 | 7 | 8 |
4 | 0 | 9 | 10 |
which is the same matrix that is exported to the "EQUAtion numbers" list in the parallel input if OUTD,UNBL is used (only using one processor). Here the array no longer shows the same number in DX2 and DX3. If I print out this array right before the set-up run finishes I get the first array again, which means that id31 was fixed after the uoutdom changed it.
Finally, if I print that same array on the parallel run right before usolve I see that the array is now:
Node | Dx | Dy | T |
1 | -1 | 1 | 2 |
2 | 3 | 4 | 5 |
3 | 6 | 7 | 8 |
4 | -17 | 9 | 10 |
Is this behaviour normal? When does the parallel run generate id31 and how can I fix it to include the desired Master/Slave behaviour?
Thank you,
Miguel