Author Topic: Master/Slave  (Read 6257 times)

miguelarriaga

  • New Member
  • *
  • Posts: 7
Master/Slave
« on: July 18, 2014, 09:32:37 AM »
Hi,

I'm trying to set some Master/Slave conditions and that information is not being passed to the Parallel input. I'm using a 4-node example with 2 displcement dof and one temperature. I want to make the X displacement of node 2 a slave dof of the X displacement of node 3.

4 -------- 3
|          |
di         |
|          |
1 -- di -- 2

MASTer  !Mx,My,Sx,Sy,rl(x,y,t)
  Node   di,di,di, 0,   0,1,1


In the initial serial set-up run, if I plot the id31 array, before the uoutdom subroutine in pmacr7 I have the following:
NodeDispXDispYTemp
1-112
2534
3567
4-1789
As we can see, the displacement on the X direction has the same numbering for nodes 2 and 3, which means the Master/Slave routine was successfull.

However, after uoutdom the array id31 becomes:
NodeDxDyT
1012
2345
3678
40910
which is the same matrix that is exported to the "EQUAtion numbers" list in the parallel input if OUTD,UNBL is used (only using one processor). Here the array no longer shows the same number in DX2 and DX3. If I print out this array right before the set-up run finishes I get the first array again, which means that id31 was fixed after the uoutdom changed it.

Finally, if I print that same array on the parallel run right before usolve I see that the array is now:
NodeDxDyT
1-112
2345
3678
4-17910


Is this behaviour normal? When does the parallel run generate id31 and how can I fix it to include the desired Master/Slave behaviour?

Thank you,
Miguel


miguelarriaga

  • New Member
  • *
  • Posts: 7
Re: Master/Slave
« Reply #1 on: July 18, 2014, 10:39:53 AM »
PS1: I'm using version 8.3 (Switching to 8.4 would be a last resort solution because the code has been heavily patched)
PS2: The number -17 is because I'm actually using a more complicated element with 5 masked dof.

FEAP_Admin

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 993
Re: Master/Slave
« Reply #2 on: July 18, 2014, 11:03:02 AM »
8.3 does not have parallel contact.  you will need 8.4.

miguelarriaga

  • New Member
  • *
  • Posts: 7
Re: Master/Slave
« Reply #3 on: July 18, 2014, 11:31:50 AM »
I was under the impression that the function MASTer was under the scope of Rigid Body Analysis for Small Displacements

FEAP_Admin

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 993
Re: Master/Slave
« Reply #4 on: July 18, 2014, 01:37:09 PM »
I misunderstood.  I thought you were trying to do contact.

Notwithstanding, what you are trying to do in not part of parFEAP (not even version 8.4).
If you want to do this in parallel then you would have to adjust the graph partitioning to
properly handle it.  We do have this on our to do list but have not done as of yet.

miguelarriaga

  • New Member
  • *
  • Posts: 7
Re: Master/Slave
« Reply #5 on: July 18, 2014, 02:12:20 PM »
Thank you for your answer.
Could you explain to me what is the problem with implementing this? I was thinking that something like this could work:

At the Set-up run, for each partition, loop over the nodes of that partition and get the master nodes with mr(np(100))
If the master nodes are not on that partition, then add them as Ghost nodes and create a local array equivalent to mr(np(100))
Finally make uoutdom export a LINK command for all local master/slave pairs.

Would this work?