Author Topic: Error when using parallel feap  (Read 6747 times)

javijv4

  • New Member
  • *
  • Posts: 5
Error when using parallel feap
« on: July 30, 2018, 09:46:47 AM »
Dear all,

I'm running a 3D mesh with 420 second order serendipity 20 node Hexahedron elements with one dof per node in feap 8.4. I programmed this a an user element and works fine when using serial feap. However, when I try to run the exact same problem using parallel feap (no matter the number of processors) I get the following error:

Code: [Select]
*** Error in `/home/grojas/codes/feap/ver84/parfeap/feap': free(): invalid next size (normal): 0x0000000002c256e0 ***

Program received signal SIGABRT: Process abort signal.

Backtrace for this error:
#0  0x7FF26F8D6777
#1  0x7FF26F8D6D7E
#2  0x7FF26EC42CAF
#3  0x7FF26EC42C37
#4  0x7FF26EC46027
#5  0x7FF26EC7F2A3
#6  0x7FF26EC8B82D
#7  0x524C78 in freefn at cmem.c:?
#8  0x52508B in ffreefn_
#9  0x524639 in setmem_
#10  0x49B8B0 in usetmem_
#11  0x45501C in ualloc_
#12  0x466C25 in palloc_
#13  0x41D0DD in pmacr7_ at pmacr7.F:267
#14  0x478A60 in pmacr_
#15  0x46BA33 in pcontr_
Aborted (core dumped)

If you got any idea of why this is happening I'll appreciate any help.

Best,
Javiera

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1164
Re: Error when using parallel feap
« Reply #1 on: July 30, 2018, 03:53:28 PM »
Hard to say.  Maybe you can detail the steps you have taken up to the point of the crash?
Also did you start your partitioning with a flat file?  If your serial input file is not a flat file you can
make one with serial FEAP using the macro command OUTMesh.

javijv4

  • New Member
  • *
  • Posts: 5
Re: Error when using parallel feap
« Reply #2 on: August 01, 2018, 07:53:25 AM »
Thank you for the answer professor. Attach are the serial file and the parallel input file I used. The problem happens when I do
Code: [Select]
$FEAPHOME8_4/parfeap/feap

FEAP_Admin

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 993
Re: Error when using parallel feap
« Reply #3 on: August 01, 2018, 11:33:41 AM »
You do not have a flat file; first execute an OUTMesh.

javijv4

  • New Member
  • *
  • Posts: 5
Re: Error when using parallel feap
« Reply #4 on: August 02, 2018, 01:41:22 PM »
I always run on parallel feap using Inputs files like the one attach before. I don't know how to start the partitioning using a flat file. I follow the parallel manual to create the flat file Ipar.opt using OUTMesh and then with partition to create a graph.opt.par file. I attached both.  Then I try to run it using
Code: [Select]
BATCH
        OUTMesh
        GRAPh par.opt
        OUTDomains
END

but I get a PETSC error,
Code: [Select]
$FEAPHOME8_4/parfeap/feap 

    F I N I T E   E L E M E N T   A N A L Y S I S   P R O G R A M

           FEAP (C) Regents of the University of California
                         All Rights Reserved.
                       VERSION: Release 8.4.1d     
                          DATE: 01 January 2014     

         Files are set as:   Status    Filename

           Input   (read ) : Exists  Ipar                           
           Output  (write) : Exists  Opar                           
           Restart (read ) : New     Rpar                           
           Restart (write) : New     Rpar                           
           Plots   (write) : New     Ppar                           

         Caution, existing write files will be overwritten.

         Are filenames correct?( y or n; r = redefine all, s = stop) :y

         R U N N I N G    F E A P    P R O B L E M    N O W

          --> Please report errors by e-mail to:
              feap@ce.berkeley.edu


     3 - D   E l e c t r o p h y s i o l o g i c a l       

 Elmt 16: Monodomain model 3D
          MATERIAL MODEL FOR FEAP -------------
          [  ] [      ] Aliev-Panfilov Electrophysiology Model     
          [  ] [      ] UMAT2..................  Version 23/04/2014
          [  ] [    n1] material history variables ...........           1
          [ 1] [   dli]  e: intra. long. cond.     ........... 0.12143E+00
          [ 2] [   dti]  e: intra. tran. cond.     ........... 0.13571E-01
          [ 3] [   dle]  e: extra. long. cond.     ........... 0.44286E+00
          [ 4] [   dte]  e: extra. tran. cond.     ........... 0.17143E+00
          [ 5] [   ani]  e: anisotropic            ........... 0.10000E+01
          [ 6] [ alpha]  e: alpha parameter        ........... 0.50000E-01
          [ 7] [ gamma]  e: gamma parameter        ........... 0.20000E-02
          [ 8] [     b]  e: b parameter            ........... 0.25000E+00
          [ 9] [    c1]  e: c1 parameter           ........... 0.52000E+02
          [10] [    c2]  e: c2 parameter           ........... 0.80000E+01
          [11] [   mu1]  e: mu1 parameter          ........... 0.10000E+00
          [12] [   mu2]  e: mu2 parameter          ........... 0.30000E+00
          [13] [  sour]  e: mu2 parameter          ........... 0.00000E+00
          [14] [   tin]  e: mu2 parameter          ........... 0.00000E+00
          [15] [  tdur]  e: mu2 parameter          ........... 0.00000E+00
          [16] [   per]  e: mu2 parameter          ........... 0.00000E+00

     3 - D   E l e c t r o p h y s i o l o g i c a l       

 Elmt 16: Monodomain model 3D
          MATERIAL MODEL FOR FEAP -------------
          [  ] [      ] Aliev-Panfilov Electrophysiology Model     
          [  ] [      ] UMAT2..................  Version 23/04/2014
          [  ] [    n1] material history variables ...........           1
          [ 1] [   dli]  e: intra. long. cond.     ........... 0.12143E+00
          [ 2] [   dti]  e: intra. tran. cond.     ........... 0.13571E-01
          [ 3] [   dle]  e: extra. long. cond.     ........... 0.44286E+00
          [ 4] [   dte]  e: extra. tran. cond.     ........... 0.17143E+00
          [ 5] [   ani]  e: anisotropic            ........... 0.10000E+01
          [ 6] [ alpha]  e: alpha parameter        ........... 0.50000E-01
          [ 7] [ gamma]  e: gamma parameter        ........... 0.20000E-02
          [ 8] [     b]  e: b parameter            ........... 0.25000E+00
          [ 9] [    c1]  e: c1 parameter           ........... 0.52000E+02
          [10] [    c2]  e: c2 parameter           ........... 0.80000E+01
          [11] [   mu1]  e: mu1 parameter          ........... 0.10000E+00
          [12] [   mu2]  e: mu2 parameter          ........... 0.30000E+00
          [13] [  sour]  e: mu2 parameter          ........... 0.35700E+00
          [14] [   tin]  e: mu2 parameter          ........... 0.00000E+00
          [15] [  tdur]  e: mu2 parameter          ........... 0.20000E+01
          [16] [   per]  e: mu2 parameter          ........... 0.60000E+03
  Text mode of mesh output: filename = Ipar.opt
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.4.3, Oct, 15, 2013
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: /home/grojas/codes/feap/ver84/parfeap/feap on a gnu−opt named cbmlab10 by grojas Thu Aug  2 16:33:09 2018
[0]PETSC ERROR: Libraries linked from /home/grojas/codes/petsc-3.4.3/gnu−opt/lib
[0]PETSC ERROR: Configure run at Thu Oct 29 15:19:33 2015
[0]PETSC ERROR: Configure options --download-parmetis --download-superlu_dist --download-mpich --download-ml --download-hypre --download-metis --download-mumps --download-scalapack --download-blacs
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
[unset]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0

I am doing something wrong?

Thank you in advance