Author Topic: Why so many memory implementations?  (Read 4541 times)

JStorm

  • Sr. Member
  • ****
  • Posts: 250
Why so many memory implementations?
« on: August 02, 2017, 02:41:32 AM »
Hello,

I'm confused about the amount of implementations to allocate dynamic arrays (FEAP 8.4). Obviously, the source code distinguishes between Windows and Unix. Both folders contains a folder "memory" with an implementation using F95 features or C-code, respectively.

My first question is, why the Unix code needs to do the allocation in C instead also using F95?

The unix code also allows to use the C-API in folder "largemem" to address huge arrays, while the windows code handle that automatically.

Is their a reason why "ipr" isn't used as switch to chose the interface or is that just "grown" code?

Finally, their is also an implementation "program/memory", which is identical to the windows one.

Which implementation should be used? If it is the windows version ("windows/memory", "program/memory"), why to still have code in "windows" and "unix" instead of only stay with "program/memory"?

Kind regards,
johannes
« Last Edit: August 02, 2017, 03:48:33 AM by JStorm »

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1160
Re: Why so many memory implementations?
« Reply #1 on: August 02, 2017, 10:17:29 AM »
Q1: The C implementation of the memory that is used on unix machines is not necessary now that Fortran provides malloc.  Up to now we have chosen to keep it because it provides additional functionality that we have not programmed into the straight Fortran memory allocation routines.  At some stage this will likely change.

Q2:  ipr and large memory were historically two separate issues.  ipr has been part of the code for a long time to be able to deal with different computer systems (99% of which no longer exist and if they exist they are in museums).  Notwithstanding, ipr can now be used (via compiler options) to effect access to longer arrays.  The largemem routines are needed in this case due to the way that C handles long ints.

Q3:  So, yes, in a way -- see Q2.

Q4: program/memory is as you note virtually the same as windows/memory and is there if one wishes to use a strictly Fortran based memory allocation system.

Q5:  Stick with the defaults unless you have a strong reason to change.  Linux/Unix -- unix/memory;  Windows -- windows/memory; both with ipr =2.

JStorm

  • Sr. Member
  • ****
  • Posts: 250
Re: Why so many memory implementations?
« Reply #2 on: August 02, 2017, 09:37:34 PM »
Dear Prof. Govindjee,

thank you very much for that detailed answer.

I'm using big integers ("-fdefault-integer-8" or "-i8") with "ipr=1" and the API in "unix/largemem" in order to allow huge FEM models.
Doing a quick comparison of unix/memory/setmem.f and program/memory/setmem.f, I have not found a difference in the provided features.
Do you think it is save to switch to the more clean F95 implementation?

Prof. S. Govindjee

  • Administrator
  • FEAP Guru
  • *****
  • Posts: 1160
Re: Why so many memory implementations?
« Reply #3 on: August 03, 2017, 06:43:50 AM »
I have not done this is a long time but both ways, unix/largemem and program/memory/, should work with -fdefault-integer-8/-i8.  The advantage of unix/largemem is the additional record keeping that it does in case there are memory errors that you have to track down.  Otherwise program/memory should work.

JStorm

  • Sr. Member
  • ****
  • Posts: 250
Re: Why so many memory implementations?
« Reply #4 on: August 03, 2017, 06:51:22 AM »
Thank you for your advices Prof. Govindjee.