Hello,
I am using FEAP8.5 and Petsc 3.7.7. It seems that a parallelized calculation works with the command
mpirun -n 2 $FEAPHOME8_5/parfeap/feap -ksp_type cg -pc_type jacobi .
However, a variation of the (PETsc-) options has no influence on the solution process. For example, adding -ksp_monitor or -ksp_view does not create another file with the corresponding output.
If I still want to use a direct solver with e.g. -ksp_monitor_short -ksp_type preonly -pc_type lu -pc_factor_mat_solver_package mumps ,
this is apparently ignored.
Even if I don't pass on any options, a calculation is performed (probably with an iterative solver).
For your information: I configured PETsc with ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-blas-lapack-dir=$MKLROOT --download-parmetis=1 --download-metis=1 --with-debugging=0 --download-mumps=1 --download-scalapack=1
Furthermore I utilize intel/17.0 and intelmpi/2019.5.281.