preCICE uses MPI for communication between different participants (and also for communication between ranks of the same participant). So are there any problems if the solver that you intend to couple also already uses MPI (e.g. for parallelization)? Who should initialize MPI? Who should finalize MPI? This is what we discuss here.
Updated 22 Dec 23

It is not complicated. There are three rules that preCICE follows:

  • preCICE only initializes MPI if it is not yet initialized (by e.g. the solver you want to couple).
  • preCICE finalizes MPI if and only if it was also initialized by preCICE.
  • preCICE only initializes MPI if it needs MPI.
  • preCICE uses MPI_COMM_WORLD if no custom communicator is provided.

So what does this mean for your adapter code:

  • Initialize preCICE after you initialize MPI.
  • Finalize preCICE before you finalize MPI.
[...] // start up your solver

MPI_Init(NULL, NULL);
int world_rank, world_size;
MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);
MPI_Comm_size(MPI_COMM_WORLD, &world_size);

[...] // maybe more initialization

precice::Participant precice("SolverName", "precice-config.xml", world_rank, world_size);

[...] // declare meshes vertices etc.

precice.initialize();

[...] // solving and coupling

precice.finalize();

[...] // more finalization

MPI_Finalize();

If you need to provide a custom communicator to preCICE, you can do so as follows:


[...] // Initialize MPI and start your solver

MPI_Comm my_comm = /* ... */;

precice::Participant precice("SolverName", "precice-config.xml", world_rank, world_size, &my_comm);

[...]