Modify FEniCS tutorials to enable parallel runs#120
Conversation
|
For testing purposes the tutorial |
BenjaminRodenberg
left a comment
There was a problem hiding this comment.
While reviewing precice/fenics-adapter#71 I used the content of this PR. I tried to run the CHT tutorial and ran into the following error (looking like issue with the parallelization):
benjamin@benjamin-ThinkPad-X1-Yoga-2nd:~/tutorials/CHT/flow-over-plate/buoyantPimpleFoam-fenics$ mpirun -np 2 python3 Solid/heat.py
---[precice] ERROR: For a parallel participant, only the mapping combinations read-consistent and write-conservative are allowed
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[38575,1],0]
Exit code: 255
--------------------------------------------------------------------------
For the FSI test cases I also ran into errors (looking like issues with the adapter):
benjamin@benjamin-ThinkPad-X1-Yoga-2nd:~/tutorials/FSI/flap_perp/OpenFOAM-FEniCS$ python3 Solid/perp-flap.py
---[precice] This is preCICE version 2.1.1
---[precice] Revision info: v2.1.1-54-g25b9f155
---[precice] Configuring preCICE with configuration "/home/benjamin/tutorials/FSI/flap_perp/OpenFOAM-FEniCS/Solid/../precice-config.xml"
---[precice] I am participant "fenics"
Traceback (most recent call last):
File "Solid/perp-flap.py", line 76, in <module>
precice_dt = precice.initialize(coupling_boundary, read_function_space=V, write_object=V,
File "/home/benjamin/.local/lib/python3.8/site-packages/fenicsprecice/fenicsprecice.py", line 363, in initialize
raise Exception("Dimension of preCICE setup and FEniCS do not match")
Exception: Dimension of preCICE setup and FEniCS do not match
---[precice] Implicitly finalizing in destructor
and
benjamin@benjamin-ThinkPad-X1-Yoga-2nd:~/tutorials/FSI/cylinderFlap/OpenFOAM-FEniCS$ python3 Solid/cyl-flap.py
---[precice] This is preCICE version 2.1.1
---[precice] Revision info: v2.1.1-54-g25b9f155
---[precice] Configuring preCICE with configuration "/home/benjamin/tutorials/FSI/cylinderFlap/OpenFOAM-FEniCS/Solid/../precice-config.xml"
---[precice] I am participant "fenics"
Traceback (most recent call last):
File "Solid/cyl-flap.py", line 92, in <module>
precice_dt = precice.initialize(coupling_boundary, read_function_space=V, write_object=V,
File "/home/benjamin/.local/lib/python3.8/site-packages/fenicsprecice/fenicsprecice.py", line 363, in initialize
raise Exception("Dimension of preCICE setup and FEniCS do not match")
Exception: Dimension of preCICE setup and FEniCS do not match
---[precice] Implicitly finalizing in destructor
This PR currently only tests the |
|
Feel free to merge this PR first. |
|
All FSI cases currently are also 2D-3D and hence will not work with the current parallel design. Just changing the precice-config by setting |
|
Merged #127 into this PR. Note that CHT case is still causing problems. I think that there is a bug in the configuration of the mapping. If I |
|
55d7579 uses this configuration file. Results look good (for 1 and 2 ranks for However, I got some confusing messages from the adapter for this case, when running
|
BenjaminRodenberg
left a comment
There was a problem hiding this comment.
Intermediate summary
- HT and CHT are working nicely
- FSI cases fail due to a deadlock in the adapter that occurs, if point sources are used. @IshaanDesai is aware of this issue and will fix it in precice/fenics-adapter#71.
|
FSI cases with FEniCS as the structure participant do not work in parallel because Note: The changes to preCICE configurations for FSI cases can be retained as these configurations work in serial. |
BenjaminRodenberg
left a comment
There was a problem hiding this comment.
Minor comment on dead code. Rest looks good to me. I did not try running the CHT and HT case again. Do you want me to run them?
Last clean-up is on-going. A last run from you will make this foolproof, so yes one run check would be good. |
* Modifying all FEniCS based tutorials to be compatible with parallel design of FEniCS-Adapter Co-authored-by: BenjaminRueth <benjamin.rueth@tum.de>
* Modifying all FEniCS based tutorials to be compatible with parallel design of FEniCS-Adapter Co-authored-by: BenjaminRueth <benjamin.rueth@tum.de>
* Modifying all FEniCS based tutorials to be compatible with parallel design of FEniCS-Adapter Co-authored-by: BenjaminRueth <benjamin.rueth@tum.de>
* Modifying all FEniCS based tutorials to be compatible with parallel design of FEniCS-Adapter Co-authored-by: BenjaminRueth <benjamin.rueth@tum.de>
* Modifying all FEniCS based tutorials to be compatible with parallel design of FEniCS-Adapter Co-authored-by: BenjaminRueth <benjamin.rueth@tum.de>
* Modifying all FEniCS based tutorials to be compatible with parallel design of FEniCS-Adapter Co-authored-by: BenjaminRueth <benjamin.rueth@tum.de>

The FEniCS-Adapter now supports parallel runs: precice/fenics-adapter#71.
Correspondingly minor modifications to FEniCS scripts and preCICE configurations for all FEniCS tutorials are necessary to enable parallel runs.