Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

run MPAS-A with multi-blocks decomposition per MPI process?

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

yvonqq

New member
Hello,

Has anyone successfully run MPAS-A with multi-blocks decomposition per MPI process?

In namelist.atmosphere, I set the following options:
config_block_decomp_file_prefix = 'x1.4002.graph.info.part.'
config_number_of_blocks = 4
config_explicit_proc_decomp = true
config_proc_decomp_file_prefix = 'x1.4002.mapping.'

x1.4002.graph.info.part.4 decomposes the domain into 4 blocks. And x1.4002.mapping.2 maps the 4 blocks to 2 MPI processes as the following:
0
1
0
1
That is it maps the first and third blocks to the MPI process of rank 0, the second and forth blocks to the MPI process of rank 1.

When I run MPAS-A with 2 MPI processes, the following errors occurred:
mpirun -np 2 ./atmosphere_model
forrtl: severe (408): fort: (2): Subscript #1 of the array RBUFFER has value 12816 which is greater than the upper bound of 12815

Routine Line Source
mpas_dmpar_mp_mpa 5462 mpas_dmpar.F
mpas_atmphys_tody 489 mpas_atmphys_todynamics.F
mpas_atmphys_tody 331 mpas_atmphys_todynamics.F
mpas_atmphys_tody 207 mpas_atmphys_todynamics.F
atm_time_integrat 435 mpas_atm_time_integration.F
atm_time_integrat 121 mpas_atm_time_integration.F
atm_core_mp_atm_d 873 mpas_atm_core.F
atm_core_mp_atm_c 664 mpas_atm_core.F
mpas_subdriver_mp 347 mpas_subdriver.F
MAIN__ 16 mpas.F

Thanks all.
 
Top