Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

MPI halo comms bug?

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

microted

New member
Hi!

I started working on adding a higher order advection scheme into WRF (4.1.3), and I first ran into an issue while testing different advection orders. In solver_em.F, h_mom_adv_order is used in various spots for moist/scalar/tke for the HALO calls. I changed the default namelist for em_quarter_ss to use h_mom_adv_order =3 and left the rest alone:

h_mom_adv_order = 3,
v_mom_adv_order = 3,
h_sca_adv_order = 5,
v_sca_adv_order = 3,

When run on 4 vs. 1 MPI thread, the netcdf files do not match (using diffwrf). If h_mom_adv_order = 5 (i.e., same as h_sca) then the files do match as expected. (I have not yet tested h_mom =5 with h_sca = 3.)

I assume this should be a valid namelist configuration. I have tried changing some code from h_mom to h_sca, but I haven't gotten it to match yet. Maybe some of the halo calls should check both?

I am eventually looking to be able to run, say, 7th order for momentum plus 5th or 7th order in scalars. On that track I have tried to set up the 7th and 9th order halo registry entries plus calls in solve_em, module_first_rk_step_part1, and module_first_rk_step_part2, but I'll follow up on that separately after figuring out how to fix the above issue.

Thanks for help!

-- Ted
 
A little more digging and I found that the HALO_EM_D and HALO_EM_E have both momentum and scalars, so they need to check *both* h_mom_adv_ord and h_sca_adv_ord. Now I can get identical results from both

h_mom_adv_order = 3,
h_sca_adv_order = 5,

and

h_mom_adv_order = 5,
h_sca_adv_order = 3,

for 4 vs. 1 mpi patch. I'll put a git update together for consideration.

-- Ted M.
 
Top