MPI halo comms bug?

Topics related to the dynamics/numerics of the WRF model
Post Reply
microted
Posts: 9
Joined: Thu Oct 04, 2018 7:34 pm

MPI halo comms bug?

Post by microted » Fri Oct 18, 2019 5:54 pm

Hi!

I started working on adding a higher order advection scheme into WRF (4.1.3), and I first ran into an issue while testing different advection orders. In solver_em.F, h_mom_adv_order is used in various spots for moist/scalar/tke for the HALO calls. I changed the default namelist for em_quarter_ss to use h_mom_adv_order =3 and left the rest alone:

h_mom_adv_order = 3,
v_mom_adv_order = 3,
h_sca_adv_order = 5,
v_sca_adv_order = 3,

When run on 4 vs. 1 MPI thread, the netcdf files do not match (using diffwrf). If h_mom_adv_order = 5 (i.e., same as h_sca) then the files do match as expected. (I have not yet tested h_mom =5 with h_sca = 3.)

I assume this should be a valid namelist configuration. I have tried changing some code from h_mom to h_sca, but I haven't gotten it to match yet. Maybe some of the halo calls should check both?

I am eventually looking to be able to run, say, 7th order for momentum plus 5th or 7th order in scalars. On that track I have tried to set up the 7th and 9th order halo registry entries plus calls in solve_em, module_first_rk_step_part1, and module_first_rk_step_part2, but I'll follow up on that separately after figuring out how to fix the above issue.

Thanks for help!

-- Ted

microted
Posts: 9
Joined: Thu Oct 04, 2018 7:34 pm

Re: MPI halo comms bug?

Post by microted » Fri Oct 18, 2019 8:01 pm

A little more digging and I found that the HALO_EM_D and HALO_EM_E have both momentum and scalars, so they need to check *both* h_mom_adv_ord and h_sca_adv_ord. Now I can get identical results from both

h_mom_adv_order = 3,
h_sca_adv_order = 5,

and

h_mom_adv_order = 5,
h_sca_adv_order = 3,

for 4 vs. 1 mpi patch. I'll put a git update together for consideration.

-- Ted M.

Post Reply

Return to “Dynamics and Numerics”