Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

AFWA with dm+sm

Tim Raupach

New member
Hi all,

I am using AFWA weather diagnostics in my WRF output by enabling eg. afwa_diag_opt and afwa_buoy_opt in my namelist. The model is compiled using distributed memory with shared memory (dm+sm) and it is running fine and producing the AFWA outputs. However I saw in the user's guide that the AFWA diagnostics "cannot be used with OpenMP". Does this restriction apply only to the smpar compiler option or to dm+sm also? Given my model is running fine with dm+sm can I have confidence in the AFWA diagnostics generated?

Thanks for your help,
Tim
 
Tim
This restriction applies to both sm and (dm+sm). If you look inside the code, you will find that loops in there are over ims:ime instead of its:ite so it would
have overlapping calculations when its:ite is smaller while threads are used.
I would suggest run WRF in MPI mode with the AFWA option on, even if you can run the case in sm mode.
 
Hi Ming, hi all,

A follow-up on this question. I have run some large simulations that are partially calculated and I want to finish them. The problem is they were run using dm+sm with AFWA diagnostics enabled. Based on Ming's reply earlier I understand that the AFWA calculations are not thread-safe so I should not use the AFWA diagnostics from these runs as they may be corrupted.

As I see it I have two options for continuing the runs -- either I recompile to MPI and rerun from scratch, or I turn off AFWA diagnostics and continue the runs using a restart (still in dm+sm mode), and discard the AFWA variables from the initial runs. In this latter case, can I trust the other non-AFWA variables from my initial runs?

Thanks again for your help.

Best wishes,
Tim
 
Top