Hi all,
while producing regional climate simulations with WRF4.2 we encountered an unexpected behavior, namely that the solution depends on the number of CPU cores (Intel Xeon Gold 6240) used. For daily mean field-averaged values the difference is not large (peaking on some days around 1 mm/day for precipitation, 0.5 °C for temperature, and 5 W/m2 for SWDNB). On most of the days though the differences are much lower than these. On the other hand, locally (in some grid points) we observe as much as tens or even hundreds of W/m2 deviations for global radiation at a given output time. Moreover, there are days with no difference at all.
Keeping all settings the same, the below figure shows SWDNB calculated using 72 cores (left) and 36 cores (right):
The difference field:
Simulation details:
We had produced a 50 km resolution continuous 5-year simulation using ERA5 as ICBC and then further downscaled it with ndown.exe to a 10 km grid, using daily reinitialization (from 18 UTC the previous day to 00 UTC at the end of the given day). I attached an example namelist.input to the post.
It might be important to note that we use adaptive time step with the corrected adapt_timestep_em.F that was introduced in WRF4.3.
Is this behavior normal? Any insight?
Best regards,
Akos
View attachment namelist.input
while producing regional climate simulations with WRF4.2 we encountered an unexpected behavior, namely that the solution depends on the number of CPU cores (Intel Xeon Gold 6240) used. For daily mean field-averaged values the difference is not large (peaking on some days around 1 mm/day for precipitation, 0.5 °C for temperature, and 5 W/m2 for SWDNB). On most of the days though the differences are much lower than these. On the other hand, locally (in some grid points) we observe as much as tens or even hundreds of W/m2 deviations for global radiation at a given output time. Moreover, there are days with no difference at all.
Keeping all settings the same, the below figure shows SWDNB calculated using 72 cores (left) and 36 cores (right):
The difference field:
Simulation details:
We had produced a 50 km resolution continuous 5-year simulation using ERA5 as ICBC and then further downscaled it with ndown.exe to a 10 km grid, using daily reinitialization (from 18 UTC the previous day to 00 UTC at the end of the given day). I attached an example namelist.input to the post.
It might be important to note that we use adaptive time step with the corrected adapt_timestep_em.F that was introduced in WRF4.3.
Is this behavior normal? Any insight?
Best regards,
Akos
View attachment namelist.input