Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

WRF simulation speed common sense

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

lslrsgis

Member
Dear WRF community,

I am writing for a question on WRF long-term simulation.

I am trying to do simulation for ~30 years. The problem I met is the speed. Grid numbers are set as 63*92 for d01, 97*133 for d02, and time_steps=180. (namelist.input for 1y attached)

Currently, I am using WRF ver 4.0.3 compiled:
(1) with dmpar (34) and basic netsting (1) configuration,
(2) with GNU compiler (gfortran/gcc),
(3) on a server x86_64 Linux, Red Hat System, with 32 Intel cores used,
(4) with mpich for parallelism.

Averagely, one year simulation would cost ~30H. That is to say, ~900H (37.5d) for one round simulation for ~30y. I am not sure it is normal or not. Should I (1) optimize my configuration, or (2) use a cluster with many CPU nodes?

How long would it take for a simulation similar as my case on a cluster in NCAR (many cores used)?

Thanks very much. Any response would be appreciated.

Siliang
 

Attachments

  • namelist.input.02.txt
    3.6 KB · Views: 55
Hi,
Unless you specifically turned off optimization when you compiled, then you likely already compiled with optimization on. Unfortunately those times seem pretty reasonable. You could try having your restart files written less often, and perhaps the same for the history files (depending on how often you need to look at those), as the model slows down a bit each time it has to write-out a file. Another test would be to try actually decreasing the number of processors you are using. I'm not sure this would make much of a difference, but your domains are so small that you may get better timing with, say 16, processors instead. You could run for a short period of time to compare the time it takes with 32 vs. 16.

As a side note, you may want to consider making your domains a little larger. This won't improve the timing issue, but we never recommend having domains any smaller than 100x100, as there isn't enough space in the domain to allow for realistic changes to take place before all the "weather" has exited out the other side. You aren't that far from 100x100, though, so if you have looked at your results and you are satisfied with them, then you should carry on as you're doing. I would just recommend stopping and taking a look at your output from time to time, as you wouldn't want to run the full 30 years and then realize you should have increased the domain sizes.
 
Top