Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Empty wrfout files in inner domain


I am running a one-way simulation in Spain with a 5 km domain and an inner domain with 1 km resolution. The bigger domain worked fine and wrfout files seem correct.
However, after running ndown and so on, my wrfout files from the inner domain are totally empty (except for the first one).
I am using adaptive time step, and have tried to lower the min_time_step, but nothing seems to change after running wrf.exe again (but not real.exe, or do I need to run it again too?). I have attached error and namelist files for both domains in case there is any inconsistency. I can't attach wrfbdy file because it's too large.
Could it be related to d02 size or position with respect to d01? For example because there is not enough space between both boundaries.
Hope someone can help,
thank you.


  • wrf1km.error.0000
    1.7 MB · Views: 2
  • wrf5km.error.0000
    303.6 KB · Views: 0
  • namelist-d1km.input.txt.txt
    5.6 KB · Views: 4
  • namelist-d5km.input.txt.txt
    6.2 KB · Views: 2
Last edited:
I looked at your wrf1km.error.0000. At the end of this file, the message is like:

Timing for Writing wrfout_d01_2023-06-15_10:00:00 for domain 1: 3.63829 elapsed seconds
Timing for Writing restart for domain 1: 13.12384 elapsed seconds

This indicates that the ndown case is done successfully.

Your namelist.input also looks fine.

Note that when you run ndown, although it is a 'nested' case, it is run as a single-domain case. The wrfout file is named as wrfout_d01.

Can you look at your wrfout files and check whether they are actually output for the ndown case?
Yes, everything you said is properly configured. I have tried changing my d02 position with respect to d01. With a more centered domain everything works fine, so I assume there weren't enough margins between domains. Is there any tip to know the minimum distance between domains in all directions?
Sorry if I wasn't clear enough. As it turned out, my model outputs empty wrfout_d02 files when there isn't enough space between d01 and d02. It needs bigger boundaries to avoid inconsistencies. How can I know where to place d02 with respect to d01 to make sure this doesn't happen? I have been modifying the size and position of the domains through trial and error, but perhaps there is a calculation that can facilitate this definition.
I would suggest that you set a large enough parent domain, then the child domain can occupy 1/3 of the parent domain size in each direction.

For example, if your parent domain has 301 grids in eat-west and north-south direction, then your child domain could be located between 100-200 grids of the parent in each direction.
I was able to run the model with d02 centered in the western part of Spain, but the same configuration of domains centered in the eastern part of the country gives me problems. At some point of the simulation the model decreases the time-step and when it can't lower it anymore, wrfout files start to come out empty. I have tried lowering the minimum time step and using a bigger d01 domain but nothing seems to work.
Could it be related to physics option? I assumed not as the same configuration worked in another part of the country, but I am really running out of options...


  • namelist.input-03-catctr.txt
    6.2 KB · Views: 1
  • namelist.wps-catctr.txt
    1.2 KB · Views: 1
Can you find error messages in your rsl files? With empty wrfout_d02, did the model run to the end? How about the result shown in wrfout_d01, reasonable or not?
Wrfout_d01 files seem fine, and yes, even when the model starts to output empty wrfout_d02 files it runs until the end with no error. In the rsl.error files there is no error. However I can find the following lines at the same time-step when the wrfout files start to come out empty:


That's why I assumed it is related to some model instabilities as it happens almost everytime but not in the same wrfout, and the same domain size is run in another place without problems.
parent_grid_ratio = 1, 5,
parent_time_step_ratio = 1, 6,
feedback = 1,
smooth_option = 0,
smooth_cg_topo = .true.

--- best practice is to have both ratios same
--- odd grid ratio is preferred especially when feedback is "on" -- maybe this rule is not applicable for timestep, but still worth checking
Finally I could make my run work. I found it was a topography problem, since with the higher resolution sharp slopes where found and made the model crash. I set a bigger epssm parameter in the namelist for d02 and it worked.
Another suggestion was..

- slope_rad = 1: Slope and shading effects. The option modifies surface solar radiation flux according to terrain slope. topo_shading = 1 allows for shadowing of neighboring grid cells. Use only with high-resolution runs with grid size less than a few kilometers.

Glad the configuration works.