Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Help & tips using ndown

Arty

Member
Hello to the community,

First, I hope I posted this thread in the right section, as ndown is (to me) kind of inbetween WPS & WRF. Then, I beg for your indulgence as I'm new to WRF and its whole environment (i.e. Linux, supercomputing) ; but as you all I bet, passion got me here, willing to learn. This post will expose my project and may contain a lot of different questions which I may assign a number to in order to keep track.
I'd also like to thank you in advance for your help and advices.

Context :
High resolution modelisation over a tropical island in the center of South Pacific. Downscaling from previous WRF simulation (wrfouts) d01 at 21km (33 vert. levels) with ndown, which will serve as input to a suite of nested domains d02 at 7km (ratio : 3) and finally d03 at 2,33 km (ratio : 3) - vertical refinement with a factor 2 is not excluded yet. The previous WRF simulation already was a downscaling, hence let's call the "original" wrfouts d02* (star) - d02* = d01.
I edited my namelist in order to have 3 domains.

d01 is set exactly as d02* i.e. : dx = dy = 21 ; 441 x 166 grid points ; 33 vert. levels
d02 is set this way : dx = dy = 7 ; 151 x 151 grid points ; 33 vert. levels (or possibly 66 if refinement) ; i_start : 310 ; j_start : 66
d03 is set as : dx = dy = 2,33 ; 151 x 151 grid points ; 33 vert. levels (or possibly 66 if refinement) ; i_start : 51 ; j_start : 51

Capture d’écran du 2022-11-29 22-38-03.png
Figure 1 : screenshot from WRF Domain Wizard. Domains d02 and d03 center are superimposed.

For the time period, I chose 4 days to run first tests.

As I understood what I read from the User's Guide and tutorial, I intend to run ndown with the wrfouts from original simulation d02* (i.e. my d01) with a refinement ratio of 3 to prepare input files in order to run with WRF my nested domains d02 and d03. For now, I identified the d02* wrfout files that I'll use, placed them in appropriate directory and renamed them wrfout.d01 ; but I can't yet figure out some points to go further.

Questions :

0) Are my explanations clear enough ?

1) Does my config look appropriate, i.e. is it compatible to use ndown to force 2 nested domains in WRF ?
2) Why aren't he wrfouts from d02* the only input files I need to run ndown ?
I don't have (yet..?) the specified intermediate files (FILE) in ndown tutorial Step 1.2.1 ; moreover I don't see the point of creating met_em from these files as I already got d02* wrfouts meteorological data​

3) What if I don't mange to get the intermediate files of the previous simulation ?

That's all for the moment. This might be a long run thread where I'll post further questions while progressing.
Thanks for your help one more time.
 
Hi Arty!
I also think of myself as a beginner in WRF, but I might know a few things that could help.

I am currently studyng how to run WRF-LES, and I saw that in that procedure ndown.exe must be used. WRF-LES is used for resolutions greater than 200m (dx<<1km), so I'm not sure why you would use ndown. The procedure to run nested domains with the resolutions you are using is simpler! This link: Compiling WRF , (in the section called "Run WPS and WRF") explains how to simulate with configurations like yours. It is pretty much like running a simulation with one domain.
Regarding your configuration I have a question: why dx/dy equal to 21km on the parent domain? why not 25km (with a ratio=5km) so you can have integer resolutions (but thats me). Also, it is recommended to place your nested domains closer to the center, as information from the boundary conditions gets more reliable far from the edges of the parent domain.

If I misunderstood what you are doing, i'm sorry, english is not my mother tongue.
If I didn't misunderstood and you need more help, just ask :).
ALSO: thanks for the tutorial link you placed on your thread! I needed more information of how to run WRF-LES.
 
Last edited:
Hi Arty!
I also think of myself as a beginner in WRF, but I might know a few things that could help.

I am currently studyng how to run WRF-LES, and I saw that in that procedure ndown.exe must be used. WRF-LES is used for resolutions greater than 200m (dx<<1km), so I'm not sure why you would use ndown. The procedure to run nested domains with the resolutions you are using is simpler! This link: Compiling WRF , (in the section called "Run WPS and WRF") explains how to simulate with configurations like yours. It is pretty much like running a simulation with one domain.
Regarding your configuration I have a question: why dx/dy equal to 21km on the parent domain? why not 25km (with a ratio=5km) so you can have integer resolutions (but thats me). Also, it is recommended to place your nested domains closer to the center, as information from the boundary conditions gets more reliable far from the edges of the parent domain.

If I misunderstood what you are doing, i'm sorry, english is not my mother tongue.
If I didn't misunderstood and you need more help, just ask :).
ALSO: thanks for the tutorial link you placed on your thread! I needed more information of how to run WRF-LES.

Hi Natalia, thanks for your message.

I've got no choice but to use the 21km domain ("original" d02* equivalent to my d01) as the data are already produced from previous work, on which my own work is based. Ndown is used specially for this kind of simulation (dynamical downscaling using previous wrfouts as inputs).
That also explains why my domain d02 and d03 are not centered on d01 : they're centered on the zone I'm interested in ; and I can't move the grid parameters of d01 (= d02* which already exists). I took also care to account for buffer between domains and other computing constraints.
 
Hi,
I believe the only way to use wrfout* files as input to WRF is to use the UPP utility to preprocess the data. They have an option to convert the files to GRIB1 data to be used during the WPS process. Take a look at this post and scroll to the bottom, where the UPP support person has provided a link to information about that process.
 
Hi Natalia, thanks for your message.

I've got no choice but to use the 21km domain ("original" d02* equivalent to my d01) as the data are already produced from previous work, on which my own work is based. Ndown is used specially for this kind of simulation (dynamical downscaling using previous wrfouts as inputs).
That also explains why my domain d02 and d03 are not centered on d01 : they're centered on the zone I'm interested in ; and I can't move the grid parameters of d01 (= d02* which already exists). I took also care to account for buffer between domains and other computing constraints.
Thanks for your answer! I did not know that :)
 
Hi,
I believe the only way to use wrfout* files as input to WRF is to use the UPP utility to preprocess the data. They have an option to convert the files to GRIB1 data to be used during the WPS process. Take a look at this post and scroll to the bottom, where the UPP support person has provided a link to information about that process.
Hi,
I didn't know of the UPP option but I'm a bit surprised not to be recommended to use ndown : is there any specific reason ?
Also, do the input files for WPS need to be GRIB1, or GRIB2 is OK ? I read UPP outputs GRIB2 by default and WRF User Guide chap. 3 seems to validate usage of GRIB2.
 
@Arty
Grib2 is perfectly fine too. I thought that I read that the UPP output will be in Grib1, but I suppose I read incorrectly. Once you have Grib data, you'll run through WPS, as you would for any other case. You can set your domain over the area you actually are interested in, and shouldn't need to use the outer large domain that covers the data you have. You will use the met_em* files to then run real and wrf. The only reason you would need to use ndown is if the size of your domains are constraining the number of processors you are able to use (see this FAQ, if you haven't already).
 
@kwerner
Thank you. I had a look at UPP and I might use this solution in the end but as I already started a config intended to be used via ndown, I'd like to go through the full process of it now to learn and debug errors.
Nevertheless, would you think the use of UPP would be less time/computational consuming that using ndown ? In the end, the results are almost the same : data are interpolated/formated to fit input requirement of WRF.
Also, I don't get your remark on the number of processor constraint ? Could you please explain further ?
 
As I previously stated, I think the only way to use wrfout* data as input to WRF is to go through the UPP process, so if those are the data files you must use, then I think you will have to do that, unfortunately.

Regarding the number of processors, the reasonable number to use for each domain is based on the size of each domain (e_we and e_sn). If one domain needs a lot of processors to run because it's so large, but another domain is unable to run with that many processors because it's much smaller, then you cannot run the domains together as a simultaneous nest. The FAQ link I shared above explains the concept in detail.
 
As I previously stated, I think the only way to use wrfout* data as input to WRF is to go through the UPP process, so if those are the data files you must use, then I think you will have to do that, unfortunately.

Regarding the number of processors, the reasonable number to use for each domain is based on the size of each domain (e_we and e_sn). If one domain needs a lot of processors to run because it's so large, but another domain is unable to run with that many processors because it's much smaller, then you cannot run the domains together as a simultaneous nest. The FAQ link I shared above explains the concept in detail.

Sorry to bring up an old post but I noticed a few things

Code:
io_form_history

2

I/O format of history output file(s);

=2 : netCDF

=102 : split netCDF files, one per processor; must restart with same number of processors

= 1 : binary format; note: no supported post-processing software available

= 4 : PHDF5; note: no supported post- processing software available

= 5 : GRIB1

= 10 : GRIB2

= 11 : parallel netCDF

single entry


Code:
io_form_history                      = 10,                                       ! format of history files
io_form_restart                      = 10,                                       ! format of restart files
io_form_input                        = 10,                                       ! format of input files
io_form_boundary                     = 10,                                       ! format of boundary files

Would changing all the io_forms for Grib2 mean that WRF can run on its own output?
 
Top