Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

ndown corrupted outputs

Arty

Member
Hello,

This post is a follow-up on that one.

Despite no trace of errors in rsl* files from ndown.exe, it appears that its outputs are corrupted on some variables (PSFC & other pressures, Qvapor ...) but not on some others (T2, TSK, SST, U10, V10 - which unfortunately I focused on at first glance). To illustrate the problem, I attached screenshots from ncview, respectively of the wrfndi_d02 file (input to ndown) and wrfinput_d02 file (output of ndown) for the PSFC variable :

Capture d’écran du 2023-02-28 10-20-56.png
Capture d’écran du 2023-02-28 10-22-18.png

As one can see, the values of pressure in the ndown.exe output are far off and strangely structured.

The ndown input files - i.e. previous run's wrfout* - are all fine also :

Capture d’écran du 2023-02-28 10-46-35.png

I'm still working on switching from MODIS to USGS... But is there anything else in the namelist, or in the variables list extracted from wrfinput and wrfout from previous coarse run (see attached files), which could be related to this problem ?

Here's a (filtered) diff between the variables lists :
diff wrfinput_variables_list.txt wrfoutput_variables_list.txt | grep -v 'CEN\|DX\|DY\|WEST\|SOUTH\|LAT\|DZ\|I_P\|J_P\|STAND\|DT'
1,3c1,11
< :AERCU_FCT = 1.f ;
< :AERCU_OPT = 0 ;
< :AUTO_LEVELS_OPT = 2 ;
---
> :AER_ANGEXP_OPT = 1 ;
> :AER_ANGEXP_VAL = 1.3f ;
> :AER_AOD550_OPT = 1 ;
> :AER_AOD550_VAL = 0.12f ;
> :AER_ASY_OPT = 1 ;
> :AER_ASY_VAL = 0.f ;
> :AER_OPT = 0 ;
> :AER_SSA_OPT = 1 ;
> :AER_SSA_VAL = 0.f ;
> :AER_TYPE = 1 ;
10,11c18,22
---
> :BUCKET_J = -1.f ;
> :BUCKET_MM = -1.f ;
15,16c26,28
< :DIFF_6TH_SLOPEOPT = 0 ;
< :DIFF_6TH_THRESH = 0.1f ;
---
> :DFI_OPT = 0 ;
> :DIFF_6TH_FACTOR = 0.12f ;
> :DIFF_6TH_OPT = 0 ;
18,24c30,33
< :ETAC = 0.f ;
---
> :FEEDBACK = 1 ;
28c37,38
< :GMT = 6.f ;
---
> :GMT = 0.f ;
> :GRAV_SETTLING = 0 ;
33,34d42
< :GWD_OPT = 0 ;
< :HYBRID_OPT = 0 ;
36,39c44,51
< :IDEAL_CASE = 0 ;
< :ISICE = 15 ;
< :ISLAKE = 21 ;
---
> :ICLOUD = 1 ;
> :ICLOUD_CU = 0 ;
> :ISFFLX = 1 ;
> :ISFTCFLX = 1 ;
> :ISHALLOW = 1 ;
> :ISICE = 24 ;
> :ISLAKE = -1 ;
41,44c53,56
< :ISURBAN = 13 ;
< :ISWATER = 17 ;
< :JULDAY = 33 ;
---
> :ISURBAN = 1 ;
> :ISWATER = 16 ;
> :JULDAY = 1 ;
51,52c63,66
< :MMINLU = "MODIFIED_IGBP_MODIS_NOAH" ;
---
> :MFSHCONV = 0 ;
> :MMINLU = "USGS" ;
> :MOIST_ADV_OPT = 1 ;
54,55c68,70
< :NUM_LAND_CAT = 21 ;
< :pARENT_GRID_RATIO = 3 ;
---
> :NUM_LAND_CAT = 24 ;
> :OBS_NUDGE_OPT = 0 ;
> :pARENT_GRID_RATIO = 5 ;
58a74,75
60a78,79
> :SCALAR_ADV_OPT = 2 ;
> :SCALAR_PBLMIX = 0 ;
64d82
< :SF_SURFACE_MOSAIC = 0 ;
69,73c87,92
---
> :SHCU_PHYSICS = 2 ;
> :SMOOTH_OPTION = 2 ;
77,78c96,98
< :START_DATE = "2005-02-02_06:00:00" ;
---
> :START_DATE = "2005-01-01_00:00:00" ;
> :STOCH_FORCE_OPT = 0 ;
80,81c100,105
< :TITLE = " OUTPUT FROM REAL_EM V4.2.1 PREPROCESSOR" ;
---
> :SWINT_OPT = 0 ;
> :SWRAD_SCAT = 1.f ;
> :TITLE = " OUTPUT FROM WRF V3.6.1 MODEL" ;
> :TKE_ADV_OPT = 2 ;
> :TRACER_PBLMIX = 1 ;
83,88c107,110
< :USE_MAXW_LEVEL = 0 ;
< :USE_THETA_M = 0 ;
< :USE_TROP_LEVEL = 0 ;
---
> :W_DAMPING = 0 ;

FYI : I ran successfully the whole process (WPS/real/ndown/WRF) with another test-configuration based only on CFSR reanalisys, with the exact same namelist parameters for coarse and fine run, as well as same WRF version. So the problem seems to come up within ndown due to some incompatibilities between old coarse run's wrfout* (WRF 3.6.1) and my actual configuration (WRF 4.2.1).
 

Attachments

  • namelist.input.real.txt
    8.8 KB · Views: 6
  • wrfinput_variables_list.txt
    2.3 KB · Views: 3
  • wrfout_variables_list.txt
    2.6 KB · Views: 2
Last edited:
Hello,

I added 3 of the wrfout files I'm using to run ndown with. I uploaded them on the server : wrfout_d02_2005-02-02.nc to *04.nc and I would really appreciate if tests could be run on another setup than mine and above all by someone more experienced. In this post I also attached the original namelist for this domain ; the namelist I use for my config is attached in the previous post. I use 38-levels CFSR data to run WPS/REAL. If needed, I can also upload geogrid et met_em files.

Thanks for your help.
 

Attachments

  • namelist.base.atm_d02.txt
    7.6 KB · Views: 2
Hi,
Can you also upload all the following?
namelist.wps (for both domains, if you have them)
met_em* files (for both domain, if you have them)

Thanks!
 
Hi,
Can you also upload all the following?
namelist.wps (for both domains, if you have them)
met_em* files (for both domain, if you have them)

Thanks!
Hi,

I uploaded 2 tar files on the cloud : met_em(2).tar and namelists.wps.tar .

Thank you
 
Thank you for sending everything. I was able to test your case with your files and I don't seem to be having the trouble you are. I'm attaching all the namelists I used and will explain how I did everything. I used WRFv4.4.2.

1. Renamed your wrfout* files to the format expected by ndown and to the correct starting date and from d02 to d01. Put these files in my WRF/test/em_real directory.
Code:
mv wrfout_d02_2005-02-02.nc wrfout_d01.2005-02-02_06:00:00
mv wrfout_d02_2005-02-03.nc wrfout_d01_2005-02-03_06:00:00
mv wrfout_d02_2005-02-04.nc wrfout_d01_2005-02-04_06:00:00

2. Linked your met_em* files to the WRF/test/em_real directory and ran real with 18 processors (using namelist.input.real -> named to namelist.input for this part, of course)

3. Renamed wrfinput_d02 to wrfndi_d02
Code:
mv wrfinput_d02 wrfndi_d02

4. Ran ndown (using wrfndi_d02 and the renamed wrfout_d01* files you shared) with 36 processors with namelist.input.ndown

5. Renamed wrfinput/wrfbdy/wrflowinp_d02 files to _d01
Makefile:
mv wrfbdy_d02 wrfbdy_d01
mv wrfinput_d02 wrfinput_d01
mv wrflowinp_d02 wrflowinp_d01

6. Ran wrf.exe for nested domain after modifying namelist to move everything for d02 over to d01. Ran with 144 processors, using namelist.input.wrfd02


I took a look at the wrfinput_d02 file (the output from ndown.exe - renamed to wrfinput_d01) and PSFC looked fine. See the attached screenshot.
 

Attachments

  • namelist.input.real.txt
    8.7 KB · Views: 7
  • namelist.input.ndown.txt
    8.7 KB · Views: 11
  • namelist.input.wrfd02.txt
    8.7 KB · Views: 6
  • psfc_wrfinputd02.png
    psfc_wrfinputd02.png
    699.5 KB · Views: 8
Thank you, that helps. First of all, it confirms I don't need the exact same namelist to make it work properly ; hence I may change from ZM cu_physcics to something else to prevent the ZM_CONV IENTROPY error I'm getting with v3.6.1. With your results and other suggestions I got from a pair, I've got many things to try out.
maybe I'll come back with good news :)
 
Thank you, that helps. First of all, it confirms I don't need the exact same namelist to make it work properly ; hence I may change from ZM cu_physcics to something else to prevent the ZM_CONV IENTROPY error I'm getting with v3.6.1. With your results and other suggestions I got from a pair, I've got many things to try out.
maybe I'll come back with good news :)
I finally succeeded having a working configuration, over all steps. I tried many, only one worked. See table attached :
1680730942246.png
I didn't yet deepen the problem for the ndown corrupted output with full CFSR forcing ; I'm focusing on the configuration that works (at last). Nevertheless, I suppose it comes from one or more variable in the namelist, because I tweaked it a lot. In the table hereunder, I listed the problems I noticed, notably the corrupted variables (only zero values / no strange pattern like before under 4.2.1) :
1680731215607.png
I'll try other physics option later with the working config. For now I got another major problem, but I'll do another dedicated thread for that.
 
That's at least some great news! Thanks for the update and the detailed information about your tests.
 
Top