NaNs at Timestep 0 for 60-km – 15-km mesh (Tropical refinement) with Noah-MP Longwave error

gokulvish

New member
Hi everyone,

I am currently running MPAS-A v8.3.1 for a 2-year global run using ERA5 as the Initial Condition. I downloaded ERA5 (from the Copernicus CDS) in GRIB format and used WPS to make the intermediate files. For SST and sea-ice, I am using the same ERA5 dataset (in NetCDF format) 6-hourly and created the intermediates using the pywinter Python package.

This is my first attempt using this IC and lower boundary condition setup for a variable resolution mesh (15km tropical refinement: x4.1572866.grid.nc). This exact same methodology, with the same IC and LBC sources, worked perfectly well for a long-term run on a 120km quasi-uniform mesh.

These are the steps I followed:

  1. Made the static file using the downloaded static terrestrial data, modified config_geog_path, and successfully created the static file.
  2. Created the init file with the right prefix, and similarly created the lower boundary update file (SST).
  3. Ran the atmosphere_model.
The Error:The model crashes almost immediately at 00:00:00 (or within the first 10 seconds). Inside log.atmosphere.0000.out, the vertical and horizontal winds instantly blow up into NaNs:


global min, max w NaN NaN
global min, max u NaN NaN

Because the atmosphere instantly generates NaN air temperatures and winds, it passes these down to the Land Surface Model. Noah-MP (running with config_noahmp_iopt_dveg = 1 (tried with 4 as well, the default)) before halting:


STOP Error: Longwave radiation budget problem in NoahMP LSM
STOP
emitted longwave <0; skin T may be wrong due to inconsistent
input of VegFracGreen with LeafAreaIndex

(Note: I checked init.nc with ncdump and there are no NaNs in the initial fields like theta or w, so the initialization file itself is not corrupted. The NaNs are generated dynamically right at startup).
Could somebody from the team help me with this ? I even tried with NOAH as LSM, it still failed. Any help would be appreciated. Thanks very much.
 
The model crashed immediately after it started, indicating that the input data is wrong.

I am aware of some issues when using ERA5 data from Copernicus. This is why we create a python script specifically for processing ERA5 data.

Would you please download ERA5 data from NCAR RDA (ds630000 and ds630006), then run the script we create ?

The code can be downloaded from GitHub - NCAR/era5_to_int: A simple Python script for converting ERA5 model-level netCDF files to the WPS intermediate format. Please follow the instruction to run this script and create the input data for MPAS.

Let me know if you have any issues.
 
@gokulvish Could you also copy-paste the contents of your namelist.atmosphere file into a new post (the uploading of attachments is currently not working on the forum)?
 
Hi Ming,
Thanks for getting back and helping me out with this. I will follow your steps and will get back to you if that resolves the issue.
But I am just a little curious if its actually a spatial resolution of ERA-5 that is causing the model to fail, because when I tried running the quasiuniform meshes with 120km, 60km and 48km (which is all coarser meshes than the actual ERA-5 resoluton), all downloaded from copernius and ungribbed using WPS, it runs smoothly (same physics settings and all, just a change in the mesh resolution). The issue occurs when I use a Variable mesh, like 92-25, or, as in this case, the tropical refinement, with the same IC and LBC. Anyway, I will give your method a try and get back to you on this. @mgduda, I will paste the contents of my namelist.atmosphere into a new post. Thanks for the help.
 
Here is the namelist.atmosphere -
&nhyd_model
config_time_integration_order = 2
config_dt = 10.0 # Have played with this value starting from 60 to all the way to 10, but it seems like it is not a CFL issue
config_start_time = ' 2009-11-01_00:00:00'
config_run_duration = '2_00:00:00'
config_split_dynamics_transport = true
config_number_of_sub_steps = 10 # Have tweaked this value as well
config_dynamics_split_steps = 3
config_horiz_mixing = '2d_smagorinsky'
config_visc4_2dsmag = 0.05 # tweaked this one as well
config_scalar_advection = true
config_monotonic = true
config_coef_3rd_order = 0.25
config_epssm = 0.9
config_smdiv = 0.3
/
&damping
config_zd = 22000.0
config_xnutr = 0.2
/
&limited_area
config_apply_lbcs = false
/
&io
config_pio_num_iotasks = 64
config_pio_stride = 16
/
&decomposition
config_block_decomp_file_prefix = 'x4.1572866.graph.info.part.'
/
&restart
config_do_restart = false
/
&printout
config_print_global_minmax_vel = true
config_print_detailed_minmax_vel = false
/
&IAU
config_IAU_option = 'off'
config_IAU_window_length_s = 21600.
/
&physics
config_sst_update = true # intend to run a long climate run, hence true, but now testing it for only a couple of days
config_sstdiurn_update = false
config_deepsoiltemp_update = false
config_radtlw_interval = '00:30:00'
config_radtsw_interval = '00:30:00'
config_bucket_update = 'none'
config_physics_suite = 'mesoscale_reference'
config_lsm_scheme = 'sf_noahmp'

/
&soundings
config_sounding_interval = 'none'
/
&physics_lsm_noahmp
config_noahmp_iopt_dveg = 1 # this was 4 before, but is it worth tweaking around
config_noahmp_iopt_crs = 1
config_noahmp_iopt_btr = 1
config_noahmp_iopt_runsrf = 3
config_noahmp_iopt_runsub = 3
config_noahmp_iopt_sfc = 1
config_noahmp_iopt_frz = 1
config_noahmp_iopt_inf = 1
config_noahmp_iopt_rad = 3
config_noahmp_iopt_alb = 1
config_noahmp_iopt_snf = 1
config_noahmp_iopt_tksno = 1
config_noahmp_iopt_tbot = 2
config_noahmp_iopt_stc = 1
config_noahmp_iopt_gla = 1
config_noahmp_iopt_rsf = 4
config_noahmp_iopt_soil = 1
config_noahmp_iopt_pedo = 1
config_noahmp_iopt_crop = 0
config_noahmp_iopt_irr = 0
config_noahmp_iopt_irrm = 0
config_noahmp_iopt_infdv = 1
config_noahmp_iopt_tdrn = 0
/
 
Thank you for giving me more details about the ERA5 implementation.

To address your question about whether the ERA resolution caused model crash, the answer is no. I don't think resolution is an issue here.

It is weird that the ERA5 works for coarse resolution MPAS initialization, but not for variable reoslution case. Would you please upload a single ERA5 file from copernius and the corresponding intermediate file for me to take a look?

Because of some web issue, files cannot be uploaded to the Forum. It is better that you share your data with me in the google drive. My email is chenming@ucar.edu.

Thank you.



But I am just a little curious if its actually a spatial resolution of ERA-5 that is causing the model to fail, because when I tried running the quasiuniform meshes with 120km, 60km and 48km (which is all coarser meshes than the actual ERA-5 resoluton), all downloaded from copernius and ungribbed using WPS, it runs smoothly (same physics settings and all, just a change in the mesh resolution). The issue occurs when I use a Variable mesh, like 92-25, or, as in this case, the tropical refinement, with the same IC and LBC. Anyway, I will give your method a try and get back to you on this. @mgduda, I will paste the contents of my namelist.atmosphere into a new post. Thanks for the help.
 
Hi Ming,
Thanks for getting back. I have emailed you with the information and files as requested.

Again, thanks very much for your help with this.
 
Hi Gokul,

Thank you for emailing all these files for me to take a look. In your email, you stated that " with a 48 km and 60 km quasi uniform mesh, it doesn't work with ERA5 inputs from Copernicus", --- this further confirms that your input data is problematic.

I checked your ERA5 intermediate files. They all look fine to me. For your data processing, it is correct to scale the seaice values to the 0-1 range.

However, I am concerned of the landsea mask in ERA5 data, which has fractional values instead of 1 or 0. This may cause issues later when the model tries to determine whether a point is water or land. This explains why your case crashed with problem in NoahMP LSM.

Basically I believe that your case is related to issues in the input data. The model itself should work as expected. Please try with the python script we provide. This script works well with ERA5 data.
 
Hi Ming,
Thanks for all your help and feedback on this. I downloaded the model-level and surface fields for ERA5 (LSM, SKIN, SST, CI, SP, Q, U, V, T) from the NCAR RDA website and created intermediates using the Python script. The script also calculated geopotential height at all model levels. However, the model still failed to run and crashed in the first timestep.

The log. atmosphere ----
--- subroutine MPAS_to_phys - pressure(1) < pressure(2):
i =1051
latCell=29.5900
lonCell=313.953
1 1051 1 46.9370 99679.8 NaN NaN NaN NaN NaN NaN
1 1051 2 59.3026 98958.7 NaN NaN NaN NaN NaN NaN

I have emailed you the new ERA5 intermediates and a sample SSTintermediates along with the namelist.init_atmosphere for condition 7. This appears to be an init file problem, but I'm not sure what it is. Something wrong with the lower-level fields during interpolation, or what? Not sure. Could you please help me with this?
 
Hi Gokul,

Before I further explore what is wrong in your case, can you clarify whether you run calc_ecmwf_p.exe to process ECMWF model level data?

Is there any special reason you have to use model lvele data? If you use pressure level data as input, then you can skip running calc_ecmwf_p.exe.
 
Hi Ming,
Thanks for getting back. Yes, i did do that for model level fields. These are the steps I followed -

1. In the current working directory, I have the required .ml, surface invariant files (LSM and Z) and the surface fields as well for Nov1,2009.
2. Before running the era5_to_int.py, for sst file, I blend skin temperature over land with SST over ocean using the LSM -> just to be on the safer side.
3. Then I run the era5_to_int.py program, but since LSM is availble for 1979 and invariant file containing Z is available for 2016, I changed the time variable to make sure it matches the other surface and .ml fields for the same timestep (Nov1, 2009 00Z). The output from this operation looks like - ERA5:2009-11-01_00.
4. For this ERA5:*, I implement the calc_ecmwf_p.exe to create pressure level fields--PRES:2009-11-01_00 .
5. For MPAS to run -it needs both 3D atmosphere (ERA5:) and pressure information (PRES:), so the only option i had was to cat ERA5: PRES: > MERGED:XXX. This is the prefix i use to run the namelist.init.atmosphere to make the init.nc file.
6. I follow the same steps for SST, SEAice, LSM to make SST:* for lower boudnary conditions.
For debugging options -I have tried multiple ways, lowering dt, including config_len_display = 15000, and tweaked the config_n_substeps as well and ran it for a few minutes and even a couple fo days but unformtunately, it hits the same error every time. I have emailed you the detailed log files, the interemdiates and the namelist for your kind reference.


I am not sure what exactly is causing the issue.

Thanks very much again.
 
Hi, Gokul,
Let's start with ERA5 pressure level data. With this data, I hope you’ll be able to get it working with MPAS. Below is a list of files required for running era5_to_int.py. Can you download these data from NCAR RDA, then run era5_to_int.py to create intermediate files? Note that the files are for the time of 2019-11-01_00, and you need to change the date based on your case.

1 /glade/campaign/collections/rda/data/d633006/e5.oper.invariant/e5.oper.invariant.128_129_z.regn320sc.2016010100_2016010100.nc


2 /glade/campaign/collections/rda/data/d633006/e5.oper.an.ml/201911/e5.oper.an.ml.128_134_sp.regn320sc.2019110100_2019110105.nc


3 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_151_msl.ll025sc.2019110100_2019113023.nc


4 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_141_sd.ll025sc.2019110100_2019113023.nc


5 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_033_rsn.ll025sc.2019110100_2019113023.nc


6 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_166_10v.ll025sc.2019110100_2019113023.nc


7 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_165_10u.ll025sc.2019110100_2019113023.nc


8 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_168_2d.ll025sc.2019110100_2019113023.nc


9 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_167_2t.ll025sc.2019110100_2019113023.nc


10 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_031_ci.ll025sc.2019110100_2019113023.nc


11 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_236_stl4.ll025sc.2019110100_2019113023.nc


12 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_183_stl3.ll025sc.2019110100_2019113023.nc


13 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_170_stl2.ll025sc.2019110100_2019113023.nc


14 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_139_stl1.ll025sc.2019110100_2019113023.nc


15 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_042_swvl4.ll025sc.2019110100_2019113023.nc


16 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_041_swvl3.ll025sc.2019110100_2019113023.nc


17 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_040_swvl2.ll025sc.2019110100_2019113023.nc


18 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_039_swvl1.ll025sc.2019110100_2019113023.nc


19 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_235_skt.ll025sc.2019110100_2019113023.nc


20 /glade/campaign/collections/rda/data/d633000/e5.oper.an.sfc/201911/e5.oper.an.sfc.128_034_sstk.ll025sc.2019110100_2019113023.nc


21 /glade/campaign/collections/rda/data/d633000/e5.oper.invariant/197901/e5.oper.invariant.128_172_lsm.ll025sc.1979010100_1979010100.nc


22 /glade/campaign/collections/rda/data/d633000/e5.oper.an.pl/201911/e5.oper.an.pl.128_131_u.ll025uv.2019110100_2019110123.nc


23 /glade/campaign/collections/rda/data/d633000/e5.oper.an.pl/201911/e5.oper.an.pl.128_130_t.ll025sc.2019110100_2019110123.nc


24 /glade/campaign/collections/rda/data/d633000/e5.oper.an.pl/201911/e5.oper.an.pl.128_133_q.ll025sc.2019110100_2019110123.nc


25 /glade/campaign/collections/rda/data/d633000/e5.oper.an.pl/201911/e5.oper.an.pl.128_129_z.ll025sc.2019110100_2019110123.nc


26 /glade/campaign/collections/rda/data/d633000/e5.oper.an.pl/201911/e5.oper.an.pl.128_132_v.ll025uv.2019110100_2019110123.nc


~
 
Thanks Ming. I followed your suggestion and used .pl files and invariant files (and also one.ml file for sp) which you listed above and created the intermediates using the era5_to_int.py. However, the model still crashed. Same error !!!!!!! A little frustrating.


Could you please confirm if people have used this custom made mesh (60-15 with tropical refinement) and have found success in running the model? I am sorry, but I havent seen many papers utilizing this custom mesh. Any leads would be appreciated. @mgduda @Ming Chen

Thanks, Gokul
 
Last edited:
Back
Top