NaNs at Timestep 0 for 60-km – 15-km mesh (Tropical refinement) with Noah-MP Longwave error

gokulvish

New member
Hi everyone,

I am currently running MPAS-A v8.3.1 for a 2-year global run using ERA5 as the Initial Condition. I downloaded ERA5 (from the Copernicus CDS) in GRIB format and used WPS to make the intermediate files. For SST and sea-ice, I am using the same ERA5 dataset (in NetCDF format) 6-hourly and created the intermediates using the pywinter Python package.

This is my first attempt using this IC and lower boundary condition setup for a variable resolution mesh (15km tropical refinement: x4.1572866.grid.nc). This exact same methodology, with the same IC and LBC sources, worked perfectly well for a long-term run on a 120km quasi-uniform mesh.

These are the steps I followed:

  1. Made the static file using the downloaded static terrestrial data, modified config_geog_path, and successfully created the static file.
  2. Created the init file with the right prefix, and similarly created the lower boundary update file (SST).
  3. Ran the atmosphere_model.
The Error:The model crashes almost immediately at 00:00:00 (or within the first 10 seconds). Inside log.atmosphere.0000.out, the vertical and horizontal winds instantly blow up into NaNs:


global min, max w NaN NaN
global min, max u NaN NaN

Because the atmosphere instantly generates NaN air temperatures and winds, it passes these down to the Land Surface Model. Noah-MP (running with config_noahmp_iopt_dveg = 1 (tried with 4 as well, the default)) before halting:


STOP Error: Longwave radiation budget problem in NoahMP LSM
STOP
emitted longwave <0; skin T may be wrong due to inconsistent
input of VegFracGreen with LeafAreaIndex

(Note: I checked init.nc with ncdump and there are no NaNs in the initial fields like theta or w, so the initialization file itself is not corrupted. The NaNs are generated dynamically right at startup).
Could somebody from the team help me with this ? I even tried with NOAH as LSM, it still failed. Any help would be appreciated. Thanks very much.
 
The model crashed immediately after it started, indicating that the input data is wrong.

I am aware of some issues when using ERA5 data from Copernicus. This is why we create a python script specifically for processing ERA5 data.

Would you please download ERA5 data from NCAR RDA (ds630000 and ds630006), then run the script we create ?

The code can be downloaded from GitHub - NCAR/era5_to_int: A simple Python script for converting ERA5 model-level netCDF files to the WPS intermediate format. Please follow the instruction to run this script and create the input data for MPAS.

Let me know if you have any issues.
 
@gokulvish Could you also copy-paste the contents of your namelist.atmosphere file into a new post (the uploading of attachments is currently not working on the forum)?
 
Hi Ming,
Thanks for getting back and helping me out with this. I will follow your steps and will get back to you if that resolves the issue.
But I am just a little curious if its actually a spatial resolution of ERA-5 that is causing the model to fail, because when I tried running the quasiuniform meshes with 120km, 60km and 48km (which is all coarser meshes than the actual ERA-5 resoluton), all downloaded from copernius and ungribbed using WPS, it runs smoothly (same physics settings and all, just a change in the mesh resolution). The issue occurs when I use a Variable mesh, like 92-25, or, as in this case, the tropical refinement, with the same IC and LBC. Anyway, I will give your method a try and get back to you on this. @mgduda, I will paste the contents of my namelist.atmosphere into a new post. Thanks for the help.
 
Here is the namelist.atmosphere -
&nhyd_model
config_time_integration_order = 2
config_dt = 10.0 # Have played with this value starting from 60 to all the way to 10, but it seems like it is not a CFL issue
config_start_time = ' 2009-11-01_00:00:00'
config_run_duration = '2_00:00:00'
config_split_dynamics_transport = true
config_number_of_sub_steps = 10 # Have tweaked this value as well
config_dynamics_split_steps = 3
config_horiz_mixing = '2d_smagorinsky'
config_visc4_2dsmag = 0.05 # tweaked this one as well
config_scalar_advection = true
config_monotonic = true
config_coef_3rd_order = 0.25
config_epssm = 0.9
config_smdiv = 0.3
/
&damping
config_zd = 22000.0
config_xnutr = 0.2
/
&limited_area
config_apply_lbcs = false
/
&io
config_pio_num_iotasks = 64
config_pio_stride = 16
/
&decomposition
config_block_decomp_file_prefix = 'x4.1572866.graph.info.part.'
/
&restart
config_do_restart = false
/
&printout
config_print_global_minmax_vel = true
config_print_detailed_minmax_vel = false
/
&IAU
config_IAU_option = 'off'
config_IAU_window_length_s = 21600.
/
&physics
config_sst_update = true # intend to run a long climate run, hence true, but now testing it for only a couple of days
config_sstdiurn_update = false
config_deepsoiltemp_update = false
config_radtlw_interval = '00:30:00'
config_radtsw_interval = '00:30:00'
config_bucket_update = 'none'
config_physics_suite = 'mesoscale_reference'
config_lsm_scheme = 'sf_noahmp'

/
&soundings
config_sounding_interval = 'none'
/
&physics_lsm_noahmp
config_noahmp_iopt_dveg = 1 # this was 4 before, but is it worth tweaking around
config_noahmp_iopt_crs = 1
config_noahmp_iopt_btr = 1
config_noahmp_iopt_runsrf = 3
config_noahmp_iopt_runsub = 3
config_noahmp_iopt_sfc = 1
config_noahmp_iopt_frz = 1
config_noahmp_iopt_inf = 1
config_noahmp_iopt_rad = 3
config_noahmp_iopt_alb = 1
config_noahmp_iopt_snf = 1
config_noahmp_iopt_tksno = 1
config_noahmp_iopt_tbot = 2
config_noahmp_iopt_stc = 1
config_noahmp_iopt_gla = 1
config_noahmp_iopt_rsf = 4
config_noahmp_iopt_soil = 1
config_noahmp_iopt_pedo = 1
config_noahmp_iopt_crop = 0
config_noahmp_iopt_irr = 0
config_noahmp_iopt_irrm = 0
config_noahmp_iopt_infdv = 1
config_noahmp_iopt_tdrn = 0
/
 
Thank you for giving me more details about the ERA5 implementation.

To address your question about whether the ERA resolution caused model crash, the answer is no. I don't think resolution is an issue here.

It is weird that the ERA5 works for coarse resolution MPAS initialization, but not for variable reoslution case. Would you please upload a single ERA5 file from copernius and the corresponding intermediate file for me to take a look?

Because of some web issue, files cannot be uploaded to the Forum. It is better that you share your data with me in the google drive. My email is chenming@ucar.edu.

Thank you.



But I am just a little curious if its actually a spatial resolution of ERA-5 that is causing the model to fail, because when I tried running the quasiuniform meshes with 120km, 60km and 48km (which is all coarser meshes than the actual ERA-5 resoluton), all downloaded from copernius and ungribbed using WPS, it runs smoothly (same physics settings and all, just a change in the mesh resolution). The issue occurs when I use a Variable mesh, like 92-25, or, as in this case, the tropical refinement, with the same IC and LBC. Anyway, I will give your method a try and get back to you on this. @mgduda, I will paste the contents of my namelist.atmosphere into a new post. Thanks for the help.
 
Hi Gokul,

Thank you for emailing all these files for me to take a look. In your email, you stated that " with a 48 km and 60 km quasi uniform mesh, it doesn't work with ERA5 inputs from Copernicus", --- this further confirms that your input data is problematic.

I checked your ERA5 intermediate files. They all look fine to me. For your data processing, it is correct to scale the seaice values to the 0-1 range.

However, I am concerned of the landsea mask in ERA5 data, which has fractional values instead of 1 or 0. This may cause issues later when the model tries to determine whether a point is water or land. This explains why your case crashed with problem in NoahMP LSM.

Basically I believe that your case is related to issues in the input data. The model itself should work as expected. Please try with the python script we provide. This script works well with ERA5 data.
 
Back
Top