Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

MPAS run model segmentation fault in 48km global run, but OK in the 240 km

medeirosj

New member
Dear all,
I just attended the MPAS training in St Andrews,

I have installed MPAS in JASMIN, and I have run MPAS with CMIP6 data for a 240km global grid experiment,
The 240 km run was successful, all stages completed, and I got the history and diag files,

The 48km run, with the same forcing data, have not, only the static.nc and ini.nc runs were successful, the actual model run fails, I tried several config_dt and several radiation dt as well,

I am attaching the tar.gz files for both runs, with logs and namelists for each individual run,

For reference, the forcing CMIP6 data is

cdo sinfo zg_6hrPlevPt_MPI-ESM1-2-HR_ssp585_r1i1p1f1_gn_210001020000-210001020600.nc
File format : NetCDF4 zip
-1 : Institut Source T Steptype Levels Num Points Num Dtype : Parameter ID
1 : unknown MPI-ESM1.2-HR v instant 28 1 73728 1 F32z : -1
Grid coordinates :
1 : gaussian : points=73728 (384x192) F96
lon : 0 to 359.0625 by 0.9375 degrees_east circular
lat : -89.28423 to 89.28423 degrees_north
available : cellbounds
Vertical coordinates :
1 : pressure : levels=28
plev : 100000 to 5000 Pa
Time coordinate :
time : 2 steps
RefTime = 1850-01-01 00:00:00 Units = days Calendar = proleptic_gregorian
YYYY-MM-DD hh:mm:ss YYYY-MM-DD hh:mm:ss YYYY-MM-DD hh:mm:ss YYYY-MM-DD hh:mm:ss
2100-01-02 00:00:00 2100-01-02 06:00:00
cdo sinfo: Processed 1 variable over 2 timesteps [0.05s 50MB]


Any advise is much appreciated
Thanks
Kind Regards
Joana
 

Attachments

  • 48_km.tar.gz
    135.8 KB · Views: 1
  • 240_km.tar.gz
    11.2 KB · Views: 0
Since you've had to lower your model top from 30 km to 19 km to accommodate the CMIP6 input data, it's possible that there are issues with the vertical grid that are leading to model failures. Although it might not produce an optimal distribution of vertical layer heights, I think it would first be worth trying to reduce the number of vertical layers from 55 to, say, 45. All that's needed is to set
config_nvertlevels = 45
in the &dimensions namelist group in the namelist.init_atmosphere file when generating the vertical grid (i.e., when config_vertical_grid = true in the &preproc_stages namelist group).

Additionally, I think it may be helpful to reduce the starting height of the Rayleigh damping of vertical velocity with the config_zd namelist option in the &damping namelist group in the namelist.atmosphere file. By default, this is set to 22000 m for an assumed model top of 30 km, so as a first guess you could try setting
config_zd = 11000.0
If reducing the number of vertical layers to 45 and starting the w-damping at 11 km allows the 48-km simulation to run stably, there may be some fine tuning of the distribution of levels that could be considered afterward.
 
Top