Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Nested Domain Simulation

Aparna

New member
Hi all,


I am currently running the WRF model for the period April 2 to April 29, 2022, using three nested domains. The ./real.exe step completed successfully, and I have obtained the following input files: wrfbdy_d01, wrfinput_d01, wrfinput_d02, and wrfinput_d03.

d03 2022-04-02_00:00:00 module_io.F: in wrf_open_for_write_begin, FileName = wrfinput_d03
d03 2022-04-02_00:00:00 calling outsub in open_w_dataset
d03 2022-04-02_00:00:00 back from outsub in open_w_dataset
d03 2022-04-02_00:00:00 calling wrf_open_for_write_commit in open_w_dataset
d03 2022-04-02_00:00:00 Information: NOFILL being set for writing to wrfinput_d03
d03 2022-04-02_00:00:00 back from wrf_open_for_write_commit in open_w_dataset
d03 2022-04-02_00:00:00 Timing for output 0 s.
d03 2022-04-02_00:00:00 Timing for loop # 1 = 0 s.
d03 2022-04-02_06:00:00 backfrom med_sidata_input
d01 2022-04-29_00:00:00 real_em: SUCCESS COMPLETE REAL_EM INIT


However, when I run ./wrf.exe, the model generates the first output file for domain 1 but does not proceed to generate output for domain 2. The simulation seems to hang at that point without any error messages or crash — it just gets stuck.

d02 2022-04-02_00:00:00 calling inc/PERIOD_BDY_EM_CHEM_inline.inc
d02 2022-04-02_00:00:00 start_domain_em: Returning
d02 2022-04-02_00:00:00 open_hist_w : opening /nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrf_out_02_2022-04-02_00:00:00 for writing.
d02 2022-04-02_00:00:00 calling wrf_open_for_write_begin in open_w_dataset
d02 2022-04-02_00:00:00 module_io.F: in wrf_open_for_write_begin, FileName = /nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrf_out_02_2022-04-02_00:00:00
d02 2022-04-02_00:00:00 calling outsub in open_w_dataset


I have attached my namelist.input and rsl.error.* file for reference.

Additionally, when I run the model with only the first domain, the simulation runs smoothly, and outputs are generated without any issues.

Could you please help me identify what might be causing the issue with the nested domains?


Thank you for your support.


Best regards,
Aparna C
 

Attachments

  • namelist.input
    5.5 KB · Views: 3
  • rsl.error.0000
    54.8 KB · Views: 1
please set max_dom =3 and turn off cumulus scheme for D03. Also need to set radt = 27, 27, 27

Your previous single-domainrun hanged there, indicating that something went wrong in this case. Please double check all your rsl files and see whether you can find some error messages that are helpful for debuging...
 
I have made all the changes as suggested above to my namelist.input file. However, I am still encountering the same fatal error during execution. Below, I have included the updated namelist.input entries for reference.



&time_control
run_days = 0,
run_hours = 0,
run_minutes = 0,
run_seconds = 0,
start_year = 2022, 2022, 2022,
start_month = 04, 04, 04,
start_day = 02, 02, 02,
start_hour = 00, 00, 00,
end_year = 2022, 2022, 2022,
end_month = 04, 04, 04,
end_day = 29, 29, 29,
end_hour = 00, 00, 00,
interval_seconds = 21600,
input_from_file = .true.,.true., true.,
history_interval = 60, 60, 60,
frames_per_outfile = 24, 24, 24,
restart = .false.,
restart_interval = 7200,
!iofields_filename = "variable.txt",
!ignore_iofields_warning = .false.,
io_form_history = 2
io_form_restart = 2
io_form_input = 2
io_form_boundary = 2
debug_level = 100,
history_outname = '/nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrf_out_<domain>_<date>'
rst_inname = '/nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrfrst__d<domain>_<date>'
rst_outname = '/nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrfrst__d<domain>_<date>'
/

&domains
time_step = 120,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 3,
e_we = 140, 115, 52,
e_sn = 155, 121, 37,
e_vert = 35, 35, 35,
p_top_requested = 5000,
num_metgrid_levels = 34,
num_metgrid_soil_levels = 4,
dx = 27000,
dy = 27000,
grid_id = 1, 2, 3,
parent_id = 1, 1, 2,
i_parent_start = 1, 51, 34,
j_parent_start = 1, 99, 46,
parent_grid_ratio = 1, 3, 3,
parent_time_step_ratio = 1, 3, 3,
feedback = 1,
smooth_option = 0,
/

&physics
mp_physics = 10, 10, 10,
cu_physics = 3, 3, 0,
ra_lw_physics = 1, 1, 1,
ra_sw_physics = 2, 2, 2,
bl_pbl_physics = 1, 1, 1,
sf_sfclay_physics = 1, 1, 1,
sf_surface_physics = 2, 2, 2,
radt = 27, 27, 27,
bldt = 0, 0, 0,
cudt = 0, 0, 0,
icloud = 1,
num_land_cat = 21,
sf_urban_physics = 1, 1, 1,
!fractional_seaice = 1,
shcu_physics = 3,
progn = 1,
cu_rad_feedback = .true.,
cu_diag = 1, 1, 0,
cugd_avedx = 1,
mp_zero_out = 2,
mp_zero_out_thresh = 1.e-8,
sst_update = 0,
o3input = 0,
/


&dynamics
hybrid_opt = 2,
w_damping = 1,
diff_opt = 1, 1, 1,
km_opt = 4, 4, 4,
diff_6th_opt = 0, 0, 0,
diff_6th_factor = 0.12, 0.12, 0.12,
base_temp = 290.
damp_opt = 3,
zdamp = 5000., 5000., 5000.,
dampcoef = 0.2, 0.2, 0.2,
khdif = 0, 0, 0,
kvdif = 0, 0, 0,
non_hydrostatic = .true., .true., .true.,
moist_adv_opt = 2, 2, 2,
scalar_adv_opt = 2, 2, 2,
gwd_opt = 1, 1, 1,
rk_ord = 3,
chem_adv_opt = 2,
tke_adv_opt = 2,
time_step_sound = 4,
h_mom_adv_order = 5,
v_mom_adv_order = 3,
h_sca_adv_order = 5,
v_sca_adv_order = 3,
/

&bdy_control
spec_bdy_width = 5,
specified = .true.
/

&grib2
/

&namelist_quilt
nio_tasks_per_group = 0,
nio_groups = 1,
/
 
I have made all the changes as suggested above to my namelist.input file. However, I am still encountering the same fatal error during execution. Below, I have included the updated namelist.input entries for reference.



&time_control
run_days = 0,
run_hours = 0,
run_minutes = 0,
run_seconds = 0,
start_year = 2022, 2022, 2022,
start_month = 04, 04, 04,
start_day = 02, 02, 02,
start_hour = 00, 00, 00,
end_year = 2022, 2022, 2022,
end_month = 04, 04, 04,
end_day = 29, 29, 29,
end_hour = 00, 00, 00,
interval_seconds = 21600,
input_from_file = .true.,.true., true.,
history_interval = 60, 60, 60,
frames_per_outfile = 24, 24, 24,
restart = .false.,
restart_interval = 7200,
!iofields_filename = "variable.txt",
!ignore_iofields_warning = .false.,
io_form_history = 2
io_form_restart = 2
io_form_input = 2
io_form_boundary = 2
debug_level = 100,
history_outname = '/nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrf_out_<domain>_<date>'
rst_inname = '/nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrfrst__d<domain>_<date>'
rst_outname = '/nfs3/aparna/Mass_diagnostics_o3/wrf_outputs_1/wrfrst__d<domain>_<date>'
/

&domains
time_step = 120,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 3,
e_we = 140, 115, 52,
e_sn = 155, 121, 37,
e_vert = 35, 35, 35,
p_top_requested = 5000,
num_metgrid_levels = 34,
num_metgrid_soil_levels = 4,
dx = 27000,
dy = 27000,
grid_id = 1, 2, 3,
parent_id = 1, 1, 2,
i_parent_start = 1, 51, 34,
j_parent_start = 1, 99, 46,
parent_grid_ratio = 1, 3, 3,
parent_time_step_ratio = 1, 3, 3,
feedback = 1,
smooth_option = 0,
/

&physics
mp_physics = 10, 10, 10,
cu_physics = 3, 3, 0,
ra_lw_physics = 1, 1, 1,
ra_sw_physics = 2, 2, 2,
bl_pbl_physics = 1, 1, 1,
sf_sfclay_physics = 1, 1, 1,
sf_surface_physics = 2, 2, 2,
radt = 27, 27, 27,
bldt = 0, 0, 0,
cudt = 0, 0, 0,
icloud = 1,
num_land_cat = 21,
sf_urban_physics = 1, 1, 1,
!fractional_seaice = 1,
shcu_physics = 3,
progn = 1,
cu_rad_feedback = .true.,
cu_diag = 1, 1, 0,
cugd_avedx = 1,
mp_zero_out = 2,
mp_zero_out_thresh = 1.e-8,
sst_update = 0,
o3input = 0,
/


&dynamics
hybrid_opt = 2,
w_damping = 1,
diff_opt = 1, 1, 1,
km_opt = 4, 4, 4,
diff_6th_opt = 0, 0, 0,
diff_6th_factor = 0.12, 0.12, 0.12,
base_temp = 290.
damp_opt = 3,
zdamp = 5000., 5000., 5000.,
dampcoef = 0.2, 0.2, 0.2,
khdif = 0, 0, 0,
kvdif = 0, 0, 0,
non_hydrostatic = .true., .true., .true.,
moist_adv_opt = 2, 2, 2,
scalar_adv_opt = 2, 2, 2,
gwd_opt = 1, 1, 1,
rk_ord = 3,
chem_adv_opt = 2,
tke_adv_opt = 2,
time_step_sound = 4,
h_mom_adv_order = 5,
v_mom_adv_order = 3,
h_sca_adv_order = 5,
v_sca_adv_order = 3,
/

&bdy_control
spec_bdy_width = 5,
specified = .true.
/

&grib2
/

&namelist_quilt
nio_tasks_per_group = 0,
nio_groups = 1,
/
try turning off urban physics from =1,1,1, to = 0,0,0
 
After turning off the cumulus scheme for Domain 02 as well, the simulation is now running successfully. Thank you all for your valuable suggestions and support!
 
Top