Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

real.exe issue saying it is a global domain

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

rameshv

New member
A problem is encountered while running real.exe "Maybe this is a global domain, but the polar flag was not set in the bdy_control namelist."
Attached is the namelist.wps and namelist.input. The same namelists work well with the versions before WRF Version 4.0

The issue remains the same with the projections set to "mercator" or "polar".

REAL_EM V4.1.4 PREPROCESSOR
*************************************
Parent domain
ids,ide,jds,jde 1 501 1 501
ims,ime,jms,jme -4 70 -4 70
ips,ipe,jps,jpe 1 63 1 63
*************************************
DYNAMICS OPTION: Eulerian Mass Coordinate
alloc_space_field: domain 1, 208982692 bytes allocated
Time period # 1 to process = 2014-05-30_12:00:00.
Time period # 2 to process = 2014-05-30_13:00:00.
Time period # 3 to process = 2014-05-30_14:00:00.
Time period # 4 to process = 2014-05-30_15:00:00.
Total analysis times to input = 4.

-----------------------------------------------------------------------------

Domain 1: Current date being processed: 2014-05-30_12:00:00.0000, which is loop # 1 out of 4
configflags%julyr, %julday, %gmt: 2014 150 12.00000
d01 2014-05-30_12:00:00 Yes, this special data is acceptable to use: OUTPUT FROM METGRID V4.1
d01 2014-05-30_12:00:00 Input data is acceptable to use: met_em.d01.2014-05-30_12:00:00.nc
metgrid input_wrf.F first_date_input = 2014-05-30_12:00:00
metgrid input_wrf.F first_date_nml = 2014-05-30_12:00:00
[rameshv@iitmlogin5 short]$ tail -f rsl.out.0000
d01 2014-05-30_12:00:00 Input data is acceptable to use: met_em.d01.2014-05-30_12:00:00.nc
metgrid input_wrf.F first_date_input = 2014-05-30_12:00:00
metgrid input_wrf.F first_date_nml = 2014-05-30_12:00:00
d01 2014-05-30_12:00:00 Timing for input 4 s.
d01 2014-05-30_12:00:00 flag_soil_layers read from met_em file is 1
Max map factor in domain 1 = 0.00. Scale the dt in the model accordingly.
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 611
Maybe this is a global domain, but the polar flag was not set in the bdy_control namelist.
-------------------------------------------
---------------------------------------------------namelist.wps----------------------------------
&share
wrf_core = 'ARW',
max_dom = 3,
start_date = '2014-05-30_12:00:00','2014-05-30_12:00:00','2014-05-30_12:00:00',
end_date = '2014-05-31_00:00:00','2014-05-31_00:00:00','2014-05-31_00:00:00',
interval_seconds = 3600
active_grid = .true., .true., .true.,
io_form_geogrid = 2,
debug_level = 0
/

&geogrid
parent_id = 1, 1, 2,
parent_grid_ratio = 1, 3, 3,
i_parent_start = 1, 240, 150,
j_parent_start = 1, 210, 270,
e_we = 501, 454, 355,
e_sn = 501, 454, 355,
geog_data_res = '30s', '30s', '30s',
dx = 12000,
dy = 12000,
map_proj = 'polar',
ref_lat = 22.0,
ref_lon = 70.0,
truelat1 = 22.0,
truelat2 = 22.0,
stand_lon = 70.0,
geog_data_path = '/iitm2/cccr-res/ramesh/Models/geog'
opt_geogrid_tbl_path = './'
/

&ungrib
out_format = 'WPS',
prefix = 'ERA5-2D',
/

&metgrid
fg_name = 'ERA5-3D','ERA5-2D',
io_form_metgrid = 2,
opt_metgrid_tbl_path = './'
-----------------------------------------namelist.input-------------------------
&time_control
run_days = 0,
run_hours = 3,
run_minutes = 0,
run_seconds = 0,
start_year = 2014, 2014, 2014,
start_month = 05, 05, 05,
start_day = 30, 30, 30,
start_hour = 12, 12, 12,
end_year = 2014, 2014, 2014,
end_month = 05, 05, 05,
end_day = 30, 30, 30,
end_hour = 15, 15, 15,
interval_seconds = 3600
input_from_file = .true.,.true.,.true.,
fine_input_stream = 0, 0, 0,
history_interval = 60, 60, 60, ,
frames_per_outfile = 1, 1, 1,
restart = .false.,
restart_interval = 1440,
write_hist_at_0h_rst = .true.,
io_form_history = 2
io_form_restart = 2
io_form_input = 2
io_form_boundary = 2
io_form_auxinput2 = 2
io_form_auxinput10 = 2
debug_level = 0
auxinput4_inname = "wrflowinp_d<domain>"
auxinput4_interval = 60, 60, 60,
io_form_auxinput4 = 2
/

&domains
time_step = 10,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 1,
e_we = 501, 454, 355,
e_sn = 501, 454, 355,
e_vert = 36, 36, 36,
num_metgrid_levels = 38
num_metgrid_soil_levels = 4
p_top_requested = 5000
lagrange_order = 2
dx = 12000, 4000, 1333.3333,
dy = 12000, 4000, 1333.3333,
grid_id = 1, 2, 3,
parent_id = 0, 1, 2,
i_parent_start = 1, 240, 190,
j_parent_start = 1, 225, 210,
parent_grid_ratio = 1, 3, 3,
parent_time_step_ratio = 1, 3, 3,
feedback = 0
smooth_option = 0
use_adaptive_time_step = .false.,
step_to_output_time = .true.,
target_cfl = 1.2, 1.2, 1.2,
target_hcfl = .84, .84, .84,
max_step_increase_pct = 5, 31, 31,
starting_time_step = 1, 1, 2,
max_time_step = 36, 12, 1,
min_time_step = 1, 1, 1,
min_time_step_den = 0, 9, 9,
adaptation_domain = 2,
/

&dfi_control
/

&physics
mp_physics = 14, 14, 14,
ra_lw_physics = 4, 4, 4,
ra_sw_physics = 4, 4, 4,
radt = 12, 12, 12,
sf_sfclay_physics = 2, 2, 2,
sf_surface_physics = 2, 2, 2,
bl_pbl_physics = 2, 2, 2,
bldt = 0, 0, 0,
cu_physics = 1, 1, 0,
cudt = 0, 0, 0,
kfeta_trigger = 2,
isfflx = 1,
ifsnow = 1,
icloud = 1,
surface_input_source = 1,
use_mp_re = 1,
num_soil_layers = 4,
num_land_cat = 21,
tmn_update = 0,
lagday = 140,
sst_update = 1,
mp_zero_out = 0,
maxiens = 1,
maxens = 3,
maxens2 = 3,
maxens3 = 16,
ensdim = 144,
slope_rad = 1,
topo_shading = 0,
co2tf = 1
levsiz = 59
paerlev = 29
cam_abs_dim1 = 4
cam_abs_dim2 = 72
cam_abs_freq_s = 3600
prec_acc_dt = 1440, 1440, 1440,
/

&tc
/

&fdda
grid_fdda = 0, 0, 0,
gfdda_inname = "wrffdda_d<domain>",
gfdda_end_h = 780, 780, 24,
gfdda_interval_m = 360, 360, 360,
fgdt = 0, 0, 0,
if_no_pbl_nudging_uv = 0, 0, 0,
if_no_pbl_nudging_t = 1, 1, 1,
if_no_pbl_nudging_q = 1, 1, 1,
if_zfac_uv = 0, 0, 0,
k_zfac_uv = 10, 10, 10,
if_zfac_t = 0, 0, 0,
k_zfac_t = 10, 10, 10,
if_zfac_q = 0, 0, 0,
k_zfac_q = 10, 10, 10,
guv = 0.005, 0.005, 0.0003,
gt = 0.005, 0.005, 0.0003,
gq = 0.005, 0.005, 0.0003,
if_ramping = 1,
dtramp_min = 60.0,
io_form_gfdda = 2,
/

&scm
/

&dynamics
rk_ord = 3,
w_damping = 1,
diff_opt = 1,
km_opt = 4,
damp_opt = 3,
base_temp = 290.
use_baseparam_fr_nml = .t.
diff_6th_opt = 0, 0, 0,
diff_6th_factor = 0.12, 0.12, 0.12,
zdamp = 5000., 5000., 5000.,
dampcoef = 0.2, 0.2, 0.2,
khdif = 0, 0, 0,
kvdif = 0, 0, 0,
smdiv = 0.1, 0.1, 0.1,
emdiv = 0.01, 0.01, 0.01,
time_step_sound = 24, 48, 72,
h_mom_adv_order = 5, 5, 5,
v_mom_adv_order = 3, 3, 3,
h_sca_adv_order = 5, 5, 5,
v_sca_adv_order = 3, 3, 3,
non_hydrostatic = .true.,.true.,.true.,
moist_adv_opt = 2, 2, 2,
scalar_adv_opt = 2, 2, 2,
tke_adv_opt = 2, 2, 2,
/

&fire
/

&bdy_control
spec_bdy_width = 10,
spec_zone = 1,
relax_zone = 9,
specified = .true., .false.,.false.,
nested = .false.,.true.,.true.,
polar = .false.,.false.,.false.,
/

&grib2
/

&namelist_quilt
nio_tasks_per_group = 0,
nio_groups = 1,
/
 
Hi,
Can you attach a couple of your met_em.d01* files so that I can try to repeat this with your input? If the files are too large, take a look at the home page of this forum for instructions on sending large files. Thanks!
 
Sure I will attach the met_em.* files.

I still am puzzled to see why met_em.* files at the first time stamp is way bigger than the subsequent time stamps (before V4, it used to be all met_em.d* files for a particular domain will be of the same size).

Thanks for the response
 
Hi,
Did you put the met_em* files in the Nextcloud account, or did you forget to attach them here?
 
Give me a little more time, I am comparing 4.1.3 and 4.1.4

Some errors popped up in 4.1.4 go away, some new comes up.
 
Hi, have you solved this issue? I am also getting the same problem with WRF4.3 (see below).

Kindly please help me to resolve this issue.


FATAL CALLED FROM FILE: <stdin> LINE: 623
Maybe this is a global domain, but the polar flag was not set in the bdy_control namelist.
-------------------------------------------
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
 
@alugulaboyaj,
We have gotten messages from a few people who have had this same error, and unfortunately we aren't able to recreate the error on our end - even when using the exact input files, namelist, wrf version, etc. as the users. Most recently, I've been working with a user on this thread. Can you take a look at my post to them on March 28th, and tell me if you see the same behavior regarding the max map factor? Can you also share your files with me (namelist.input, your error log that shows the error, and a couple time periods of met_em* files)? You may need to use the method explained on the home page of this forum to send your met* files because they are likely too large to attach. Thanks!
 
Top