Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Error executing wrf.exe V4.0 (ERROR: Reduce the MPI rank count)

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

Guido_c

New member
Hello All,

I have a problem when I run wrf.exe (Version 4.0).

I compiled WRF with the option dm + sm (successfully), WPS with the option serial (successfully, but I had that add -lgomp in WRF_LIB of configure.wps ) and also downloaded all the necessary data.

The following configuration I ran successfully on a desktop computer, but it fails when I want to run it in a Cluster.

I attach the rsl, configure and launch script files:

&time_control
start_year = 2013, 2013, 2013,
start_month = 09, 09, 09,
start_day = 01, 01, 01,
start_hour = 12, 12, 12,
start_minute = 00, 00, 00,
start_second = 00, 00, 00,
end_year = 2013, 2013, 2013,
end_month = 09, 09, 09,
end_day = 28, 28, 28,
end_hour = 18, 18, 18,
end_minute = 00, 00, 00,
end_second = 00, 00, 00,
interval_seconds = 21600,
input_from_file = .true.,.true.,.true.,
history_interval = 60, 60, 60,
frames_per_outfile = 1000, 1000, 1000,
restart = .false.,
restart_interval = 7200,
io_form_history = 2,
io_form_restart = 2,
io_form_input = 2,
io_form_boundary = 2,
auxinput1_inname = "met_em.d<domain>.<date>",
!*****************************************************************
!Emisiones anthorpogencs (EDGAR-HTAP/GEAAA)
auxinput5_inname = 'wrfchemi_d<domain>_<date>',
auxinput5_interval_m = 360, 360, 360,
frames_per_auxinput5 = 1, 1, 1,
io_form_auxinput5 = 2,
!Configuracion para wrfchemis generados por GEAA
!auxinput5_inname = "wrfchemi_<hr>z_d<domain>",
!auxinput5_interval_s = 3600, 3600, 3600,
!frames_per_auxinput5 = 24, 24, 24,
!io_form_auxhist5 = 2,
!*****************************************************************
!Emisiones biogénicas (MEGAN)
auxinput6_inname = 'wrfbiochemi_d<domain>',
frames_per_auxinput6 = 1, 1, 1,
io_form_auxinput6 = 2,
!*****************************************************************
!Emisiones de incendios (FINN)
auxinput7_inname = 'wrffirechemi_d<domain>_<date>',
auxinput7_interval_m = 60, 60, 60,
frames_per_auxinput7 = 1, 1, 1,
io_form_auxinput7 = 2,
!*****************************************************************
!********** this option is cancelled when we use mobzc ***********
auxinput12_inname = 'wrf_chem_input',
debug_level = 100,
force_use_old_data = .true., !permite combinar version 3 con 4
!*****************************************************************
!************ To get wrfout file in another directory ************
!history_outname = '/directory/wrfout_d<domain>_<date>'


/

&domains
time_step = 120,
max_dom = 3,
time_step_fract_num = 0,
time_step_fract_den = 1,
s_we = 1, 1, 1,
e_we = 90,112,100,
s_sn = 1, 1, 1,
e_sn = 90,112,100,
e_vert = 31, 31, 31,
num_metgrid_levels = 27,
num_metgrid_soil_levels = 4,
dx = 27000,9000,3000,
dy = 27000,9000,3000,
grid_id = 1, 2, 3,
parent_id = 1,1,2,
i_parent_start = 1,27,40,
j_parent_start = 1,27,40,
parent_grid_ratio = 1,3,3,
parent_time_step_ratio = 1, 3, 3,
p_top_requested = 5000,
feedback = 1,
smooth_option = 0,
!############################Perfiles de alturas #############################
auto_levels_opt = 1, ! old=1 ; new option= 2
max_dz = 1000., ! maximum level thickness allowed (m)
dzbot = 50., ! thickness of lowest layer (m) for auto_levels_opt=2
dzstretch_s = 1.3, ! surface stretch factor for auto_levels_opt=2
dzstretch_u = 1.1, ! upper stretch factor for auto_levels_opt=2

/
sfcp_to_sfcp = .true.,

&physics
!physics_suite = 'CONUS'
mp_physics = 6, 6, 6, ! WSM6 Hong and Lim (2006)
ra_lw_physics = 1, 1, 1, !RRTM onda larga
ra_sw_physics = 1, 1, 1, !Dubhia onda corta
radt = 27, 27, 27,
num_land_cat = 24,
cu_diag = 1,
cu_rad_feedback = .true.,.true.,.true.,
sf_sfclay_physics = 1, 1, 1,
sf_surface_physics = 2, 2, 2,
bl_pbl_physics = 1, 1, 1,
bldt = 0, 0, 0,
cu_physics = 3, 3, 3,
cudt = 5, 5, 5,
isfflx = 1,
ifsnow = 1,
icloud = 1,
surface_input_source = 1,
num_soil_layers = 4,
sf_urban_physics = 0, 0, 0,
/

&fdda
/

&dynamics
hybrid_opt = 2,
w_damping = 0,
diff_opt = 1, 1, 1,
km_opt = 4, 4, 4,
diff_6th_opt = 0, 0, 0,
diff_6th_factor = 0.12, 0.12, 0.12,
base_temp = 290.
damp_opt = 0,
zdamp = 5000., 5000., 5000.,
dampcoef = 0.2, 0.2, 0.2,
khdif = 0, 0, 0,
kvdif = 0, 0, 0,
non_hydrostatic = .true., .true., .true.,
moist_adv_opt = 1, 1, 1,
scalar_adv_opt = 1, 1, 1,
gwd_opt = 1,
chem_adv_opt = 1,
moist_adv_opt = 1,
/

&bdy_control
spec_bdy_width = 5,
spec_zone = 1,
relax_zone = 4,
specified = .true., .false.,.false.,.false.,
nested = .false., .true., .true.,.true.,
/

&grib2
/

&namelist_quilt
nio_tasks_per_group = 0,
nio_groups = 1,
/

&chem
kemit = 8,
chem_opt= 112, 112, 112,
bioemdt = 30, 30, 30, 30,
photdt = 30, 30, 30, 30,
chemdt = 2., 2., 2., 2.,
io_style_emissions = 2,
emiss_inpt_opt = 111, 111, 111, 1,
emiss_opt = 8, 8, 8, 0,
chem_in_opt = 1,1,1,
phot_opt = 4, 4, 4, 4,
gas_drydep_opt = 1, 1, 1, 1,
aer_drydep_opt = 1, 1, 1, 1,
bio_emiss_opt = 3, 3, 3, 1,
ne_area = 128,
dust_opt = 0,
dmsemis_opt = 0,
seas_opt = 0,
gas_bc_opt = 1, 1, 1, 1,
gas_ic_opt = 1, 1, 1, 1,
aer_bc_opt = 1, 1, 1, 1,
aer_ic_opt = 1, 1, 1, 1,
gaschem_onoff = 1, 1, 1, 1,
aerchem_onoff = 1, 1, 1, 1,
wetscav_onoff = 0, 0, 0, 0,
cldchem_onoff = 0, 0, 0, 0,
vertmix_onoff = 1, 1, 1, 1,
chem_conv_tr = 1, 1, 1, 1,
biomass_burn_opt = 2, 2, 2, 0,
plumerisefire_frq = 30, 30, 30, 30,
aer_ra_feedback= 1, 1, 1,
opt_pars_out = 1,
chemdiag = 1,
have_bcs_chem = .false., .false., .false.,.false.,
!########################################################################
!#######################Upper Boundary Condition ########################
!have_bcs_upper = .true.,
!fixed_upper_bc = 50.,
!fixed_ubc_inname = "ubvals_rcp4_5.2deg_2020-2029.nc",
!########################################################################
/

Any help it wil be useful
Thank you a lot.

Guido C.
 

Attachments

  • all_files.tar.gz
    7.8 KB · Views: 51
Hi Guido,

At the end of the rsl.error.* file, you can see the following error:
Code:
 Minimum decomposed computational patch size, either x-dir or y-dir, is 10 grid cells.
e_we =    90, nproc_x =    8, with cell width in x-direction =   11
e_sn =    90, nproc_y =   10, with cell width in y-direction =    9
--- ERROR: Reduce the MPI rank count, or redistribute the tasks.
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE:  <stdin>  LINE:    1744
NOTE:       1 namelist settings are wrong. Please check and reset these options

The problem is that you are asking for too many processors for the small number of grid cells you are using in each direction. You are asking for a total of 80 processors, which decomposes to 8x10 (8 processors in the x-direction, and 10 in the y-direction), which when you divide these by the total number of grid cells in each direction by their corresponding number of processors, it gives a value less than the minimum number of 10 grid cells allowed in each direction (ideally you want more than that) for the y-direction. Take a look at this FAQ, which discusses a basic formula to roughly follow when determining a good number of processors to use. This also discusses the problem with using too many processors:
http://forum.mmm.ucar.edu/phpBB3/viewtopic.php?f=73&t=127

Additionally, you may want to consider increasing the size of your domains a bit. We don't usually recommend using domain sizes smaller than about 100x100 in each direction. This is usually due to the fact that the domain just won't be large enough to allow things to be resolved before the variables propagate through, and out the other side. In case this helps, you can take a look at this web page, which provides descriptions of namelist variables for WPS, in addition to some best practice advice for each parameter:
http://www2.mmm.ucar.edu/wrf/users/namelist_best_prac_wps.html

and there is also one available for WRF, in case that is useful to you, as well:
http://www2.mmm.ucar.edu/wrf/users/namelist_best_prac_wrf.html

Kelly
 
Top