Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

real.exe missing dates in wrfbdy1

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

adaloz

New member
Hello,

I am trying to run WRF (3.8.1) over Scandinavia from 2008-07-01 to 2013-12-31.
All the steps until mpirun ./real.exe are working fine. I get the met files for all the dates between the start date and the end date. The met files look ok when I look at them with ncview. However, when I run mpirun ./real.exe, wrfbdy1 only has Times from 2008-07-01 until 2009-01-10 and I don't understand why. I don't have the parameter run_days in my namelist so I don't why I don't have all the dates between the start and the end dates. Does someone has any idea?

I am using GFS data as boundary conditions and here is the namelist.input I am using:
&time_control
start_year = 2008,
start_month = 07,
start_day = 01,
start_hour = 12,
start_minute = 00,
start_second = 00,
end_year = 2013,
end_month = 12,
end_day = 31,
end_hour = 12,
end_minute = 00,
end_second = 00,
interval_seconds = 21600
input_from_file = .true.,
history_interval = 180,
frames_per_outfile = 1000,
restart = .false.,
restart_interval = 5000,
io_form_history = 2
io_form_restart = 2
io_form_input = 2
io_form_boundary = 2
debug_level = 100
/

&domains
time_step = 180,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 1,
e_we = 106,
e_sn = 78,
e_vert = 30,
p_top_requested = 5000,
num_metgrid_levels = 27,
num_metgrid_soil_levels = 4,
dx = 20000,
dy = 20000,
grid_id = 1,
parent_id = 0,
i_parent_start = 1,
j_parent_start = 1,
parent_grid_ratio = 1,
parent_time_step_ratio = 1,
feedback = 1,
smooth_option = 0
/

&physics
mp_physics = 3,
ra_lw_physics = 1,
ra_sw_physics = 1,
radt = 4,
sf_sfclay_physics = 1,
sf_surface_physics = 5,
bl_pbl_physics = 1,
bldt = 0,
cu_physics = 1,
cudt = 5,
isfflx = 1,
ifsnow = 1,
icloud = 1,
surface_input_source = 3,
num_soil_layers = 4,
sf_urban_physics = 0,
/

&fdda
/

&dynamics
w_damping = 0,
diff_opt = 1,
km_opt = 4,
diff_6th_opt = 0,
diff_6th_factor = 0.12,
base_temp = 290.
damp_opt = 0,
zdamp = 5000.,
dampcoef = 0.2,
khdif = 0,
kvdif = 0,
non_hydrostatic = .true.,
moist_adv_opt = 1,
scalar_adv_opt = 1,
/

&bdy_control
spec_bdy_width = 5,
spec_zone = 1,
relax_zone = 4,
specified = .true.,
nested = .false.,
/

&grib2
/

&namelist_quilt
nio_tasks_per_group = 0,
nio_groups = 1,
/

If you need more information, please let me know.

Best,

Anne Sophie.
 
This is possibly due to two reasons:
(1) if the wrfbdy_d01 file is too large (greater than 2GB?), then the model may have trouble to continue to write new records
(2) The number of times of boundary condition is somehow limited (I haven't figured out yet where it is hard-coded).

Please try to split the boundary files into segments (e.g., each boundary file include 6-month data), then use the restart function of WRF to run the model over long periods.
 
Anne...

If Ming is correct in that your files are >2gb, then check to make sure the environmental variable
WRFIO_NCD_LARGE_FILE_SUPPORT is set to 1. If it wasn't, you need to rebuild the package.

In WRF 3.8.1, the default was 32-bit NETCDF files. That means files that hit the 2gb limit are trash.
 
Top