Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

problems running WRF with sf_ocean_physics option 2.

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

jcarlosgm

New member
Hello,
I'm trying to run WRF with option 2 from sf_ocean_physics (PWP module), but I have a problem. The model freezes, wrf.exe continues to run but does not write the outputs, it only writes two or three outputs (every 6 hours). In the files rsl.error.* or rsl.out.* there is no error message or hint about what the problem might be. I ran other cases, with the option sst_update or sf_ocean_physics = 1 (the latter is based on Pollard), they ended successfully, the problem is only to run sf_ocean_physics=2.

I tried different configuration, changing physics parameterizations, modifying dt and using and not using adaptative timestep but problem continue. I use ERA-I data for forced the model, and I using the default values for specify levels in ocean, temperature and salinity.
Any idea, guide or tip for resolve this issue? I very apreciated your help.

The namelist.input that I'm using is:

&time_control
run_days = 15,
run_hours = 0,
run_minutes = 0,
run_seconds = 0,
start_year = 1994, 2000, 2000,
start_month = 12, 01, 01,
start_day = 20, 24, 24,
start_hour = 00, 12, 12,
start_minute = 00, 00, 00,
start_second = 00, 00, 00,
end_year = 1995, 2000, 2000,
end_month = 01, 01, 01,
end_day = 01, 25, 25,
end_hour = 00, 12, 12,
end_minute = 00, 00, 00,
end_second = 00, 00, 00,
interval_seconds = 21600
input_from_file = .true.,.true.,.true.,
history_interval = 360, 60, 60,
frames_per_outfile = 1, 1000, 1000,
restart = .false.,
restart_interval = 44640,
io_form_history = 2
io_form_restart = 2
io_form_input = 2
io_form_boundary = 2
debug_level = 9999
auxinput4_inname = "wrflowinp_d<domain>",
auxinput4_interval = 360,
io_form_auxinput4 = 2,
adjust_output_times = .true.,
output_diagnostics = 1,
auxhist3_outname = "wrfxtrm_d<domain>_<date>"
auxhist3_interval = 1440,
frames_per_auxhist3 = 30,
io_form_auxhist3 = 2,
write_hist_at_0h_rst = .true.,
/

&domains
time_step = 60,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 1,
e_we = 440, 112, 94,
e_sn = 220, 97, 91,
e_vert = 45, 30, 30,
p_top_requested = 5000,
num_metgrid_levels = 38,
num_metgrid_soil_levels = 4,
dx = 25000, 10000, 3333.33,
dy = 25000, 10000, 3333.33,
grid_id = 1, 2, 3,
parent_id = 0, 1, 2,
i_parent_start = 1, 31, 30,
j_parent_start = 1, 17, 30,
parent_grid_ratio = 1, 3, 3,
parent_time_step_ratio = 1, 3, 3,
feedback = 0,
smooth_option = 0,
use_adaptive_time_step = .true.,
target_cfl = 1.6,
max_step_increase_pct = 5,
starting_time_step = 50,
max_time_step = 80,
min_time_step = 1,
sfcp_to_sfcp = .true.,
ocean_levels = 30,
ocean_z = 5., 15., 25., 35., 45., 55.,
65., 75., 85., 95., 105., 115.,
125., 135., 145., 155., 165., 175.,
185., 195., 210., 230., 250., 270.,
290., 310., 330., 350., 370., 390.,
ocean_t = 302.3493, 302.3493, 302.3493, 302.1055, 301.9763, 301.6818,
301.2220, 300.7531, 300.1200, 299.4778, 298.7443, 297.9194,
297.0883, 296.1443, 295.1941, 294.1979, 293.1558, 292.1136,
291.0714, 290.0293, 288.7377, 287.1967, 285.6557, 284.8503,
284.0450, 283.4316, 283.0102, 282.5888, 282.1674, 281.7461
ocean_s = 34.0127, 34.0127, 34.0127, 34.3217, 34.2624, 34.2632,
34.3240, 34.3824, 34.3980, 34.4113, 34.4220, 34.4303,
34.6173, 34.6409, 34.6535, 34.6550, 34.6565, 34.6527,
34.6490, 34.6446, 34.6396, 34.6347, 34.6297, 34.6247,
34.6232, 34.6189, 34.6097, 34.6002, 34.5920, 34.5860
/

&physics
sf_ocean_physics = 2,
omdt = 1,
oml_hml0 = 50,
oml_gamma = 0.14,
mp_physics = 3, 3, 3,
ra_lw_physics = 1, 1, 1,
ra_sw_physics = 1, 1, 1,
radt = 30, 30, 30,
sf_sfclay_physics = 1, 1, 1,
sf_surface_physics = 2, 2, 2,
bl_pbl_physics = 1, 1, 1,
bldt = 0, 0, 0,
cu_physics = 1, 1, 0,
cudt = 5, 5, 5,
isfflx = 1,
ifsnow = 0,
icloud = 1,
surface_input_source = 1,
num_soil_layers = 4,
num_land_cat = 21,
sf_urban_physics = 0, 0, 0,
maxiens = 1,
maxens = 3,
maxens2 = 3,
maxens3 = 16,
ensdim = 144,
/

&fdda
grid_fdda = 2, 0, 0,
gfdda_inname = "wrffdda_d<domain>",
gfdda_end_h = 1401600,
gfdda_interval_m = 360,
fgdt = 3,
fgdtzero = 1,
if_no_pbl_nudging_uv = 0,
if_no_pbl_nudging_t = 0,
if_no_pbl_nudging_ph = 0,
if_zfac_uv = 1,
k_zfac_uv = 10,
if_zfac_t = 1,
k_zfac_t = 10,
if_zfac_ph = 1,
k_zfac_ph = 10,
dk_zfac_uv = 5,
dk_zfac_t = 5,
dk_zfac_ph = 5,
guv = 0.0003,
gt = 0.0003,
gph = 0.0003,
xwavenum = 7,
ywavenum = 7,
if_ramping = 0,
dtramp_min = 60.0,
io_form_gfdda = 2,
/

&dynamics
w_damping = 0,
diff_opt = 1, 1, 1,
km_opt = 4, 4, 4,
diff_6th_opt = 0, 0, 0,
diff_6th_factor = 0.12, 0.12, 0.12,
base_temp = 290.
damp_opt = 0,
zdamp = 5000., 5000., 5000.,
dampcoef = 0.2, 0.2, 0.2
khdif = 0, 0, 0,
kvdif = 0, 0, 0,
non_hydrostatic = .true., .true., .true.,
moist_adv_opt = 1, 1, 1,
scalar_adv_opt = 1, 1, 1,
iso_temp = 0,
/

&bdy_control
spec_bdy_width = 5,
spec_zone = 1,
relax_zone = 4,
specified = .true., .false.,.false.,
nested = .false., .true., .true.,
/

&grib2
/

&namelist_quilt
nio_tasks_per_group = 0,
nio_groups = 1,
/
 
Hi,
Can you let me know which version of the model you are running, and whether you have modified the code at all (or is it pristine "out-of-the-box" code)? Can you also please attach the full output log and error log (e.g., rsl.out.0000 and rsl.error.0000). To attach files, when you are in the text box, click the tab below with 3 horizontal lines and follow the instructions to attach. Thanks!
 
Hi kwerner,
The versión is 3.8.1, at this moment i use the model without any changes in the code, only modifications on the namelist.input.
The model are running on HPC cluster using slurm and 256 procesators, and model was compiled with intel 2019 compilers.

==
#!/bin/bash

#SBATCH --job-name=WFR-16x16
#SBATCH --nodes=16
#SBATCH --ntasks=256
#SBATCH --ntasks-per-node=16

ml intel/2019u4 mpi/intel wrf/3.8.1

srun ./real.exe
srun ./wrf.exe
===

I attached the first rsl.error and rsl.out, in the other files the output is very similar. The errors in the last part of the file were generated when I killed the process, as I mentioned wrf.exe continued to run but didn't process anything else.

Thanks in advance for you time.
 

Attachments

  • rsl.out.0000.txt
    13.1 MB · Views: 62
  • rsl.error.0000.txt
    13.1 MB · Views: 53
Hi,
Can you set debug_level = 0 and try to rerun this? We recently removed that variable from the default namelist because it doesn't typically provide any useful information, and actually ends up adding a lot of junk to the rsl* files making them difficult to read through. Setting this large can also occasionally make the rsl* files so large that they use up all the disk space, resulting in the model's inability to run. That's likely not what is going on with your run, but it will help me to look through the files more easily. Please attach the new rsl* files. Thanks!
 
Hello,
Normally the model is running with debug_level = 1, had configured with 9999 looking for information about the problem, without success.
I ran the model with debug_level = 0, the result is the same. The model "freeze" in the same time step. I have attached all rsl.error and rsl.our files in a tar file.

Thank you for your support!
 

Attachments

  • rsl_error.tar
    1.3 MB · Views: 47
  • rsl_out.tar
    20.8 MB · Views: 50
Hi,
I notice in your rsl.error.* files, you have CFL errors:
Code:
rsl.error.0152:d01 1994-12-20_21:00:00          169  points exceeded cfl=2 in domain d01 at time 1994-12-20_21:00:00 hours
rsl.error.0152:d01 1994-12-20_21:00:00  MAX AT i,j,k:          229         136          25  vert_cfl,w,d(eta)=   98.49350      -391.7924      2.4615258E-02
rsl.error.0168:d01 1994-12-20_21:00:00          468  points exceeded cfl=2 in domain d01 at time 1994-12-20_21:00:00 hours
rsl.error.0168:d01 1994-12-20_21:00:00  MAX AT i,j,k:          230         139          25  vert_cfl,w,d(eta)=   363.8071      -1674.798      2.4615258E-02
Take a look at this FAQ for information on getting rid of those errors.
https://forum.mmm.ucar.edu/phpBB3/viewtopic.php?f=73&t=133
Your time_step is already pretty low, so you may want to try the other options. I'm not sure exactly why this is happening when you're using ocean physics option 2, as opposed to 1, but this is likely the reason your run is giving a seg-fault.
 
Ok, copy that.

I'm going to try the other options because the stacksize was already considered, and as you mention me the dt is already very low.
If the problem is solved, I will be writing to notify. Or in its defect, if the problem persists and I can find some other indication of where the problem might be, I'll still be in contact.

Tks for all support.
 
I continue with the problem of running the wrf model with sf_ocean_physics = 2. I have tried the options you suggested to me without success. I started a tracking of the error by putting flags in different modules to see if the model entered a module and if it left this module. All tests marked the error in the module about PWP: module_sf_3dpwp.

The problem is in the coast, in the interface between the land and the ocean. In this interaction, the model shows a very strong cooling and the om_ml (mixed layer depth) is very unreal, for this reason the TSK is also very cold. In addition, the grid for om_tmp (which is a 3D grid) should be resolved only in ocean grid points, but the result shows that this is not true, with values on land grid points. (no CFL in rsl.* files).

For the last few tests, I changed my domain to one more over the Gulf of Mexico. The latter questions are very clear in these regions. For a clearer explanation, I attach four figures and the namelist.input entry for these tests. The figures are for TSK, OM_ML, OML_TMP and XLAND. I set manually the scale for colorbar different at default values but you can see in displayed range the maximum and minimum values, very irreals for this variables.

Do you have any idea that can help me? Really, ¡¡ this problem is freaking me out !!
 

Attachments

  • namelist.input.txt
    7.2 KB · Views: 49
  • Screenshot from 2019-10-07 11-01-30.png
    Screenshot from 2019-10-07 11-01-30.png
    34.8 KB · Views: 1,494
  • Screenshot from 2019-10-07 11-03-04.png
    Screenshot from 2019-10-07 11-03-04.png
    26.3 KB · Views: 1,493
  • Screenshot from 2019-10-07 11-04-21.png
    Screenshot from 2019-10-07 11-04-21.png
    32.8 KB · Views: 1,494
  • Screenshot from 2019-10-07 11-05-09.png
    Screenshot from 2019-10-07 11-05-09.png
    23.7 KB · Views: 1,493
Hi,
I apologize for the delay. Our team has been out of the country for the past couple of weeks, teaching a WRF tutorial in Europe. Thank you for your patience.

Can you try setting 'surface_input_source = 3' (instead of 1) to see if that makes any difference. If not, can you let me know how far into the run the problems begin to show up? Thanks.
 
I'm very sorry for the answer after many months :|

The PWP module works very well in oceanic domains, without land, because it does'nt recognize between them.
I needed to use a mask for it, and a conditional so that the equations would be solved only on the oceanic points.

This worked for my purposes.
 
Top