Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

High resolution simulation over mountainous region

asandeep

New member
Hello,

I am trying to simulate weather over Asian region including high altitude Himalayan mountainous region using WRF v4.4. I am able to perform simulations at 3km but if fails for higher resolution with errors such as CFL errors, Flerchinger, WOULD GO OFF TOP, ERROR: ghg_input available only for these radiation schemes etc. I also set epssm to 1.8 which helped to some extent but the simulation hangs up without any error. I have tried for smaller domains as well but it fails if the mountainous region is present.
I will appreciate any help.

Thanks,
Sandeep
 
Sandeep,

CFL error indicates the case is numerically unstable. Flerchinger and WOULD GO OFF TOP suggest that physics went wrong.
Please reduce time_step to 3 x delx, set epssm = 1.0 and turn on w_damping. Let's see whether the case can run.
 
Hello Ming,

Thank you very much for the valuable information and help. As per your suggestion I tried with lower time-steps (4/3/2/1/half sec) along with higher values of epssm (from 1.0 to 2.4) and w_damping enabled. But the simulation stops in few hours without any error. One observation is that with a certain no. of parallel processes and a given namelist.input, two runs would stop exactly at same point. While they will stop exactly at another point with another number of processes (same namelist.input).

I used `lambert` map_proj in WPS, setting same value for ref_lat, truelat1 and truelat1 and aonther value for ref_lon and stand_lon. I am not sure if this is the correct way. Also, I want to know the parameters required to be set for `mercator` projection and whether that would help in my simulation.

I will appreciate any help.

Thanks,
Sandeep
 
Same I opened an issue for Nepal here:

Thanks !
Could you run WRF4.5 with mp_physics=38 ?
I can run WRF4.5 with mp_physics=8 or 28 successfully,but wrf crashes without any errors when mp_physics=38。
Thanks!
 
We did some simulations (in forecast mode using GFS/GDAS) over Delhi and part of the outer domain heads into the higher altitudes.

The follow configuration was successful

domains = 9km and 3km

cu_physics=11 (MSKF only for 9km)
cu_dt=5
cu_rad_feedback=.true.,.false.
sfclay = 1 (for both)
pbl = 1 (ysu for both)
mp_physics = 8 (thompson for both)
radt = 5 (for both)
adapative timestep max = 45,15

Note: Precipitation was higher than other combinations.
 
1-km resolution WRF run over high topography area like the Tibet frequently fails due to numerical instability. So far we don't have any solution to such issue. Sorry.
 
Hello,

I am also facing similar issue. I am using 1km resolution over mountainous region with steep topography gradient, where I am able to complete simulation with certain number of MPI processes such as with 640 or 1200 or 1392 procs but the simulation hangs with different no. of procs such as 720 or 768 or 960 or 1440 etc.

I assume it is because of certain sub-domain which if gets divided among two processes leads to unstable simulation and hangs. If this could be true how can I debug it and how can I find out which process is working on which lat/lon ?

Thanks,
Sandeep Agrawal
 
The epssm=0.9,0.9 is a great idea to try. I would also suggest trying to not use the hybrid vertical coordinate in this region if you are doing so (I believe I have read in the past that this coordinate could cause some issues over very steep and high topography such as the Himalaya). Finally, smaller time steps and/or possibly a 3-pass smoothing of the terrain field in geogrid. However, too much smoothing might eliminate some of the more interesting terrain features you might want to resolve at 1 km grid spacing. Also try w_damping=1 and possibly the new implicit-explicit vertical transport IEVA option starting I think in WRF v4.3. I personally have not tried this option but perhaps it could help you?
 
Hello,

I am also facing similar issue. I am using 1km resolution over mountainous region with steep topography gradient, where I am able to complete simulation with certain number of MPI processes such as with 640 or 1200 or 1392 procs but the simulation hangs with different no. of procs such as 720 or 768 or 960 or 1440 etc.

I assume it is because of certain sub-domain which if gets divided among two processes leads to unstable simulation and hangs. If this could be true how can I debug it and how can I find out which process is working on which lat/lon ?

Thanks,
Sandeep Agrawal
@asandeep

I have also found that using 3*dx=dt for the time step works for 1km domains over the Himalayan mountains. Also using hybrid_opt=2 seemed to improve my handling of the mountains
 
Thanks everyone for your valuable suggestions, time and help!
I tried many of the suggestions, but still face the issue of simulation getting hung in between without errors. The options I tried include smaller time-steps, hybrid_opt2, etac, epssm etc. Though I am able to run simulations on few small sub-domains but not on a bigger part of Himalayan region. I have been using NOAH-MP but will try with NOAH as suggested above by @rdumais63
Also, I am yet to try a 3-pass smoothing of the terrain field in geogrid as suggested by @rdumais63.
Thanks again everyone.
 
Thanks everyone for your valuable suggestions, time and help!
I tried many of the suggestions, but still face the issue of simulation getting hung in between without errors. The options I tried include smaller time-steps, hybrid_opt2, etac, epssm etc. Though I am able to run simulations on few small sub-domains but not on a bigger part of Himalayan region. I have been using NOAH-MP but will try with NOAH as suggested above by @rdumais63
Also, I am yet to try a 3-pass smoothing of the terrain field in geogrid as suggested by @rdumais63.
Thanks again everyone.
Can I see your namelist.input and namelist.wps file?
 
Top