Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

WRF-UCM problem ./wrf.exe only output the first time period, urban option not work [GFS/FNL used]

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

lslrsgis

Member
Dear WRF community,

I am using WRF-UCM to do a 24-hour simulation for New York region. GFS/FNL is used to generate boundary condition and initial condition. The nested domains are set as 9km, 3km, 1km.
The option urban was turned on for the 3rd domain: sf_urban_physics = 0, 0, 1,

It goes well when WPS, and ./real.exe. It runs without error for ./wrf.exe, however, it only generates the wrfout output files for 1st, 2nd domains for the initial time period: wrfout_d01_2000-01-24_12:00:00, wrfout_d02_2000-01-24_12:00:00.

Files for the remaining time periods and domain 03 as:

wrfout_d03_2000-01-24_12:00:00
wrfout_d02_2000-01-24_13:00:00
wrfout_d03_2000-01-24_13:00:00
wrfout_d02_2000-01-24_14:00:00
wrfout_d03_2000-01-24_14:00:00
wrfout_d01_2000-01-24_15:00:00

are not generated!

It seems that wrf encountered problem for domain 03 (using urban option.) Any help is appreciated! Thanks.

The output log file, namelists are attached.

LSL
 

Attachments

  • WRF_FNL_UCM_namelist_files.zip
    14 KB · Views: 71
Please turn on urban physics for all the three domains, i.e., sf_urban_physics = 1, 1, 1, then try again.

It is recommended that all physics options except cumulus scheme should be the same for all domains.
 
Thanks for the reply. I have changed the urban option for all three domains, and paid attention for cu_physics. They are set as:
sf_urban_physics = 1, 1, 1,
cu_physics = 1, 0, 0,

However, after executing ./wrf.exe, there is still only two files at the initial time generated:
-rw-r--r-- 1 sliu user 105M 7月 31 20:04 wrfout_d01_2000-01-24_12:00:00
-rw-r--r-- 1 sliu user 93M 7月 31 20:04 wrfout_d02_2000-01-24_12:00:00

The namelist files and wrfout log files after adoptation are attached.
 

Attachments

  • WRF_FNL_UCM_ADOPTED.zip
    12.8 KB · Views: 61
Can you check whether the simulation ran to the end? Please look at all your rsl files and tried to find the information that indicates when and where the case failed.
Your namelist options look fine. I am concerned of the finest domain with grid intervals of 1km, which may cause the problem.
 
Thanks for the quick response. I used serial option, thus wrf.exe were executed in non-mpi mode.

When checking the wrf.log.20190801.txt (attached), it is found that the program stopped after writing wrfout* for domain02 without complain. It seems that wrf.exe does not go on with domain 03 for the initial time step.
 

Attachments

  • wrf.log.20190801.txt
    13.7 KB · Views: 58
BTW, would you take a look at this post for WPS-UCM. A similar topic, however with NARR as bdy/initial condition: http://forum.mmm.ucar.edu/phpBB3/viewtopic.php?f=39&t=8149 Thanks very much.
 
I looked at your log file, which shows some information like:

WRF NUMBER OF TILES FROM OMP_GET_MAX_THREADS = 32
Tile Strategy is not specified. Assuming 1D-Y
WRF TILE 1 IS 1 IE 293 JS 1 JE 10
WRF TILE 2 IS 1 IE 293 JS 11 JE 20
WRF TILE 3 IS 1 IE 293 JS 21 JE 29
WRF TILE 4 IS 1 IE 293 JS 30 JE 38
WRF TILE 5 IS 1 IE 293 JS 39 JE 47
WRF TILE 6 IS 1 IE 293 JS 48 JE 56
WRF TILE 7 IS 1 IE 293 JS 57 JE 65

This makes me believe e you are running in openMP mode instead of serial mode. Please let me know if I am wrong.

I couldn't find anything wrong in your settings. I did a similar testing case and it works fine. I am suspicious that you may not have sufficient memory ro run this triply nested case. Can you try with more cores to see how it works?

Another option is, can you rebuild WRF in mph mode (dmpar), then run this case gain?
 
Thanks, I have recompiled WRF folder with serial and dmpar options separately.

For serial option, it works well when running ./real.exe. However, after executing ./wrf.exe, it prompted an error as follows:

-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 771
ZDC + Z0C + 2m is larger than the 1st WRF level Stop in subroutine urban - change ZDC and Z0C
-------------------------------------------
mediation_integrate.G 1943 DATASET=HISTORY
mediation_integrate.G 1944 grid%id 1 grid%oid 1
mediation_integrate.G 1943 DATASET=HISTORY
mediation_integrate.G 1944 grid%id 2 grid%oid 2
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 771
ZDC + Z0C + 2m is larger than the 1st WRF level Stop in subroutine urban - change ZDC and Z0C
-------------------------------------------

What should I do? Thanks in advance.

P.S. WRF log file and namelist.input are attached.
 

Attachments

  • log.wrf.20190814.txt
    11.1 KB · Views: 58
  • namelist.output.txt
    83.4 KB · Views: 67
Top