Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

issues in producing longterm (25-years) lateral boundary condition to one wrfbdy_d01 file in WRF 4.0 real-case

esmwg2020

New member
Dear all,

For my personal reasons, I have to put all the 25-years lateral boundary conditions to one wrfbdy_d01 file in ARW version 4.0 real-case when I ran the real.exe, though the file of wrfbdy_d01 will has large size and the time-steps will more than 10000.

What I have done are:

1\ change the 'icount' from '10000' to '50000' in the the script of share/input_wrf.F90 in line 1201: DO WHILE ( ( currentTime .GE. grid%next_bdy_time ) .AND. ( icount < 50000 ) ) !!!---change 10000 to 50000

2\ compile the model again './compile em_real >&log.file1' without './clean' or './clean -a'

3\ runing the real.exe as usual
;---------------------------------------------------------------------------

a) But the files wrfbdy_d01 and wrflowinp_d01 for lateral boundary conditions and sst update both stoped at the 10000th time step again (interval_seconds = 21600), which means it did not work for what I had chaged in the script share/input_wrf.F90 ;

b) what's more, the size of wrfbdy_d01 file this time is much less than that of former one that haven't changed the 'icount' (this time, the wrfbdy_d01 is about 66.0 G, while the former one is 458.0 G).

c) Although the the wrfbdy_d01 and wrflowinp_d01 stop update at the time step 10000, the real,exe is still running, and the Input data is still acceptable for the next time steps (please see the rsl.error.0000 in attach files).
******************************************************************
So I am wondering wether you or any one could tell how to change the script and re-compile the ARW, so that I could put all the 25-years lateral boundary conditions to one wrfbdy_d01 file.
============================================
Attached are the namelist.input and the shell script running the real.exe.

Thanks so much !

Best,

Kai
 
Hi Kai,
This issue was actually addressed, and the code was modified prior to some of the newer versions of code. Can you try your simulation with V4.4 to see if that resolves your issue? If you're interested in the code changes, take a look at the code commit here.
 
Hi Kai,
This issue was actually addressed, and the code was modified prior to some of the newer versions of code. Can you try your simulation with V4.4 to see if that resolves your issue? If you're interested in the code changes, take a look at the code commit here.
Hi, kwerner,
Another issue I want to confirm is that: the size of the wrfbdy_d01 file turn to be much less than the former one that without changing the max time-steps, it is about 5-times less than the former one. However, this less-size wrfbdy_d01 also contain the full variables with all time steps.
So I want to know whether this less-size file wrfbdy_d01 is noamal and correct, and could be used to simulate ?
 
I apologize for overlooking that question in your initial post. I'm not certain why the file size is smaller this time. If you're possibly using an updated netcdf with compression, the compression will make the files smaller. Otherwise, if the real program runs to completion and you're able to run wrf, then the wrfbdy file is okay. If you get an error when running, you'll know something went wrong. You can also check to ensure all boundary times are available in the file by issuing
ncdump -v Times wrfbdy_d01
 
I apologize for overlooking that question in your initial post. I'm not certain why the file size is smaller this time. If you're possibly using an updated netcdf with compression, the compression will make the files smaller. Otherwise, if the real program runs to completion and you're able to run wrf, then the wrfbdy file is okay. If you get an error when running, you'll know something went wrong. You can also check to ensure all boundary times are available in the file by issuing
ncdump -v Times wrfbdy_d01
Thanks a lot, I really appreciate.
 
Top