Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Long-time WRF simulation

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

D_Fora

New member
Hi, I want to do a 1-year WRF simulation with analysis nudging using ERA5 hourly reanalysis data.

Take 2019 for example, should I run WRF from Dec,2018 to Dec,2019, taking Dec in 2018 as spin-up period? If so, does nudging set in Dec,2018?

Thanks,

HUANG
 
Hi Huang,
You shouldn't need more than about 24-48 hours for spin-up, so if you're interested in the data from 1 Jan 2019, you can just start it Dec 30 or 31 and it should be okay. As for nudging, you can modify the namelist options to declare when you want nudging to start and stop. In the &fdda section, you can add any of the following:
gfdda_begin_y (years from start of run)
gfdda_begin_d (days from start of run)
gfdda_begin_h (hours from start of run)
gfdda_begin_m (minutes from start of run)
gfdda_begin_s (seconds from start of run)

The options also apply for ending times (e.g., gfdda_end_h).
 
kwerner said:
Hi Huang,
You shouldn't need more than about 24-48 hours for spin-up, so if you're interested in the data from 1 Jan 2019, you can just start it Dec 30 or 31 and it should be okay. As for nudging, you can modify the namelist options to declare when you want nudging to start and stop. In the &fdda section, you can add any of the following:
gfdda_begin_y (years from start of run)
gfdda_begin_d (days from start of run)
gfdda_begin_h (hours from start of run)
gfdda_begin_m (minutes from start of run)
gfdda_begin_s (seconds from start of run)

The options also apply for ending times (e.g., gfdda_end_h).

Thank you kwerner.

Something still confuse me that can FDDA method (analysis nudging) used in the whole simulation?

As Jimy Dudhia mentioned in his slides (slide 2, 4, and 6. https://vdocuments.mx/wrf-four-dimensional-data-assimilation-fdda-jimy-dudhia.html) , FDDA methods can be used for a) Dynamical initializaiton, b) Boundary conditions. When used for dynamical initializaiton, nudging starts in pre-forecast period and stops before forecast peorid, while nuding runs through forecast when used for boundary conditions.

And in the last slide, Dudhia emphasizes that "FDDA ... should not be used on the domain of interest and in the time period of interest ...".

Does this summary statement conflict with what mentioned before?
 
kwerner said:
Hi Huang,
You shouldn't need more than about 24-48 hours for spin-up, so if you're interested in the data from 1 Jan 2019, you can just start it Dec 30 or 31 and it should be okay. As for nudging, you can modify the namelist options to declare when you want nudging to start and stop. In the &fdda section, you can add any of the following:
gfdda_begin_y (years from start of run)
gfdda_begin_d (days from start of run)
gfdda_begin_h (hours from start of run)
gfdda_begin_m (minutes from start of run)
gfdda_begin_s (seconds from start of run)

The options also apply for ending times (e.g., gfdda_end_h).

Hi @kwerner, thank you for your helpful replies. I didn't want to post a new question since my doubts are pretty much relate to the one treated here.
I'll be running WRF for a 5 year period, and would appreciate any insights. I'm using as input GFS FNL 0.25, and two nested domains of 9 and 3 km resolution, about 100 to 110 grid points each (wider e-we) and 40 vertical layers, and I was highly recomended to use spectral nudging for the coarser domain, that I have included in the namelist file with the standard values, except for the wavenumber, which I changed from 2 to 3 km. I plan to do runs for two months at a time, with 5 days for spin up and overlapping.
All the examples I read, set nudging to stop at anything from 6 to 24h (gfdda_end_h). Is it any beneficial to extend the nugding for beyond that, up to the whole simulation period? Also, any other recommendation or insight about other aspects I should pay attention, given that I'll be performing a long run?

Thank you in advance.
 
Huang (D_Fora),
Yes, the options I mentioned still apply. You can use those namelist parameters to specify the nudging start and end dates/times. Depending on which type of nudging you'd like to do. The parameters are set for each domain, so you can just set them for the outer domain(s). When you set the grid_fdda variable, you can turn it off for the domain of interest (for e.g., grid_fdda = 1, 0). Zero means it won't nudge on that domain.
 
@bkainy,
Nudging is beneficial throughout the whole simulation, especially for larger/continental sized domains. It's less necessary for smaller domains that are more constrained by the boundaries. Can I ask why you intend to run the year-long simulation in 2 month increments, with 5-day spin-up, as opposed to just running it all together? Is this for timing (e.g., would you run all of these simultaneously in different directories, so that it finishes faster)?

As for long simulations, the only other thing that comes to mind is to make sure you have some additional SST input data. The GFS provides SST data with their files, but the resolution is temporally coarse. You will want something with higher temporal resolution. You may already have some, or a resource for it, but if you need additional resources, you can see slide 6 from this presentation.
 
Thank you, @kwerner

I'll employ the spectral nudging to the whole simulation, then, on the coarser grid. No particular reason for the 2 months at a time simulation - I did, however, read at least one paper that used this strategy, but with no clear justification. I may run the 5 years in a row, if the computing center which will be providing is ok with handling this amount of data (that is a bit of a concern on their side).
I looked at the presentation you sent the link to, and is really helpfull. However, for the link (https://polar.ncep.noaa.gov/sst/) the datasets apparently are no longer available, and for http://nomads.ncdc.noaa.gov/data.php I couldn't reach the website. I was only available to access the https://rda.ucar.edu/datasets/ds277.0/ dataset, but this one updates sst weekly - would it still be helpfull?
Once I have SST data, the procedure is running ungrib using Vtable.SST and including the options "sst_update = 1" in physics and "auxinput5_*" in &time_control, is that right?
 
Hi,
Yes, that would still be a helpful SST data type. Thank you for letting me know about the other links. I'll check into those and try to update them if I can find the new ones.

Yes, your strategy is correct with the vtable and namelist settings for sst.

Right, wallclock time could be an issue with running the model for all 5 years at a time. I would suggest finding out the max wallclock time you are allowed. Then do a very quick simulation (e.g., 1 hour simulation time) to see how long this takes. You can extrapolate and see how long of a simulation you are able to run within the wallclock time limit. You may need to run 1 year at a time, or 1/2 a year - whatever works. You don't need to spin up anything after the initial run, though. You'll just make sure to have a restart file (wrfrst*) output at the time you want to start your new run, and it will start from that time and move forward until the next restart time.
 
Hi, thank you for suggesting using sst_update option in long simulation.

I think SST data can be found in this link (ftp://polar.ncep.noaa.gov/pub/history/sst). Files inside folder named 'rtg_high_res' contain high resoulution data range from 2015 to 2020.

This set of data may be daily. Should I set 'interval_seconds = 3600 (or 21600)' in &share section to interpolate SST data into 1-hourly (or 6-hourly) window to work well with hourly (or 6-hourly) meterological input data, and set 'auxinput4_interval = 60 (or 360)' in &time_control section accordingly?
 
You'll want to set the time interval to the same interval you are using for your atmospheric data. So if your atmospheric data are available every 6 hours, use a 21600 interval for the SST data, as well. The metgrid program will interpolate the data to the 6-hr incremented values. You can set auxinput4_interval to the same increment.
 
Top