Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

boudary conditions for real case LES || WRF-LES specialists help please.

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

Peter_Wilson

New member
Hi everyone, I want to run WRF LES(Large eddy simulation) in a real case with real lateral boundary conditions.
Which weather model is best for simulating with a 100–150 m grid spacing?
I tried GFS(28km) and ECMWF(9km) but they do not resolve turbulent scales at this grid spacing, obviously they're too coarse.
The best I've found is HRRR with 3km grid, is it worth trying? Or is there something better for real case LES?
 
Hi,
When you're running this with coarser input (e.g., GFS/ECMWF) are you using nesting so that you have coarser outer domain(s) around your 100-150m domain of interest?
 
Yep, i did a one-way nested run with d01 3km grid and d02 166m (1:18 grid ratio) using HRRR boundary conditions for a coarsest domain. And this is complete nonsense, the model does not resolve any large eddies, there is no eddy resolving deep convection. Also the physical structure of cloud,rain,snow,ice fields remains the same as if it was a 3km grid run. Increasing the horizontal resolution only made them smoother, otherwise everything is the same.
This cannot be compared with idealized WRF-LES, that is where everything is present and all large eddies (depending on the resolution) are resolved. And here in real case is complete nonsense, although everything is correct in the input file.
 
The problem could very well be the 1:18 grid ratio. We recommend using a ratio of either 3:1 or 5:1 (typically odd ratios are best, but not mandatory), but no higher than 5:1. It may be necessary to add several nests in between to get down to the resolution you're interested in. For that to work, though, you may have to use ndown since the number of processors you'll be able to use for high-res domains (and likely larger domain sizes) will be limited by the number you can use for the coarser (smaller number of grid points) domains. Here are a few informational pages that may be useful for you:

Best practices for namelist.wps settings and namelist.input settings

Presentation about WRF Nesting

Explanation about allowable/reasonable number of processors, based on the size of your domain(s).
 
Not sure if the problem is with the nesting itself. Nesting cannot improve the resolution of the initial boundary conditions of the coarsest region.
I think that the situation can be changed either by forcing the structures of large eddies, or by increasing the resolution of the boundary conditions themselves.
But I don’t know how to do a forcing of large eddy structures. And I have not seen boundary conditions with a resolution of 166 meters at all, most likely they do not exist.

In general, I want someone who is really engaged in large eddy simulations with real boundary conditions to help me.
 
Top