Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

WRF-LES(Real data) microphysics doesn't work

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

Peter_Wilson

New member
Hi, i'm running WRF in LES mode with real data, using NCEP GDAS Final Analysis data. The problem is that fields Qcloud, qrain, qgraup, qice, qrain, qsnow and their two-dimensional counterparts in the output file all are equal to 0 and constant.

To me, these fields begin to appear in the output file only when dx is greater than 8000-10000 and, accordingly, the boundary layer parameterization is enabled but this is not a LES mode anymore.

Obviously, it shouldn't be that way, what to do to fix the problem?
I am attaching an input file.
 

Attachments

  • namelist.input
    3.1 KB · Views: 74
The model top in your namelist is too low for a real-data case (with P_TOP=580 hPa). Also, the domain is too small.
In addition, LES run in a real-data mode requires special treatment of lateral forcing, because the permanent lack of eddy structures in the lateral forcing will damage the model simulation.
Overall, running real-data LES is quite complicated. Please refer to literature for more information before you start doing so.
 
Sorry, but I have limited memory and computational resources and this prevents me from simulating with a huge number of grid cells. I also want the simulated domain to be isotropic so that dx = dy = dz. The maximum top of the model I can get is 100 hpa to keep my dz = 100 m and still keep a reasonable number of grid cells. Hope this should be enough for real case LES?
 

Attachments

  • namelist.input
    3.1 KB · Views: 44
For real-data case, 100m is way too coarse for the surface layer and lower PBL layer. Unreasonable model configuration may cause problem in model run and damage the simulation.

Please refer to the literature for real-data LES simulation. Probably you can take a look at this paper:

https://ui.adsabs.harvard.edu/abs/2017EGUGA..1912439H/abstract

Hope it might give you some hints.
 
I did a one-way nested run with d01 3km grid and d02 166m (1:18 grid ratio) using HRRR boundary conditions for a coarsest domain.

And this is complete nonsense, the model does not resolve any large eddies, there is no eddy resolving deep convection.
Also the physical structure of cloud,rain,snow,ice fields remains the same as if it was a 3km grid run. (It turned out that clouds,ice,snow etc were in a different place, before that these fields were out of sight of my domain, and therefore they were constant in the output file)
Increasing the horizontal resolution only made them smoother, otherwise everything is the same.

This cannot be compared with idealized WRF-LES, that is where everything is working as it should and all large eddies (depending on the resolution) are resolved, there I can clearly see the work of deep convection.
And here in real case is complete nonsense, although everything is correct in the input file.
So I see no reason to run case with resolution higher than this, everything will be the same. Coarse 3km grid boundary conditions simply do not allow any eddies to be resolved.

I would like to know how to do the lateral forcing of large eddy structures, this seems to be the only way out of this situation. If you have any materials on this, please share.
 
Top