Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

27 km land surface features in 3 km resolution domain

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

oscarvdvelde

New member
I am running WRF practically out of the box on the Amazon cloud.
For sure I specified geog_data_res = "default,"2m","30s" for geographical features, and it seems I have the files for 30s in the WPS_GEOG directory.
However, the model output in the 9 and 3 km domains over NW Colombia shows blocky 27 km-sized land use and sea mask features and heavily smoothed terrain height.
terrain-smooth.png
landuse.png
How could I improve this?
 
Which version of WPS did you run? Since you set geog_data_res= 'default', and 'default' may point to different dataset in different version of WPS.
If possible, please update WPS to t latest version, in which 'default' data is modis_30s data and high resolution topography, which will give your more detailed information.
 
Ming Chen said:
Which version of WPS did you run? Since you set geog_data_res= 'default', and 'default' may point to different dataset in different version of WPS.
If possible, please update WPS to t latest version, in which 'default' data is modis_30s data and high resolution topography, which will give your more detailed information.

Thanks for your comment, log.compile shows version 4.1. But note that I am using "2m" and "30s" on the 9 km and 3 km domains respectively. Shouldn't that select the correct resolution already? Instead I see the large domain grid (or it is GFS/ERA5 land surface information?) being applied to the inner domains.

Can I force all domains to use "modis_30s" or are there benefits of 5m and 2m?
 
I'm pretty sure it ran with "default", "2m", "30s" but with that blockiness.
But in fact, I cannot run it e.g. with "10m","2m","30s".
It complains: ERROR: Could not open ../../WPS_GEOG/soiltype_top_10m/index
That is no surprise because those files do not exist for 10m. I have all the files I can find on this page:
https://www2.mmm.ucar.edu/wrf/users/download/get_sources_wps_geog.html

I did get it to run with "30s","30s","30s" but it still has just as low resolution as before.
 
Please send me your namelist.wps and I would like to repeat your case. It may take a few days because our super computer is down these days. Thanks for your patience.
 
Here is the namelist of the "30s","30s","30s" run and the WPS logs.
Thanks for your time.

WPS_GEOG_contents.PNG
 

Attachments

  • geogrid.log
    43.8 KB · Views: 32
  • namelist.wps
    827 bytes · Views: 58
  • metgrid.log
    995.7 KB · Views: 30
  • namelist.input
    4.1 KB · Views: 40
Hi,
I run your case and the output is correct. Somehow I cannot repeat your error. Please see the attached plots.
I am not sure what is going on in AWS. Can you try to run the case in a local machine ?
 

Attachments

  • ncview.LU_INDEX.ps
    803.6 KB · Views: 40
I currently have no other machine available, except another AWS instance, the package with multiple WRF V4 versions (up to 4.1 in my case).
Perhaps I can compile a newer version on it.

A run I already had done there also had the same issue. I looked into the corresponding geo_em.d03 and met_em.d03 output. The geo_em has got the desired resolution or terrain height and land-sea mask, but the met_em is smoothed/blocky. I guess only the latter is used for the actual model run?

Is there anything we can do with this information?
 
Hi,
REAL program only reads met_em data to produce initial and boundary conditions for WRF run. Therefore, met_em must be correct.
I am sorry that you have to go through so many troubles. I am not quite familiar with AWS computing and instances, but I guess something was wrong there, which resulted in wrong terrain height, landmark, etc.
if the codes can run successfully, I would say that you don't need to recompile. This is a data issue, and you may need to check whether the high resolution data (e.g., MODIS 30s and 30s terrain data etc.) are available in your AWS.
 
As the geo_em file does contain high-resolution data, I think the data in WPS_GEOG folder are probably fine? I included the file list screenshot some posts earlier. It seems consistent with what is available on the surface data download page.

Then somehow the geo_em information is not used by metgrid.exe which produces the met_em files.
Do the input fields of GFS (or ERA5, which I am also trying) matter for the final resolution of the land details?
I could also try to eliminate the intermediate 27 km grid to check if then I get 9 km as surface resolution.
 
After running geogrid.exe, we will have the basic static information, including landmask, lu_index, HGT_M, etc. metgrid.exe will read these information ad keep them in met_em files. Note that metgrid.exe also reads data from intermediate file, which includes coarse-resolution static data of landmask (named landsea) and terrain height (named soilhgt) from the forcing data. However, these information shouldn't affect lu_index. Also, HGT produced by geogrid shouldn't be changed after running metgrid.exe.
 
Sorry, I was mistaken about met_em files, they actually also contain high-resolution fields, and some smoothed ones, like PSFC (should it not follow MODIS terrain height?)
That leaves the model output files as the only ones not using these fields. For example, LU_INDEX is very finely represented in the met_em files, but blocky in wrfout. 2m variables and QVAPOR at the lowest 1-2 levels has blocky coastlines, HGT is smoothed.
Maybe I should try one of the different versions of WRF included on the AWS...
 
It is reasonable that PSFC doesn't follow the MODIS terrain because it is from the forcing data and should follow the pattern of SOILGHT.
However, lu_index and low level variables like T2 and QV should not be blocky.
 
Hmm I notice this namelist.input setting of mine: input_from_file = .true.,
Shouldn't it be .true., .true. ,.true., given that I have 3 domains, or doesn't it matter for my issue?

Update: This solved the issue! I wish I realized this a bit earlier :roll:
 
Thanks for the update, --- I should take a look at this namelist.input earlier. Somehow by mindset we focused on WPS and thought it was a data issue. Anyway it is good that the issue was solved.
 
Top