Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

(RESOLVED) Weird SST/SKINTEMP artifacts along coast

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

Greetings,

I am downscaling a very low-resolution GCM (MPI-ESM1-2-LR) using WRF. I am using skin temperature (SKINTEMP) from the GCM output to generature the intermediate binaries.

When generated, the me_em files show blocky patterns in the SST field along the shoreline which precisely match the LANDSEA mask. Is there a standard way to smooth out these artifacts using metgrid.exe?

All my best, and thank you.
-Stefan
 
Stefan,
I guess you didn't include LANDSEA mask in your GCM data. Please get this field, then rerun ungrib and metgrid.
Let me know whether this will give you better results along the coast.
 
Thanks for the reply,

LANDSEA is included as a binary output, 1 for land, 0 for ocean. The problem is that there are swaths of coastline areas in the WRF domain where SKINTEMP from the GCM land area is being mapped.

I have changed my approach, and I am now using the SST variable instead of the SKINTEMP variable. However, SST gives me the same problem, only SST doesn't make it all the way to the WRF coastline; there are missing values in between the GCM SST boundary and the WRF coastline. Is there an interpolation method that bypasses this issue? To reiterate, I want these swaths of missing data to be replaced by the nearest "good" SST data from the GCM.

Thanks,
-Stefan
 
Update,

I have been playing around with the interpolation options in metgrid/METGRID.TBL for SST. Here is what I have...

name=SST
interp_option=wt_average_16pt
fill_missing=-999.
missing_value=-999.
flag_in_output=FLAG_SST
interp_land_mask = LANDSEA(1)

This option is using a weighted 16-point average to compute SSTs at each met_em gridcell, masking values over land. This gets me closer to my goal, as it appears as though the masked areas are being "cut into" and being populated by GCM SSTs. However, there are still areas where the SST is on the order of 130K... far too low. I thought that bad or missing values would have been skipped in the averaging process here.

I've also tried using the "search(r)" method for interpolation, for which the user defines a radius within which the WRF grid's SST should be aasigned the GCM SST for the nearest unmasked value, but I get large swaths of strange results in the met_em file. Here, "r" is the defined radius in GCM grid points away from metgrid's WRF cell of interest.

Thanks,
-Stefan
 
Update --

Problem not fixed, but I am getting closer. See the attached figure. I used the following interpolation options in the METGRID>TBL:

name=SST
interp_option=wt_average_4pt
fill_missing=-100000.
missing_value=-100000.0
flag_in_output=FLAG_SST
interp_mask = LANDSEA(1)
masked=land

These strange artifacts just need to be "smoothed" to the SSTs around them, excluding the masked values of course. Can WRF?WPS do this?

Thanks,
-StefanScreen Shot 2020-04-28 at 12.19.44 PM.png
 
Problem fixed!

The problem was not with WPS, but rather with my LANDSEA mask as the first commenter pointed out. My final METGRID.TBL setup looked like:

name=SST
interp_option=wt_average_16pt
fill_missing=285.
missing_value=100000.
flag_in_output=FLAG_SST
interp_mask = LANDSEA(1)
masked=land

Thanks for your help,
-Stefan
 
Top