Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Running Problem for GFS Update V15.2

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.


New member
Since GFS version update to 15.2 (November 7, 12Z), I'm experiencing a trouble during wrf.exe. You may find the error in the rsl file below. Currently, I use WRF ARW V3.9 and WPS 4.1. Upgrading WRF to version 4.0 or 4.1 did not work. Any help will be greatly appreciated.

d01 2019-11-07_18:01:00  call phy_prep
d01 2019-11-07_18:01:00  DEBUG wrf_timetoa():  returning with str = [2019-11-07_18:01:00]
d01 2019-11-07_18:01:00  call radiation_driver
d01 2019-11-07_18:01:00 Top of Radiation Driver
d01 2019-11-07_18:01:00 calling inc/
d01 2019-11-07_18:01:00  call surface_driver
d01 2019-11-07_18:01:00 in SFCLAY
d01 2019-11-07_18:01:00 in SFCLAY
*** Process received signal ***
Signal: Segmentation fault (11)
Signal code: Address not mapped (1)
Failing at address: 0xfffffffe0746ad00

A very kind man researched the issue and concluded the following:

Land Surface solver of Noah Land Surface Model ("sf_surface_physics = 2," in the namelist.input file) does not work properly with GFS update. If you look at grib files, there were finite values of soil temperature and moisture at sea points before, however now they are NULL as it should be. Not quite sure but NLSM confused with NULL values, and using different Land Surface Model may solve the problem. In my case, I switch to NOAH-MP and successfully completed the run.

I am posting it here in case it helps someone else. The sf_surface_physics value for NOAh-MP is 4.
Yes, the problem is present also in WRF 4.1 version, and it seems related to the GFS change to V15.2.

I did some tests using different sf_surface_physics configurations, and the only scheme that works without problems is NOAA_MP.

NOAA land_surface_model (number 2 in the namelist option) crashes immediately after the start of wrf.exe, without any advice on rsl files.

5-layer thermal diffusion scheme (number 1) crashes after several hours of simulations, and it produces some highly non-realistic values in 2 meters temperature. In some places of the model domain, very confined geographically and adjacent to the sea, the skin temperature (and consequently the 2-meters temperature) drops to -60 / -80 °C: it seems that the scheme is not able to correctly manage the undefined value of the sea.

Thanks for the precious hints about the problem


I also had problems with the GFS Update v15.2, since when running wrf.exe (from WRF-ARW versions 3.5 and 3.9) the model crashed just after beginning the simulation.

I managed to solve the problem by maintaining the model version and my physics (using Noah as a LSM indeed!). These are the steps I followed:

1. To donwload LANDN from NOMADS.

2. Edit the Vtable.GFS so that the 7th field for the LANDN entry had a comment. If this space is left blank, ungrib.exe does not write this field to the intermediate FILE.
diff Vtable.GFS.old Vtable.GFS
<   81 |   1  |   0  |      | LANDSEA  | proprtn | Land/Sea flag (1=land, 0 or 2=sea)      |  2  |  0  |  0  |   1 |
<   81 |   1  |   0  |      | LANDN    | proprtn |                                         |  2  |  0  | 218 |   1 |
>   81 |   1  |   0  |      | LANDSEA  | proprtn | Land/Sea flag (1=land, 0 or 2=sea)      |  2  |  0  |  0  |   1 |
>   81 |   1  |   0  |      | LANDN    | proprtn | Land/Sea flag (1=land, 0 or 2=sea)      |  2  |  0  | 218 |   1 |
3. Edit the METGRID.TBL so that when metgrid.exe finds the LANDN field, it uses its values to fill or rewrite the LANDSEA field.
> name=LANDN ; output_name=LANDSEA  # If we get LANDN, use entry from LANDSEA and
>                          #   write the field out as LANDSEA
> ========================================

The thing is that you need LANDN as the valid land-sea mask instead of LANDSEA in order to interpolate the subsoil TEMP & MOISTURE at your domain.
Hi Ming,

Thanks for your message, I was running using WRFv4.0.3 and WPSv4.0.3 and still had problems when the GFS was upgraded to V15.2. I had to change my sf_surface_physics scheme from option 2 to 4 (Noah-MP) before I could get WRF running again. Do I need a newer version of WPS than v4.0.3 to run with the other surface physics scheme?

Thanks to all the others too for the messages which helped to solve the problem!
Please send me your namelist.wps and namelist.input to take a look. I would like to repeat this case and make sure what is going on.
By the way, do you work with GFS analysis products or forecast products?
Hi Ming,

Sorry for the delayed response. Please see attached the namelist.wps and namelist.input files we were using without issues problems before 7th November 2019. We work with GFS forecast products to power operational WRF forecast runs. We are currently using WRFv4.0.3 and WPSv4.0.3.

Since 7th November we have been running WRF successfully, but using Noah-MP sf_surface_physiscs.

All the Best


  • namelist.input
    6.1 KB · Views: 70
  • namelist.wps
    1.3 KB · Views: 61
I have run a test case using your namelist.wps and namelist.input, except that I process the GFS data for 2019-12-04 (the date after GFS update).
I try both GFS forecast and GFS analysis products, and both cases work fine.

Note that I download GFS data from NCAR CISL website. Where did you download the GFS data?

We hit the same problem in our operational model, we updated WPS/WRF in June so are running > v4.0 though I need to check exact version. Looking at the files we have from GFS they moved from having soil temperatures assigned on every grid cell (presumably using SST not on land?) to having null values. Updating the land mask (LANDN) as suggested above has got it running again though I haven't checked the results yet.

I get GFS forcing from*/


  • namelist.input
    4.5 KB · Views: 53
  • namelist.wps
    1.2 KB · Views: 48
I am suspicious this is because the files you download don't include all necessary information. I remember someone posted an issue similar to yours. I will find the link and send to you. You may also download GFS data from CISL and compare with the data you download from NCEP. Apparently the data archived in CISL work fine.
Please see the link here:
At the bottom of this link, you can find the solution related to problems for using new GFS data. Hope it is helpful for you.