Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Adding wind at 22 meter to the model

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.


New member
Hi All,

For my work I need to have wind at 22 meter in the model, for this reason I modified some codes in the model but when I run the model with more than one processor, there are some stripes in U22/V22 with -inf to inf values. So there is a problem with running the model with multiple processors or input/output. I have attached all modified codes. The lines that I edited or added have "AF" in them. I also modified Registry.CONVERT but I can't attach it here. Also the namelist is attached here.

Any help or suggestion would be appreciated.


  • io_grib1.F
    109.2 KB · Views: 58
  • io_grib2.F
    135.6 KB · Views: 58
  • module_bl_ysu.F
    67.8 KB · Views: 58
  • module_first_rk_step_part1.F
    99.9 KB · Views: 58
  • module_pbl_driver.F
    118.1 KB · Views: 54
  • module_sf_sfclay.F
    50.5 KB · Views: 59
  • module_surface_driver.F
    350.5 KB · Views: 60
  • module_sf_sfclayrev.F
    56.4 KB · Views: 61
  • Registry.EM_COMMON
    365 KB · Views: 62
  • namelist.input
    7.2 KB · Views: 63
To attach the Registry.CONVERT file, just add .txt to the end - that should be sufficient. Can you also let me know which version of the model you modified? Thanks!
I used WRF-3.9 and Now I have this problem with WRF-4.0.1


  • Registry.CONVERT.txt
    48.7 KB · Views: 56
  • Registry.EM_COMMON.txt
    365 KB · Views: 56
  • Registry.EM_COMMON.var.txt
    35.6 KB · Views: 61
When you see stripes with distributed memory and the code works OK for serial processing, that is typically the indication that a mandatory MPI communication is missing.

I am assuming that that your u22 and v22 fields are similar to u10 and v10 diagnostics - these fields should not need to be communicated with other processors for later processing.

What we would do to solve this issue is what we are asking you to try.

1. Is the problem indeed an MPI boundary interface issue?
To test this out, simplify the u22 and v22 assignments to be values that are known, but have horizontal gradients. For example u22(i,j) = i, and v22(i,j) = j.

If this "fixes" the problem, identify what is causing your troubles. If this does not fix the problem, track down where the u22 and v22 are being re-assigned from reasonable values.

2. Are correct horizontal indexes being used?
There are examples of indexing in many diagnostic, dynamics, and physics routines. If u22 and v22 are similar to u10 and v10, find locations that are similar to the computations of those diagnostics.

For anyone interested, there is additional conversation regarding this here: