Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

wrf.exe don't write any output but running (sf_sfclay_physics)

syyang

New member
I'm conducting one domain experiment and two domain experiments with WRF version 4.4.2.
I am using openmpi 4.1.4 version and aarch64, a 64-bit processor based on the ARMv8 architecture.

It was confirmed that the experiment set with two domains was completed normally,
and the experiment with one domain was performed with the settings of d01 as is.

The only difference between the two experiments is that the max_dom of "namelist.input" is 2 and 1.

If i run it after changing max_dom to 1, WRF.exe continues to run, but it just runs without creating any wrfout_d01~.
The result is the same whether i run it locally (mpirun -np 2 ./wrf.exe) or run it on a compute node (use wrf.slurm scripts).

I confirmed that the one-domain experiment performed well when performed with the same settings on another server using the Intel compiler.

Here are the namelist.input and rsl.error.0000 logs performed by wrf using openmpi.

Please Help me
 

Attachments

  • namelist.input
    3.8 KB · Views: 4
  • rsl.error.0000
    3.2 KB · Views: 3
Last edited:
found the answer.

I left all other physics options the same and changed "sf_sfclay_physics" from 1 to 91, and WRF worked normally.

Option 1 is "Revised MM5 Monin-Obukhov scheme" and option 91 is "Old MM5 scheme".
Could it be that option 1 does not work on the aarch64 core?
 
found the answer.

I left all other physics options the same and changed "sf_sfclay_physics" from 1 to 91, and WRF worked normally.

Option 1 is "Revised MM5 Monin-Obukhov scheme" and option 91 is "Old MM5 scheme".
Yes this correct.

Could it be that option 1 does not work on the aarch64 core?
No, option 1 should work on arch64 core. Have you tried to compile WRF in dmpar mode and rerun this case?
 
I'm already compile WRF in dmpar.

I successfully performed an experiment with nesting domains by compiling with dmpar,
and only changed the domain to single in the same compile option.
 
This is weird, and I see no reason why WRF works fine but not writing any outputs, ---- did you check whether you have enough space to save the output?
 
Yes, we confirmed that the storage capacity was sufficient and that other files were created, such as actually downloading data.

In order to check what process of WRF was performed, I created a random log in the source code.
All messages containing "seon" in the "WRF/phys/module_sf_sfclayref.F" file are random logs I added.

I think it stopped working while using the sfclay module.
 

Attachments

  • module_sf_sfclayrev.F
    60.8 KB · Views: 0
  • single_domain_rsl.error.0000
    7.4 KB · Views: 0
I am suspicious that something went wrong with unrealistic value like NaN, but for some machines wrf.exe won't stop immediately. It just hanged there until the required time is used up.

To figure out whether this is the case, please recompile WRF in debug mode, i.e., ./configure -D, and rerun this case. The log file will tell you excatly when and where the model crashed first.
 
Top