Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Error while reading namelist dynamics

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

HI everyone,
my issue actually concerns the running of WRF3.5.1, starting from a namelist.input that I previously used to run WRF4.2.
I was not sure it could work, since new modifications occurred between the two releases.
Firstly I deleted the Physics_suite CONUS, non implemented in 3.5.1 .
After several aborted runs, following the indications of the error file, I added the requested empty namelists like &dfi_control, &tc, &scm ..... (never requested in 4.2 version), but still WRF does not run and returns the error :

------ ERROR while reading namelist dynamics ------
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 9651
ERRORS while reading one or more namelists from namelist.input.
-------------------------------------------
Abort(1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0

I attached also the namelist.input file View attachment namelist.input and the namelist.outputView attachment namelist.output.txt

Thank you in advance
 
Oh thank you so much, now it runs....only for few minutes of the simulation time.
I see in the Users Manual that the introduction of the hybrid vertical coordinates is one of the most important improvement of the versions 4.* , and I did not realized that before.
This means that probably the input file generated by the program real.exe using the vertical hybrid coordinates , are not completely suitable for WRF 3.5.1.
This should explain the crash of the simulation with error message in the attached file.
Am I correct?
Actually I was trying to run the simulation with WRF3.5.1 using the domain and boundary conditions generated with the 4.1 suite (WPS+WRF).
I understand the issue related to the boundary condition, I'm wondering now whether or not to restart from the beginning of the simulation chain.
I also posted a related topic here :
https://forum.mmm.ucar.edu/phpBB3/viewtopic.php?f=30&t=9793
Thank you so much
F
 

Attachments

  • rsl.error.0005.txt
    8.1 KB · Views: 43
Hi all,
I think this is a cross-cutting issue for my WRF/WPS 3.5.1 installation.
I made a test running WRF with previous boundary conditions generated with real.exe (with hybrid vertical coordiantes, versione from WRF4.2).
WRF crashes after few minutes for physical reasons I guess (see attached rsl.error.0002.wrf), but still wrf is not able to open and write the output with the initial condition and some time records (rsl.out.0000.wrf). I only get a file core.3301 sizing 7GB.

So I tried to avoid the crash re-generating the boundary conditions with the program real.exe.
Same issue: real is not able to write to wrfinp file.

Bachward on the modelling chain also metgrid has the same issue.
Here it is documented: https://forum.mmm.ucar.edu/phpBB3/viewtopic.php?f=30&t=9793#p19367 .

So the point is that recently I compiled both WRF4.2 and WRF3.5.1 .
The first works , the latter no.
Both codes were successfully compiled.
I attached the configure and logs.
Guess it could be a problem of netcdf libraries (https://forum.mmm.ucar.edu/phpBB3/viewtopic.php?f=39&t=9416)?....look the .bashrc with the loaded modules.
thanks a lot
 

Attachments

  • rsl.out.0000.wrf
    3.5 KB · Views: 34
  • rsl.error.0002.wrf
    10.4 KB · Views: 41
  • rsl.out.0000.real.txt
    6.5 KB · Views: 36
  • log.compile.wrf42.txt
    820 KB · Views: 36
  • configure.wrf42.txt
    23.4 KB · Views: 32
  • log.compile.wrf351.txt
    468.4 KB · Views: 35
  • configure.wrf351.txt
    22.6 KB · Views: 32
Dear all,
this issue has been solved.
PLease take a look to the related post:
https://forum.mmm.ucar.edu/phpBB3/viewtopic.php?f=30&t=9793#p19376
Bye
 
Top