Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1.

Dear Tanmoy,
Thank you so much for your reply and suggestion. Yes, I set up same as nested domains need only initial information in two way nesting. When I set up larger time step, I also usually face CFL error.

Best,
Khin
WRF namelist.wps Best Prac

WRF Namelist.input Best Prac



this might help
 
Hi everyone, I'm getting this error when I run wrf.exe. Here is the complete error message.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
The followings are the rsl.out rsl.error and namelist files. The rsl.out and rsl.error files are too large, so I only attched the begining and the ending parts of each file.
Thanks in advance.
 

Attachments

  • namelist.wps
    831 bytes · Views: 1
  • namelist.input
    3.9 KB · Views: 0
  • rsl.error.0000.txt
    85.5 KB · Views: 0
  • rsl.out.0000.txt
    85.4 KB · Views: 0
Last edited:
Hi everyone, I'm getting this error when I run wrf.exe. Here is the complete error message.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
The followings are the rsl.out rsl.error and namelist files. The rsl.out and rsl.error files are too large, so I only attched the begining and the ending parts of each file.
Thanks in advance.
can you open a new issue on the forum? as the admins like to keep issues separated due to different system setups
 
Hi,
I have configured and complied the WPS and WRF-Hydro, when I am running the command "mpirun -np 2 ./real >& real.log", I got the similar problem as you in the "real.log":
starting wrf task 0 of 2
starting wrf task 1 of 2
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

in the rsl.error.0001:

Assume Noah LSM input
d01 2013-09-09_18:00:00 forcing artificial silty clay loam at 13 points, out of 820
-------------- FATAL CALLED ---------------
grid%tsk unreasonable
-------------------------------------------

I have cheched the domain consistency between .wps and namelit.input.

Does anyone know how to solve this problem? Thank you very much !
 

Attachments

  • namelist.input
    4.6 KB · Views: 0
  • rsl.error.0000
    1.9 KB · Views: 0
  • rsl.out.0000
    97.4 KB · Views: 0
  • namelist.wps
    866 bytes · Views: 0
Top