Running WRF with mpiexec

ikki0123

New member
Hello, my question is the following. If I run WRF with MPI and other nodes (Cluster ) is the execution time shorter? I did a test and it seems to me that I didn't see any difference in execution time. I don't know if I did something wrong. My test was as follows
First I used a node with 8 cores, then I tested it with another node totaling 16 cores, but I didn't notice any difference in the execution time.
I am using mpiexec
Sem título.png
 

Hello, my question is the following. If I run WRF with MPI and other nodes (Cluster ) is the execution time shorter? I did a test and it seems to me that I didn't see any difference in execution time. I don't know if I did something wrong. My test was as follows
First I used a node with 8 cores, then I tested it with another node totaling 16 cores, but I didn't notice any difference in the execution time.
I am using mpiexec
View attachment 8037
 
@ikki0123
It doesn't look like you're actually using multiple processors. I'm not sure what is going on, but those messages should look more like (for e.g., if you are using 8 processors):
starting wrf task 0 of 8
starting wrf task 1 of 8
starting wrf task 2 of 8
starting wrf task 3 of 8
starting wrf task 4 of 8
starting wrf task 5 of 8
starting wrf task 6 of 8
starting wrf task 7 of 8

You may need to discuss this with a systems administrator at your institution to see if they can help you get it set up correctly.
 
@ikki0123
It doesn't look like you're actually using multiple processors. I'm not sure what is going on, but those messages should look more like (for e.g., if you are using 8 processors):
starting wrf task 0 of 8
starting wrf task 1 of 8
starting wrf task 2 of 8
starting wrf task 3 of 8
starting wrf task 4 of 8
starting wrf task 5 of 8
starting wrf task 6 of 8
starting wrf task 7 of 8

You may need to discuss this with a systems administrator at your institution to see if they can help you get it set up correctly.
I also found it strange, Apparently I'm running the same process several times. Thanks for the note, I'll take another look. Any news come back here.
 
Back
Top