davidovens
New member
I've been trying to get WRF 4.0.3 running on our cluster here at the
University of Washington. I've built with icc/ifort (17.0.0.098) and OpenMPI
(2.1.1) on our intel Xeon cluster. I had no trouble running this
simple 2-domain case with an SMP-compiled version, so I know that the
inputs and namelist.input are fine.
However, every build with OpenMPI -- trying different versions
of OpenMPI (1.8.8 and 2.1.1) and ifort (15.0.0.090 and 17.0.0.098) --
leads to a segmentation fault at the first time step (see below).
I've of course tried setting stacksize to a healthy 6000M (which is
more than enough for this small 36/12-km domain run).
I am not using the hybrid vertical coordinate. Does anyone have any
ideas or has anyone had similar problems with OpenMPI?
Error seen in rsl.error.0000:
....
D01 3-D analysis nudging reads new data at time = 0.000 min.
D01 3-D analysis nudging bracketing times = 0.00 180.00 min.
mediation_integrate.G 1943 DATASET=HISTORY
mediation_integrate.G 1944 grid%id 2 grid%oid 3
Timing for Writing wrfout_d02_2019-01-04_12:00:00 for domain 2: 1.12362 elapsed seconds
Tile Strategy is not specified. Assuming 1D-Y
WRF TILE 1 IS 1 IE 46 JS 1 JE 24
WRF NUMBER OF TILES = 1
Timing for main: time 2019-01-04_12:01:12 on domain 2: 1.78872 elapsed seconds
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
wrf.exe.mpi211if2 00000000033FFB61 tbk_trace_stack_i Unknown Unknown
wrf.exe.mpi211if2 00000000033FDC9B tbk_string_stack_ Unknown Unknown
wrf.exe.mpi211if2 0000000003378974 Unknown Unknown Unknown
wrf.exe.mpi211if2 0000000003378786 tbk_stack_trace Unknown Unknown
wrf.exe.mpi211if2 00000000033023A7 for__issue_diagno Unknown Unknown
wrf.exe.mpi211if2 0000000003309FD0 for__signal_handl Unknown Unknown
libpthread-2.19.s 00007FADE38D6890 Unknown Unknown Unknown
wrf.exe.mpi211if2 000000000040D46E Unknown Unknown Unknown
libc-2.19.so 00007FADE353DB45 __libc_start_main Unknown Unknown
wrf.exe.mpi211if2 000000000040D369 Unknown Unknown Unknown
Thanks,
David Ovens
University of Washington. I've built with icc/ifort (17.0.0.098) and OpenMPI
(2.1.1) on our intel Xeon cluster. I had no trouble running this
simple 2-domain case with an SMP-compiled version, so I know that the
inputs and namelist.input are fine.
However, every build with OpenMPI -- trying different versions
of OpenMPI (1.8.8 and 2.1.1) and ifort (15.0.0.090 and 17.0.0.098) --
leads to a segmentation fault at the first time step (see below).
I've of course tried setting stacksize to a healthy 6000M (which is
more than enough for this small 36/12-km domain run).
I am not using the hybrid vertical coordinate. Does anyone have any
ideas or has anyone had similar problems with OpenMPI?
Error seen in rsl.error.0000:
....
D01 3-D analysis nudging reads new data at time = 0.000 min.
D01 3-D analysis nudging bracketing times = 0.00 180.00 min.
mediation_integrate.G 1943 DATASET=HISTORY
mediation_integrate.G 1944 grid%id 2 grid%oid 3
Timing for Writing wrfout_d02_2019-01-04_12:00:00 for domain 2: 1.12362 elapsed seconds
Tile Strategy is not specified. Assuming 1D-Y
WRF TILE 1 IS 1 IE 46 JS 1 JE 24
WRF NUMBER OF TILES = 1
Timing for main: time 2019-01-04_12:01:12 on domain 2: 1.78872 elapsed seconds
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
wrf.exe.mpi211if2 00000000033FFB61 tbk_trace_stack_i Unknown Unknown
wrf.exe.mpi211if2 00000000033FDC9B tbk_string_stack_ Unknown Unknown
wrf.exe.mpi211if2 0000000003378974 Unknown Unknown Unknown
wrf.exe.mpi211if2 0000000003378786 tbk_stack_trace Unknown Unknown
wrf.exe.mpi211if2 00000000033023A7 for__issue_diagno Unknown Unknown
wrf.exe.mpi211if2 0000000003309FD0 for__signal_handl Unknown Unknown
libpthread-2.19.s 00007FADE38D6890 Unknown Unknown Unknown
wrf.exe.mpi211if2 000000000040D46E Unknown Unknown Unknown
libc-2.19.so 00007FADE353DB45 __libc_start_main Unknown Unknown
wrf.exe.mpi211if2 000000000040D369 Unknown Unknown Unknown
Thanks,
David Ovens