Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

WRF 4.6.0 firebrand module error (does not have type)

William.Hatheway

Active member
Not sure what recently changed with 4.6.0 but for some reason using the intel compilers (LLVM) it fails to build. Whereas it worked before about a month ago.

I have uploaded two zip files. One is the month ago log that worked the other is today's log that didn't. Interesting this to note is that all the tests passed for installation of the libraries and the WRF compatibility tests. Still needs to do two compiles though to make it work.


Any ideas?
 

Attachments

  • failed.zip
    74.3 KB · Views: 1
  • Success.zip
    70.3 KB · Views: 0
I think this is related to WRFv4.6.0 fails to build on Ubuntu 24: Modules not found

Some others have had this same issue independent of the OneAPI compilers or Ubuntu, but always Intel. I've yet to be able to reproduce it, but if you have an environment in which this is currently failing I think I have some ideas of how to further inspect what's going on.

As the title of this post is more generic and typically the first error people encounter when this issue occurs it will be best to follow up on this thread (as opposed to the previously linked one).
 
I think this is related to WRFv4.6.0 fails to build on Ubuntu 24: Modules not found

Some others have had this same issue independent of the OneAPI compilers or Ubuntu, but always Intel. I've yet to be able to reproduce it, but if you have an environment in which this is currently failing I think I have some ideas of how to further inspect what's going on.

As the title of this post is more generic and typically the first error people encounter when this issue occurs it will be best to follow up on this thread (as opposed to the previously linked one).
@islas

What are the ideas? I have an environment where it will always fail.
 
Using a partially built WRF setup that failed, we should be able to go into the phys/ directory and manually compile to recreate and isolate the issue.
So:
Code:
# inside WRF top dir
cd phys/
# copy-paste verbatim the error line from the log file
time mpiifx -o module_firebrand_spotting_mpi.o -c -O3 -ip -fp-model precise -w -ftz -align all -fno-alias -FR -convert big_endian    -I../dyn_em  -I/home/workhorse/WRF_Intel/WRFV4.6.0/external/esmf_time_f90  -I/home/workhorse/WRF_Intel/WRFV4.6.0/main -I/home/workhorse/WRF_Intel/WRFV4.6.0/external/io_netcdf -I/home/workhorse/WRF_Intel/WRFV4.6.0/external/io_int -I/home/workhorse/WRF_Intel/WRFV4.6.0/frame -I/home/workhorse/WRF_Intel/WRFV4.6.0/share -I/home/workhorse/WRF_Intel/WRFV4.6.0/phys -I/home/workhorse/WRF_Intel/WRFV4.6.0/wrftladj -I/home/workhorse/WRF_Intel/WRFV4.6.0/chem -I/home/workhorse/WRF_Intel/WRFV4.6.0/inc -I/home/workhorse/WRF_Intel/Libs/NETCDF/include  -real-size `expr 8 \* 4` -i4  module_firebrand_spotting_mpi.f90

If that reproduces the issue, we can almost safely assume that the issue is not due to the internal environment setup within the makefiles. In this case, we can further reduce the problem by only using the raw compiler flags rather than the MPI wrapper. To do this we can do
Code:
# Still in phys/
mpiifx -show # copy the output, I'm not doing `` or outputting to a variable so we can see everything
# now in place of mpiifx we can do 
time <paste output> <rest of the compile line>

This again should result in error if the first manual compilation did as well. Removing duplicated flags we should see (1) includes to the MPI location, (2) library location to MPI, linker options, libraries to link in like -lmpifort -lmpi -ldl -lrt -lpthread

We really only care about -lmpifort -lmpi and their respective includes. If we look in the library locations specified by -L<path> we should find something like "libmpifort.so" and "libmpi.so" that matches our linked libraries. If we look in the include paths (note that there is both <mpi path>/include and <mpi path>/include/mpi we should find the headers and mod files like so (my output):
Code:
/home/aislas/intel/oneapi/mpi/2021.11/include
mpi  mpicxx.h  mpif.h  mpi.h  mpiof.h  mpio.h

ls /home/aislas/intel/oneapi/mpi/2021.11/include/mpi
gfortran  mpi_base.mod       mpi_f08_callbacks.mod          mpi_f08_link_constants.mod  mpi_f08_types.mod  mpi_sizeofs.mod  pmpi_f08.mod
ilp64     mpi_constants.mod  mpi_f08_compile_constants.mod  mpi_f08.mod                 mpi.mod            pmpi_base.mod

Assuming all that looks good, we can now effectively test if somehow the .mod file is not being found. A simple test program to use MPI:
Code:
program main
  use mpi
  integer :: ierr
  call MPI_Init(ierr)
  call mpi_finalize( ierr )
  write( *, * ) "MPI ran"
end program main

Using the output of mpiifx -show from before we can compile this file (e.g. if named "mpi_test.F90" by adding mpi_test.F90 -o mpi_test to the rest of the command)

And if that fails, well there are problems with the compiler and .mod file it is trying to use. If it manages to compile, then the WRF source file is suspect.
 
Using a partially built WRF setup that failed, we should be able to go into the phys/ directory and manually compile to recreate and isolate the issue.
So:
Code:
# inside WRF top dir
cd phys/
# copy-paste verbatim the error line from the log file
time mpiifx -o module_firebrand_spotting_mpi.o -c -O3 -ip -fp-model precise -w -ftz -align all -fno-alias -FR -convert big_endian    -I../dyn_em  -I/home/workhorse/WRF_Intel/WRFV4.6.0/external/esmf_time_f90  -I/home/workhorse/WRF_Intel/WRFV4.6.0/main -I/home/workhorse/WRF_Intel/WRFV4.6.0/external/io_netcdf -I/home/workhorse/WRF_Intel/WRFV4.6.0/external/io_int -I/home/workhorse/WRF_Intel/WRFV4.6.0/frame -I/home/workhorse/WRF_Intel/WRFV4.6.0/share -I/home/workhorse/WRF_Intel/WRFV4.6.0/phys -I/home/workhorse/WRF_Intel/WRFV4.6.0/wrftladj -I/home/workhorse/WRF_Intel/WRFV4.6.0/chem -I/home/workhorse/WRF_Intel/WRFV4.6.0/inc -I/home/workhorse/WRF_Intel/Libs/NETCDF/include  -real-size `expr 8 \* 4` -i4  module_firebrand_spotting_mpi.f90

If that reproduces the issue, we can almost safely assume that the issue is not due to the internal environment setup within the makefiles. In this case, we can further reduce the problem by only using the raw compiler flags rather than the MPI wrapper. To do this we can do
Code:
# Still in phys/
mpiifx -show # copy the output, I'm not doing `` or outputting to a variable so we can see everything
# now in place of mpiifx we can do
time <paste output> <rest of the compile line>

This again should result in error if the first manual compilation did as well. Removing duplicated flags we should see (1) includes to the MPI location, (2) library location to MPI, linker options, libraries to link in like -lmpifort -lmpi -ldl -lrt -lpthread

We really only care about -lmpifort -lmpi and their respective includes. If we look in the library locations specified by -L<path> we should find something like "libmpifort.so" and "libmpi.so" that matches our linked libraries. If we look in the include paths (note that there is both <mpi path>/include and <mpi path>/include/mpi we should find the headers and mod files like so (my output):
Code:
/home/aislas/intel/oneapi/mpi/2021.11/include
mpi  mpicxx.h  mpif.h  mpi.h  mpiof.h  mpio.h

ls /home/aislas/intel/oneapi/mpi/2021.11/include/mpi
gfortran  mpi_base.mod       mpi_f08_callbacks.mod          mpi_f08_link_constants.mod  mpi_f08_types.mod  mpi_sizeofs.mod  pmpi_f08.mod
ilp64     mpi_constants.mod  mpi_f08_compile_constants.mod  mpi_f08.mod                 mpi.mod            pmpi_base.mod

Assuming all that looks good, we can now effectively test if somehow the .mod file is not being found. A simple test program to use MPI:
Code:
program main
  use mpi
  integer :: ierr
  call MPI_Init(ierr)
  call mpi_finalize( ierr )
  write( *, * ) "MPI ran"
end program main

Using the output of mpiifx -show from before we can compile this file (e.g. if named "mpi_test.F90" by adding mpi_test.F90 -o mpi_test to the rest of the command)

And if that fails, well there are problems with the compiler and .mod file it is trying to use. If it manages to compile, then the WRF source file is suspect.


Not sure if I did this right but here are the log file @islas
 

Attachments

  • include.log
    2.6 KB · Views: 1
  • step1.log
    1.5 KB · Views: 1
  • mpiifx-show.log
    394 bytes · Views: 1
  • show_and_line.log
    1.5 KB · Views: 1
The logs are a little hard to follow especially without the command that generated them. You can use script to generate a direct logfile of your console and then when you exit that script's session (say for instance with ctrl-D) all of your terminal output and commands input will be saved.


I find this helpful when needing to generate longer interactive log sessions.
 
The logs are a little hard to follow especially without the command that generated them. You can use script to generate a direct logfile of your console and then when you exit that script's session (say for instance with ctrl-D) all of your terminal output and commands input will be saved.


I find this helpful when needing to generate longer interactive log sessions.
Thank you for teaching me something new about script command @islas ! I have never heard of that one before.

Here's my session and some relevant log files:
 

Attachments

  • session.log
    21.2 KB · Views: 3
  • compile.wrf1.log
    946.8 KB · Views: 2
  • configure.wrf.txt
    21.8 KB · Views: 1
  • configure.log
    9.7 KB · Views: 1
Thank you for teaching me something new about script command @islas ! I have never heard of that one before.

Here's my session and some relevant log files:

Here's when I don't use the wrappers, at least this time it worked as expected for now. I'll have to reinstall the os and double check.
 

Attachments

  • works.zip
    48.2 KB · Views: 1
Here's when I don't use the wrappers, at least this time it worked as expected for now. I'll have to reinstall the os and double check.

second test it fails without using the wrappers

I'm not sure I follow, all of the logs use the wrapper except when ifx was called directly. Can you explain the difference between the two setups? Is one installed from conda?

The first fail it looks like the wrapper is not properly giving the location to MPI (two I_MPI_SUBSTITUTE_INSTALL_DIR/include instead of one .../include and one .../include/mpi). The second fail it then looks like it is pointing to some gfortran location. In that same failing environment, I would suspect older versions of WRF fail.

The test Fortran program I proposed was meant to be written to a file, and then compiled with the direct outputs of mpiifx -show to ensure that this is an environment problem separate from the WRF build system. The logs do seem to suggest that.
 
The following is written for a POSIX sh-style shell with GNU grep. This should get the necessary info for any intel mpiifx environment you have.

Code:
banner="================================================="
# What are we running
which mpiifx
echo $banner
# Show raw command
mpiifx -show
echo $banner
# Show what includes are being used
mpiifx -show | grep -E "[-]I\"(\w+|/|\.)*\"" -o
echo $banner
# Show files in those include dirs
mpiifx -show | grep -E "[-]I\"(\w+|/|\.)*\"" -o | sed -e 's@-I"@@g' | tr -d \" | sort -u | xargs -i sh -c "echo {} && ls -p {}"
echo $banner

# Get raw command
echo "Getting raw compile line"
raw_ifx=`mpiifx -show`
echo $banner

# Create temp test file
echo "Creating test file"
cat << EOF > test.F90
program main
  use mpi
  integer :: ierr
  call MPI_Init(ierr)
  call mpi_finalize( ierr )
  write( *, * ) "MPI ran"
end program main
EOF
echo $banner

# This should compile
echo "$raw_ifx test.F90"
$raw_ifx test.F90
echo $banner
./a.out

Running the following with a verbatim copy-paste should show different results between your working and failing environment. For instance, my working environment outputs :
Code:
/home/aislas/intel/oneapi/2024.2/bin/mpiifx
=================================================
ifx -I"/home/aislas/intel/oneapi/2024.2/include/mpi" -I"/home/aislas/intel/oneapi/2024.2/include" -I"/home/aislas/intel/oneapi/2024.2/include/mpi" -L"/home/aislas/intel/oneapi/2024.2/lib" -L"/home/aislas/intel/oneapi/2024.2/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/home/aislas/intel/oneapi/2024.2/lib" -Xlinker -rpath -Xlinker "/home/aislas/intel/oneapi/2024.2/lib" -lmpifort -lmpi -ldl -lrt -lpthread
=================================================
-I"/home/aislas/intel/oneapi/2024.2/include/mpi"
-I"/home/aislas/intel/oneapi/2024.2/include"
-I"/home/aislas/intel/oneapi/2024.2/include/mpi"
=================================================
/home/aislas/intel/oneapi/2024.2/include
crypto_mb/                 dnnl_version.h        mkl_blas.h                                     mkl_cluster_sparse_solver.f90  mkl_direct_lapack.h               mkl_lapack_omp_offload_lp64.f90   mkl_service.f90           mkl_trans.h              mkl_vsl.f90              oneapi/
daal.h                     dpc_common.hpp        mkl_blas_omp_offload.f90                       mkl_cluster_sparse_solver.fi   mkl_direct_types.h                mkl_lapack_omp_variant.h          mkl_service.fi            mkl_trans_names.h        mkl_vsl.fi               pstl_offload/
daal_sycl.h                dpc_common_README.md  mkl_blas_omp_offload.h                         mkl_cluster_sparse_solver.h    mkl_dss.f90                       mkl_lapack_omp_variant_ilp64.f90  mkl_service.h             mkl_trig_transforms.f90  mkl_vsl_functions_64.h   stb/
dal/                       dpct/                 mkl_blas_omp_offload_ilp64.f90                 mkl_compact.h                  mkl_dss.fi                        mkl_lapack_omp_variant_lp64.f90   mkl_solvers_ee.f90        mkl_trig_transforms.h    mkl_vsl_functions.h      std/
dnnl_config.h              fftw/                 mkl_blas_omp_offload_ilp64_no_array_check.f90  mkl_df_defines.h               mkl_dss.h                         mkl_omp_offload.f90               mkl_solvers_ee.fi         mkl_types.h              mkl_vsl.h                sycl/
dnnl_debug.h               ia32                  mkl_blas_omp_offload_lp64.f90                  mkl_df.f90                     mkl.fi                            mkl_omp_offload.h                 mkl_solvers_ee.h          mkl_version.h            mkl_vsl_omp_offload.f90  syclcompat/
dnnl.h                     i_malloc.h            mkl_blas_omp_offload_lp64_no_array_check.f90   mkl_df_functions.h             mkl.h                             mkl_omp_variant.h                 mkl_sparse_handle.f90     mkl_vml_defines.h        mkl_vsl_omp_offload.h    tbb/
dnnl.hpp                   intel64               mkl_blas_omp_variant.h                         mkl_df.h                       mkl_jit_blas.f90                  mkl_pardiso.f90                   mkl_sparse_handle.fi      mkl_vml.f90              mkl_vsl_omp_variant.f90  xpti/
dnnl_ocl.h                 ipp/                  mkl_blas_omp_variant_ilp64.f90                 mkl_dfti.f90                   mkl_jit_blas_ilp64.f90            mkl_pardiso.fi                    mkl_sparse_handle.h       mkl_vml.fi               mkl_vsl_omp_variant.h
dnnl_ocl.hpp               ippcp/                mkl_blas_omp_variant_ilp64_no_array_check.f90  mkl_dfti.h                     mkl_jit_blas_lp64.f90             mkl_pardiso.h                     mkl_sparse_qr.f90         mkl_vml_functions.h      mkl_vsl_subroutine.fi
dnnl_sycl.h                ippcp.h               mkl_blas_omp_variant_lp64.f90                  mkl_dfti_omp_offload.f90       mkl_lapacke.h                     mkl_pblas.h                       mkl_sparse_qr.h           mkl_vml.h                mkl_vsl_types.h
dnnl_sycl.hpp              ipp.h                 mkl_blas_omp_variant_lp64_no_array_check.f90   mkl_dfti_omp_offload.h         mkl_lapack.f90                    mkl_poisson.f90                   mkl_spblas.f90            mkl_vml_omp_offload.f90  mpi/
dnnl_sycl_types.h          mkl/                  mkl_cblas_64.h                                 mkl_df_types.h                 mkl_lapack.fi                     mkl_poisson.h                     mkl_spblas.fi             mkl_vml_omp_offload.h    mpicxx.h
dnnl_threadpool.h          mkl_blacs.h           mkl_cblas.h                                    mkl_direct_blas.h              mkl_lapack.h                      mkl_rci.f90                       mkl_spblas.h              mkl_vml_omp_variant.f90  mpif.h
dnnl_threadpool.hpp        mkl_blas_64.h         mkl_cdft.f90                                   mkl_direct_blas_kernels.h      mkl_lapack_omp_offload.f90        mkl_rci.fi                        mkl_spblas_omp_offload.h  mkl_vml_omp_variant.h    mpi.h
dnnl_threadpool_iface.hpp  mkl_blas.f90          mkl_cdft.h                                     mkl_direct_call.fi             mkl_lapack_omp_offload.h          mkl_rci.h                         mkl_spblas_omp_variant.h  mkl_vml_types.h          mpiof.h
dnnl_types.h               mkl_blas.fi           mkl_cdft_types.h                               mkl_direct_call.h              mkl_lapack_omp_offload_ilp64.f90  mkl_scalapack.h                   mkl_trans.fi              mkl_vsl_defines.h        mpio.h
/home/aislas/intel/oneapi/2024.2/include/mpi
gfortran/  ilp64/  mpi_base.mod  mpi_constants.mod  mpi_f08_callbacks.mod  mpi_f08_compile_constants.mod  mpi_f08_link_constants.mod  mpi_f08.mod  mpi_f08_types.mod  mpi.mod  mpi_sizeofs.mod  pmpi_base.mod  pmpi_f08.mod
=================================================
Getting raw compile line
=================================================
Creating test file
=================================================
ifx -I"/home/aislas/intel/oneapi/2024.2/include/mpi" -I"/home/aislas/intel/oneapi/2024.2/include" -I"/home/aislas/intel/oneapi/2024.2/include/mpi" -L"/home/aislas/intel/oneapi/2024.2/lib" -L"/home/aislas/intel/oneapi/2024.2/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/home/aislas/intel/oneapi/2024.2/lib" -Xlinker -rpath -Xlinker "/home/aislas/intel/oneapi/2024.2/lib" -lmpifort -lmpi -ldl -lrt -lpthread test.F90
=================================================
 MPI ran
 
not sure what's going on, now I can't replicate the results.
 

Attachments

  • test_inside_WRF_Intel_Environment_With_exports.log
    991.9 KB · Views: 1
  • test_inside_WRF_Intel_Environment_With_exports_with_intel_wrappers.log
    996 KB · Views: 1
  • test_outside_of_WRF_Intel_Envrionment.log
    2.9 KB · Views: 1
Interesting. This does appear to be an environment issue. The first new set of logs without failure shows :
Code:
/opt/intel/oneapi/mpi/2021.13/bin/mpiifx
=================================================
ifx -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -I"/opt/intel/oneapi/mpi/2021.13/include" -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -L"/opt/intel/oneapi/mpi/2021.13/lib" -L"/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -lmpifort -lmpi -ldl -lrt -lpthread
=================================================
-I"/opt/intel/oneapi/mpi/2021.13/include/mpi"
-I"/opt/intel/oneapi/mpi/2021.13/include"
-I"/opt/intel/oneapi/mpi/2021.13/include/mpi"
=================================================
/opt/intel/oneapi/mpi/2021.13/include
mpi/  mpicxx.h  mpif.h  mpi.h  mpiof.h  mpio.h
/opt/intel/oneapi/mpi/2021.13/include/mpi
gfortran/        mpi_f08_compile_constants.mod  mpi_sizeofs.mod
ilp64/           mpi_f08_link_constants.mod     pmpi_base.mod
mpi_base.mod         mpi_f08.mod          pmpi_f08.mod
mpi_constants.mod      mpi_f08_types.mod
mpi_f08_callbacks.mod  mpi.mod
=================================================
Getting raw compile line
=================================================
Creating test file
=================================================
ifx -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -I"/opt/intel/oneapi/mpi/2021.13/include" -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -L"/opt/intel/oneapi/mpi/2021.13/lib" -L"/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -lmpifort -lmpi -ldl -lrt -lpthread test.F90
=================================================
 MPI ran

But something happening in between changing the two environments is messing with the Intel provided wrappers :
Code:
/opt/intel/oneapi/mpi/2021.13/bin/mpiifx
=================================================
ifx -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib" -lmpifort -lmpi -ldl -lrt -lpthread
=================================================
-I"I_MPI_SUBSTITUTE_INSTALLDIR/include"
-I"I_MPI_SUBSTITUTE_INSTALLDIR/include"
-I"I_MPI_SUBSTITUTE_INSTALLDIR/include"
=================================================
I_MPI_SUBSTITUTE_INSTALLDIR/include
ls: cannot access 'I_MPI_SUBSTITUTE_INSTALLDIR/include': No such file or directory
=================================================
Getting raw compile line
=================================================
Creating test file
=================================================
ifx -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib" -lmpifort -lmpi -ldl -lrt -lpthread test.F90
test.F90(2): error #7002: Error in opening the compiled module file.  Check INCLUDE paths.   [MPI]
  use mpi
------^
compilation aborted for test.F90 (code 1)
=================================================
./test.sh: line 37: ./a.out: No such file or directory

Same binary path of /opt/intel/oneapi/mpi/2021.13/bin/mpiifx but the includes changed.

Given that the I_MPI_ROOT is empty in the failing log :
Code:
HOME=/home/workhorse
I_MPI_ROOT=
INFOPATH=/opt/intel/oneapi/debugger/2024.2/share/info
INTEL_PYTHONHOME=/opt/intel/oneapi/debugger/2024.2/opt/debugger

I suspect it has something to do with this :

Overall, I think we've sufficiently narrowed down that this is not an issue with the WRF builds.
 
Interesting. This does appear to be an environment issue. The first new set of logs without failure shows :
Code:
/opt/intel/oneapi/mpi/2021.13/bin/mpiifx
=================================================
ifx -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -I"/opt/intel/oneapi/mpi/2021.13/include" -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -L"/opt/intel/oneapi/mpi/2021.13/lib" -L"/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -lmpifort -lmpi -ldl -lrt -lpthread
=================================================
-I"/opt/intel/oneapi/mpi/2021.13/include/mpi"
-I"/opt/intel/oneapi/mpi/2021.13/include"
-I"/opt/intel/oneapi/mpi/2021.13/include/mpi"
=================================================
/opt/intel/oneapi/mpi/2021.13/include
mpi/  mpicxx.h  mpif.h  mpi.h  mpiof.h  mpio.h
/opt/intel/oneapi/mpi/2021.13/include/mpi
gfortran/        mpi_f08_compile_constants.mod  mpi_sizeofs.mod
ilp64/           mpi_f08_link_constants.mod     pmpi_base.mod
mpi_base.mod         mpi_f08.mod          pmpi_f08.mod
mpi_constants.mod      mpi_f08_types.mod
mpi_f08_callbacks.mod  mpi.mod
=================================================
Getting raw compile line
=================================================
Creating test file
=================================================
ifx -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -I"/opt/intel/oneapi/mpi/2021.13/include" -I"/opt/intel/oneapi/mpi/2021.13/include/mpi" -L"/opt/intel/oneapi/mpi/2021.13/lib" -L"/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.13/lib" -lmpifort -lmpi -ldl -lrt -lpthread test.F90
=================================================
 MPI ran

But something happening in between changing the two environments is messing with the Intel provided wrappers :
Code:
/opt/intel/oneapi/mpi/2021.13/bin/mpiifx
=================================================
ifx -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib" -lmpifort -lmpi -ldl -lrt -lpthread
=================================================
-I"I_MPI_SUBSTITUTE_INSTALLDIR/include"
-I"I_MPI_SUBSTITUTE_INSTALLDIR/include"
-I"I_MPI_SUBSTITUTE_INSTALLDIR/include"
=================================================
I_MPI_SUBSTITUTE_INSTALLDIR/include
ls: cannot access 'I_MPI_SUBSTITUTE_INSTALLDIR/include': No such file or directory
=================================================
Getting raw compile line
=================================================
Creating test file
=================================================
ifx -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -I"I_MPI_SUBSTITUTE_INSTALLDIR/include" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -L"I_MPI_SUBSTITUTE_INSTALLDIR/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib/release" -Xlinker -rpath -Xlinker "I_MPI_SUBSTITUTE_INSTALLDIR/lib" -lmpifort -lmpi -ldl -lrt -lpthread test.F90
test.F90(2): error #7002: Error in opening the compiled module file.  Check INCLUDE paths.   [MPI]
  use mpi
------^
compilation aborted for test.F90 (code 1)
=================================================
./test.sh: line 37: ./a.out: No such file or directory

Same binary path of /opt/intel/oneapi/mpi/2021.13/bin/mpiifx but the includes changed.

Given that the I_MPI_ROOT is empty in the failing log :
Code:
HOME=/home/workhorse
I_MPI_ROOT=
INFOPATH=/opt/intel/oneapi/debugger/2024.2/share/info
INTEL_PYTHONHOME=/opt/intel/oneapi/debugger/2024.2/opt/debugger

I suspect it has something to do with this :

Overall, I think we've sufficiently narrowed down that this is not an issue with the WRF builds.
hmm interesting, not sure how to fix that
 
So I have done some tests and while the intel llvm compiler builds the libraries without a problem and passes all the tests it fails to build wrf or wps.

The solution that I have found is that you use this command after building the libraries and it fixes the empty variable export.

Bash:
source /opt/intel/oneapi/setvars.sh --force

What do you think @islas

It is important to note that it has to be down AFTER the libraries are all built, even if you force it before the libraries the exports have empty strings. The the attached logs below.
 

Attachments

  • intel_forced.log
    8.2 KB · Views: 0
  • intel_forced.log
    8.2 KB · Views: 0
  • intel_forced_environment_with_conda.log
    13.4 KB · Views: 0
  • intel_only_environment.log
    10.3 KB · Views: 0
  • intel_plus_minoconda_environment.log
    9.9 KB · Views: 0
  • intel_source_with_miniconda.log
    6.1 MB · Views: 0
Top