Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

arc_bisect error when running JW Baroclinic Wave Idealized Case

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

arjunkunna

New member
Hi, I've followed the instructions here: https://www2.mmm.ucar.edu/projects/mpas/tutorial/Boulder2019/index.html

I'm able to build both init_atmosphere and atmosphere by doing the follows.

$ make -j4 gfortran CORE=init_atmosphere PRECISION=single
$make clean CORE=atmosphere
$ make -j4 gfortran CORE=atmosphere PRECISION=single


I'm now trying to run the jw_baroclinic_wave.tar.gz idealized case.

I have:
1) Unpacked the input files and set the environment variable $MPAS_TUTORIAL to the absolute path of the resulting mpas_tutorial directory.
2) cd ${HOME}
3) wget http://www2.mmm.ucar.edu/projects/mpas/test_cases/v7.0/jw_baroclinic_wave.tar.gz
4) tar xzvf jw_baroclinic_wave.tar.gz
5) Linked the init_atmosphere_model and atmosphere_model executables from the top-level MPAS directory.

However, when I now run ./init_atmosphere_model inside the jw_baroclinic folder, I get this error (see attached screenshot as well)


----------------------------------------------------------------------
2 Beginning MPAS-init_atmosphere Error Log File for task 0 of 1
3 Opened at 2020/08/20 06:54:10
4 ----------------------------------------------------------------------
5
6 CRITICAL ERROR: arc_bisect: A and B are diametrically opposite
7 Logging complete. Closing file at 2020/08/20 06:54:10


Do you know what might be happening? Thank you very much for your help!
 

Attachments

  • Screen Shot 2020-08-20 at 9.54.43 PM.png
    Screen Shot 2020-08-20 at 9.54.43 PM.png
    154 KB · Views: 2,831
That's strange! The JW baroclinic wave test normally runs pretty well out of the box.

Just to insure I know your exact setup, are you using the meshes that are found within the jw_baroclinic_wave.tar.gz and you have touched any of the namelist or streams? i.e. x1.40962.init.nc?

I have also moved this topic from Compiling and Installation to Running.
 
I haven't touched any of the namelists or streams.

I'm using the x1.40962.grid.nc mesh which is located in the jw_baroclinic_wave folder. I don't see x1.40962.init.nc though, only x1.40962.grid.nc.

Regarding my setup:

I'm working on Stanford's sherlock cluster (https://www.sherlock.stanford.edu/)

I'm using:
libfabric/1.10.1
gcc/9.1.0
openMPI 4.0.4
PnetCDF 1.8.1
PIO 1.7.1

These are how I installed them:

openMPI-4.0.4:
**Install OpenMPI**
wget https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.4.tar.bz2
tar -xvf openmpi-4.0.4.tar.bz2
cd openmpi-4.0.4
./configure --prefix=$HOME/openmpi_install --with-pmi-libdir=/usr/lib64 --with-pmix=internal --with-libevent=internal --with-slurm --without-verbs

PnetCDF 1.8.1:
Step 4: **Install PNETCDF**
Wget https://parallel-netcdf.github.io/Release/parallel-netcdf-1.8.1.tar.gz
tar -xvf parallel-netcdf-1.8.1.tar
cd parallel-netcdf-1.8.1
./configure --prefix=$PNETCDF --disable-cxx
make
make install

PIO 1.7.1
wget https://github.com/NCAR/ParallelIO/archive/pio1_7_1.tar.gz
tar -xvf ParallelIO-pio1_7_1.tar
cd ParallelIO-pio1_7_1/pio
./configure --prefix=$PIO --disable-netcdf --disable-mpiio
make
make install

MPAS:
git clone https://github.com/MPAS-Dev/MPAS-Model.git
cd MPAS-Model
make gfortran CORE=init_atmosphere


Thank you so much once again for your help, I really appreciate it!

Arjun
 
Your setup and installation looks good and most likely isn't the problem. And you're right, I meant to say x1.40962.grid.nc. That is the file that the init_atmosphere will use for the setup of the JW baroclinic wave test.

Will you be able to upload the entirety of the log that was produced when running the init_atmosphere core? i.e. the entire log.init_atmosphere.0000.txt?

Lastly, did you modify any portions of the init_atmosphere code in anyway? This is definitely an interesting issue.
 
I haven't touched any of the code in init_atmosphere.

I've attached the log.init_atmosphere.0000.out and log.init_atmosphere.0000.err files - I don't see a log.init_atmosphere.0000.txt file, though? Thank you so much for your help once again!
 

Attachments

  • log.init_atmosphere.0000.err.rtf
    763 bytes · Views: 65
  • log.init_atmosphere.0000.out.rtf
    7.4 KB · Views: 66
My apologize, I meant log.init_atmosphere.0000.out, and not .txt. Thanks for sending your log files and thanks for your patience while working through this.

I think the issue might be with your installation libraries, which I believe I missed before. Taking a look at the arc_bisect routine, which is giving you this error when the cx, cy, and cz from this routine are all zero, the arc_bisect error (which you are getting) is thrown. In the JW Baroclinic wave test, these fields are coming from mesh fields. So my guess is that there is an error with your IO library installation.

Looking back at the post with your IO libraries, I noticed that NetCDF was not present, which I had missed the first time. As well, I noticed you compiled PIO with the --disable-netcdf flag (and the --disable-mpiio). In order to run MPAS PIO has to be compiled with both netCDF and parallel-netCDF (as mentioned in the MPAS Atmosphere User's Guide in section 3.2). Was there a reason for compiling PIO without NetCDF?

If there wasn't a reason, then you'll need to install the netCDF library, with parallel-netCDF support and then reinstall the PIO library with both netCDF and parallel-netCDF. Likewise, for reference, the iolib_installation.sh script found here, is how we typically install the IO LIBs for MPAS and might help you along with installing these libraries.

Thanks again for your patience on this!
 
I agree with mcurry that this may be an issue in reading mesh fields.

Compiling the PIO1 library with just parallel-netCDF should work; the default "io_type" for streams is "pnetcdf".

It may be that there's an problem with version 1.8.1 of the parallel-netCDF library, and updating to a newer version may resolve the issue. Since it might require less effort than installing netCDF-4, could you try simply updating to parallel-netCDF 1.12.1? You'll then need to recompile PIO, but staying with PIO 1.7.1 for now might be reasonable.
 
Thanks to both of you for your help!

The reason why I installed it with --disable-netcdf flag was because I followed the instructions at the tutorial here : https://www2.mmm.ucar.edu/projects/mpas/tutorial/Boulder2019/slides/02.downloading_and_compiling.pdf

On slide 14, it has ./configure --prefix=$PIO --disable-netcdf --disable-mpiio.

I'm not sure if this is the right thing to do, though.

I tried using parallel-netCDF 1.12.1 as suggested, and for some reason that broke my call of make gfortran CORE=init_atmosphere
.

I get this error (see attachment):
Screen Shot 2020-08-25 at 8.48.46 PM.png

It might be an error with my installation steps, I'm still trying to look further into this...

Thank you so much once again!


(As reference, these are my steps:)

ml libfabric/1.10.1 gcc/9.1.0

Step 1: **Install OpenMPI**
wget https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.4.tar.bz2
tar -xvf openmpi-4.0.4.tar.bz2
cd openmpi-4.0.4
./configure --prefix=$HOME/openmpi_install --with-pmi-libdir=/usr/lib64 --with-pmix=internal --with-libevent=internal --with-slurm --without-verbs
make 
make install

Step 2: **Install PNETCDF**
Wget https://parallel-netcdf.github.io/Release/pnetcdf-1.12.1.tar.gz
tar -xvf pnetcdf-1.12.1.tar.gz
cd pnetcdf-1.12.1
./configure --prefix=$PNETCDF --disable-cxx
make
make install

Step 3: **Install PIO**
wget https://github.com/NCAR/ParallelIO/archive/pio1_7_1.tar.gz
tar -xvf ParallelIO-pio1_7_1.tar
cd ParallelIO-pio1_7_1/pio
./configure --prefix=$PIO --disable-netcdf --disable-mpiio
make
make install

Step 4: **Install MPAS**
git clone https://github.com/MPAS-Dev/MPAS-Model.git
cd MPAS-Model
make gfortran CORE=init_atmosphere

 
I think the compilation problem you're encountering looks like it may be the same as in this thread.

Somewhat off topic, it took me a few minutes to find that other thread, since the forum search function apparently doesn't search attachments to posts. If you do have a couple of minutes and wouldn't mind editing your previous post to include the error messages as plain text (perhaps quoted or in code tags), that might help future users who run into the same error to find this thread. I'll also follow up on the other thread with a post that quotes the error message in the attachment there. We're still working out the best practices in order to make this forum as useful as possible.
 
I have updated the title of this post to better represent the occurred error. The hope of this change is to better help users locate already answered errors they may be having.
 
Hi,

Sorry for the late reply. I tried doing what was suggested in the link -

"change the definition of PIO_MAX_VAR_DIMS in the PIO 1.7.1 source code before recompiling and reinstalling PIO. In the pio/poi_types.F90 file, you can change the definition of PIO_MAX_VAR_DIMS to something like 6 around lines 305 and 324."


I changed PIO_MAX_VAR_DIMS to 6 as suggested, however when I 'make' pio I get the following error:

Error: Array specification at (1) has more than 7 dimensions
mpif-sizeof.h:325.47:
Included at mpif.h:63:
Included at pio_kinds.F90:30:

COMPLEX(REAL32), DIMENSION(1,1,1,1,1,1,1,1,1,1,1,1,1,*)::x
1
Error: Array specification at (1) has more than 7 dimensions
mpif-sizeof.h:332.47:
Included at mpif.h:63:
Included at pio_kinds.F90:30:

COMPLEX(REAL32), DIMENSION(1,1,1,1,1,1,1,1,1,1,1,1,1,1,*)::x
1
Error: Array specification at (1) has more than 7 dimensions
mpif-sizeof.h:395.47:
Included at mpif.h:63:
Included at pio_kinds.F90:30:

COMPLEX(REAL64), DIMENSION(1,1,1,1,1,1,1,*)::x
1
Error: Array specification at (1) has more than 7 dimensions
Fatal Error: Error count reached limit of 25.
make[1]: *** [pio_kinds.o] Error 1
make[1]: Leaving directory `/home/users/arjunk1/MPAS_install_files/ParallelIO-pio1_7_1/pio'
make: *** [all] Error 2

Would you have any suggestions?

I've also put the error text in the original post to help others searching for similar issues.

Thank you so much once again for your help!
 
Hi - Just wanted to follow up on this. Do you have any suggestions on how to get MPAS working? It would be extremely helpful for our research if we would be able to run MPAS.

Thank you very much once again, and sorry for the trouble!
 
Sorry about the slow reply on this. Have you tried installing a different recommended versions of PIO (as specified in section 3.2.3 of the MPAS Users Guide)?

I've also attached a file iolib_installation.sh, which we use to install the MPAS libraries. You'll need to edit it for your use case and it will take some work on your end to get working completely, but you can use it as a reference for how we install our libraries.
 

Attachments

  • iolib_installation.sh.txt
    3.9 KB · Views: 59
Top