Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Search results

  1. M

    error compiling convert_mpas

    The first of the errors in your "error.txt" file suggests that the PnetCDF library isn't being included in the list of libraries to link with: (Note that 'ncmpi_strerror' is a PnetCDF function, and the NetCDF library can be configured to include PnetCDF functionality.) Since the convert_mpas...
  2. M

    how to convert diagnostic output files to regular lat-lon?

    Please refer to the "Post-processing and visualizing MPAS-Atmosphere output" tutorial lecture. In particular, slide 6 explains how to interpolate the fields in diagnostics files.
  3. M

    MPAS-A issues with parallelization: Segmentation fault

    Thanks for following up! If you do happen to try v8.0.1 with OpenMP enabled and find any issues, please don't hesitate to let us know and we can try to debug further.
  4. M

    Running Model with Dense Meshes

    Thanks for the path. Interestingly, the sizes for our esmf.mod files are substantially different: versus An octal dump of the files suggests that they were created with different compilers, so I'm wondering whether there might be something about your shell environment that's causing a...
  5. M

    Running Model with Dense Meshes

    If you still have your working directory on /glade, I can take a look to see if there are differences between the `esmf.mod` file in your build directory and mine. In a separate directory, though, it may be easiest to just switch to the Intel compilers. Here's the set of modules that have...
  6. M

    PIO1.f90 error &

    Thanks so much for following up!
  7. M

    MPAS-A issues with parallelization: Segmentation fault

    Just some clarification here: the default settings &io config_pio_num_iotasks = 0 config_pio_stride = 1 / instruct MPAS to use all MPI tasks as I/O tasks. For larger runs, this may not provide optimal performance, but it isn't necessarily incorrect to do this.
  8. M

    MPAS-A issues with parallelization: Segmentation fault

    From your log.atmosphere.0000.out file, it looks like you're using hybrid MPI+OpenMP: Compile-time options: Build target: gfortran OpenMP support: yes OpenACC support: no Default real precision: double Compiler flags: optimize I/O layer: SMIOL Run-time settings: MPI task...
  9. M

    Strange behavior with MPAS-Atmosphere 8.0.1

    Great to hear the runtimes are now in line with expectations, and thanks very much for following up!
  10. M

    Running Model with Dense Meshes

    As an aside, you will likely get better model simulation rates if you use the Intel compilers on Cheyenne, although compilation time will generally be longer compared with the GNU compilers.
  11. M

    Running Model with Dense Meshes

    I'm not sure what the issue might be, unfortunately. I've just tried compiling on Cheyenne with the following modules loaded, which I think may roughly match what you may be using: Currently Loaded Modules: 1) ncarenv/1.3 2) gnu/10.1.0 3) ncarcompilers/0.5.0 4) mpt/2.25 5)...
  12. M

    Vertical profiles from MPAS 8.0.1

    If you would like the geometric heights for variables defined at vertical layer midpoints (theta, qv, uReconstruct{Zonal, Meridional}, etc.), you can just average the geometric heights of the layer interfaces provided by the 'zgrid' array.
  13. M

    Strange behavior with MPAS-Atmosphere 8.0.1

    It might also be worth comparing the total time for individual code regions in the timer summaries at the end of the log files. Are all of the timer values proportionately larger in the 72-h simulation compared with the 48-h simulation? Or are some code regions taking disproportionately longer?
  14. M

    Strange behavior with MPAS-Atmosphere 8.0.1

    It might be interesting to see if every time step is slower for the 72-h simulation, or whether the simulation is slower only during certain periods of the integration. If you run something like grep "Timing for integration" log.atmosphere.0000.out for the original 48-h run, and again for the...
  15. M

    Unable to Post-Process Output Files

    You will need to ensure that the NetCDF library that your application (MATLAB, convert_mpas) is using has been compiled with support for the CDF-5 (large variable) file format. Depending on which version of the NetCDF library you're installing, this may be as simple as adding the option...
  16. M

    PIO1.f90 error &

    It may be that you will need a newer version of the GNU compilers. I have verified that MPAS-A v8.0.1 will build successfully with the GNU 7.4.0 compilers, but it's been a while since I've tried anything older than that.
  17. M

    PIO1.f90 error &

    Always glad to help! Which version of the 'gfortran' compiler are you using?
  18. M

    PIO1.f90 error &

    It looks like you may have installed a PIO 1.x version, rather than the newer PIO 2.x library, based on the presence of just a libpio.a file in your $PIO/lib directory. Is that correct? Anyway, if you're using MPAS v8.0 or later, an alternative may be to use the SMIOL I/O library rather than...
  19. M

    PIO1.f90 error &

    Have you set the PIO environment variable to the installation path of the PIO library? What do you see if you run the commands echo $PIO ls -l $PIO/lib
  20. M

    Running Model with Dense Meshes

    I think there may be a few issues, a couple of which are specific to MPAS-A releases prior to v8.0. For MPAS v7.3 and earlier, the following are important: 1) Ensure that the following vertical dimensions are all set to 1 when processing static fields to avoid allocating large 3-d atmospheric...
Top