Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

MPAS-A stops when generating init.nc file for regional simulation

yanhongyi

Member
I've successfully get a regional Mediterranean.static.nc file after running limited-area mpas tool (GitHub - MPAS-Dev/MPAS-Limited-Area: Python tool to create a regional subset of a global MPAS Mesh).

The original x5.xxxxx.static.nc (15km-3km) has no problem because it can be processed successfully to generate x5.xxxxx.init.nc file and finally I can run the global simulation successfully. I've also checked the x5.xxxxx.static.nc file, it's OK and can be plotted by ncl script.

However, when I used the Mediterranean.static.nc file obtained by limited-area mpas tool (this Mediterranean.static.nc is successfully outputed by limited-area mpas tool without any error) to generate the .init.nc file, MPAS-A stops without .err files. The .out files seem to be OK but it did stop. When I check the slurm file (which contains some information reported by the supercomputer), I found it:

"forrtl: severe (408): fort: (7): Attempt to use pointer LATCELL when it is not associated with a target"

I've no idea...Can anybody help me with this problem?

Attached are my setting and some other files
 

Attachments

  • log.init_atmosphere.0000.out.txt
    3.9 KB · Views: 8
  • namelist.init_atmosphere.txt
    1.4 KB · Views: 3
  • regional_terrain.png
    regional_terrain.png
    521.3 KB · Views: 8
  • slurm-72865.out.txt
    836 bytes · Views: 7
  • streams.init_atmosphere.txt
    1 KB · Views: 1
Recently I've tried 92km-25km mesh and 15km-3km mesh (circle), it has same result...I still can not get my init.nc file...
 
I've successfully get a regional Mediterranean.static.nc file after running limited-area mpas tool (GitHub - MPAS-Dev/MPAS-Limited-Area: Python tool to create a regional subset of a global MPAS Mesh).

The original x5.xxxxx.static.nc (15km-3km) has no problem because it can be processed successfully to generate x5.xxxxx.init.nc file and finally I can run the global simulation successfully. I've also checked the x5.xxxxx.static.nc file, it's OK and can be plotted by ncl script.

However, when I used the Mediterranean.static.nc file obtained by limited-area mpas tool (this Mediterranean.static.nc is successfully outputed by limited-area mpas tool without any error) to generate the .init.nc file, MPAS-A stops without .err files. The .out files seem to be OK but it did stop. When I check the slurm file (which contains some information reported by the supercomputer), I found it:

"forrtl: severe (408): fort: (7): Attempt to use pointer LATCELL when it is not associated with a target"

I've no idea...Can anybody help me with this problem?

Attached are my setting and some other files
Hello yanhongyi
It's been a long time since your original post. So, I hope you have been able to solve the issue by yourself. If so, please do share your solution.

If not, I have a suggestion for you.
Have you considered that the number of cells in the mesh (6488066) is significantly large (>2,000,000)
Make the following change, I hope it will work.

Notes on Creating Large Regions (nCells >= 2,000,000)​

If the region you create is significantly large ( >= 2,000,000 grid cells) you will need to change the NetCDF file version of the regional grid file. To do this, you can change the format keyword argument of the LimitedArea initialization call within the create_region script to NETCDF3_64BIT_DATA:

regional_area = LimitedArea(args.grid,
args.points,
format='NETCDF3_64BIT_DATA',
**kwargs)

Also check out the following.
 
I think the comment above the call to blend_bdy_terrain(...) at line 198 of mpas_init_atm_cases.F helps in understanding the error:
!
! NB: When calling blend_bdy_terrain(...) with the 'dryrun' argument set, the nCells, latCell, lonCell,
! bdyMaskCell, and ter arguments are not used -- only the config_met_prefix and config_start_time
! arguments are used.
!
Essentially, we call the blend_bdy_terrain(...) routine initially just to check that we will be able to read the intermediate file with first-guess terrain, in which case we don't actually need the model grid cell locations; so, we never bother to set the latCell, lonCell, bdyMaskCell, and ter pointers before the call. But, this triggers a use of an unassociated pointer (even if just to pass it through a subroutine call).

One simple solution would be to re-compile the init_atmosphere_model program without debugging options enabled (run "make clean CORE=init_atmosphere", then compile again without setting DEBUG=true).

An alternative, quick fix (that I haven't fully tested) might be to declare the latCell, lonCell, bdyMaskCell, and ter arguments as pointers in the blend_bdy_terrain(...) routine; you could do this around lines 6789 - 6792 of mpas_init_atm_cases.F:
Code:
      real (kind=RKIND), dimension(:), pointer :: latCell  ! These four variables (latCell, lonCell, bdyMaskCell, and ter)
      real (kind=RKIND), dimension(:), pointer :: lonCell  ! may actually have more than nCells elements, for example,
      integer, dimension(:), pointer :: bdyMaskCell        ! if the arrays include a "garbage cell".
      real (kind=RKIND), dimension(:), pointer :: ter      !

In any case, this is probably something that we should address in a bugfix release.
 
Top