Hello!
I am using obsgrid.exe to combine my WPS output (i.e., met_em*.nc files) with observation datasets of ds046.1 and ds035.1. My research domain covers entire China, with a spatial resolution of 27km. I want to improve the accuracy of meteorological parameters, so I tried to use the OBSGRID.exe tool. I followed the user guide (OBSGRID/ObsNudgingGuide.pdf at master · wrf-model/OBSGRID) to compile and run the tool. However, when I finally ran the obsgrid.exe, the program failed after a few seconds with the following segmentation error:
The attach file is the namelist.oa file. Other input files (e.g., met_em*.nc and OBS*) can be found here example input files. I guess this problem may be a result of substantial amount of obs from ds046.1 dataset (please refer to the image below, which is output by ./util/station.ncl within OBSGRID package). If so, is there any tool which can be used to decrease the density of the observations?
Any suggestion will be appreciated!
Su Yi
I am using obsgrid.exe to combine my WPS output (i.e., met_em*.nc files) with observation datasets of ds046.1 and ds035.1. My research domain covers entire China, with a spatial resolution of 27km. I want to improve the accuracy of meteorological parameters, so I tried to use the OBSGRID.exe tool. I followed the user guide (OBSGRID/ObsNudgingGuide.pdf at master · wrf-model/OBSGRID) to compile and run the tool. However, when I finally ran the obsgrid.exe, the program failed after a few seconds with the following segmentation error:
I have searched the forum and found similar errors, but could not find a suitable solution. I have also run the command "ulimit -s unlimited", but the error remains.----------------------------------------------------------------------------
Time period #00227 is for date 2019-01-29_18:00:00
Time period #00228 is for date 2019-01-29_21:00:00
Time period #00229 is for date 2019-01-30_00:00:00
Time Loop Processing, date = 2019-01-01_12:00:00
#########################################################################
Setting radius of influence for Cressman scheme
The radius of influence of each scan is set to:
20, 14, 10, 7,
#########################################################################
Setting up LAMBERT CONFORMAL map...
Computed cone factor: 0.539
Computed pole (x,y) = 93.514 488.799
Setting up LAMBERT CONFORMAL map...
Computed cone factor: 0.539
Computed pole (x,y) = 94.014 489.299
Using ./OBS:2019-01-01_12 as obs input file
Number of observations successfully ingested: 2420.
Number of empty observations discarded: 456.
Number of observations discarded outside of domain: 45141.
Of the 2420 observations reported, 184 of them are duplicates, leaving 2236 merged locations
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
libpthread-2.27.s 0000153831016980 Unknown Unknown Unknown
obsgrid.exe 0000000000463B40 Unknown Unknown Unknown
obsgrid.exe 0000000000468024 Unknown Unknown Unknown
obsgrid.exe 00000000004123C6 Unknown Unknown Unknown
obsgrid.exe 000000000041E554 Unknown Unknown Unknown
obsgrid.exe 000000000040C3BD Unknown Unknown Unknown
libc-2.27.so 0000153830C34C87 __libc_start_main Unknown Unknown
obsgrid.exe 000000000040C2BA Unknown Unknown Unknown
The attach file is the namelist.oa file. Other input files (e.g., met_em*.nc and OBS*) can be found here example input files. I guess this problem may be a result of substantial amount of obs from ds046.1 dataset (please refer to the image below, which is output by ./util/station.ncl within OBSGRID package). If so, is there any tool which can be used to decrease the density of the observations?
Any suggestion will be appreciated!
Su Yi