Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Ungribbing ERA5 grib2 files


New member

I am attempting to ungrib ERA5 files in grib2 format, but I keep getting the error message
Looking for data at time 2012-05-13_06
ERROR: Data not found: 2012-05-13_06:00:00.0000
When I run
, it will read the grib2 files, but it only returns the date/time 2012-05-01_00:00:00 for every line.
I can see that I have all necessary dates/times because when I run
cdo showtimestamp era5.grib2
, I get 2012-05-01T00:00:00 through 2012-05-31T21:00:00 at 3-hr increments for 248 total timestamps.
I found this post from the WRF Forum in 2019 that mentions existing issues with reading ERA5 data. Does that issue still persist? And do you know of a workaround that has worked for others using ERA5 data? I've tried converting grib2 to grib1 and the suggestion mentioned here, but I still can't get it to work.
Thank you
Hi, kyle_r
WPS can only process single-time GRIB2 format datafile. If your datafile contains multiple-time data, WPS cannot work.
Is there nay way you can split the single file into multiple files and each file contains only a single-time data? Sorry that we don't have the utility to do so, but I would appreciate if you can post the information here if you figure out how to. Many users have the similar problem and we hope to get some helpful information from the community. Thanks in advance.
Hi Ming,

Thank you for the confirmation.
I was able to use a relatively simple workflow to accomplish what you suggested. My example is based on a month of 3-hourly data.
Using Climate Data Operator, I split the data into separate days using
cdo splitday era5_201205.grib2 era5_201205_
where era5_201205.grib2 is the input file name and era5_201205_ is the base file name for the output files. CDO appends the day number and file extension to the end (returns 31 files for May).
I then used
cdo splithour era5_201205_01.grb era5_3hr_ml_130_201205_01_
to split the output from step 1 data into different separate grib files based on hour. Syntax is similar to above except for the function used. This command would have to be ran on each of the 31 separate grib files from step 1. For 3-hourly data, it will ouput 8 grib files/day.
You'll probably want put it in a bash script to loop through the files because they add up quickly.