Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

How to use ERA5 Data From Copernicus Database

William.Hatheway

Active member

**Guide to Using Copernicus ERA5 Data with WRF**


Dear WRF Users,

Many users encounter challenges when utilizing ERA5 data from the Copernicus Climate Data Store (CDS) with WRF. To assist, I've put together a step-by-step guide on how to successfully download, process, and run ERA5 data within WRF. Please note that, unlike ERA5 data from NOAA's RDA database, the ECMWF Copernicus database involves a few additional steps.

### Step 1: Acquiring ERA5 Data from CDS

ERA5 data from ECMWF is split into two distinct categories: Pressure Levels and Single-Level Surface data. It's crucial to download these datasets separately and store them in separate directories for proper processing. I recommend organizing them as follows:

- `ERA5/PRESSURE`
- `ERA5/SURFACE`

You can find the necessary datasets here:

- [ERA5 Pressure Level Data]
- [ERA5 Single-Level Surface Data]

### Step 2: Downloading Data for WRF

#### ERA5 Pressure Level Data

For the pressure level data, download the following variables and pressure levels:

**Variables:**
- `divergence`, `fraction_of_cloud_cover`, `geopotential`, `ozone_mass_mixing_ratio`, `potential_vorticity`, `relative_humidity`, `specific_cloud_ice_water_content`, `specific_cloud_liquid_water_content`, `specific_humidity`, `specific_rain_water_content`, `specific_snow_water_content`, `temperature`, `u_component_of_wind`, `v_component_of_wind`, `vertical_velocity`, `vorticity`

**Pressure Levels:**
- `10`, `20`, `30`, `50`, `70`, `100`, `125`, `150`, `175`, `200`, `225`, `250`, `300`, `350`, `400`, `450`, `500`, `550`, `600`, `650`, `700`, `750`, `775`, `800`, `825`, `850`, `875`, `900`, `925`, `950`, `975`, `1000`

**Times:** (adjust according to your case)
- `00:00`, `01:00`, `02:00`, ..., `23:00`

#### ERA5 Single-Level Surface Data

For the surface data, the following variables are required:

**Variables:**
- `10m_u_component_of_wind`, `10m_v_component_of_wind`, `2m_dewpoint_temperature`, `2m_temperature`, `land_sea_mask`, `mean_sea_level_pressure`, `sea_ice_cover`, `sea_surface_temperature`, `skin_temperature`, `snow_density`, `snow_depth`, `soil_temperature_level_1`, `soil_temperature_level_2`, `soil_temperature_level_3`, `soil_temperature_level_4`, `surface_pressure`, `volumetric_soil_water_layer_1`, `volumetric_soil_water_layer_2`, `volumetric_soil_water_layer_3`, `volumetric_soil_water_layer_4`

**Times:** (adjust according to your case)
- `00:00`, `01:00`, `02:00`, ..., `23:00`

### Step 3: Ungribbing the Data

After downloading the required data, you'll need to ungrib both datasets. The best approach is to use two separate `namelist.wps` files—one for pressure data and one for surface data. This separation simplifies the process and keeps the workflow organized. I’ve provided example `namelist.wps` files that can assist you.

#### Ungribbing the Pressure Data

Use the following commands to ungrib the pressure data:


Bash:
ulimit -s unlimited
cd /path/to/wps/folder/


# Linking pressure level data
./link_grib.csh /path/to/ERA5/PRESSURE/


# Linking Vtable for ECMWF
ln -sf ungrib/Variable_Tables/Vtable.ECMWF ./Vtable


# Ungrib the data
./ungrib.exe


# Remove GRIB files after processing
rm -rf GRIBFILE.*


The `namelist.wps` should be configured like this for pressure data:


Code:
&ungrib
 out_format = 'WPS',
 prefix = 'FILE',
/


#### Ungribbing the Surface Data

Next, ungrib the surface data with the following commands:


Bash:
# Linking surface level data
./link_grib.csh /path/to/ERA5/SURFACE/


# Linking Vtable for ECMWF
ln -sf ungrib/Variable_Tables/Vtable.ECMWF ./Vtable


# Ungrib the data
./ungrib.exe


Use a different prefix for the surface data in the `namelist.wps`:

Code:
&ungrib
 out_format = 'WPS',
 prefix = 'SFILE',
/


### Step 4: Running Metgrid

Once the ungribbing process is complete, the next step is to run `metgrid` to process the data for WRF. You need to configure `metgrid` to read both the pressure (FILE) and surface (SFILE) data by specifying them in the `namelist.wps` file:


Code:
&metgrid
 fg_name = 'SFILE','FILE',
 io_form_metgrid = 2,
/


Run `metgrid` using the following command:


Bash:
mpirun -np ./metgrid.exe


This command will process both the surface and pressure data and generate the `met` files needed for WRF.

### Step 5: Running WRF

After completing the `metgrid` step, the `met` files will be ready for WRF. You can either copy or link these files to the WRF directory and proceed with your WRF simulation.

---

I hope this guide helps streamline the process of using ERA5 data from Copernicus with WRF. If you encounter any issues or have questions, feel free to reach out. Please note, I am not affiliated with NCAR, and my goal is simply to assist fellow users. For any corrections or additional insights, I encourage the forum admins (@kwerner, @Ming Chen, @weiwang) to weigh in.

Best regards
 

Attachments

  • pressure.namelist.wps
    824 bytes · Views: 29
  • surface.namelist.wps
    833 bytes · Views: 23
Thanks for your post. I'd really appreciate it if you could provide insights into downloading and utilizing SST data in combination with ERA5 forcings.
 
I am following the above guide on running WPS (v4.6) on ERA5 pressure-level and surface data acquired from cds.climate.copernicus.eu and getting stuck on running ungrib.exe on the surface data. I get the following error, which other forum posts indicate that the method above *should* resolve:

\tI was expecting a Grib1 file, but this is a Grib2 file.
\tIt is possible this is because your GRIBFILE.XXX files
\tare not all of the same type.
\tWPS can handle both file types, but a separate ungrib
\tjob must be run for each Grib type.\n

I acquired the ERA5 data from copernicus using the Python API to generate the two requests, with the grib file format specified. I Put the two grib files in separate subdirectories: ERA5/PRS/data1.grib and ERA5/SFC/data2.grib

I then made two separate ungrib.exe runs, one per directory, changing the namelist.wps files before each run in accordance with the procedure above. I am using the Vtable.ECMWF file packaged with WPS.

I made sure that there were no other GRIBFILE.XXX in the working directory. On one occasion, I even tried running a variation of:
grib_set -r -w packingType=grid_ccsds -s packingType=grid_simple input.grib2 output.grib2

Nothing that I have tried is clearing this error or providing additional information on where the problem might lie. By all accounts, the method above works / has worked with copernicus data.

Does anyone have any insights? Have there been any changes to either Copernicus or WPS that could be relevent?
 
I am following the above guide on running WPS (v4.6) on ERA5 pressure-level and surface data acquired from cds.climate.copernicus.eu and getting stuck on running ungrib.exe on the surface data. I get the following error, which other forum posts indicate that the method above *should* resolve:

\tI was expecting a Grib1 file, but this is a Grib2 file.
\tIt is possible this is because your GRIBFILE.XXX files
\tare not all of the same type.
\tWPS can handle both file types, but a separate ungrib
\tjob must be run for each Grib type.\n

I acquired the ERA5 data from copernicus using the Python API to generate the two requests, with the grib file format specified. I Put the two grib files in separate subdirectories: ERA5/PRS/data1.grib and ERA5/SFC/data2.grib

I then made two separate ungrib.exe runs, one per directory, changing the namelist.wps files before each run in accordance with the procedure above. I am using the Vtable.ECMWF file packaged with WPS.

I made sure that there were no other GRIBFILE.XXX in the working directory. On one occasion, I even tried running a variation of:
grib_set -r -w packingType=grid_ccsds -s packingType=grid_simple input.grib2 output.grib2

Nothing that I have tried is clearing this error or providing additional information on where the problem might lie. By all accounts, the method above works / has worked with copernicus data.

Does anyone have any insights? Have there been any changes to either Copernicus or WPS that could be relevent?
did you download the data as one big file or separate files?
 
I downloaded as one big file. I figured out the solution.

I didn't realize this, but when you select the "grib" option, the API does not standardize the results as a single file format. If you select parameters that are natively grib1, it will return the data in grib1, and if the parameters are natively in grib2 then it will return the data in grib2. Thus, selecting to receive the data as a single file will cause grib1 and grib2 records to be combined in the same file.

The solution is to either only select parameters that are grib1 in one request and others that are grib2 in another request, or to request each parameter individually.
 
I downloaded as one big file. I figured out the solution.

I didn't realize this, but when you select the "grib" option, the API does not standardize the results as a single file format. If you select parameters that are natively grib1, it will return the data in grib1, and if the parameters are natively in grib2 then it will return the data in grib2. Thus, selecting to receive the data as a single file will cause grib1 and grib2 records to be combined in the same file.

The solution is to either only select parameters that are grib1 in one request and others that are grib2 in another request, or to request each parameter individually.
glad you figured it out
 
Hello, @William.Hatheway, thank you for the post. I have a question: why download variables like 'divergence', 'fraction_of_cloud_cover', 'ozone_mass_mixing_ratio', etc. through the API if these variables are not in the Vtable? In this case, the program ungrib.exe won't process these variables. Wouldn't it be necessary to download only the variables listed in the Vtable?
 
Hello, @William.Hatheway, thank you for the post. I have a question: why download variables like 'divergence', 'fraction_of_cloud_cover', 'ozone_mass_mixing_ratio', etc. through the API if these variables are not in the Vtable? In this case, the program ungrib.exe won't process these variables. Wouldn't it be necessary to download only the variables listed in the Vtable?
Good morning @robson_passos

That's probably an artifact of when I was using it for WRF-CHEM purposes instead of just WRF-ARW. I should probably go through it again and double check what all needs to go away
 
This discussion will prove very helpful to me. Thanks for posting!

I have a question about the single level wind speed data that is at 10 meters. If I follow this process, will WRF eventually be fed this information? What about if I would want to provide WRF with the pressure level wind speeds, and the 10 meter and 100 meter single level data. Is it possible?

Does anyone have any thoughts on this?
 
Top