Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Joiner program to join wrfinput and wrfbdy files

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

melpomene23

New member
Hi,

I am running quite a large set of domains, which produces very large met_em files (bigger than 4 GB). As real.exe cannot read met_em files that big, I was thinking of setting io_form_* = 102 for metgrid, real and wrf programs. But the issue is that since I'm going to use a very large number of processors for wrf.exe in order to run efficiently (500*36), I'll have to start with exactly that many processors for metgrid and real, which I think is too many.
I was wondering if there's a way to start with, say, 4 processors for metgrid and real, join wrfinput and wrfbdy into a single file for each domain, and then run wrf with a significantly larger number of processors. I know there's a Joiner program for wrfout, but what I'm looking for is actually join wrfinput and wrfbdy before running wrf.exe.

Thanks for your input,
 
Please take a look at the code "JOINER.tar" posted in the website:
http://www2.mmm.ucar.edu/wrf/users/contributed/contributed.html
This is a sample code to combine patch wrfout files into a single file. This may be a good example for you to develop your utility to combine wrfinput and wrfbdy.

Ming Chen
 
Ming Chen said:
Please take a look at the code "JOINER.tar" posted in the website:
http://www2.mmm.ucar.edu/wrf/users/contributed/contributed.html
This is a sample code to combine patch wrfout files into a single file. This may be a good example for you to develop your utility to combine wrfinput and wrfbdy.

Ming Chen

Hi Ming,

Thank you for your post. I am aware of the Joiner, and I mentioned it in my post. It'll take me some time to figure out how to change it so that it would do the job for wrfinput. That was why I made the post.
I appreciate if anyone could help with converting the code for wrfinput.

Thanks,
 
Due to limited human power here in NCAR, I am very sorry that we cannot get involved into all the coding works raised by our users. Hope someone in the community may have this utility and would like to share.

Ming Chen
 
Hi there!
I want to do some WRF Benchmarks with a big domain of 4500x4500 gridpoints. My met_em files are generated properly and are around 20 GB using netCDF-4 compression. However, if I run real.exe, it is stuck after reading in some variables and the reason is so far unclear to me.

melpomene23 said:
I am running quite a large set of domains, which produces very large met_em files (bigger than 4 GB). As real.exe cannot read met_em files that big, I was thinking of setting io_form_* = 102 for metgrid, real and wrf programs.

You mention real.exe cannot read met_em-files that big: Do you have any additional information on that or a source for that information? I'd appreciate any hints as I am really stuck at the moment.
 
Let's try the following options:

(1) rerun metgrid.exe using the option of io_form_metgrid = 102 (in &metgrid).

(2) run real with the option io_form_auxinput1 = 102 (&time-control)

Remember that you need to run metgrid and real using the same number of processors.

Please let me know whether this works.

Ming Chen
 
Hi Ming,
I tried this, but no improvement.

I tried a lot and did find the following problems using WPS& WRF v4.0.1 with ECMWF analysis data as input (I often run WRF with ECMWF data, so the problem is not related to ungrib or something similar; top of the ECMWF data is 1 hPa):

Step 1: Set up a domain with 1000x1000 grid points
-> geogrid works
-> metgrid works
-> real works (with 80 levels and p_top_requested=5000)
-> wrf works and produced wrfout-files

Step 2: Set up a domain with 2000x2000 grid points (only values changed in the namelist.wps are "e_we" and "e_sn". The ECMWF-grib data fully covers the extended domain)
-> geogrid works
-> metgrid works
-> real FAILS with "p_top_requested < grid%p_top possible"

BUT: If i compare the met_em files generated in step 2 to those generated in step 1 I cannot see any difference in the netCDF variables PRESS and PRESSURE (or others, all look reasonable, the only difference the bigger spatial coverage in the 2000x2000 domain from Step 2)

I'd appreciate any hints or ideas on that issue!
 
Please upload your namelist.wps and namelist.input (for both 1000x1000 grids and 2000x2000 grids).
Which ECMWF data did you ungrib? Please let me know where you download these data.

Ming Chen
 
Ming...

A couple of thoughts.

I believe I read somewhere that the join program is based on the CAPS "joinwrf" program. If so, that
program was designed to join WRF history files. It wasn't designed for "wrfinput_d01" or "wrfbdy_d01",
nor as far as I know has it ever been tested. Since I'm the person that does the realtime runs, I can tell
you that I've never tried it.

Any chance that the joiner program is writing 32-bit netcdf instead of 64-bit.

Is split "wrfinput_d01" and "wrfbdy_d01" supported by the WRF software? I know many years ago when
we did have to run splitfiles, one of them, I think the LBC files, were *NOT* supported, and didn't work.
The other did.
 
Kevin,

You are right that the "joinwrf" program only works for wrfout files. It doesn't work for wrfinput or wrfbdy files.

The split option works for history and restart files, but not for wrfinput and wrfbdy.

Ming
 
Ming Chen said:
Please take a look at the code "JOINER.tar" posted in the website:
http://www2.mmm.ucar.edu/wrf/users/contributed/contributed.html
This is a sample code to combine patch wrfout files into a single file. This may be a good example for you to develop your utility to combine wrfinput and wrfbdy.

Ming Chen

Does this Joiner program really work? I downloaded and tried to join files output for io_form_history=102, but always get segmentation fault for 4-D variables. It seems like when assigning variable values that are written, no matter if the variable is 3D or 4D, there are only three for loops that loop through the first dimension. So it looks to me that this code doesn't work for 4D variables.
 
Top