Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Designated alternative output location

This post was from a previous version of the WRF&MPAS-A Support Forum. New replies have been disabled and if you have follow up questions related to this post, then please start a new thread from the forum home page.

bdjohns

New member
Hello,

I am attempting to direct wrfout* files to an alternative file location in WRF v4.3. I am limited by storage space in my working directory and would like to output to a scratch storage space. In previous versions, this was pretty straightforward by using
Code:
history_outname = /path/to/scratch/directory
within namelist.input. I noticed this option is removed from /WRF-4.3/run/README.namelist. Is there a way to utilize this feature in v4.3+?

Best,
Brad Johnson
 
Brad,
I'm not sure this option was ever listed in the README.namelist file (though it should be); however, the option wasn't removed. You can still find it in the Registry/registry.io_boilerplate. Did you try it?
 
Thank you for your reply. Yes, I included it in a simulation I'm currently running without success. There are no errors. The wrfout files simply outputs to the em_real directory.
 
Hi Brad,
I apologize for not catching this after your initial post, but assuming you used the parameter essentially as you stated (of course with the actual path name), you need to change the format to:
Code:
history_outname = "/path/to/scratch/directory/wrfout_d<domain>_<date>"

- add quotes around the path and file name
- add wrfout_d<domain>_<date> after the directory path. Do not make any modifications to this syntax. It should appear exactly as "wrfout_d<domain>_<date>"

Let me know if that makes a difference. Thanks!
 
This worked for me. Thank you!

I should note that with the output designated for a different location, the model efficiency has dropped in approximately half. I will need to run a couple more simulations to see if this is consistent or possibly just an issue related to the HPC and storage on my end.
 
Top