Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Can't open some wrfouts after simulation

janac4

Member
Hello, I'm familiar with WRF model and have run multiple simulations successfully. However I have set up a new domain and tried it for a 30h run. Everything ran out apparently good, but when I tried to read the output, some of the wrfout files can't be read either by python or bash commands. For example when using ncdump command it pops out "Segmentation fault (core dumped)".
After looking at the log file I found that when creating these files, it says:
open_hist_w : error opening wrfout_d01_2022-11-28_11:00:00 for writing. -1021
Does anyone know how to solve this? It isn't a problem of space but I don't know what could it be.
Thanks a lot.
 

Attachments

  • rsl.error.txt
    3.3 MB · Views: 4
The model should automatically compile with large file support, but it could possibly be a disk space issue. I'm actually surprised it was able to run with only a single processor. If you have access to more processors, you may want to consider running with additional ones. You can take a look at this FAQ for information on determining an appropriate number to use, based on the size of your domain.

Is this a real-data run, or an idealized simulation?
 
It is a real-data run using multiple processors. I have tried using:
export WRFIO_NCD_LARGE_FILE_SUPPORT=1
This kind of solved my problem. Only the first wrfout was damaged after the simulation, all the others are fine. Since I discard the first files because of the spin-up, I can use all the data. I hope that in further runs no other wrfouts are damaged.
 
Hi, Actually I am also facing a similar issue. I had made a 5 day run with frames_per_outfile as 1 and I had the first wrfout file damaged/corrupted. So I left out that file and carried forward with the post-processing. But for the other runs that I tried, are also having its first wrfout file corrupted. Only the initial ones are getting damaged and others dont have any issue. How to resolve this issue? While it is possible to avoid the first file when it is one file per output, if i want the outputs to be in a single file (frame_per_outfile = 1000), which i tried, then that file is also having issue in opening. Can someone help me in solving the issue with the initial wrfout file being corrupted?

The said first file is not working in ARWpost or in wrf python, but all others are.
I tried the run with both FNL 1 degree by 1 degree files as well as GDAS FNL 0.25 degree by 0.25 degree files but the same issue is there.
I will attach the namelist for reference.
 

Attachments

  • namelist.input
    4 KB · Views: 1
Hi, Actually I am also facing a similar issue. I had made a 5 day run with frames_per_outfile as 1 and I had the first wrfout file damaged/corrupted. So I left out that file and carried forward with the post-processing. But for the other runs that I tried, are also having its first wrfout file corrupted. Only the initial ones are getting damaged and others dont have any issue. How to resolve this issue? While it is possible to avoid the first file when it is one file per output, if i want the outputs to be in a single file (frame_per_outfile = 1000), which i tried, then that file is also having issue in opening. Can someone help me in solving the issue with the initial wrfout file being corrupted?

The said first file is not working in ARWpost or in wrf python, but all others are.
I tried the run with both FNL 1 degree by 1 degree files as well as GDAS FNL 0.25 degree by 0.25 degree files but the same issue is there.
I will attach the namelist for reference.
Can you open a new issue on the forum? NCAR likes to keep the threads separate.
 
Top