jonasberndt
New member
Dear wrf help team,
this is a question on setup up an assimilation experiment with the wrfda system, in particular 3DVAR with be statistics.
We want to run a predictability experiment on the evolution of a baroclinic instability with WRF and 3DVAR assimilation, we will invest quite some HPC resources in this experiment. We want to setup the experiment within a 3DVAR cycling mode. For calculation of the background error covariance statistics, the NMC method seems appropriate to us for practical reasons.
Let's say we want to initialize our target forecast on 20140808 00:00. We thought to initialize WRF at 20140806 00:00 from WPS (cold start from GFS analysis) and then successively calculate an analysis in 6-hours steps.
Does this setup sound reasonable to you? Is the analysis interval of 6-hours too short? Should we make a cold start a few days earlier, or some time later? What is you experience on this?
Is it worth to update the lower boundary conditions via da_update_bc.exe, or is it not worth as the time span is short anyways?
A last question remains, just to make sure: For calculation of the background error covariance statistics in a 6-hour cycling mode, we will use the forecast initialized 6-hours ago and the lagged forecast initialized 12 hours ago, right? Or should we rather make a lagged forecast 18/24 hours ago?
Thank you very much for any kind of help on this issue.
Jonas
this is a question on setup up an assimilation experiment with the wrfda system, in particular 3DVAR with be statistics.
We want to run a predictability experiment on the evolution of a baroclinic instability with WRF and 3DVAR assimilation, we will invest quite some HPC resources in this experiment. We want to setup the experiment within a 3DVAR cycling mode. For calculation of the background error covariance statistics, the NMC method seems appropriate to us for practical reasons.
Let's say we want to initialize our target forecast on 20140808 00:00. We thought to initialize WRF at 20140806 00:00 from WPS (cold start from GFS analysis) and then successively calculate an analysis in 6-hours steps.
Does this setup sound reasonable to you? Is the analysis interval of 6-hours too short? Should we make a cold start a few days earlier, or some time later? What is you experience on this?
Is it worth to update the lower boundary conditions via da_update_bc.exe, or is it not worth as the time span is short anyways?
A last question remains, just to make sure: For calculation of the background error covariance statistics in a 6-hour cycling mode, we will use the forecast initialized 6-hours ago and the lagged forecast initialized 12 hours ago, right? Or should we rather make a lagged forecast 18/24 hours ago?
Thank you very much for any kind of help on this issue.
Jonas