Hello,
I am currently working with the wrfout* and wrfxtrm* files and have encountered a significant issue related to the computation of T2 and T2MEAN. Despite reviewing the relevant directories, particularly within the Physics modules, and consulting the User's Guide as well as searching GitHub, I have not found any clear explanation regarding how T2 and T2MEAN are computed.
One concerning observation is that I am seeing an average temperature difference of approximately -1.1°C between T2 from the wrfout* files and T2MEAN in the wrfxtrm* files. This difference appears to be quite consistent across the entire domain, regardless of whether it is over ocean or land. This bias seems large, and I am struggling to understand its cause.
Could anyone provide insight into how T2 and T2MEAN are calculated in these files, or offer any explanations as to why this bias might be occurring in my simulations?
Thank you for your help.
I am currently working with the wrfout* and wrfxtrm* files and have encountered a significant issue related to the computation of T2 and T2MEAN. Despite reviewing the relevant directories, particularly within the Physics modules, and consulting the User's Guide as well as searching GitHub, I have not found any clear explanation regarding how T2 and T2MEAN are computed.
One concerning observation is that I am seeing an average temperature difference of approximately -1.1°C between T2 from the wrfout* files and T2MEAN in the wrfxtrm* files. This difference appears to be quite consistent across the entire domain, regardless of whether it is over ocean or land. This bias seems large, and I am struggling to understand its cause.
Could anyone provide insight into how T2 and T2MEAN are calculated in these files, or offer any explanations as to why this bias might be occurring in my simulations?
Thank you for your help.