Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

Questions About Using Precompiled WRF on AWS

spark1

New member
Hello WRF community,

I'm new to using WRF on AWS, and I have a few questions regarding the process. I hope some of you with more experience can provide some guidance.

Question 1: Transferring Precompiled WRF to My School Cluster

I'm interested in using precompiled WRF (WPS and WRF) on AWS, but I also need to run simulations on my school cluster. Is it possible to transfer the precompiled WRF (including all necessary files) from AWS to my school cluster? If so, what is the recommended method for doing this, and are there any specific configurations or permissions to be aware of when transferring the files?

Question 2: Obtaining Extra Geographic Data

I understand that I may need additional geographic data for certain WRF models like Noah MP. Can I simply use tools like wget to download the required data files and place them in the WPS_GEOG directory? Are there any specific guidelines or best practices for acquiring and formatting this additional geographic data?

Question 3: Managing Input Data

When it comes to input data, such as ERA5 data, can I create a directory on my AWS instance and download this data directly into that directory using wget or a similar method? What are the recommended steps for managing input data efficiently and ensuring it is accessible to my WRF simulations?

Question 4: Storage Limitations with AWS Free Tier

I'm currently using the AWS free tier, and I'm concerned about storage limitations. Do you have any suggestions for handling storage efficiently within the free-tier limits when working with large input files and geographic data? Are there AWS services or strategies I should consider to optimize storage usage?

I appreciate any insights and advice you can provide on these questions. Your expertise will be invaluable as I navigate my way through using WRF on AWS.

Thank you in advance for your help!
 
Question 1: Transferring Precompiled WRF to My School Cluster

I'm interested in using precompiled WRF (WPS and WRF) on AWS, but I also need to run simulations on my school cluster. Is it possible to transfer the precompiled WRF (including all necessary files) from AWS to my school cluster? If so, what is the recommended method for doing this, and are there any specific configurations or permissions to be aware of when transferring the files?
To do this, you can simply launch a new instance from the AMI that contains the pre-configured/installed WPS/WRF packages. Once you do that, the instance should be identical to the AMI and it is then in your own account and you can do whatever you need to in there. Our AMIs are in the US East region, so if you aren't using that region, there may be extra steps to get this to your account, but there should be AWS documentation on doing this. Let me know if I need to add any settings to allow you to use it, but I *think* I've already done this.

Question 2: Obtaining Extra Geographic Data

I understand that I may need additional geographic data for certain WRF models like Noah MP. Can I simply use tools like wget to download the required data files and place them in the WPS_GEOG directory? Are there any specific guidelines or best practices for acquiring and formatting this additional geographic data?
Yes, you can simply add the files, using 'wget' or similar commands. There are no best practices. Just make sure all static data directories reside in the same WPS_GEOG directory.

Question 3: Managing Input Data

When it comes to input data, such as ERA5 data, can I create a directory on my AWS instance and download this data directly into that directory using wget or a similar method? What are the recommended steps for managing input data efficiently and ensuring it is accessible to my WRF simulations?
Yes, you can use 'wget' or similar commands to obtain your input data, as well. Once you have it in your instance, the process should be no different than any other WPS process on another system.

Question 4: Storage Limitations with AWS Free Tier

I'm currently using the AWS free tier, and I'm concerned about storage limitations. Do you have any suggestions for handling storage efficiently within the free-tier limits when working with large input files and geographic data? Are there AWS services or strategies I should consider to optimize storage usage?
Unfortunately I don't have a lot of experience with the AWS free tier. You may need to discuss this with an AWS support person.
 
Thank you for your response and guidance on that.

I have a couple more questions to ensure a smooth transition:

1. Transferring Entire Precompiled Environment to School Cluster

If I transfer the entire directory structure from my AWS instance, including WPS_GEOG, compiled_wps, compiled_wrf, and libs, to my school cluster, will it be ready to run without any extra steps? Since these directories contain the precompiled packages, I assume the environment should be self-contained. However, I want to confirm if there are any specific configurations or settings I should be aware of when moving the entire environment.

2. Transferring the Entire AWS Instance

Alternatively, is it possible to transfer the entire AWS instance I am currently using to my school cluster? This way, I can avoid the hassle of re-preparing the geographical data and input data. Are there any recommended steps or considerations for transferring an entire AWS instance, and are there any potential challenges or issues I should be aware of?
 
To do this, you can simply launch a new instance from the AMI that contains the pre-configured/installed WPS/WRF packages. Once you do that, the instance should be identical to the AMI and it is then in your own account and you can do whatever you need to in there. Our AMIs are in the US East region, so if you aren't using that region, there may be extra steps to get this to your account, but there should be AWS documentation on doing this. Let me know if I need to add any settings to allow you to use it, but I *think* I've already done this.


Yes, you can simply add the files, using 'wget' or similar commands. There are no best practices. Just make sure all static data directories reside in the same WPS_GEOG directory.


Yes, you can use 'wget' or similar commands to obtain your input data, as well. Once you have it in your instance, the process should be no different than any other WPS process on another system.


Unfortunately I don't have a lot of experience with the AWS free tier. You may need to discuss this with an AWS support person.
I'm currently in the process of setting up and running WRF on my school cluster after transferring the entire directory from an AWS instance. While trying to execute ./geogrid.exe, I encountered the following error:

./geogrid.exe: error while loading shared libraries: libgfortran.so.5: cannot open shared object file: No such file or directory

To provide more context, I transferred the directories one of compiled_wps, one of compiled_wrf, and libs from AWS. However, it seems that I may need to configure some environmental variables correctly.

Environmental Variable Configuration: I understand that certain paths need to be set for the WRF environment to function properly. Here's what I have currently in my environment setup:

setenv DIR /path_to_directory/Build_WRF/LIBRARIES
setenv CC gcc
setenv CXX g++
setenv FC gfortran
setenv FCFLAGS -m64
setenv F77 gfortran
setenv FFLAGS -m64
setenv JASPERLIB $DIR/grib2/lib
setenv JASPERINC $DIR/grib2/include
setenv LDFLAGS -L$DIR/grib2/lib
setenv CPPFLAGS -I$DIR/grib2/include

I want to confirm if these environmental variable settings are correct for my setup. I also have the environment variable LD_LIBRARY_PATH set. But, I literally transferred the exact "libs" directory from the instance. I do not know why this happens. Are there any additional variables or configurations I should consider to resolve the libgfortran.so.5 error and ensure that ./geogrid.exe runs successfully?
 
Last edited:
Hi,
Unfortunately you won't be able to simply transfer the AWS environment to your school cluster. There are several paths, libraries, versions of libraries, and other software that will be different. The only way you could do this is if you were to put it in a container (for e.g., Docker). That is why you are seeing the errors when trying to run geogrid.exe.

2. Transferring the Entire AWS Instance

Alternatively, is it possible to transfer the entire AWS instance I am currently using to my school cluster? This way, I can avoid the hassle of re-preparing the geographical data and input data. Are there any recommended steps or considerations for transferring an entire AWS instance, and are there any potential challenges or issues I should be aware of?
I'm not sure if you can simply transfer the instance, but you could save the instance as an image, make it publicly available, and then use that image to create a new instance in your school's account. You should be able to find information on this process on the AWS site.
 
Top