Scheduled Downtime
On Friday 21 April 2023 @ 5pm MT, this website will be down for maintenance and expected to return online the morning of 24 April 2023 at the latest

ERROR IN REAL.EXE

andysant18

New member
Hi, my name is Andrea i am student and i am programing the WRF model, now i have a few problems, i have all the .exe and all the ERA5 I need to simulate Hurricane Otis, but I'm having trouble running real.exe. I show the error
base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ tail -n 20 log.real
starting wrf task 0 of 1
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ tail -n 20 rsl.error.0000
taskid: 0 hostname: tlaloc.atmosfera.unam.mx
module_io_quilt_old.F 2931 T
------ ERROR while reading namelist domains ------
Maybe here?: smooth_option = 0,
Maybe here?: use_fdda = .false.,
------ ERROR while reading namelist grib2 ------
Maybe here?: &grib2
Maybe here?: fg_name = 'met_em.d01.*.nc', ! Archivos generados por WPS
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 11700
ERRORS while reading one or more namelists from namelist.input.
-------------------------------------------
Abort(1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ tail -n 20 rsl.out.0000
taskid: 0 hostname: tlaloc.atmosfera.unam.mx
------ ERROR while reading namelist domains ------
Maybe here?: smooth_option = 0,
Maybe here?: use_fdda = .false.,
------ ERROR while reading namelist grib2 ------
Maybe here?: &grib2
Maybe here?: fg_name = 'met_em.d01.*.nc', ! Archivos generados por WPS
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 11700
ERRORS while reading one or more namelists from namelist.input.
-------------------------------------------
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $
And this is my namelist.input
&time_control
run_days = 2, ! Duración total de la simulación en días
run_hours = 0,
run_minutes = 0,
run_seconds = 0,
start_year = 2023, ! Año de inicio para cada dominio (d01,d02...)
start_month = 10,
start_day = 23,
start_hour = 0,
start_minute = 0,
start_second = 0,
end_year = 2023,
end_month = 10,
end_day = 25,
end_hour = 0,
end_minute = 0,
end_second = 0,

interval_seconds = 10800, ! Salida de metgrid cada 3 horas
input_from_file = .true., ! Usa los archivos met_em generados
history_interval = 60, ! Salida WRF cada 60 minutos
frames_per_outfile = 1,
restart = .false.,
restart_interval = 1440,
io_form_history = 2, ! NetCDF clásico
io_form_restart = 2,
io_form_input = 2,
io_form_boundary = 2,
/

&domains
time_step = 180, ! 3 minutos para dx=30 km
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 1, ! Solo un dominio
e_we = 50, ! Igual que geogrid
e_sn = 50,
e_vert = 41, ! Número de niveles verticales (ajustable)
p_top_requested = 5000, ! Pa (50 hPa)
num_metgrid_levels = 27, ! Para presiones y superficie
dx = 30000,
dy = 30000,
grid_id = 1,
parent_id = 1,
i_parent_start = 1,
j_parent_start = 1,
parent_grid_ratio = 1,
feedback = 0,
smooth_option = 0,
use_fdda = .false.,
/

&physics
mp_physics = 3, ! Thompson microphysics
ra_lw_physics = 1,
ra_sw_physics = 1,
bl_pbl_physics = 1,
sf_sfclay_physics = 2,
sf_surface_physics = 2,
cu_physics = 0, ! No convection para dx=30 km
cudt = 5,
/

&fdda
/

&dynamics
w_damping = 0,

diff_opt = 1,
km_opt = 4,
damp_opt = 0,
/

&bdy_control
spec_bdy_width = 5,
relax_zone = 5,
/

&grib2
fg_name = 'met_em.d01.*.nc', ! Archivos generados por WPS
/
I wish that someone will help me, that is for my tesis
 
Hi,
The error messages you're seeing are specific to the following two namelist entries:

use_fdda is not a namelist option. Perhaps you meant to use grid_fdda, which should be listed in the &fdda namelist record.

fg_name, which is listed in your &grib2 namelist record is not a WRF namelist parameter. If you remove that entire line from the &grib2 section, so that it now reads like the following, that error should disappear:

Code:
&grib2
/
 
Hi,
The error messages you're seeing are specific to the following two namelist entries:

use_fdda is not a namelist option. Perhaps you meant to use grid_fdda, which should be listed in the &fdda namelist record.

fg_name, which is listed in your &grib2 namelist record is not a WRF namelist parameter. If you remove that entire line from the &grib2 section, so that it now reads like the following, that error should disappear:

Code:
&grib2
/
Hi, now i have a new mistake the message is the next :
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ tail -f rsl.error.0000
ims,ime,jms,jme -4 54 -4 54
ips,ipe,jps,jpe 1 49 1 49
*************************************
DYNAMICS OPTION: Eulerian Mass Coordinate
alloc_space_field: domain 1 , 101511876 bytes allocated
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 409
error opening met_em.d01.2023-10-23_00:00:00.nc for input; bad date in namelist or file not in directory
-------------------------------------------
Abort(1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
and now my namelist.input is this:
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ cat namelist.input
&time_control
run_days = 2,
run_hours = 0,
run_minutes = 0,
run_seconds = 0,
start_year = 2023,
start_month = 10,
start_day = 23,
start_hour = 0,
start_minute = 0,
start_second = 0,
end_year = 2023,
end_month = 10,
end_day = 25,
end_hour = 0,
end_minute = 0,
end_second = 0,

interval_seconds = 10800,
input_from_file = .true.,
history_interval = 60,
frames_per_outfile = 1,
restart = .false.,
restart_interval = 1440,
io_form_history = 2,
io_form_restart = 2,
io_form_input = 2,
io_form_boundary = 2,
/

&domains
time_step = 180,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 1,
e_we = 49,
e_sn = 49,
e_vert = 41,
p_top_requested = 5000,
num_metgrid_levels = 27,
dx = 30000,
dy = 30000,
grid_id = 1,
parent_id = 1,
i_parent_start = 1,
j_parent_start = 1,
parent_grid_ratio = 1,
feedback = 0,
smooth_option = 0,
/

&physics
mp_physics = 3,
ra_lw_physics = 1,
ra_sw_physics = 1,
bl_pbl_physics = 1,
sf_sfclay_physics = 2,
sf_surface_physics = 2,
cu_physics = 0,
cudt = 5,
/

&dynamics
w_damping = 0,
diff_opt = 1,
km_opt = 4,
damp_opt = 0,
/

&bdy_control
spec_bdy_width = 5,
relax_zone = 5,
/

&grib2
/

(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $
} The thrut is that i don´t know what happen now, i wait for your answer!
 
Did you link the met_em* files from WPS to your WRF running directory? If not, you will need to make sure those met_em* files are either linked or copied to your wrf running directory.
 
Yes, Im not sure, Can you help me? Here were my met_em files
(base) a.7116@tlaloc ~/Build_WRF/WPS-4.5 $ ls -lh met_em.d01.*
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-23_00:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-23_03:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-23_06:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-23_09:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-23_12:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 796K ene 27 17:27 met_em.d01.2023-10-23_15:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-23_18:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-23_21:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-24_00:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-24_03:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-24_06:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-24_09:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-24_12:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 796K ene 27 17:27 met_em.d01.2023-10-24_15:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 796K ene 27 17:27 met_em.d01.2023-10-24_18:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-24_21:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:27 met_em.d01.2023-10-25_00:00:00.nc
(base) a.7116@tlaloc ~/Build_WRF/WPS-4.5 $ and i was run the WRF model in Build_WRF, WRF-4.5, main. Here are my .exe executables can you see that here :
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/main $ ls
CMakeLists.txt log.real module_wrf_top.f90 ndown.exe* rsl.error.0000 wrf_ESMFMod.F
convert_em.F Makefile module_wrf_top.mod real_em.F rsl.out.0000 wrf.exe*
depend.common make_real_direct.log module_wrf_top.o real_em.f90 tc_em.F wrf.F
ideal_em.F make_real_only.log ndown_em.F real_em.o tc_em.f90 wrf.f90
ideal_nmm.F module_initialize_real.mod ndown_em.f90 real.exe* tc_em.o wrf.o
libwrflib.a module_wrf_top.F ndown_em.o real_nmm.F tc.exe* wrf_SST_ESMF.F
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/main $
All my geogrib,ungrib and metgrib run good but all my problems was in the real.exe. I wish that you can help me, and wait your answer.
 
Did you link the met_em* files from WPS to your WRF running directory? If not, you will need to make sure those met_em* files are either linked or copied to your wrf running directory.
Now im check it, and all the met_em are here
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ ls
backup_metem/ create_p3_lookupTable_2.f90-v5.3 met_em.d01.2023-10-25_00:00:00.nc README.tslist
aerosol.formatted eclipse_besselian_elements.dat met_em.d02.2023-10-21_00:00:00.nc real_debug2.log
aerosol_lat.formatted ETAMPNEW_DATA met_em.d02.2023-10-21_06:00:00.nc real_debug.log
aerosol_lon.formatted ETAMPNEW_DATA_DBL met_em.d02.2023-10-21_12:00:00.nc real_debug_new.log
aerosol_plev.formatted ETAMPNEW_DATA.expanded_rain* met_em.d02.2023-10-21_18:00:00.nc real_detailed.log
BROADBAND_CLOUD_GODDARD.bin ETAMPNEW_DATA.expanded_rain_DBL met_em.d02.2023-10-22_00:00:00.nc real.exe@
bulkdens.asc_s_0_03_0_9 fix_metem.sh* met_em.d02.2023-10-22_06:00:00.nc real_execution.log
bulkradii.asc_s_0_03_0_9 GENPARM.TBL met_em.d02.2023-10-22_12:00:00.nc real_final.log
CAM_ABS_DATA geo_em.d01.nc met_em.d02.2023-10-22_18:00:00.nc real_no_mpi.log
CAM_AEROPT_DATA grib2map.tbl met_em.d02.2023-10-23_00:00:00.nc RRTM_DATA
CAMtr_volume_mixing_ratio@ gribmap.txt met_em.d02.2023-10-23_06:00:00.nc RRTM_DATA_DBL
CAMtr_volume_mixing_ratio.A1B HLC.TBL met_em.d02.2023-10-23_12:00:00.nc RRTMG_LW_DATA
CAMtr_volume_mixing_ratio.A2 input met_em.d02.2023-10-23_18:00:00.nc RRTMG_LW_DATA_DBL
CAMtr_volume_mixing_ratio.RCP4.5 ishmael-gamma-tab.bin met_em.d02.2023-10-24_00:00:00.nc RRTMG_SW_DATA
CAMtr_volume_mixing_ratio.RCP6 ishmael-qi-qc.bin met_em.d02.2023-10-24_06:00:00.nc RRTMG_SW_DATA_DBL
CAMtr_volume_mixing_ratio.RCP8.5 ishmael-qi-qr.bin met_em.d02.2023-10-24_12:00:00.nc rsl.error.0000
CAMtr_volume_mixing_ratio.SSP119 kernels.asc_s_0_03_0_9 met_em.d02.2023-10-24_18:00:00.nc rsl.out.0000
CAMtr_volume_mixing_ratio.SSP126 kernels_z.asc met_em.d02.2023-10-25_00:00:00.nc SOILPARM.TBL
CAMtr_volume_mixing_ratio.SSP245 LANDUSE.TBL met_em.d02.2023-10-25_06:00:00.nc SOILPARM.TBL_Kishne_2017
CAMtr_volume_mixing_ratio.SSP370 log.real met_em_dump.txt STOCHPERT.TBL
CAMtr_volume_mixing_ratio.SSP585 masses.asc MPTABLE.TBL@ tc.exe@
capacity.asc met_em.d01.2023-10-23_00:00:00.nc namelist termvels.asc
CCN_ACTIVATE.BIN met_em.d01.2023-10-23_03:00:00.nc namelist.input tmp.nc.pid3534950.ncap2.tmp
CLM_ALB_ICE_DFS_DATA met_em.d01.2023-10-23_06:00:00.nc namelist.input.backup tmp.nc.pid3534976.ncap2.tmp
CLM_ALB_ICE_DRC_DATA met_em.d01.2023-10-23_09:00:00.nc namelist.input.simple tr49t67
CLM_ASM_ICE_DFS_DATA met_em.d01.2023-10-23_12:00:00.nc namelist.input.test tr49t85
CLM_ASM_ICE_DRC_DATA met_em.d01.2023-10-23_15:00:00.nc namelist.output tr67t85
CLM_DRDSDT0_DATA met_em.d01.2023-10-23_18:00:00.nc ndown.exe@ URBPARM_LCZ.TBL
CLM_EXT_ICE_DFS_DATA met_em.d01.2023-10-23_21:00:00.nc ozone.formatted URBPARM.TBL
CLM_EXT_ICE_DRC_DATA met_em.d01.2023-10-24_00:00:00.nc ozone_lat.formatted URBPARM_UZE.TBL
CLM_KAPPA_DATA met_em.d01.2023-10-24_03:00:00.nc ozone_plev.formatted VEGPARM.TBL
CLM_TAU_DATA met_em.d01.2023-10-24_06:00:00.nc p3_lookupTable_1.dat-v5.4_2momI wind-turbine-1.tbl
co2_trans* met_em.d01.2023-10-24_09:00:00.nc p3_lookupTable_1.dat-v5.4_3momI wrf.exe@
coeff_p.asc met_em.d01.2023-10-24_12:00:00.nc p3_lookupTable_2.dat-v5.3 wrf_run_files.zip
coeff_q.asc met_em.d01.2023-10-24_15:00:00.nc README.namelist
constants.asc met_em.d01.2023-10-24_18:00:00.nc README.physics_files
create_p3_lookupTable_1.f90-v5.4 met_em.d01.2023-10-24_21:00:00.nc README.rasm_diag
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $
 
Did you link the met_em* files from WPS to your WRF running directory? If not, you will need to make sure those met_em* files are either linked or copied to your wrf running directory.
This is the new mistake.
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ # 1. Muestra el contenido COMPLETO de tu namelist.input
echo "=== NAMELIST.INPUT ==="
cat namelist.input

# 2. Muestra los logs de error COMPLETOS
echo -e "\n=== LOGS DE ERROR (rsl.error.0000) ==="
if [ -f rsl.error.0000 ]; then
cat rsl.error.0000
else
echo "No existe rsl.error.0000"
fi

# 3. Muestra los logs de salida
echo -e "\n=== LOGS DE SALIDA (rsl.out.0000) ==="
if [ -f rsl.out.0000 ]; then
cat rsl.out.0000
fi

# 4. Información del sistema y archivos
echo -e "\n=== INFORMACIÓN DEL SISTEMA ==="
echo "Directorios actual: $(pwd)"
echo "Usuario: $(whoami)"
module list 2>/dev/null || echo "No hay módulos cargados o no disponible"

# 5. Información de archivos met_em
echo -e "\n=== ARCHIVOS MET_EM ==="
ls -lh met_em.d01.*.nc | head -5
fi tail -20 ../../configure.wrfigure.wrf:"=="nible)
=== NAMELIST.INPUT ===
&time_control
run_days = 2,
run_hours = 0,
run_minutes = 0,
run_seconds = 0,
start_year = 2023,
start_month = 10,
start_day = 23,
start_hour = 0,
start_minute = 0,
start_second = 0,
end_year = 2023,
end_month = 10,
end_day = 25,
end_hour = 0,
end_minute = 0,
end_second = 0,

interval_seconds = 10800,
input_from_file = .true.,
history_interval = 60,
frames_per_outfile = 1,
restart = .false.,
restart_interval = 1440,
io_form_history = 2,
io_form_restart = 2,
io_form_input = 2,
io_form_boundary = 2,
/

&domains
time_step = 180,
time_step_fract_num = 0,
time_step_fract_den = 1,
max_dom = 1,
e_we = 49,
e_sn = 49,
e_vert = 41,
p_top_requested = 5000,
num_metgrid_levels = 27,
dx = 30000,
dy = 30000,
grid_id = 1,
parent_id = 1,
i_parent_start = 1,
j_parent_start = 1,
parent_grid_ratio = 1,
feedback = 0,
smooth_option = 0,
/

&physics
mp_physics = 3,
ra_lw_physics = 1,
ra_sw_physics = 1,
bl_pbl_physics = 1,
sf_sfclay_physics = 2,
sf_surface_physics = 2,
cu_physics = 0,
cudt = 5,
/

&fdda
/

&dynamics
w_damping = 0,
diff_opt = 1,
km_opt = 4,
damp_opt = 0,
/

&bdy_control
spec_bdy_width = 5,
relax_zone = 5,
/



=== LOGS DE ERROR (rsl.error.0000) ===
taskid: 0 hostname: tlaloc.atmosfera.unam.mx
module_io_quilt_old.F 2931 T
Ntasks in X 1 , ntasks in Y 1
Domain # 1: dx = 30000.000 m
REAL_EM V4.7.1 PREPROCESSOR
git commit f52c197ed39d12e087d02c50f412d90d418f6186
*************************************
Parent domain
ids,ide,jds,jde 1 49 1 49
ims,ime,jms,jme -4 54 -4 54
ips,ipe,jps,jpe 1 49 1 49
*************************************
DYNAMICS OPTION: Eulerian Mass Coordinate
alloc_space_field: domain 1 , 101511876 bytes allocated
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 409
error opening met_em.d01.2023-10-23_00:00:00.nc for input; bad date in namelist or file not in directory
-------------------------------------------
Abort(1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0

=== LOGS DE SALIDA (rsl.out.0000) ===
taskid: 0 hostname: tlaloc.atmosfera.unam.mx
Ntasks in X 1 , ntasks in Y 1
Domain # 1: dx = 30000.000 m
REAL_EM V4.7.1 PREPROCESSOR
git commit f52c197ed39d12e087d02c50f412d90d418f6186
*************************************
Parent domain
ids,ide,jds,jde 1 49 1 49
ims,ime,jms,jme -4 54 -4 54
ips,ipe,jps,jpe 1 49 1 49
*************************************
DYNAMICS OPTION: Eulerian Mass Coordinate
alloc_space_field: domain 1 , 101511876 bytes allocated
Time period # 1 to process = 2023-10-23_00:00:00.
Time period # 2 to process = 2023-10-23_03:00:00.
Time period # 3 to process = 2023-10-23_06:00:00.
Time period # 4 to process = 2023-10-23_09:00:00.
Time period # 5 to process = 2023-10-23_12:00:00.
Time period # 6 to process = 2023-10-23_15:00:00.
Time period # 7 to process = 2023-10-23_18:00:00.
Time period # 8 to process = 2023-10-23_21:00:00.
Time period # 9 to process = 2023-10-24_00:00:00.
Time period # 10 to process = 2023-10-24_03:00:00.
Time period # 11 to process = 2023-10-24_06:00:00.
Time period # 12 to process = 2023-10-24_09:00:00.
Time period # 13 to process = 2023-10-24_12:00:00.
Time period # 14 to process = 2023-10-24_15:00:00.
Time period # 15 to process = 2023-10-24_18:00:00.
Time period # 16 to process = 2023-10-24_21:00:00.
Time period # 17 to process = 2023-10-25_00:00:00.
Total analysis times to input = 17.

-----------------------------------------------------------------------------

Domain 1: Current date being processed: 2023-10-23_00:00:00.0000, which is loop # 1 out of 17
configflags%julyr, %julday, %gmt: 2023 296 0.0000000E+00
-------------- FATAL CALLED ---------------
FATAL CALLED FROM FILE: <stdin> LINE: 409
error opening met_em.d01.2023-10-23_00:00:00.nc for input; bad date in namelist or file not in directory
-------------------------------------------

=== INFORMACIÓN DEL SISTEMA ===
Directorios actual: /home/cca/i.5003/alumnos/a.7116/Build_WRF/WRF-4.5/WRF/run
Usuario: a.7116

=== ARCHIVOS MET_EM ===
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:34 met_em.d01.2023-10-23_00:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:34 met_em.d01.2023-10-23_03:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:34 met_em.d01.2023-10-23_06:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:34 met_em.d01.2023-10-23_09:00:00.nc
-rw-rw-r--+ 1 a.7116 i.5003 797K ene 27 17:34 met_em.d01.2023-10-23_12:00:00.nc
Total: 17 archivos

=== INFORMACIÓN DE COMPILACIÓN ===
Últimas líneas de configure.wrf:
wrf_restartin.o \
wrf_restartout.o \
module_state_description.o \
module_comm_dm.o \
module_comm_dm_0.o \
module_comm_dm_1.o \
module_comm_dm_2.o \
module_comm_dm_3.o \
module_comm_nesting_dm.o \
module_configure.o :
$(RM) $@
$(CPP) -I$(WRF_SRC_ROOT_DIR)/inc $(CPPFLAGS) $(OMPCPP) $*.F > $*.bb
$(SED_FTN) $*.bb | $(CPP) $(TRADFLAG) > $*.f90
@ if echo $(ARCHFLAGS) | $(FGREP) 'DVAR4D'; then \
echo COMPILING $*.F for 4DVAR ; \
$(WRF_SRC_ROOT_DIR)/var/build/da_name_space.pl $*.f90 > $*.f90.tmp ; \
mv $*.f90.tmp $*.f90 ; \
fi
$(RM) $*.b $*.bb
$(FC) -c $(PROMOTION) $(FCSUFFIX) $(FCNOOPT) $(FCBASEOPTS) $(MODULE_DIRS) $(OMP) $*.f90
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $
 
Just a quick question.

Did you build the libraries inside or outside the conda environment your in?

I noticed the terminal output you shared has (base). That normally is related to a conda environment and if you built the libraries outside of conda that might be a source of issues.

Aside from the previously mentioned name list issues.
 
Last edited:
Just a quick question.

Did you build the libraries inside or outside the conda environment your in?

I noticed the terminal output you shared has (base). That normally is related to a conda environment and if you built the libraries outside of conda that might be a source of issues.

Aside from the previously mentioned name list issues.
Hi, nice to meet you! I was check it right now and i saw that i only have this libraries in my enviroment, please if you can check it i will thank you. Because now I have much problems with the code. I was talk with the cluster support , and they check this with me this afternoon! Thank you and have a nice day!
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ conda list | egrep "netcdf|hdf5|jasper|libpng|zlib|szip"
hdf5 1.14.6 nompi_h6e4c0c1_103 conda-forge
libnetcdf 4.9.3 nompi_h81b047f_102 conda-forge
libzlib 1.3.1 hb25bd0a_0
netcdf-fortran 4.6.2 nompi_h90de81b_102 conda-forge
(base) a.7116@tlaloc ~/Build_WRF/WRF-4.5/WRF/run $ which nc-config
which nf-config
nc-config --prefix
nf-config --prefix
/home/cca/i.5003/alumnos/a.7116/miniconda3/bin/nc-config
/home/cca/i.5003/alumnos/a.7116/miniconda3/bin/nf-config
/home/cca/i.5003/alumnos/a.7116/miniconda3
/home/cca/i.5003/alumnos/a.7116/miniconda3
 
@andysant18
Are you able to use any tools like 'ncview' to open/view the met_em.d01.2023-10-23_00:00:00.nc file?

From your WRF/run directory, will you issue the following, and then attach the met.txt file? Thanks!

Code:
ls -ls met_em* >& met.txt
 
Top