Table of Contents |
---|
this This page document documents the steps to run WRF and WRF-Chem on OdysseyHarvard cluster.
We invite Archana Dayalu, Packard Chan, Lee Miller , and Jiahua Guo , and Archana Dayalu to to co-edit this page.
Glossary
...
- WRF
...
PART I. Setting up/Configuration/Compilation.
Instructions in this part assume you want to compile and run your own version of WRF. However, note that a compiled usable version of WRF/WRF-Chem v3.6.1 including all external utilities and supplementary geography datasets that you can copy to your preferred run directory is already located at:
/n/holylfs/: Weather Research and Forecasting Model- ARW / EM: Advanced Research WRF. As opposed to NMM.
- NMM: Nonhydrostatic Mesoscale Model
- WPS: WRF Preprocessing System
- DA / VAR: Variational Data Assimilation
- KPP: Kinetic Pre-Processor (related to chemistry)
- chem, ..
- real
- ideal
- grib: Gridded Binary
- grid
Super quick start (Harvard cluster users only, real case, no compilation)
One-time setup
Register in http://www2.mmm.ucar.edu/wrf/users/download/get_source.html
ssh datamover01
# for faster copying
Choose a wrf-root directory: recommend to be on lfs disk, but not scratchlfs. If available, LAB storage on holylfs, kuanglfs are good choices. Below uses /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/.
rsync -a /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/
...
This folder (hereafter $CLIMATE_MODELS) contains the WRF-ARW model, the WRF Pre-processing system (WPS; used for real test cases), the chemistry module add-on, the complete WRF geography dataset (for use with WPS and WRF-Chem), and other utilities needed for WRF-Chem. Note: WPS, WRF-Chem not relevant for idealized cases.
Note 2: With the exception of the geography data set which is really big, copy the WRF_CHEM_3-6-1 folder to a location you are going to run it from. Soft link to the geography data set in the $CLIMATE_MODELS folder.
.
...
# (1) Load required modules (here we use Intel and Intel MPI)
module load intel/17.0.4-fasrc01
module load impi/2017.2.174-fasrc01
module load netcdf/4.1.3-fasrc02
module load libpng/1.6.25-fasrc01
module load jasper/1.900.1-fasrc02
module load intel/17.0.4-fasrc01 impi/2017.2.174-fasrc01 ncview/2.1.7-fasrc01
module load ncl_ncarg/6.4.0-fasrc01
# (2) Define required environment variables
...
wrf/410-WPS /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/ --exclude='.git*' # 15s on datamover01
rsync -a /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/412-WRF /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/ --exclude='.git*' # 2m22s on datamover01
Run WPS
Exit datamover01 node. You can use login node for this default case.
cd /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/410-WPS/
source source_wrf # 25s on holylogin03
vim namelist.wps # no edits
./geogrid.exe # 8s on holylogin03, v410, Jan00
ln -sf ungrib/Variable_Tables/Vtable.GFS-PRMSL Vtable # for ungrib
./link_grib.csh /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/data-tutorial-case/JAN00/fnl_2000012* # for ungrib
./ungrib.exe # 2s on holylogin03, v410, Jan00
./metgrid.exe # 1s on holylogin03, v410, Jan00
Run WRF
cd /n/holylfs/LABS/`id -gn`/$USER/new-wrf-root/412-WRF/
cp -ai run 00run # 5s on holylogin03
cd 00run/
# make sure you have sourced source_wrf
ln -s ../../410-WPS/met_em.d01.2000-01-2* ./
vim namelist.input # no edits
./real.exe # 3s on holylogin03
tail rsl.error.0000 # expect to see "SUCCESS COMPLETE REAL_EM INIT"
vim job_wrf.sbatch # no required edits
sbatch job_wrf.sbatch # 2m36s with 4 huce_intel cpus
tail rsl.error.0000* # expect to see "SUCCESS COMPLETE WRF"
REAL cases
Choose versions of WRF & WPS:
Latest releases on official GitHub:
WRF v4.1.2 (precompiled without chem) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/412-WRF
WPS v4.1 (precompiled) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/wrf/410-WPS
WRF v3.6.1 (precompiled with chem) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRFV3
WPS v3.6.1 (precompiled) is downloaded in /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WPS
WRF v3.0beta with RAVE (pls recompile) is in ~kuang/Model/WRFV3
WPS v3.2.1 (pls recompile) is in ~dingma/Model2/WPSv3
Read the user's guide:
WRF (ARW) User's Guides: v3, v4
WRF-Chem: https://ruc.noaa.gov/wrf/wrf-chem/Users_guide.pdf #This is for a different WRF-Chem version (3.9), but it's still a relevant guide.
https://ruc.noaa.gov/wrf/wrf-chem/Emission_guide.pdf #This is a separate supplementary WRF-Chem guide to chemical input data processing.
https://ruc.noaa.gov/wrf/wrf-chem/wrf_tutorial_nepal/talks/Setup.pdf #Some helpful WRF-Chem slides from NOAA
PART I. Setting up/Configuration/Compilation.
Instructions in this part assume you want to compile and run your own version of WRF. However, note that a compiled usable version of WRF/WRF-Chem v3.6.1 including all external utilities and supplementary geography datasets that you can copy to your preferred run directory is already located at:
/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1
This folder (hereafter $CLIMATE_MODELS) contains the WRF-ARW model, the WRF Pre-processing system (WPS; used for real test cases), the chemistry module add-on, the complete WRF geography dataset (for use with WPS and WRF-Chem), and other utilities needed for WRF-Chem. Note: WPS, WRF-Chem not relevant for idealized cases.
Note 2: With the exception of the geography data set which is really big, copy the WRF_CHEM_3-6-1 folder to a location you are going to run it from. Soft link to the geography data set in the $CLIMATE_MODELS folder.
.- If you want to use another version, you will first need to register as a user to download WRF source codes
WRF: http://www2.mmm.ucar.edu/wrf/users/download/get_source.html - Once you've registered, you will be able to navigate to the downloads section. Select the WRF components relevant to you. For example:
http://www2.mmm.ucar.edu/wrf/src/WPSV3.6.1.TAR.gz #WPS 3.6.1
http://www2.mmm.ucar.edu/wrf/src/WRFV3.6.1.TAR.gz #WRF 3.6.1
http://www2.mmm.ucar.edu/wrf/src/WRFV3-Chem-3.6.1.TAR.gz #Chem module add-on for WRF 3.6.1 - If you're going to be using WRF meteorology output to drive the STILT LPDM (http://stilt-model.org/index.php/Main/HomePage) you will need to make some changes to the Registry file before compiling. Some basic instructions for this are located in $CLIMATE_MODELS folder as a README_WRF-STILT_modifications.txt file.
- Once you've downloaded the necessary tar.gz files, confirm your bashrc file looks something like:
# (1) Load required modules (here we use Intel and Intel MPI)
module load intel/17.0.4-fasrc01
module load impi/2017.2.174-fasrc01
module load netcdf/4.1.3-fasrc02
module load libpng/1.6.25-fasrc01
module load jasper/1.900.1-fasrc02
module load intel/17.0.4-fasrc01 impi/2017.2.174-fasrc01 ncview/2.1.7-fasrc01
module load ncl_ncarg/6.4.0-fasrc01# (2) Define required environment variables
export NETCDF=${NETCDF_FORTRAN_HOME:-${NETCDF_HOME}}
export JASPERLIB=${JASPER_LIB}
export JASPERINC=${JASPER_INCLUDE}
export WRFIO_NCD_LARGE_FILE_SUPPORT=1
export HDF5=${HDF5_HOME}
unset MPI_LIB #unset for WPS, where WPS is used for real WRF simulations### ...... For WRF-Chem: ...... ###
export WRF_EM_CORE=1
export WRF_NMM_CORE=0
export WRF_CHEM=1 - Now, configure and compile. WRF must be compiled before WPS.
...
./compile &> compile.output
...
#(1)-(8)
...
are adapted
...
from
...
p.6 of Plamen's
...
...
for
...
v3.9.1
...
(6/8/2018)
- Now that the main WRF and WPS programs are compiled, it's time to think about utilities specific WRF-Chem. If you are planning on running WRF-Chem, you will additionally need to compile some utilities to make sure your chemical data is in the right format. Specifically, you will likely need to compile convert_emiss.exe; prep-chem-src, and mozbc. These are just examples; you may find that you may need additional ones.
...
#(10) Compile PREP-CHEM-SOURCES (available HERE), a mapping utility for converting raw anthropogenic chemical fields to binary intermediates that are then fed into convert_emiss.exe. Unzip the tar.gz file into your main WRF folder. There are some typos, and missing details in the pdf guide above, so a modified version of the instructions (and Paul Edmon's help rebuilding HDF5 to fix an error message) enabled successful compilation of the utility. The modified instructions are located here:
...
#(12) The MOZBC utility for mapping chemical boundary conditions to your WRF-Chem domain has already been compiled and saved in the $CLIMATE_MODELS WRF-Chem folder, following the instructions in the README_mozbc file. You can use that, or if you wanted, download and compile MOZBC on your own. The initial files are based on MOZART 4-GEOS 5 output (6hr, 1.9x2.5deg, 56-level) datasets (https://www.acom.ucar.edu/wrf-chem/mozart.shtml). Read the README file for compilation instructions if you're doing this on your own; on odyssey the Harvard cluster you might have to do the following: export NETCDF_DIR=$NETCDF_HOME before compilation, and same with MEGAN instructions (#13 below). Otherwise you can use the compiled version located at:
...
/n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/anthro_data
PART II. Running WPS and WRF: Overview/General Steps
...
Now that you have a compiled version of WRF and WPS, you are ready to set up your model runs. Since these are for real cases, you need access to initialization data sets. You will need to figure out the initialization data set best suited to your domain and purposes. For the examples for China provided here, we are using GRIB2 NCEP FNL Operational Model Global Tropospheric Analyses, continuing from July 1999 (http://dx.doi.org/10.5065/D6M043C6). For these examples, the files are already downloaded. Instructions to link to them are noted where necessary.
Regardless of whether you are running WRF or WRF-Chem, it is important that you do the following first and in this order (detailed instructions follow, including in the examples in Part III and IV):
(1) Run WPS (geogrid.exe, ungrib.exe, metgrid.exe) to create real data-based initialization files with of form met_em.d0X.YYYY-MM-DD_hh:mm:ss.nc
(2) Run real.exe to generate input and boundary files of form wrfinput_d0*, wrfbdy_d01 (and optionally wrffdda_d0*) to initialize WRF model
If you are running WRF without chemistry, you can go ahead and run the main WRF model at this point. If you are running WRF-Chem, this is the point at which you run your chemistry data prep program (i.e., prep-chem-src, anthro_emis, and/or convert_emiss) which requires wrfinput_d0* files to actually work. Once you have your correctly formatted chemical data (they should be in the form wrfchemi_00z_d01 and wrfchemi_12z_d01). Once you are done with this and have all your requisite chem data in the right format stored or linked in the WRFV3/test/em_real folder, you can run the wrf.exe model.
...
Step 1: Running WPS (WRF Pre-processing System)
The pre-processing to create a real data-based initialization file. Read the README file in the parent WPS folder for a quick guide on what the programs do.
...
cp util/plotgrids_new.ncl . ; ncl plotgrids.ncl
3. Run geogrid.
cd geogrid; ln -s GEOGRID.TBL.ARW_CHEM GEOGRID.TBLmpirun -np 1 ./geogrid.exe
...
3. You should see met_em.d01(or d02 or d03....).YYYY-MM-DD_hh:mm:ss.nc files created. These are your WPS final output files that real.exe ingests.
Step 2. Running WRF
Navigate to your WRF/WRFV3/run folder
Step 2b. PROGRAM REAL: real data initialization program to set up the model domain and initialize WRF
...
wrfout_d0#_YYYY-MM-DD_hh:mm:ss
...
Detailed examples
WRF-Chem: A PM2.5 example
The purpose of this example is to take the general steps listed above and actually run a three nested domain WRF-Chem PM2.5 simulation over Beijing during the January 2013 severe haze event and compare with observations. We are going to run WRF-Chem for a total of 10 days from Jan 6 2013 00:00UTC to Jan 16 2013 00:00UTC. We establish 5 days for model spin-up such that the usable simulation time period is 5 days. Make sure you have a local copy of the /n/holylfs/INTERNAL_REPOS/CLIMATE_MODELS/WRF_CHEM_3-6-1/WRF directory and contents. You don't need to copy the geography data set.
...
- Get in the habit of using ncview for sanity-check visuals
- Use the dust map from the WPS high-res geographical data set (namelist.input dust_opt=3), although this is less important for this winter exercise.
- Use prep-chem-sources and convert_emis to format a gridded anthropogenic PM2.5 emissions file to the WRF forecast domain
- Use gridded chemical data for biogenic emissions and chemical boundary conditions
- Run WRF-Chem with PM2.5-radiation feedbacks to simulate PM2.5 concentrations in Beijing in January 2013.
- Extract relevant components from wrfout netcdf files and examine/visualize using NCL.
...
Step 0.1.
At any point where you want to check the contents of a netcdf file as a sanity-check use ncview! This is an excellent habit to develop. Just type
...
and navigate through the variables and panes and make sure things look realistic.
Step 0.2
For wrf-chem, it's good practice to create a folder for use with various external utilities that you link your intermediate wrf files to. This will become clearer, but for now make sure you have a directory in your $CLIMATE_MODELS location that's entitled "UTILINP".
Step 1. Run WPS
- Navigate to the WRF/WPS folder.
- First customize the WPS namelist.wps file such that it looks like the following. Note that with the exception of the geog_data_path, all path variables should point to a writable dump directory of your choice. It is recommended you write to regal or a similar scratch space you have access to; the dump files take up space and aren't needed again after the run so they don't have to be stored in a permanent location. Consult the WRF user guide for detailed explanation of the namelist variables.
...
srun -n 1 --mem=10000 --pty --x11=first -p test -t 200 bashp test -t 200 bashcd geogrid; ln -s GEOGRID.TBL.ARW_CHEM GEOGRID.TBL
mpirun -np 1 ./geogrid.exe
...
- Now get ready to run ungrib. Since we will be using NCEP fnl data (GRIB2 format) for this run, we do some sleuthing to confirm the correct Vtable to use. Following the discussion here we download the appropriate Vtable separately (this one isn't included with the standard set of Vtables in WRFv3.6.1 because they recently updated the dataset). Note: I have already downloaded it and saved it: ungrib/Variable_Tables/Vtable.GFS_new
...
mpirun -np 1 ./metgrid.exe
Step 2. Run real.exe to get necessary intermediate files for WRF-Chem utilities (chem_opt off).
- Navigate to WRFV3/test/em_real
- Copy namelist.input_pm25Beijing to namelist.input. Take a look at the namelist and confirm the dates and domain match your wps namelist
- Make sure to turn off chemistry (chem_opt = 0) for this step.
- Change the filepath in history_outname to reflect where you want your wrfchem output files to be saved.
- Link the metgrid files to the test/em_real directory and make sure the namelist num_metgrid_levels and num_metgrid_soil_levels match what's in the metgrid files.
...
- Create soft links of all the met_em.<domain>, wrfinput_<domain>, wrfbd_d01 files in your UTILINP folder. These will be used by the external utilities in the next steps.
Step 3. Generate your bio emissions using MEGAN
- You will need access to the relevant MEGAN initial files, from https://www.acom.ucar.edu/wrf-chem/download.shtml
- Make sure you are still using an interactive shell ('srun -n 1 --mem=10000 --pty --x11=first -p test -t 200 bash' should be sufficient). Navigate to the MEGAN directory in the $CLIMATE_MODELS directory (or wherever you have saved your local copy of everything)
- Create a new text file called
megan_bio_emiss.inp
. This is your MEGAN namelist file. Note that as the README instructs, the leaf area index (lai) months requires the simulation month and the previous month such that for January (as our example here is) we have to simulate all months. Following Following the instructions in the README, you should populate to look like follows ... with the paths obviously reflecting where your WRF + external utility directories are located.
&control
domains = 3,
start_lai_monthmnth = 1,
end_lai_monthmnth = 12,
wrf_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP',
megan_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/MEGAN'
/
- You should be ready to run the MEGAN bio emission utility
./megan_bio_emiss < megan_bio_emiss.inp > megan_bio_emiss.out
- If the program ran correctly, you should get no messages and you should see three files in your MEGAN folder.
...
- If the program ran correctly, you should get no messages and you should see three files in your MEGAN folder. Check them out with ncview.
wrfbiochemi_d01
wrfbiochemi_d02
wrfbiochemi_d03
Step 4. OPTION 1 (USE THIS FOR NOW): Prep your anthropogenic emissions using anthro_emis
- Now that you have your bio emissions, it's time to get your anthropogenic emissions in the right format. We're going to use the EDGAR-HTAP PM2.5 emissions on a 0.1x0.1 deg annual global grid.
- This option has been successfully tested if you're using EDGAR-HTAP emissions or IPCC emissions (for use with CAM-Chem) and seems a little less complicated than prep-chem-sources. In these examples, we will go this route. If you are starting with other emissions, you probably need to go the prep-chem route. I haven't spent much time figuring out how to use it; the anthro_emis option seemed most straightforward for now because I already had some familiarity with utilities in the same family (MEGAN, MOZBC).
- Navigate to your ANTHRO folder
cd src
- Link the EDGAR_HTAP_emi_PM2.5Check them out with ncview.
wrfbiochemi_d01
wrfbiochemi_d02
wrfbiochemi_d03
Step 4. OPTION 1 (USE THIS FOR NOW): Prep your anthropogenic emissions using anthro_emis
- Now that you have your bio emissions, it's time to get your anthropogenic emissions in the right format. We're going to use the EDGAR-HTAP PM2.5 emissions on a 0.1x0.1 deg annual global grid.
- This option has been successfully tested if you're using EDGAR-HTAP emissions or IPCC emissions (for use with CAM-Chem) and seems a little less complicated than prep-chem-sources. In these examples, we will go this route. If you are starting with other emissions, you probably need to go the prep-chem route. I haven't spent much time figuring out how to use it; the anthro_emis option seemed most straightforward for now because I already had some familiarity with utilities in the same family (MEGAN, MOZBC).
- Navigate to your ANTHRO folder
cd src
- Link the EDGAR_HTAP_emi_PM2.5_2010.0.1x0.1.nc file from the anthro_data/MOZCART folder to this folder
ln -sf yourpath_toWRF/anthro_data/MOZCART/EDGAR_HTAP_emi_PM2.5_2010.0.1x0.1.nc .
- Create a new text file called anthro_emis.inp. This will be your namelist file. Check out the README file for detailed instructions. Your namelist file should look like the following. This is constructed based on a hybrid of following instructions in the README file and the input file located in the anthro_data/EDGAR-HTAP/INP-Examples folder. Make sure you understand the purpose of the entries.
&CONTROL
anthro_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/ANTHRO/src'
domains = 3
wrf_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP'
src_file_prefix = 'EDGAR_HTAP_emi_'
src_file_suffix = '_2010.0.1x0.1.nc
...
ln -sf yourpath_toWRF/anthro_data/MOZCART/EDGAR_HTAP_emi_PM2.5_2010.0.1x0.1.nc .
...
&CONTROL
anthro_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/ANTHRO/src'
domains = 3
wrf_dir = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP'
src_file_prefix = 'EDGAR_HTAP_emi_'
src_file_suffix = '_2010.0.1x0.1.nc'
src_names = 'PM2.5(1)'
sub_categories = 'emis_tot'
serial_output = .false.
start_output_time = '2010-12-01_00:00:00'
emissions_zdim_stag = 1
emis_map = 'PM_25(a)->PM2.5',
/
- Note that the src_file_prefix and src_file suffix are concatenated with whatever is specified in src_names (here, PM2.5) to generate the full filename string.
- Now run anthro_emis.
./anthro_emis < anthro_emis.inp > anthro_emis.out
and you should see six new files in the ANTHRO/src directory, one for each of the three domains. Check out the contents with ncview.
wrfchemi_00z_<domain>
wrfchemi_12z_<domain>
Step 4. OPTION 2 (FYI for now): Prep your anthropogenic emissions using prep-chem-sources
- TBD. Watch this space.
- Use this utility if you're using something other than EDGAR-HTAP or things not listed in the anthro_emis README/instructions. In theory, anthro_emis could work for other emissions fields, but they have just not been tested.
- Now that you have your bio emissions, it's time to get your anthropogenic emissions in the right format. Navigate to your PREP-CHEM-SRC-1.5 folder.
cd bin
- Edit the prep_chem_sources.inp namelist file. Check out the README file for detailed instructions.
Step 5. Run real.exe again, with chem_opt turned on
- Navigate to your WRFV3/test/em_real/ directory
- Link your bio and anthro files to your WRFV3/test/em_real/ directory.
- Open up the the namelist.input file and turn chem_opt back on (set it to 10).
- Make sure kemit=1 (vertical levels in anthro emissions files...in this case it is 1...surface data).
- Call up an interactive shell if you don't have one running. Run real.exe again. This incorporates the chem variables into the initial and boundary condition files so that MOZBC can populate them.
mpirun -np 1 ./real.exe > run_real.log
- Check that the tail of the rsl.error.0000 file says "SUCCESS COMPLETE REAL_EM INIT"
Step 6. Prep the chemical data initial and boundary conditions using MOZBC
- This step modifies the wrf initial and boundary condition files that now have space for chemical data (since we re-ran real.exe with chem_opt on and anthro/bio fields)
- Go to https://www.acom.ucar.edu/wrf-chem/mozart.shtml and submit a data request. This can take some time to process depending on your domain size request. Submit your bounding box, and times. For this exercise you can use what was already downloaded. This is available in the mozbc directory in the $CLIMATE_MODELS folder as a mozart4geos5-ZZZZ.nc file.
- Navigate to your mozbc folder from the $CLIMATE_MODELS folder.
cd MOZBC
- Recall the MOZBC utility is used to modify the wrfinput/wrfbdy files to contain chemical boundary conditions.
- From the namelist.input file for this exercise in WRFV3/test/em_real, we see that chem_opt = 10. Referring to the WRF-Chem user manual for more details. This chemistry option uses the CBMZ chemical mechanism and MOSAIC using 8 sectional aerosol bins. We match this with what MOZBC provides as sample namelists and we pick CBMZ-MOSAIC_8bins.inp as our mozbc namelist file. Copy this to a new file and edit the new file as follows
cp CBMZ-MOSAIC_8bins.inp mozbc.inp
- Now edit this file.
do_bc = .true.
do_ic = .true.
domain = 3
#FYI, I've found mozbc can be unhappy when the set directory path is too long.
dir_wrf = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP/' #obviously this should be your specific path.
dir_moz = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/MOZBC/' #obviously this should be your specific path.
# fn_moz should look something like 'ha0004.nc'. #you will have to rename your mozart4geos5-ZZZZ.nc file whatever this is.
#In the species map (spc_map) section, delete the entry 'op2 -> C2H5OOH'. This isn't in the mozart4geos5 file, and leads to an error if it remains in there.
#(This knowledge is from trial and error.)
- Exit editor.
ln -s mozart4geos5-ZZZZ.nc ha0004.nc #copy as link to "rename"
- You are now ready to run mozbc. In your mozbc directory type:
./mozbc < mozbc.inp > mozbc.out
#tail mozbc.out should have a final line that reads:
bc_wrfchem completed successfully
Step 7. You're (FINALLY) ready to run WRF-Chem
- Navigate back to your WRFV3/test/em_real directory. Check that you have the preferred path set to your WRFOUT directory (recommend regal or some scratch space with a lot of space) in your history_outname field. If you change the filename, note the only part of the filename that you should mess with is the part before '_d<domain>_d<date>'!
- I liked to keep my frames hourly (a file is written out every simulation hour. More helpful with debugging if run fails. Up to you.)
- Open up the script run_wrf.sh and make sure it looks ok.
- Submit your wrf job
sbatch run_wrf.sh
squeue -u username #monitor your job status
- If your run is successful you should start to see files populating your outpath:
cd /your/history_outname/specified/preferably/some/regal/directory/WRFOUT/
and you should see for each domain ...
wrfchem_d<domain>_2013-01-DD_HH:00:00
- With 15 cores on one node (using huce_amd queue) it took ___ hours for this run to complete.
...
'
src_names = 'PM2.5(1)'
sub_categories = 'emis_tot'
serial_output = .false.
start_output_time = '2010-12-01_00:00:00'
emissions_zdim_stag = 1
emis_map = 'PM_25(a)→PM2.5',
/
- Note that the src_file_prefix and src_file suffix are concatenated with whatever is specified in src_names (here, PM2.5) to generate the full filename string.
- Now run anthro_emis.
./anthro_emis < anthro_emis.inp > anthro_emis.out
and you should see six new files in the ANTHRO/src directory, one for each of the three domains. Check out the contents with ncview.
wrfchemi_00z_<domain>
wrfchemi_12z_<domain>
Step 4. OPTION 2 (FYI for now): Prep your anthropogenic emissions using prep-chem-sources
- TBD. Watch this space.
- Use this utility if you're using something other than EDGAR-HTAP or things not listed in the anthro_emis README/instructions. In theory, anthro_emis could work for other emissions fields, but they have just not been tested.
- Now that you have your bio emissions, it's time to get your anthropogenic emissions in the right format. Navigate to your PREP-CHEM-SRC-1.5 folder.
cd bin
- Edit the prep_chem_sources.inp namelist file. Check out the README file for detailed instructions.
Step 5. Run real.exe again, with chem_opt turned on
- Navigate to your WRFV3/test/em_real/ directory
- Link your bio and anthro files to your WRFV3/test/em_real/ directory.
- Open up the the namelist.input file and turn chem_opt back on (set it to 10).
- Make sure kemit=1 (vertical levels in anthro emissions files...in this case it is 1...surface data).
- Call up an interactive shell if you don't have one running. Run real.exe again. This incorporates the chem variables into the initial and boundary condition files so that MOZBC can populate them.
mpirun -np 1 ./real.exe > run_real.log
- Check that the tail of the rsl.error.0000 file says "SUCCESS COMPLETE REAL_EM INIT"
Step 6. Prep the chemical data initial and boundary conditions using MOZBC
- This step modifies the wrf initial and boundary condition files that now have space for chemical data (since we re-ran real.exe with chem_opt on and anthro/bio fields)
- Go to https://www.acom.ucar.edu/wrf-chem/mozart.shtml and submit a data request. This can take some time to process depending on your domain size request. Submit your bounding box, and times. For this exercise you can use what was already downloaded. This is available in the mozbc directory in the $CLIMATE_MODELS folder as a mozart4geos5-ZZZZ.nc file.
- Navigate to your mozbc folder from the $CLIMATE_MODELS folder.
cd MOZBC
- Recall the MOZBC utility is used to modify the wrfinput/wrfbdy files to contain chemical boundary conditions.
- From the namelist.input file for this exercise in WRFV3/test/em_real, we see that chem_opt = 10. Referring to the WRF-Chem user manual for more details. This chemistry option uses the CBMZ chemical mechanism and MOSAIC using 8 sectional aerosol bins. We match this with what MOZBC provides as sample namelists and we pick CBMZ-MOSAIC_8bins.inp as our mozbc namelist file. Copy this to a new file and edit the new file as follows
cp CBMZ-MOSAIC_8bins.inp mozbc.inp
- Now edit this file.
do_bc = .true.
do_ic = .true.
domain = 3
#FYI, I've found mozbc can be unhappy when the set directory path is too long.
dir_wrf = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/UTILINP/' #obviously this should be your specific path.
dir_moz = '/n/holylfs/LABS/kuang_lab/adayalu/WRF/MOZBC/' #obviously this should be your specific path.
# fn_moz should look something like 'ha0004.nc'. #you will have to rename your mozart4geos5-ZZZZ.nc file whatever this is.
#In the species map (spc_map) section, delete the entry 'op2 -> C2H5OOH'. This isn't in the mozart4geos5 file, and leads to an error if it remains in there.
#(This knowledge is from trial and error.)
- Exit editor.
ln -s mozart4geos5-ZZZZ.nc ha0004.nc #copy as link to "rename"
- You are now ready to run mozbc. In your mozbc directory type:
./mozbc < mozbc.inp > mozbc.out
#tail mozbc.out should have a final line that reads:
bc_wrfchem completed successfully
Step 7. You're (FINALLY) ready to run WRF-Chem
- Navigate back to your WRFV3/test/em_real directory. Check that you have the preferred path set to your WRFOUT directory (recommend regal or some scratch space with a lot of space) in your history_outname field. If you change the filename, note the only part of the filename that you should mess with is the part before '_d<domain>_d<date>'!
- I liked to keep my frames hourly (a file is written out every simulation hour. More helpful with debugging if run fails. Up to you.)
- Open up the script run_wrf.sh and make sure it looks ok.
- Submit your wrf job
sbatch run_wrf.sh
squeue -u username #monitor your job status
- If your run is successful you should start to see files populating your outpath:
cd /your/history_outname/specified/preferably/some/regal/directory/WRFOUT/
and you should see for each domain ...
wrfchem_d<domain>_2013-01-DD_HH:00:00
- With 15 cores on one node (using huce_amd queue) it took about 40 hours for this run to complete:
sacct -j 49862718 --format=JobID,JobName,MaxRSS,Elapsed
JobID JobName MaxRSS Elapsed
------------ ---------- ---------- ----------
49862718 wrfchem_t+ 1-15:32:54
49862718.ba+ batch 11588K 1-15:32:54
Step 8. Post-Processing and data visualization
- I primarily use NCL, and/or some combination of NCL and R. Pick whatever you're used to for processing netcdf files. If you're going to use NCL, I recommend looking at the very well documented examples on their website. For example, start with the one about how to open and read netcdf files and go from there:
https://www.ncl.ucar.edu/Document/Functions/Built-in/addfile.shtml
- If you use ncview, you can quickly get a sense of how your simulation worked. Take a look at the PM2.5_DRY variable (3D Vars). You will notice that while the run completed successfully from a technical standpoint, it's actually way off. The PM2.5 values are unrealistic – two orders of magnitude lower than observations in the d03 domain – and this is most likely due to some combination of the following:
- Accounting only for primary PM2.5. There is obviously all the secondary PM2.5 that needs the appropriate precursor species mapped as well. (25% to nearly 40% of PM2.5 in many cities in the region is secondary inorganic.)
- My failure to process files correctly. While the wrfchemi, wrfbiochemi, and mozbc utilities seem to have gone through, it may not have done so correctly based on an inappropriate namelist parameter selection. For some reason, the surface emissions are not being read in correctly. We used EDGAR-HTAP from 2010 processed using the anthro_emis utility. In the past I have run a test of this using a more specialized inventory from 2010 pre-processed using NCL which led to a far more realistic PM2.5 simulation (i.e., surface emissions data was being read in).
- inappropriate choices in the WRF-Chem namelist.input file.
In any case, this exercise should at least get you familiarity with the process of running WRF-Chem and set you up for being able to do second-order troubleshooting (like the more important question of why these values are unrealistic!).
Running WRF-Chem for real cases in Large Eddy Simulation (LES) Mode: A Beijing PM2.5 Case Study
...
IDEALIZED cases
WRF (
...
Miscellaneous links
WRF (ARW) User's Guides: v3, v4
Google Docs: https://docs.google.com/document/d/1Jls4FlWIOIhMlCzMPWm6_aBZqx_Axxe8RMcKjdILDFg/
Ding's notes: global_WRF_on_Odyssey.pdf
ARW Technical Note: http://www2.mmm.ucar.edu/wrf/users/docs/technote/
Optimizing performance: https://www2.cisl.ucar.edu/software/community-models/optimizing-wrf-performance
...