Now that you have compiled the MITgcm and copied the executable to the run directory, you can start a MITgcm simulation. We will look at the file
Run scripts
In the global_hg_llc90/run
, pfos/run
, and pcb/run
directory, you will find several sample scripts that you can use to run MITgcm jobs.
Simulation type | Run script | data.exch file | data file |
---|---|---|---|
13 CPUs, debug run (10 hours) | run.mitgcm.13np.debug | data.exch2.13np | data.debug.run |
13 CPUs, 1-month run | run.mitgcm.13np.1month | data.exch2.13np | data.1month_run |
96 CPUs, debug run (10 hours) | run.mitgcm.96np.debug | data.exch2.96np | data.debug_run |
96 CPUs, 20 year run | run.mitgcm.96np.20yr | data.exch2.96np | data.20yr_run |
We will look at each of these scripts in more detail below.
The run.mitgcm scripts
The run.mitgcm* scripts are used to start a MITgcm simulation with 13 CPUs (for debugging) or 96 CPUs. The run.mitgcm.*.debug
scripts look like this:
#!/bin/bash #SBATCH -n 13 #SBATCH -N 1 #SBATCH -t 60 #SBATCH -p general #SBATCH --mem-per-cpu=3750 #SBATCH --mail-type=ALL #EOC #------------------------------------------------------------------------------ # Harvard Biogeochemistry of Global Pollutants Group ! #------------------------------------------------------------------------------ #BOP # # !IROUTINE: run.mitgcm.13np.debug # # !DESCRIPTION: Script to run a debug MITgcm simulation with 13 CPUs. #\\ #\\ # !CALLING SEQUENCE: # sbatch run.mitgcm.13np.debug # To submit a batch job # ./run.mitgcm.hg.13np.debug # To run in an interactive session # # !REMARKS: # Consider requesting an entire node (-N 64 -n 1), which will prevent # outside jobs from slowing down your simulation. # # Also note: Make your timestep edits in "data.debug_run", which will # automatically be copied to "data" by this script. # # !REVISION HISTORY: # 17 Feb 2015 - R. Yantosca - Initial version #EOP #------------------------------------------------------------------------------ #BOC # Make sure we apply the .bashrc_mitgcm settings source ~/.bashrc_mitgcm # Copy run-time parameter input files for the 13 CPU run cp -f data.debug_run data cp -f data.exch2.13np data.exch2 # Remove old output files rm -f STDOUT.* rm -f STDERR.* rm -f PTRACER* # Run MITgcm with 13 CPUs time -p ( mpirun -np 13 ./mitgcmuv ) exit 0 #EOC
Each run script also copies the following files before starting the run
data.exch2.13np or data.exch2.96np ---> data.exch2
data.debug_run or data.1month_run or data.20yr_run ---> data
The data.exch2.13np and data.exch2.96np files
The data.exch2.13np
contains the following namelist data declaration. This is used to set up the horizontal grid for 13 CPus.
&W2_EXCH2_PARM01 W2_printMsg = 0 , W2_mapIO = 1 , preDefTopol = 0 , #============================================================================== #-- 5 facets llc_120 topology (drop facet 6 and its connection): #============================================================================== dimsFacets(1:10) = 90, 270, 90, 270, 90, 90, 270, 90, 270, 90 , facetEdgeLink(1:4,1) = 3.4, 0. , 2.4, 5.1 , facetEdgeLink(1:4,2) = 3.2, 0. , 4.2, 1.3 , facetEdgeLink(1:4,3) = 5.4, 2.1, 4.4, 1.1 , facetEdgeLink(1:4,4) = 5.2, 2.3, 0. , 3.3 , facetEdgeLink(1:4,5) = 1.4, 4.1, 0. , 3.1 , /
The data,exch2.96np
is used to set up the horizontal grid for 96 CPUs. It contains the same namelist variables as does data.exch2.13np
, with an additional variable named blanklist
. This is used to set certain tiles to zero.
&W2_EXCH2_PARM01 W2_printMsg = 0 , W2_mapIO = 1 , preDefTopol = 0 , #============================================================================== #-- 5 facets llc_120 topology (drop facet 6 and its connection): #============================================================================== dimsFacets(1:10) = 90, 270, 90, 270, 90, 90, 270, 90, 270, 90 , facetEdgeLink(1:4,1) = 3.4, 0. , 2.4, 5.1 , facetEdgeLink(1:4,2) = 3.2, 0. , 4.2, 1.3 , facetEdgeLink(1:4,3) = 5.4, 2.1, 4.4, 1.1 , facetEdgeLink(1:4,4) = 5.2, 2.3, 0. , 3.3 , facetEdgeLink(1:4,5) = 1.4, 4.1, 0. , 3.1 , #============================================================================== #-- 30x30 nprocs = 96 : Blank out certain tiles #============================================================================== blankList(1:21) = 1,2,3,5,6,28,29,30,31,32,33,49,50 52,53,72,81,90,99,108,117 /
The run.mitgcm* scripts will copy data.exch2.13np
or data.exch2.96np
to a file named data.exch2
, so that you won't forget to do this yourself.
Debug run
To submit a debugging run (on 13 CPUs), type the following commands:
#### To run a debug Hg simulation ### cd MITgcm_code/ # Switch to main code directory setcpus 13 hg # Pcks the proper SIZE.h and data.exch2 file for 13 CPUs cd verification/global_hg_llc90/run # Change to the Hg run directory sbatch run.mitgcm.13np.1month # Submit the run to SLURM #### To run a debug PFOS simulation ### cd MITgcm_code/ # Switch to main code directory setcpus 13 pfos # Picks the proper SIZE.h and data.exch2 file for 13 CPUs cd verification/pfos/run # Change to the Hg run directory sbatch run.mitgcm.13np.1month # Submit the run to SLURM #### To run a debug PCB simulation ### cd MITgcm_code/ # Switch to main code directory setcpus 13 pcb # Picks the proper SIZE.h and data.exch2 file for 13 CPUs cd verification/pfos/run # Change to the Hg run directory sbatch run.mitgcm.13np.1month # Submit the run to SLURM
1-month run
Hello
20 year run
Hello