Skip to content

Commit a57a56f

Browse files
Reorganize, cleanup, and document
1 parent dd41652 commit a57a56f

14 files changed

Lines changed: 897 additions & 347 deletions

README.md

Lines changed: 48 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -2,47 +2,60 @@
22

33
This repository is adapted from https://github.com/quentinjamet/SPECTRE
44

5-
Main differences include
5+
Main differences include:
66
* MITgcm as a submodule
7-
* Ocean boundary conditions are from Glorys v12 and the same across all ensemble members
8-
* All simulation suites are defined by yamls that are used to drive input deck generation and job submission scripts
7+
* Ocean boundary and initial conditions derived from Glorys v12
8+
* Atmospheric forcing derived from ERA5, processed through the MITgcm EXF package
9+
* All simulations are defined by a `config.yaml` that drives input deck generation and job submission
910

1011

1112
## Get Started
12-
Clone this repository and recursively grab submodules
1313

14-
```
14+
Clone this repository and recursively grab submodules:
15+
16+
```bash
1517
git clone --recurse-submodules https://github.com/ocean-spectre/spectre-ensembles
18+
cd spectre-ensembles
19+
```
20+
21+
22+
## Repository Layout
23+
24+
```
25+
spectre-150-ensembles/
26+
├── MITgcm/ # MITgcm source (submodule)
27+
├── env/ # Environment setup scripts (per host)
28+
├── opt/ # MITgcm build option files (per host)
29+
├── simulations/
30+
│ └── glorysv12-curvilinear/ # Primary simulation (see its README)
31+
│ ├── code/ # MITgcm compile-time options
32+
│ ├── etc/config.yaml # Simulation configuration
33+
│ ├── input/ # Static input files and generated forcing
34+
│ └── workflows/ # Slurm job scripts
35+
└── spectre_utils/ # Python utilities for pre-processing
1636
```
1737

1838

19-
## Workflow
20-
21-
Set up template directory (manual work)
22-
|
23-
|
24-
build mitgcm executable
25-
|
26-
|
27-
Define ensemble configurations with simulation sequences
28-
|
29-
|
30-
Create boundary and atm conditions (constant across all members)
31-
|
32-
|
33-
Set up member directories
34-
|
35-
|
36-
Create initial conditions (unique for each member)
37-
|
38-
|
39-
For each member, launch simulation sequences
40-
41-
42-
### What's a simulation sequence ?
43-
44-
Defined by
45-
* sequence stage - a name for this "stage" of the simulation (e.g. spinup, production)
46-
* map of MITgcm parameters to their values
47-
* list of required conditions to start
48-
* list of conditions to verify successful completion
39+
## Simulations
40+
41+
### glorysv12-curvilinear
42+
43+
MITgcm re-run of Glorys v12 on the native NEMO curvilinear grid for the North
44+
Atlantic. See [`simulations/glorysv12-curvilinear/README.md`](simulations/glorysv12-curvilinear/README.md)
45+
for the full workflow and configuration details.
46+
47+
48+
## spectre_utils
49+
50+
Python package containing all pre-processing scripts. All scripts accept a
51+
`config.yaml` path as their only argument. Key scripts:
52+
53+
| Script | Purpose |
54+
|--------|---------|
55+
| `download_glorys12_raw.py` | Download Glorys v12 ocean data from CMEMS |
56+
| `download_era5.py` | Download ERA5 atmospheric data from CDS |
57+
| `mk_initial_conditions.py` | Generate MITgcm initial conditions |
58+
| `mk_ocean_boundary_conditions.py` | Generate open boundary conditions |
59+
| `mk_exf_conditions.py` | Process ERA5 fields into EXF binary forcing files |
60+
| `animate_exf_conditions.py` | Animate EXF forcing fields (MP4 per variable) |
61+
| `review_exf_conditions.py` | QC checks, statistics, and diagnostic figures |

env/derecho.sh

Lines changed: 0 additions & 12 deletions
This file was deleted.

env/galapagos-franklin-glorysv12-curvilinear.sh

Lines changed: 0 additions & 17 deletions
This file was deleted.

env/galapagos-franklin.sh

Lines changed: 0 additions & 17 deletions
This file was deleted.
Lines changed: 120 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,120 @@
1+
# Glorys v12 - MITgcm Curvilinear Grid
2+
3+
MITgcm re-run of the Glorys v12 simulation on the native NEMO curvilinear grid.
4+
Running on the native grid means no interpolation is needed for GLORYS ocean fields,
5+
and the Arakawa C-grid velocity fields are numerically divergence-free, which
6+
avoids the gravity-wave eruptions that typically complicate spinup on remapped grids.
7+
8+
9+
## Configuration
10+
11+
| Parameter | Value |
12+
|-----------|-------|
13+
| Ocean source | CMEMS Glorys v12 (`cmems_mod_glo_phy_my_0.083deg_P1D-m`) |
14+
| Atmosphere source | ERA5 single levels (EXF package) |
15+
| Longitude | -82.0 to -17.5 |
16+
| Latitude | 26.0 to 50.5 |
17+
| Vertical levels | 50 |
18+
| Simulation period | 2002-07-01 – 2017-06-30 |
19+
| Spinup | 2002-07-01 – 2003-07-01 × 5 repeats |
20+
| MPI layout | 8 × 8 = 64 ranks |
21+
22+
Open boundary array sizes (5479 daily time steps, 2002–2017):
23+
24+
| Boundary | U/V/T/S shape | Eta shape |
25+
|----------|--------------|-----------|
26+
| South | (5479, 50, 768) | (5479, 768) |
27+
| North | (5479, 50, 768) | (5479, 768) |
28+
| West | (5479, 50, 424) ||
29+
| East | (5479, 50, 424) ||
30+
31+
32+
## Workflow
33+
34+
All steps are run as Slurm jobs from the `workflows/` directory. Each script
35+
sources `workflows/env.sh` for container image paths and data directories.
36+
37+
### 1. Build MITgcm
38+
39+
```bash
40+
sbatch workflows/build.sh
41+
```
42+
43+
Compiles MITgcm inside the base container image using `code/` for compile-time
44+
options. The executable is written into the simulation directory.
45+
46+
### 2. Download ocean data (GLORYS v12)
47+
48+
```bash
49+
sbatch workflows/download_glorysv12_raw.sh
50+
```
51+
52+
Downloads daily GLORYS v12 fields (T, S, U, V, SSH) from CMEMS for the domain
53+
and years defined in `etc/config.yaml`.
54+
55+
### 3. Download atmospheric data (ERA5)
56+
57+
```bash
58+
sbatch workflows/download_era5.sh
59+
```
60+
61+
Downloads ERA5 single-level fields from the Copernicus CDS for each variable
62+
listed under `atmosphere.variables` in `etc/config.yaml`.
63+
64+
### 4. Generate initial conditions
65+
66+
```bash
67+
sbatch workflows/make_ocean_initial_conditions.sh
68+
```
69+
70+
Interpolates GLORYS v12 T, S, U, V, SSH fields onto the MITgcm grid and writes
71+
binary initial condition files to `input/`.
72+
73+
### 5. Generate ocean boundary conditions
74+
75+
```bash
76+
sbatch workflows/make_ocean_boundary_conditions.sh
77+
```
78+
79+
Processes GLORYS v12 daily fields into open boundary condition binary files
80+
(U, V, T, S, Eta on all four boundaries) in `input/`.
81+
82+
### 6. Generate EXF atmospheric forcing
83+
84+
```bash
85+
sbatch workflows/make_exf_conditions.sh
86+
```
87+
88+
Processes ERA5 fields into EXF binary forcing files in `input/`. Applies any
89+
`scale_factor` values from the config (e.g. for radiation units), and computes
90+
specific humidity from dewpoint temperature and surface pressure.
91+
92+
To review the produced forcing fields before running the model:
93+
94+
```bash
95+
sbatch -w franklin -c8 --wrap \
96+
"srun --container-image=\$SPECTRE_UTILS_IMG \
97+
--container-mounts=\$(pwd)/../:/workspace,\$HOST_DATADIR:/data \
98+
python /opt/spectre_utils/review_exf_conditions.py /workspace/etc/config.yaml"
99+
```
100+
101+
Output (report + figures) is written to `review/atmosphere/`.
102+
103+
### 7. Run the model
104+
105+
```bash
106+
sbatch workflows/run.sh
107+
```
108+
109+
Launches MITgcm under the base container image. Sets up the run directory on
110+
first launch, then submits the MPI job. The run directory is controlled by
111+
`RUN_DIR` in `run.sh` (default: `demo/`).
112+
113+
114+
## Data sources
115+
116+
| Dataset | Access | Variables |
117+
|---------|--------|-----------|
118+
| CMEMS Glorys v12 daily | `copernicusmarine` Python package | thetao, so, uo, vo, zos |
119+
| CMEMS Glorys v12 static | `copernicusmarine` Python package | mask, deptho, deptho_lev |
120+
| ERA5 single levels | `cdsapi` Python package | winds, temperature, humidity, radiation, precip, pressure |

simulations/glorysv12-curvilinear/etc/config.yaml

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -30,23 +30,26 @@ atmosphere:
3030
mitgcm_name: "atemp"
3131
- era_name: "total_precipitation"
3232
mitgcm_name: "precip"
33-
- era_name: "mean_surface_downward_short_wave_radiation_flux"
33+
- era_name: "surface_solar_radiation_downwards"
3434
mitgcm_name: "swdown"
35-
- era_name: "mean_surface_net_long_wave_radiation_flux"
35+
scale_factor: 2.7778E-04
36+
- era_name: "surface_thermal_radiation_downwards"
3637
mitgcm_name: "lwdown"
38+
scale_factor: 2.7778E-04
3739
- era_name: "mean_sea_level_pressure"
3840
mitgcm_name: "pressure"
3941
- era_name: "runoff"
4042
mitgcm_name: "runoff"
4143
- era_name: "evaporation"
4244
mitgcm_name: "evap"
45+
- era_name: "surface_pressure"
46+
mitgcm_name: "sp"
4347
computed_variables:
4448
- mitgcm_name: "aqh"
4549
era_name: "q"
4650
function: "specific_humdity_from_dewpoint"
4751
years: [2002,2003,2004,2005,2006,2007,2008,2009,2010,2011,2012,2013,2014,2015,2016,2017]
4852
prefix: "era5"
49-
cheapaml: True
5053

5154
domain:
5255
mpi:

simulations/glorysv12-curvilinear/input/data.cheapaml

Lines changed: 0 additions & 35 deletions
This file was deleted.
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
#!/bin/bash
22

33
# Path where downloaded data is stored
4-
export HOST_DATADIR=/mnt/beegfs/spectre-150-ensembles/glorysv12-curvilinear-15year
4+
export HOST_DATADIR=/mnt/beegfs/spectre-150-ensembles/simulations/glorysv12-curvilinear/downloads
55
export SPECTRE_UTILS_IMG="docker://ghcr.io#ocean-spectre/spectre-ensembles/spectre-utils:sha-6f41db5"
66
export MITGCM_BASE_IMG="docker://ghcr.io#fluidnumerics/mitgcm-containers/gcc-openmpi:latest"

simulations/glorysv12-curvilinear/workflows/make_cheapaml_conditions.sh renamed to simulations/glorysv12-curvilinear/workflows/make_exf_conditions.sh

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@
22
#SBATCH -n1
33
#SBATCH -c32
44
#SBATCH --nodelist=franklin
5-
#SBATCH --job-name=spectre_cheapaml
6-
#SBATCH --output=./spectre_cheapaml.out
7-
#SBATCH --error=./spectre_cheapaml.out
5+
#SBATCH --job-name=spectre_exf
6+
#SBATCH --output=./spectre_exf.out
7+
#SBATCH --error=./spectre_exf.out
88

99
if [ -n "${SLURM_JOB_ID:-}" ]; then
1010
SCRIPT_PATH=$(scontrol show job "$SLURM_JOB_ID" --json | jq -r '.jobs[0].command' )
@@ -18,8 +18,8 @@ fi
1818
source $SCRIPT_DIR/env.sh
1919

2020
###############################################################################################
21-
# Run the script to download make the cheapaml boundary conditions
21+
# Run the script to download make the exf boundary conditions
2222
###############################################################################################
2323
srun --container-image=$SPECTRE_UTILS_IMG \
2424
--container-mounts=${SCRIPT_DIR}/../:/workspace,${HOST_DATADIR}:/data \
25-
python /opt/spectre_utils/mk_cheapaml_conditions.py /workspace/etc/config.yaml
25+
python /opt/spectre_utils/mk_exf_conditions.py /workspace/etc/config.yaml

0 commit comments

Comments
 (0)