Analysis of E3SMv2 Model Output#

Overview#

This workflow example showcases how to use UXarray to analyze the unstructured grid output from the Energy Exascale Earth System Model (E3SM) directly without needing to perform any regridding operations.

Imports#

This notebook requires the following packages to be installed in the notebook environment. cmocean is downloaded for pretty colormaps.

mamba install -c conda-forge uxarray cmocean
import warnings

warnings.filterwarnings("ignore")

from dask.distributed import Client, LocalCluster
import uxarray as ux
import cartopy.crs as ccrs
import geoviews as gv
import geoviews.feature as gf
import holoviews as hv
import cmocean
import glob

import sys

import panel as pn
import dask.distributed

# Set-up for HoloViz plots
hv.extension("matplotlib")
features = (
    gf.coastline(scale="110m", projection=ccrs.PlateCarree())
    * gf.borders(scale="110m", projection=ccrs.PlateCarree())
    * gf.states(scale="110m", projection=ccrs.PlateCarree())
)

Set up local Dask cluster#

For details about setting up Dask cluster, check out the Load Input Data in Parallel with Dask and UXarray page in the User Guide section.

# cluster = LocalCluster()
# client = Client(cluster)
client = dask.distributed.Client(n_workers=4, threads_per_worker=2)
client

Client

Client-2591eb31-909c-11ef-9aa8-0040a687eb14

Connection method: Cluster object Cluster type: distributed.LocalCluster
Dashboard: https://jupyterhub.hpc.ucar.edu/stable/user/rtam/proxy/8787/status

Cluster Info

Data#

Data loaded in this notebook is the output from E3SMv2 that is run for 6 simulated years. The case is an atmosphere-only (AMIP) simulation with present-day control forcing (F2010) at a 1-degree horizontal resolution (ne30pg2), where boundary conditions like sea surface temperatures and sea ice set as default as in the E3SMv2 model and repeat every simulated year.

%%time
# Load multiple files with chunking by time #CHUNK FAIL WITH E3SM IN TIME DIMENSION
data_files = "/glade/campaign/cisl/vast/uxarray/data/e3sm_keeling/ENSO_ctl_1std/unstructured/*.nc"
grid_file = (
    "/glade/campaign/cisl/vast/uxarray/data/e3sm_keeling/E3SM_grid/ne30pg2_grd.nc"
)
uxds_e3sm_multi = ux.open_mfdataset(grid_file, glob.glob(data_files), parallel=True)
CPU times: user 11.9 s, sys: 853 ms, total: 12.7 s
Wall time: 25.9 s
uxds_e3sm_multi
<xarray.UxDataset> Size: 37GB
Dimensions:              (time: 72, n_face: 21600, lev: 72, ilev: 73,
                          cosp_prs: 7, nbnd: 2, cosp_tau: 7, cosp_ht: 40,
                          cosp_sr: 15, cosp_htmisr: 16, cosp_tau_modis: 7,
                          cosp_reffice: 6, cosp_reffliq: 6, cosp_sza: 5,
                          cosp_scol: 10)
Coordinates: (12/13)
  * lev                  (lev) float64 576B 0.1238 0.1828 0.2699 ... 993.8 998.5
  * ilev                 (ilev) float64 584B 0.1 0.1477 0.218 ... 997.0 1e+03
  * cosp_prs             (cosp_prs) float64 56B 9e+04 7.4e+04 ... 2.45e+04 9e+03
  * cosp_tau             (cosp_tau) float64 56B 0.15 0.8 2.45 ... 41.5 100.0
  * cosp_scol            (cosp_scol) int32 40B 1 2 3 4 5 6 7 8 9 10
  * cosp_ht              (cosp_ht) float64 320B 1.896e+04 1.848e+04 ... 240.0
    ...                   ...
  * cosp_sza             (cosp_sza) float64 40B 0.0 20.0 40.0 60.0 80.0
  * cosp_htmisr          (cosp_htmisr) float64 128B 0.0 250.0 ... 1.8e+04
  * cosp_tau_modis       (cosp_tau_modis) float64 56B 0.15 0.8 ... 41.5 100.0
  * cosp_reffice         (cosp_reffice) float64 48B 5e-06 1.5e-05 ... 7.5e-05
  * cosp_reffliq         (cosp_reffliq) float64 48B 4e-06 9e-06 ... 2.5e-05
  * time                 (time) object 576B 0001-02-01 00:00:00 ... 0007-01-0...
Dimensions without coordinates: n_face, nbnd
Data variables: (12/471)
    lat                  (time, n_face) float64 12MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    lon                  (time, n_face) float64 12MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    area                 (time, n_face) float64 12MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    hyam                 (time, lev) float64 41kB dask.array<chunksize=(1, 72), meta=np.ndarray>
    hybm                 (time, lev) float64 41kB dask.array<chunksize=(1, 72), meta=np.ndarray>
    P0                   (time) float64 576B 1e+05 1e+05 1e+05 ... 1e+05 1e+05
    ...                   ...
    soa_c1DDF            (time, n_face) float32 6MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    soa_c1SFWET          (time, n_face) float32 6MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    soa_c2DDF            (time, n_face) float32 6MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    soa_c2SFWET          (time, n_face) float32 6MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    soa_c3DDF            (time, n_face) float32 6MB dask.array<chunksize=(1, 21600), meta=np.ndarray>
    soa_c3SFWET          (time, n_face) float32 6MB dask.array<chunksize=(1, 21600), meta=np.ndarray>