Worthen, Denise, Jun Wang, Raffaele Montuoro, Dom Heinzeller, Bin Li, G Theurich, Ufuk Turuncoglu, Dan Rosen, Dusan Jovic, Brian Curtis, Rahul Mahajan, Hang Lei, Alexander Richert, Arun Chawla, Jiande Wang, Jessica Meixner, Ali Abdolali, Matthew Masarik, Li Pan, Michael Barlage, Bin Liu, M Vertenstein, Tony Craig, Rusty Benson, and Thomas E Robinson, et al., June 2024: Coupling Infrastructure Capability in UFS Weather Model, College Park, MD: NOAA Technical Memorandum NWS NCEP, 519, DOI:10.25923/dvv2-3g03 115pp. Abstract
The Unified Forecast System (UFS) is an end-to-end forecast system for the next generation of the National Centers for Environmental Prediction (NCEP) production suite. The UFS weather-model (UWM) comprises the model component of the UFS. It consists of atmosphere, which currently includes the Finite Volume Cubed Sphere (FV3) dynamical core and the Common Community Physics Package, CCPP; ocean, sea ice, wave, land, aerosol, chemistry, and data model components with a central mediator component to couple the components together. The coupling strategy among the components uses the common Earth System Prediction Suite (ESPS) architecture, an Earth System Modeling Framework (ESMF) and National Unified Operational Prediction Capability (NUOPC) based coupling infrastructure framework. The NUOPC interfaces, also called caps, allow each model component to be an independent ESMF grid component. The coupling communication uses either generic UOPC
connectors or the NUOPC compliant Community Mediator for Earth Prediction Systems (CMEPS). This technical note provides details on the coupling capability available in the latest UWM. The model component NUOPC caps and component coupling strategies are documented. The coupled configurations, computational performance, and features that improve the computation performance are also illustrated.
Two-way multiple same-level and telescoping grid nesting capabilities are implemented in the Geophysical Fluid Dynamics Laboratory (GFDL)'s Finite-Volume Cubed-Sphere Dynamical Core (FV3). Simulations are performed within GFDL's System for High-resolution modeling for Earth-to-Local Domains (SHiELD) using global and regional multiple nest configurations. Results show that multiple same-level and multi-level telescoping nests were able to capture various weather events in greater details by resolving smaller-scale flow structures. Two-way updates do not introduce numerical errors in their corresponding parent grids where the nests are located. The cases of Hurricane Laura's landfall and an atmospheric river in California were found to be more intense with increased levels of telescoping nesting. All nested grids run concurrently, and adding additional nests with computer cores to a setup does not degrade the computational performance nor increase the simulation run time if the cores are optimally distributed among the grids.
A review of the results for GFDL models from the Computational Performance Model Intercomparison Project (CPMIP) raises the question of why the coupling costs appear so extreme, especially in light of tests demonstrating minimal cost for the FMS coupling framework. This technical memorandum seeks to explain the seemingly high cost associated with the FMS coupler in pre-industrial control (piControl) simulations using the GFDL ESM4 Earth system model and two configurations of the CM4 climate model1. While this technical note presents results for three specific piControl simulations, the findings are assumed to be general and applicable to the full range of configurations and experiments.
We present the System for High‐resolution prediction on Earth‐to‐Local Domains (SHiELD), an atmosphere model developed by the Geophysical Fluid Dynamics Laboratory (GFDL) coupling the nonhydrostatic FV3 Dynamical Core to a physics suite originally taken from the Global Forecast System. SHiELD is designed to demonstrate new capabilities within its components, explore new model applications, and to answer scientific questions through these new functionalities. A variety of configurations are presented, including short‐to‐medium‐range and subseasonal‐to‐seasonal prediction, global‐to‐regional convective‐scale hurricane and contiguous U.S. precipitation forecasts, and global cloud‐resolving modeling. Advances within SHiELD can be seamlessly transitioned into other Unified Forecast System or FV3‐based models, including operational implementations of the Unified Forecast System. Continued development of SHiELD has shown improvement upon existing models. The flagship 13‐km SHiELD demonstrates steadily improved large‐scale prediction skill and precipitation prediction skill. SHiELD and the coarser‐resolution S‐SHiELD demonstrate a superior diurnal cycle compared to existing climate models; the latter also demonstrates 28 days of useful prediction skill for the Madden‐Julian Oscillation. The global‐to‐regional nested configurations T‐SHiELD (tropical Atlantic) and C‐SHiELD (contiguous United States) show significant improvement in hurricane structure from a new tracer advection scheme and promise for medium‐range prediction of convective storms.
In this two-part paper, a description is provided of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). This version, with roughly 100km horizontal resolution and 33 levels in the vertical, contains an aerosol model that generates aerosol fields from emissions and a “light” chemistry mechanism designed to support the aerosol model but with prescribed ozone. In Part I, the quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode – with prescribed sea surface temperatures (SSTs) and sea ice distribution – is described and compared with previous GFDL models and with the CMIP5 archive of AMIP simulations. The model's Cess sensitivity (response in the top-of-atmosphere radiative flux to uniform warming of SSTs) and effective radiative forcing are also presented. In Part II, the model formulation is described more fully and key sensitivities to aspects of the model formulation are discussed, along with the approach to model tuning.
In Part II of this two-part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part I. Part II provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken to tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.
Balaji, V, E Maisonnave, Niki Zadeh, Bryan N Lawrence, Joachim Biercamp, Uwe Fladrich, G Aloisio, Rusty Benson, Arnaud Caubel, Jeffrey W Durachta, M-A Foujols, Grenville Lister, S Mocavero, Seth D Underwood, and Garrett Wright, January 2017: CPMIP: Measurements of Real Computational Performance of Earth System Models. Geoscientific Model Development, 10(1), DOI:10.5194/gmd-10-19-2017. Abstract
A climate model represents a multitude of processes on a variety of time and space scales; a canonical example of multi-physics multi-scale modeling. The underlying climate system is physically characterized by sensitive dependence on initial conditions, and natural stochastic variability, so very long integrations are needed to extract signals of climate change. Algorithms generally possess weak scaling and can be I/O and/or memory bound. Such weak-scaling, I/O and memory-bound multi-physics codes present particular challenges to computational performance.
Traditional metrics of computational efficiency such as performance counters and scaling curves do not tell us enough about real sustained performance from climate models on different machines. They also do not provide a satisfactory basis for comparative information across models.
We introduce a set of metrics that can be used for the study of computational performance of climate (and Earth System) models. These measures do not require specialized software or specific hardware counters, and should be accessible to anyone. They are independent of platform, and underlying parallel programming models. We show how these metrics can be used to measure actually attained performance of Earth system models on different machines, and identify the most fruitful areas of research and development for performance engineering.
We present results for these measures for a diverse suite of models from several modeling centres, and propose to use these measures as a basis for a CPMIP, a computational performance MIP.
Balaji, V, Rusty Benson, Bruce Wyman, and Isaac M Held, October 2016: Coarse-grained component concurrency in Earth system modeling: parallelizing atmospheric radiative transfer in the GFDL AM3 model using the Flexible Modeling System coupling framework. Geoscientific Model Development, 9(10), DOI:10.5194/gmd-9-3605-2016. Abstract
Climate models represent a large variety of processes on a variety of timescales and space scales, a canonical example of multi-physics multi-scale modeling. Current hardware trends, such as Graphical Processing Units (GPUs) and Many Integrated Core (MIC) chips, are based on, at best, marginal increases in clock speed, coupled with vast increases in concurrency, particularly at the fine grain. Multi-physics codes face particular challenges in achieving fine-grained concurrency, as different physics and dynamics components have different computational profiles, and universal solutions are hard to come by.
We propose here one approach for multi-physics codes. These codes are typically structured as components interacting via software frameworks. The component structure of a typical Earth system model consists of a hierarchical and recursive tree of components, each representing a different climate process or dynamical system. This recursive structure generally encompasses a modest level of concurrency at the highest level (e.g., atmosphere and ocean on different processor sets) with serial organization underneath.
We propose to extend concurrency much further by running more and more lower- and higher-level components in parallel with each other. Each component can further be parallelized on the fine grain, potentially offering a major increase in the scalability of Earth system models.
We present here first results from this approach, called coarse-grained component concurrency, or CCC. Within the Geophysical Fluid Dynamics Laboratory (GFDL) Flexible Modeling System (FMS), the atmospheric radiative transfer component has been configured to run in parallel with a composite component consisting of every other atmospheric component, including the atmospheric dynamics and all other atmospheric physics components. We will explore the algorithmic challenges involved in such an approach, and present results from such simulations. Plans to achieve even greater levels of coarse-grained concurrency by extending this approach within other components, such as the ocean, will be discussed.
We characterize impacts on heat in the ocean climate system from transient ocean mesoscale eddies. Our tool is a suite of centennial-scale 1990 radiatively forced numerical climate simulations from three GFDL coupled models comprising the CM2-O model suite. CM2-O models differ in their ocean resolution: CM2.6 uses a 0.1° ocean grid, CM2.5 uses an intermediate grid with 0.25° spacing, and CM2-1deg uses a nominally 1.0° grid.
Analysis of the ocean heat budget reveals that mesoscale eddies act to transport heat upward in a manner that partially compensates (or offsets) for the downward heat transport from the time mean currents. Stronger vertical eddy heat transport in CM2.6 relative to CM2.5 accounts for the significantly smaller temperature drift in CM2.6. The mesoscale eddy parameterization used in CM2-1deg also imparts an upward heat transport, yet it differs systematically from that found in CM2.6. This analysis points to the fundamental role that ocean mesoscale features play in transient ocean heat uptake. In general, the more accurate simulation found in CM2.6 provides an argument for either including a rich representation of the ocean mesoscale in model simulations of the mean and transient climate, or for employing parameterizations that faithfully reflect the role of eddies in both lateral and vertical heat transport.
Michalakes, John, Mark Govett, and Rusty Benson, et al., April 2015: AVEC Report: NGGPS Level-1 Benchmarks and Software Evaluation, NOAA National Weather Service, DOI:10.25923/zq35-b106 22pp. Abstract
The Advanced Computing Evaluation Committee (AVEC) was formed in August, 2014 to provide Level-1 technical evaluation of HPC suitability and readiness of five Next Generation Global Prediction System (NGGPS) candidate models to meet operational forecast requirements at the National Weather Service through 2025-30. This report describes methodology, cases, model configurations, and results of performance and scalability benchmarks conducted during two sessions on Edison, a 130-thousand processor core supercomputer at the U.S. Department of Energy’s National Energy Research Scientific Computing Center (NERSC)1 , during March and April, 2015. This testing is part of the NGGPS Test Plan.
We present results for simulated climate and climate change from a newly developed high-resolution global climate model (GFDL CM2.5). The GFDL CM2.5 model has an atmospheric resolution of approximately 50 Km in the horizontal, with 32 vertical levels. The horizontal resolution in the ocean ranges from 28 Km in the tropics to 8 Km at high latitudes, with 50 vertical levels. This resolution allows the explicit simulation of some mesoscale eddies in the ocean, particularly at lower latitudes.
We present analyses based on the output of a 280 year control simulation; we also present results based on a 140 year simulation in which atmospheric CO2 increases at 1% per year until doubling after 70 years.
Results are compared to the GFDL CM2.1 climate model, which has somewhat similar physics but coarser resolution. The simulated climate in CM2.5 shows marked improvement over many regions, especially the tropics, including a reduction in the double ITCZ and an improved simulation of ENSO. Regional precipitation features are much improved. The Indian monsoon and Amazonian rainfall are also substantially more realistic in CM2.5.
The response of CM2.5 to a doubling of atmospheric CO2 has many features in common with CM2.1, with some notable differences. For example, rainfall changes over the Mediterranean appear to be tightly linked to topography in CM2.5, in contrast to CM2.1 where the response is more spatially homogeneous. In addition, in CM2.5 the near-surface ocean warms substantially in the high latitudes of the Southern Ocean, in contrast to simulations using CM2.1.