Kiss, A E., Andrew McC Hogg, N Hannah, Fabio Boeira Dias, G Brassington, Matthew A Chamberlain, C Chapman, Peter Dobrohotoff, Catia M Domingues, E R Duran, Matthew H England, R Fiedler, Stephen M Griffies, A Heerdegen, P Heil, Ryan M Holmes, A Klocker, S J Marsland, Adele K Morrison, J Munroe, P Oke, M Nikurashin, G S Pilo, O Richet, A Savita, P Spence, K D Stewart, and Marshall L Ward, et al., February 2020: ACCESS-OM2: A Global Ocean-Sea Ice Model at Three Resolutions. Geoscientific Model Development, 13(2), DOI:10.5194/gmd-13-401-2020. Abstract
We introduce a new version of the ocean-sea ice implementation of the Australian Community Climate and Earth System Simulator, ACCESS-OM2. The model has been developed with the aim of being aligned as closely as possible with the fully coupled (atmosphere-land-ocean-sea ice) ACCESS-CM2. Importantly, the model is available at three different horizontal resolutions: a coarse resolution (nominally 1° horizontal grid spacing), an eddy-permitting resolution (nominally 0.25°) and an eddy-rich resolution (0.1° with 75 vertical levels), where the eddy-rich model is designed to be incorporated into the Bluelink operational ocean prediction and reanalysis system. The different resolutions have been developed simultaneously, both to allow testing at lower resolutions and to permit comparison across resolutions. In this manuscript, the model is introduced and the individual components are documented. The model performance is evaluated across the three different resolutions, highlighting the relative advantages and disadvantages of running ocean-sea ice models at higher resolution. We find that higher resolution is an advantage in resolving flow through small straits, the structure of western boundary currents and the abyssal overturning cell, but that there is scope for improvements in sub-grid scale parameterisations at the highest resolution.
Yang, Rui, Marshall L Ward, and Ben Evans, April 2020: Parallel I∕O in Flexible Modelling System (FMS) and Modular Ocean Model 5 (MOM5). Geoscientific Model Development, 13(4), DOI:10.5194/gmd-13-1885-2020. Abstract
We present an implementation of parallel I∕O in the Modular Ocean Model (MOM), a numerical ocean model used for climate forecasting, and determine its optimal performance over a range of tuning parameters. Our implementation uses the parallel API of the netCDF library, and we investigate the potential bottlenecks associated with the model configuration, netCDF implementation, the underpinning MPI-IO library/implementations and Lustre filesystem. We investigate the performance of a global 0.25∘ resolution model using 240 and 960 CPUs. The best performance is observed when we limit the number of contiguous I∕O domains on each compute node and assign one MPI rank to aggregate and to write the data from each node, while ensuring that all nodes participate in writing this data to our Lustre filesystem. These best-performance configurations are then applied to a higher 0.1∘ resolution global model using 720 and 1440 CPUs, where we observe even greater performance improvements. In all cases, the tuned parallel I∕O implementation achieves much faster write speeds relative to serial single-file I∕O, with write speeds up to 60 times faster at higher resolutions. Under the constraints outlined above, we observe that the performance scales as the number of compute nodes and I∕O aggregators are increased, ensuring the continued scalability of I∕O-intensive MOM5 model runs that will be used in our next-generation higher-resolution simulations.
Ward, Marshall L., June 2019: f90nml - A Python module for Fortran namelists. Journal of Open Source Software, 4(38), 1474, DOI:10.21105/joss.01474.