15th Estuarine and Coastal Modeling Conference (ECM15)

Monday, June 25, 2018
Tuesday, June 26, 2018
Wednesday, June 27, 2018
Seattle University, Seattle, WA

Abstracts

(Plenary) Improving Forecasting through 4D-Var Data Assimilation and Observing System Impacts (or how I learned to stop worrying and love observations)

Brian Powell, University of Hawaii, powellb@hawaii.edu

Operational oceanographic analyses and predictions rely upon near real-time observations to constrain the model to produce the most probable ocean state. Unfortunately, near real-time observations are sparse in space and time, are costly, and may contain errors or capture non-modeled phenomena. These issues are only exacerbated in coastal environments. Model systems must represent rather than replicate these observations using a Bayesian approach that constrains the model and observational differences using the primitive equations of the ocean. In doing so, we can also determine how each observation covaries with the ocean state and its impact upon the analysis or forecast. With a focus on the state of Hawaii, we present an overview and results from our real-time, coupled, multiple-nested atmospheric/ocean (WRF/ROMS) prediction system. For the ocean, we operate a daily, 4D-Var assimilation and observational impact system, over nested grids ranging from 4km to 50m horizontal resolution. We utilize all available data, including real-time HF radar radial currents, in situ autonomous gliders, Argo floats, Satellite temperature and height data, as well as a variety of research vessels.

(Plenary) Interactive, Scalable, Data-Proximate Analysis of Model Data in the Cloud

Richard Signell, USGS, rsignell@usgs.gov

Traditionally in the USGS, data is processed and analyzed on local researcher computers, then moved to centralized, remote computers for preservation and publishing (ScienceBase, Pubs Warehouse). This approach requires each researcher to have the necessary hardware and software for processing and analysis, and also to bring all external data required for the workflow over the internet to their local computer. To explore a more efficient and effective scientific workflow, we explored an alternate model: storing scientific data remotely, and performing data analysis and visualization interactively close to the data, using only a local web browser as an interface. Using the Pangeo project infrastructure, we were able to demonstrate huge efficiency gains using these data-proximate scalable workflows on NSFs XSEDE Jetstream Cloud, the Amazon Cloud and on the USGS Yeti HPC cluster.

(Plenary) ECM Past, Present and Future

Malcolm Spaulding, Spaulding Environmental Associates, LLC, spaulding@uri.edu

The first ECM Meeting was held in Rhode Island in 1989. In the nearly 30 years that have passed, there have been tremendous advances in the capabilities of models, which are now integral to most estuarine and coastal studies. This talk will be a look back at where we came from, where we are now, and outstanding challenges for the future.

(1A) Hydro-morphodynamic Modeling of a Tropical River Delta: Magdalena River

Juan Rueda-Bayona, Universidad Militar Nueva Granada, ruedabayona@gmail.com, Hebert Gonzalo-Rivera, Universidad Militar Nueva Granada, hebert.rivera@unimilitar.edu.co

The Magdalena River is the largest fluvial system in Colombia, flows from south to north discharging fresh water to Caribbean Colombian coast with high capacity of sediment transport. Wind, tides, waves, currents, heat fluxes and river flow variations, generate complex transport processes that modulates water properties in the Magdalena River delta. Several researchers contributed to understanding Magdalena River delta, now this research analyzes the transport patterns through multidimensional hydro-morphodynamics modelling with high space-time resolution. The 2010 year showed the minima and maximal water levels in the Magdalena River before 2015 year, what motivated to the present work to analyze the temperature, salinity, waves, and sediments dynamics through calibrated and validated numerical results. This research analyzed three periods (February, June, October 2010), and saline intrusion, river feather curvature due to the wind effect, and reduction of bed-level of river mouth due to the accretion processes were identified.

(1A) Investigating controlling factors of estuarine stratification over a tidal inlet

Linlin Cui, Louisiana State University, lcui2@lsu.edu, Haosheng Huang, Louisiana State Univesity, hhuang7@lsu.edu, Chunyan Li, Louisiana State Univesity, cli@lsu.edu

Tidal straining is considered to be a dominant factor that affects vertical stratification. It is the interaction between along-channel salinity gradient and vertical shear of horizontal velocity at different tidal stages. It is known to increase stratification during ebb tides while reduce stratification during flood tides. However, this is based on idealized assumptions such as a predominant direction of the tidal currents along which there is a salinity gradient caused by river discharge and one-dimensionality in the horizontal. As such, it only considers salinity variation in the along-channel direction, while ignores salinity variation in the cross section, lateral circulation, and bathymetry variation. The real estuaries are complicated and are three-dimensional. At the same time, field observations are sparse both temporally and spatially. Thus, numerical model is a good tool to examine complicated estuarine dynamics. In this study, we use the high resolution, 3D Finite-Volume Coastal Ocean Model (FVCOM), covering most of the Alabama-Mississippi-Texas continental shelf, to examine estuarine stratification over an 800 m wide tidal inlet, Barataria Pass of the Barataria Bay Estuary. Simulated water level, salinity and velocity are compared with observational data. The results show that this model can reliably simulate three-dimensional estuarine flows in this system. During ebb tides, the western shoal is fresher than the eastern shoal, water column is well-mixed over most of the ebb period, lateral circulation shows a two-layer characteristic with surface flow moving eastward and bottom flow westward. During flood tides, the eastern shoal is fresher than the western shoal, water column is stratified over most of the flood period, surface flows converge over the surface of deep channel to develop two-cell lateral circulation. This stratification pattern is not consistent with traditional tidal straining. We hypothesize that lateral circulation and turbulence mixing play more important roles than tidal straining on estuarine stratification in this tidal inlet.

(1A) Detailed hydrodynamic feasibility assessment for Leque Island restoration project

Adi Nugraha, Pacific Northwest National Laboratory, adi.nugraha@pnnl.gov, Tarang Khangaonkar, PNNL, Tarang.Khangaonkar@pnnl.gov

Numerous restoration projects are underway in Puget Sound, Washington, USA with the goal of re-establishing intertidal wetlands that were historically lost due to dike construction for flood protection and agricultural development. The projects strive to restore full tidal exchange at the selected project sites and work towards redeveloping tidal marshlands suitable salmonid rearing habitat. One such effort is the dual restoration effort within the Stillaguamish Delta benefitting from the cumulative effects from the Leque Island and zis a ba restoration projects. The preferred restoration design calls for removal of perimeter dikes at the two sites and creation of tidal channels to facilitate drainage of tidal flows. The objective of this study was to evaluate the hydrodynamic feasibility of the proposed project. A 3-D high-resolution unstructured-grid coastal ocean model based on FVCOM was developed to evaluate hydrodynamic response of the estuary to restoration alternatives. A series of hydrodynamic modeling simulations were then performed for baseline pre-restoration conditions, and the preferred project conditions. These included a typical condition of October 2005 and a high flow condition representing bank-full conditions. A set of parameters were defined to quantify the hydrodynamic response of the nearshore restoration project, such as periodic inundation, suitable currents, desired habitat/salinity levels. Sediment impacts were also examined including the potential for excessive erosion or sedimentation requiring maintenance. To quantitatively assess the difference between the preferred restoration design and the current condition, the project simulations were compared with baseline under different flow conditions. Also, to evaluate potential for inundation and flooding risk to adjacent properties, maximum water level near the project site was computed with consideration of extreme high tide, wind-induced storm surge, significant wave height and future sea-level rise based on numerical model results and coastal engineering calculation. Simulation results indicate that preferred alternative scenario provides the desired estuarine response consistent with the planned design. A decrease in velocities and bed shear in the main river channels was noted for the restored condition associated with the increased inundation of tidal flat area and reduced exchange flows through the main channels. High bed shear near the restored tidal channel mouths indicates that the moths may evolve in size until equilibrium is established. Feasibility assessments with a 3-D high resolution of hydrodynamic model coupled with wave analysis using coastal engineering principles was successful in helping restoration planning, design, selection, consensus building, and action in the Stillaguamish Delta.

(1B) Community Resilience Planning for Coastal Flooding under Climate Change: Development and Application of a Coupled Model System

Christine Szpilka, University of Oklahoma, cmszpilka@ou.edu, Christine M. Szpilka, University of Oklahoma, cmszpilka@ou.edu, Kendra M. Dresback, University of Oklahoma; NIST Center, dresback@ou.edu, Xianwu Xue, University of Oklahoma; NIST Center, xuexianwu@ou.edu, Jia Xu, Dalian University of Technology, jia.xu-1@ou.edu, Naiyu Wang, University of Oklahoma; NIST Center, naiyu.wang@ou.edu, Randall L. Kolar, University of Oklahoma, kolar@ou.edu, Kevin M. Geoghegan, Northwest Hydraulic Consultants, keving@ou.edu

Resilience planning for coastal communities requires hazard-modeling technologies that can capture the impact of climate change on frequency of occurrence and intensity of a hazard, as well as provide hazard demand parameters to support resilience assessment at community or regional scales. Over the last several years, a coupled modeling system has been developed that couples hurricane tracking, wind field, precipitation, hydrologic, wave and hydrodynamic models to predict the total water level (tides + waves + surge + runoff) due to a hurricane event. Currently, the first stage of the system is set up to evaluate the short-term hazard impact associated with hurricanes and not to evaluate the long-term hazard impact on coastal communities due to climate change. The greenhouse emission and radiative forcing scenarios identified by the International Panel on Climate Change (IPCC) are considered through a suite of global climate model ensembles downscaled to provide sea surface temperature and local environmental conditions for a semi-physics based hurricane model that predicts track and intensity of the storm. This information is then used to generate the wind field forcing for the coupled modeling system and the rainfall track for the precipitation model. A parametric rainfall model (P-CLIPER) is used to produce hourly averaged rainfall along the track, which is then fed into a Coupled Routing and Excess STorage hydrology model (CREST) which routes the rainfall into the various channels of the watershed. Streamflows from the CREST model and wind fields from the hurricane model are then fed into the coupled ADCIRC+SWAN hydrodynamic plus wave model from which we obtain spatial and temporal hazard demands, i.e. wind speed, surge height, wave height, and coastal and inland inundation, for a coastal community during a simulated hurricane scenario. A brief description of the modeling framework will be provided and then results from the coupled modeling system will be discussed in further detail.

(1B) Integrating Hydrologic and Storm Surge Simulation to Improve Evaluation of Flooding Risk in Amite River Basin

Shu Gao, Louisiana State University, sgao7@lsu.edu, Matthew V. Bilskie, LSU Center for Coastal Resiliency, mbilsk3@lsu.edu, Scott C. Hagen, LSU Center for Coastal Resiliency, shagen@lsu.edu

Riverine and coastal floods are one of the most common environmental hazards that affect millions of people around the world. For example, in August 2016, a slow-moving upper level low-pressure system with a high amount of atmospheric moisture brought heavy rains from August 11 to August 13. Some of the most serious flooding occurred along the Amite River, which runs between Baton Rouge and the nearby city of Denham Springs, and has its headwaters in southwestern Mississippi and drains into Lake Maurepas. In addition, portions of the lower Amite River watershed are susceptible to coastal storm surges. To develop an improved understanding of the driving mechanisms that can cause flooding within the watershed, an improved coupling of hydrologic and costal storm surge simulation is required for the Amite River basin. First, a Soil and Water Assessment Tool (SWAT) model has been developed to simulate rainfall runoff for the Amite River basin. The model was calibrated with daily discharges from 2003 to 2009 and validated from 2010 to 2016. A Support Vector Machine (SVM) rating curve with radial basis function (RBF) kernel was developed to calculate river stage from observed discharges for the Amite River. The SVM method was compared with widely used logarithmic and higher order polynomial fitting methods and the results indicates that it is better suited for extrapolation. Results from the SWAT model were translated to river stages and used as an initial condition for the Advanced Circulation (ADCIRC) coastal inundation model, specifically the flood extent and depth. In addition, since simulation of flooding is dependent on an accurate digital elevation model, vertical features such as roadbeds, levees and railroads are considered in the development of the ADCIRC unstructured mesh in order to improve the model performance across the watershed. To minimize the elevation error of mesh nodes and to represent the earth surface accurately, significant vertical features will be extracted from a high-resolution LiDAR-DEM. The linking of SWAT and ADCIRC models will provide further insight into conceptualization of flood risk across river deltas and other low gradient coastal regions that are vulnerable to both riverine and coastal flooding.

(1B) Total Water Level Prediction for Compound Inland and Coastal Flooding

Edward Myers, NOAA/NOS/OCS/Coast Survey Development Laboratory, Edward.Myers@noaa.gov, E.J. Van Den Ameele, NOAA/NOS/OCS/CSDL, edward.j.vandenameele@noaa.gov, Saeed Moghimi, UCAR; NOAA/NOS/OCS/CSDL, saeed.moghimi@noaa.gov, Sergey Vinogradov, NOAA/NOS/OCS/CSDL, sergey.vinogradov@noaa.gov, Lei Shi, NOAA/NOS/OCS/CSDL, l.shi@noaa.gov, Ali Abdolali, NOAA/NWS/NCEP/EMC, ali.abdolali@noaa.gov, Andre Van der Westhuysen, NOAA/NWS/NCEP/EMC, andre.vanderwesthuysen@noaa.gov, Zaizhong Ma, NOAA/NWS/NCEP, zaizhong.ma@noaa.gov, Arun Chawla, NOAA/NWS/NCEP/EMC, arun.chawla@noaa.gov, Audra Luscher, NOAA/NOS/CO-OPS, audra.luscher@noaa.gov, Patrick Burke, NOAA/NOS/CO-OPS, pat.burke@noaa.gov

One of the main goals of the NOAA Water Initiative is to revolutionize water modeling and forecasting to provide accurate representation of total water level propagation up and downstream in coastal and estuarine environments, particularly during storm events. The US Consumer Option for an Alternative System to Allocate Losses (COASTAL) Act also requires that NOAA provides accurate hindcast modeling of flooding and wind caused by hurricanes soon after the event. However, compound flooding and its ultimate inundation is a result of complicated interactions among wind generated waves, coastal storm surge, tides and inland flooding. To address the above mentioned requirements, a coupled modeling framework is being developed based on ESMF/NUOPC technology under a common modeling framework called the NOAA Environmental Modeling System (NEMS) to enable flexible model coupling in compound inland and coastal flooding studies. The framework is essentially a wrapper around atmospheric, wave, inland flooding and storm surge models that enables them to communicate seamlessly and efficiently run in massively parallel environments. The coupling strategy provides dynamic interaction between the wave (WAVEWATCH III), National Water Model and storm surge (ADCIRC) models. Here we will present our latest results for some of the selected hurricanes in the Gulf of Mexico and the Atlantic Ocean coastal areas.

(2A) Development of a 3-D temperature model for Hells Canyon Reservoir: A model application study

Adi Nugraha, Pacific Northwest National Laboratory, adi.nugraha@pnnl.gov, Tarang Khangaonkar, PNNL, Tarang.Khangaonkar@pnnl.gov, Steve Brink, Idaho Power Company, nan, Jesse Naymik, Idaho Power Company

Hells Canyon Complex on the Snake River is a hydropower project consisting of three dams and associated reservoirs: Brownlee, Oxbow, and Hells Canyon respectively. This presentation focuses on the Hells Canyon reservoir where hydropower operations result in the formation of a strong thermocline during the summer months resulting in high temperatures in the surface layers and the discharged water downstream. Temperatures exceed the 17.8 oC aquatic life and 13 oC spawning criteria for waters downstream. However, monitoring data shows that a large volume of cold water remains stored below the thermocline allowing warmer surface layer waters to flow directly into the powerhouse turbines. Feasibility of designing a temperature-management structure is being evaluated using 3-D hydrodynamic models with the goal of utilizing some of the stored water for modulating the temperature of the water discharged. Typically short duration simulations associated with hydropower projects are conducted using commercially available computational fluid dynamics codes. However, the nature of this project dictated the use of free surface models that would allow us to simulate year-long hydrodynamics and heat balance allowing the annual formation of the observed stratified conditions in the summer. For this purpose, we employed 3-D unstructured-grid, finite volume and parallel coastal ocean model, SUNTANS, as a tool. While coastal ocean model has been tested extensively for stratified flows in estuaries, the ability of this model to maintain stratified conditions in deep narrow reservoirs with high flow has not been tested. The model was calibrated and validated using flow and temperature data from years 2013 and 2014, respectively. The results show good agreement between model results and observed data for both years 2013 and 2014; values of absolute mean error and root mean squared error values were less than 1 oC at most stations. A qualitative comparison of observed velocities with simulated values was generally matched the observed data well. By comparing two hydrographic years, we found that the thermal stratification is not only controlled by the net atmospheric heat flux but also by the relative temperature difference of the inflow inducing density flows. This study demonstrated that SUNTANS is capable to examine the thermal stratification processes at high level of satisfaction in the reservoir.

(2A) Non-Traditional Haline Circulation in Newtown Creek, NY: Hydrodynamic Modeling and Data Analysis

Nicholas Kim, HDR, Inc, Nicholas.Kim@hdrinc.com, Stephen C. Ertman, HDR, Inc., Stephen.Ertman@hdrinc.com, Robin Landeck Miller, HDR, Inc., Robin.Miller@hdrinc.com

The development, calibration/validation and application of a three-dimensional hydrodynamic model for Newtown Creek has demonstrated that the downstream reaches of Newtown Creek experience non-traditional haline circulation. Newtown Creek is an urban tidal creek bordering Queens and Kings Counties in New York City and is a tributary to the lower East River. Salinity at the mouth of Newtown Creek is highly variable due to the tidal exchange between Long Island Sound and Upper New York Bay that occurs in the East River. Often times when lower East River salinity is depressed during high Hudson River flow events, alternating relatively high salinity water from Long Island Sound and low salinity water from Upper New York Bay create a reverse salinity gradient along the axis of Newtown Creek during high tides, with notably lower salinity at the Creek mouth and higher salinity elsewhere in the Creek. This salinity phenomenon creates reverse estuarine circulation inside of Newtown Creek (i.e., landward flows near the surface and seaward currents near the bottom) and often creates multi-layer reversing currents inside Newtown Creek during intra-tidal and sub-tidal circulation patterns. In contrast, traditional haline circulation exhibits relatively higher density water, mostly induced by high salinity water, at the downstream end and relatively low density water in upstream locations. This longitudinal difference in density structure creates 'traditional' landward (toward upstream) movement of water near the bottom and seaward (toward downstream) near the surface. The three-dimensional hydrodynamic model developed for the Creek was calibrated with data from an array of 6 ADCP moorings covering more than 2 years concurrently, as well as from 18 in-situ temperature (for 15 months) and salinity (for 3 months) recorders. The hydrodynamic model includes freshwater inflows from groundwater and the watershed calculated by the City with calibrated models. For example, for a future planning projection based on 2008 rainfall, the calculated freshwater inflows to Newtown Creek are 1800 MGY from groundwater and 3743 MGY from the watershed. Hydrodynamic model simulations and the observed currents indicate that the relatively lower salinity water mass at the mouth of the Creek during the above mentioned events hinders the exchange of water between the East River and Newtown Creek and subsequently creates distinctive recirculation cells inside of Newtown Creek. Water age computations for Newtown Creek suggest that it takes about 5-7 days to replace Newtown Creek water from the mouth of Newtown Creek to the Turning Basin near Maspeth Creek. A much longer time is need to replace waters above the Turning Basin in English Kills (~10 day or longer). Acknowledgements The work summarized herein was supported by the New York City Department of Environmental Protection under the leadership of Ron Weissbard and Dabeiba Marulanda and was performed under subcontract to Louis Berger and Associates, P.C. The work was performed to support the City's technical evaluation of CSO controls and the ongoing Remedial Investigation and Feasibility Study in Newtown Creek.

(2B) Monte Carlo synthesis of coastal flood hazards and probabilistic sea level rise projections for the Maine coast

Nathan Dill, Ransom Consulting, Inc., nathan.dill@ransomenv.com, Scott Hayward, Ransom Consulting, Inc., scott.hayward@ransomenv.com

Long term planning for coastal development requires accurate flood hazard information to assess present and future coastal flooding risk. Coastal flood hazards related to storm surge and wave action depend, to some degree, on the mean sea level. As mean sea level changes in the future, coastal flood risks are expected to increase . Common approaches for evaluating future coastal flood hazards combine present hazard estimates with a set of discrete sea level rise scenarios that bound a range of possible futures. While scenario-based approaches allow decision makers to identify robust solutions that will work over a range of highly uncertain future conditions, they do not consider the variable likelihood of different scenarios. This may inadvertently cause decision makers to put too much weight on an unlikely scenario, or too little weight on a likely one. The scenario-based approach is thus limited in its utility for risk assessments because any precise assessment of risk will be tied to a particular scenario with an unquantified degree of uncertainty. Furthermore, the combination of present hazard information with a discrete sea level scenario often assumes a linear combination of the flood elevation and sea level rise projection. This simplifying assumption neglects possible non-linear interactions between changes in the mean sea level and the flood hazards. But less obviously and perhaps more importantly, linear assumptions may neglect the fact that more extreme hazard projections should increase at a greater rate than less extreme projections because of greater uncertainty in extreme value estimates and longer-term sea level rise projections. This presentation explores the use of a Monte Carlo synthesis to improve upon scenario-based guidance. High fidelity numerical modeling of coastal flood processes is used to develop flood hazard curves and evaluate non-linear sea level rise interactions for areas along the Maine coastline. Local mean sea level change is considered as a non-stationary random process based on recently available, localized, probabilistic sea level rise projections. Flood hazard information, sea level rise projections, and epistemic uncertainty, including contributions from non-linear interactions, are then combined through Monte Carlo simulation to yield flood hazard curves for future decades out to 2148. This adds value to the scenario-based approach by appropriately weighting the full range of possible sea level rise scenarios within each future hazard projection. Decision makers can then confidently consider the changing hazard within a risk informed decision making framework without having to select or question the likelihood of discrete sea level rise scenarios.

(2B) Predicting Risk in a Changing Climate

Nickitas Georgas, Jupiter , nickitas.georgas@jupiterintel.com

Government agencies, asset owners, planners, developers and investors are increasingly recognizing the need to incorporate climate data into risk modeling for specific assets. Catastrophic risk modeling most often employs models that project the future based on past statistics with the assumption that the climate is not changing. This approach is flawed in a dynamic environment that is continually shaped by changes to built and natural landscapes. Similarly, climate panels at the international, national, state and metropolitan levels use inconsistent methodologies, validation approaches, and metrics that make it nearly impossible for the private sector to use them without extensive custom work. Today’s decision-makers need data that reflects ongoing change and provides accurate predictions. With the right information, they can make more informed decisions in areas such as building placement and design, insurance, zoning and building codes. The right decisions improve safety and reduce risks to critical infrastructure. In early 2017, Jupiter was born. Jupiter’s founders believe that by incorporating every relevant factor in an integrated, dynamic model, they could deliver a risk-focused solution with accurate, actionable information, and that this approach could be designed to efficiently scale in the cloud. The talk will introduce Jupiter to the ECM community.

(2B) Storm surge propagation in small tidal rivers during storms of mixed coastal and fluvial influence

Liv Herdman, USGS, lherdman@usgs.gov, Li Erikson, USGS, lerikson@usgs.gov, Patrick Barnard, USGS, pbarnard@usgs.gov

San Francisco Bay is a highly urbanized estuary and the surrounding communities are susceptible to flooding along the bay shoreline and inland rivers and creeks that drain to the Bay. As part of developing a forecast model that integrates fluvial and oceanic drivers, a case study of the Napa River and its interactions with the Bay was performed. This case study introduces coupling of the state-of-the-art USGS Coastal Storm Modeling System (CoSMoS) with the National Weather Service Research Distributed Hydrologic Model (RDHM) for San Francisco Bay. For this application, we utilize Delft3D-FM, a hydrodynamic model based on a flexible mesh grid, to calculate water levels that account for tidal forcing, seasonal water level anomalies, as well as surge and in-Bay generated wind waves derived from the 10-km resolution wind and pressure weather model CaRD10. The flooding extent is determined by overlaying the resulting maximum water levels onto a recently updated 2-m digital elevation model of the study area which best resolves the extensive levee and tidal marsh systems in the region. We simulated a series of storms with varying storm surge, river discharge and tidal forcing in both the realistic Napa river drainage as well as an idealized geometry. With these scenarios we show how the lateral extent, vertical level, and duration of flooding is dependent on these atmospheric and hydrologic parameters. Our model indicates that maximal water levels will occur in a tidal river when high tides, storm surge, and large fluvial discharge events are coincident. Large tidal amplitudes diminish the storm surge propagation upstream. Phasing between peak fluvial discharges and high tide is important for predicting when and where the highest water levels will occur. The complicated non-linear interaction between tides, storm surge and discharge make it difficult to predict the maximum storm water level without a coupled hydraulic model. Model results show that even small coastal storms can greatly increase the duration of elevated upstream water levels. The interactions between tides, river discharge and storm surge are not simple, indicating the need for more integrated flood forecasting models in the future.

(3A) Water Quality Model Calibration via a Full-Factorial Analysis of Algal Growth Kinetic Parameters

James Bowen, Civil and Environmental Engineering Dept., UNC Charlotte, jdbowen@uncc.edu, Noyes B. Harrigan, WK Dickson & Co., Inc., nharrigan@wkdickson.com

The two-dimensional, laterally-averaged mechanistic eutrophication model CE-QUAL-W2 version 3.72 was used to predict chlorophyll-a concentrations across two different time periods in the Neuse River Estuary, North Carolina. Chlorophyll calibration was performed for two time periods simultaneously by performing by a full-factorial experiment that tested seven algal kinetic growth parameters over three levels for a single algal group. A cluster of up to six computers each running between two and ten instances of the program was used to complete and manage the data for 2187 runs for each time period. A set of six criteria were used to determine which runs performed acceptably, yielding a group of 27 cases that met all of the criteria. Calibration performance of the set of cases outperformed a previous model using three algal groups that met only four of the six selection criteria. Calibration performed this way allowed for a more rational specification of model calibration performance and provided uncertainty estimates of model predictions, albeit at the cost of a considerable increase in computational requirements that necessitated the use of a computer cluster.

(3A) Development of Plug-In Water Quality Libraries for Two and Three Dimensional Estuarine and Coastal Models

Dan Rucinski, LimnoTech, USACE Environmental Lab, drucinski@limno.com, Zhonglong Zhang, LimnoTech, USACE Environmental Lab., zhonglong.zhang@erdc.dren.mil, Billy Johnson, USACE Environmental Lab., Billy.E.Johnson@erdc.dren.mil

A set of plug-in water quality libraries have been developed at Environmental Laboratory of U.S. Army Corps of Engineers Engineer Research and Development Center (ERDC-EL). The water quality libraries include water temperature simulation module (TEMP), general constituent simulation module (GC), nutrient simulation modules (NSMI and NSMII), and contaminant simulation module (CSM), and mercury simulation module (HgSM). Each module simulates a certain range of water quality constituents in aquatic environments. NSMI and NSMII model nutrients and eutrophication processes in the water column using 16 and 24 state variables, respectively. CSM models any user-defined contaminants. Kinetic processes modeled in CSM include: ionization, multi-phase partitioning, degradation, photolysis, hydrolysis, volatilization, generalized second-order reaction, and transformations where one chemical species undergoes a reaction and is transformed to a daughter product. HgSM models mercury species (elemental mercury, inorganic mercury, and methylmercury) and their cycling in aquatic systems. The water quality cells consist of both an aquatic element, representing the water column, and a bed sediment element, representing a biologically active sediment layer beneath the water column. Both elements are distinct and interact with each other across a sediment-water interface. Kinetics for the water column and active sediment layer are fully coupled within each module. The new plug-in framework allows dynamic linked water quality libraries to be developed independently from the computation engines, enabling ERDC-EL and others to integrate water quality libraries into a variety of hydrodynamic models. The framework allows the user to select one or more libraries and their state variables, edit default parameters, enable/disable pathways among constituents, and control model outputs. The water quality libraries are being integrated into two and three dimensional (2D/3D) Adaptive Hydraulics (AdH) models for performing water quality analysis. ADH is a state-of-the-art hydrodynamic modeling system developed by the ERDC-CHL and has been used to model sediment transport in sections of the Mississippi River, tidal conditions in southern California, and study tidal propagation in the Lower Columbia River estuary, etc.

(3A) Simulating Phytoplankton and Assessing the impact of uniqueness of water quality model kinetic parameters on nutrient reduction in the tidal James River, Virginia, USA

Jian Shen, Virginia Institute of Marine Science,College of William & Mary, shen@vims.edu

A eutrophication model has been applied to the James River, a western tributary of the Chesapeake Bay, to assist Virginia Department of Quality to address phytoplankton attainment in the Estuary. The model has been well calibrated for more than 20 years and is capable of conducting TMDL development for nutrient reductions. For a water quality model addressing complex biochemical processes, the accuracy of model simulations depends highly on the formulation of biochemical processes and kinetic parameters used for the model. As there is no unique solution for kinetic parameters and high correlations exist among parameters, the uncertainties of model simulations are difficult to quantify. It is unknown if the model response to the change of environmental conditions, such as a reduction of nutrients, will be different when conducting model scenario simulations with the use of different sets of model kinetic parameters. In this study, we investigated the impact of the uniqueness of model parameters on the model performance and model uncertainty. Multiple model simulations have been conducted. Correlations between model parameters and the uniqueness of model parameters and different formulations for algal growth are investigated. It is found that the model calibration based on statistics for chlorophyll-a is not sufficient to evaluate the model accuracy of model skill. Using different model parameters can often yield similar model skill scores based on statistics, while a large uncertainty can be associated with model response to the change of environmental conditions. Using multiple model calibration results to improve the model predictive skill is also discusses.

(3B) Incorporating Rainfall into Storm Surge Prediction for Hurricane Irma using WRF and CaMEL

Kyra Bryant, Tennessee State University, kmb5482@yahoo.com, Abdullah Alghamdi, Tennessee State University, aalgham9@my.tnstate.edu, Abram Musinguzi, Tennessee State University, mailtoabram@gmail.com, Muhammad Akbar, Tennessee State University, makbar@tnstate.edu

As increasing sea surface temperatures pave the way for more powerful hurricanes, and population growth remains unwavered in low-elevation coastal zones, the time is certainly ripe for accurate hurricane storm surge prediction. Emergency management officials need a reliable model to properly minimize loss of life, which also benefits authorities in preventing and limiting risks when designing coastal protection structures. Since hurricanes cannot be tamed, an accurate model, capable of predicting an impending hurricanes storm surge, is absolutely necessary to ensure effective evacuation. A reliable model successfully portrays each parameter associated with hurricane storm surge. This study explores the effects of incorporating rainfall in the Computation and Modeling Engineering Laboratory (CaMEL) storm surge model. The lateral boundary inputs (e.g. river inputs) and a rain source term in the conservation equations are vital in storm surge modeling. As witnessed with Hurricane Harvey, a hurricane extends far beyond the wind-based Saffir-Simpson scale. The most powerful havoc occurred at its weakest moment as a storm. Tropical Storm Harveys rainfall wreaked widespread flooding as it stalled over southeast Texas. This proves fresh water can play a significant role in flooding, but often this contribution is partially or completely ignored in hurricane storm surge modeling programs and practices. An effective study on Hurricane Harvey requires an extensive inland mesh, which is currently unavailable. Therefore, Hurricane Irma will be simulated retrospectively to understand the rainfall contribution in overall inundations. This study hindcasts Hurricane Irma by employing rainfall, wind speed, and pressure input data from the numerical weather prediction system Weather Research and Forecasting (WRF) into CaMEL. First, the hindcast is performed without a rainfall input in both CaMEL and ADvanced CIRCulation (ADCIRC) for validation. Second, Hurricane Irma is simulated in CaMEL applying a rainfall input from WRF. All three cases are compared to observational data collected from various NOAA stations along the Puerto Rico and Florida coasts. This study will pave the way for investigating rainfall in other hurricanes.

(3B) Development of Probabilistic Extra Tropical Storm Surge (P-ETSS) Past, Present and Future

Arthur Taylor, NOAA / NWS / OSTI / Meteorological Development Lab, arthur.taylor@noaa.gov, Huiqing Liu, NOAA / NWS / OSTI / MDL, Huiqing.Liu@noaa.gov

The National Weather Service (NWS) has developed a tropical storm surge warning and tropical storm surge inundation graphic. Combined they depict the threat of tropical storm surge overland, and the primary guidance for both comes from the probabilistic tropical cyclone storm surge (P-Surge) model (Taylor et. al., 2008). To provide consistent service regardless of cause, NWS is interested in an extra-tropical storm surge warning, and an inundation graphic. Much of the forecasting and collaboration infrastructure for tropical storm surge products can be leveraged for extra-tropical products however what is missing is comparable guidance. To resolve that, in 2015, the NWSs Meteorological Development Lab (MDL) began developing the Probabilistic Extra-Tropical Storm Surge model (P-ETSS). To do so, we needed to apply several modeling advances which had been made to our tropical storm surge modeling system, to our extra-tropical one. Specifically those included: 1) the introduction of a wetting / drying algorithm, 2) the incorporation of various tide methodologies, 3) the ability to inundate due to surge and tide and 4) the application of real-time ensembles to account for forecast uncertainty. Along the way, item 1 was enhanced by developing the ability to nest narrow high resolution overland basins within broader but coarser basins and item 2 was improved by adding a new tide methodology. However the main difference between P-ETSS and P-Surge is in item 4 in that P-ETSS is based on the 21 members in the Global Ensemble Forecast System, while P-Surge is based on utilizing a parametric hurricane wind model to sample 5 year climatological error spaces around the National Hurricane Centers official forecast. The initial operating capability for P-ETSS was established in December 2017 when it was successfully implemented within the National Centers for Environmental Protections (NCEP) operational computer system. This paper explores the efforts of the last two years, provides validation results, and discusses future plans.

(3B) Development of a Computationally Efficient Unstructured Mesh For Use in Real-Time Forecasting of Hurricane Storm Surge

Matthew Bilskie, Louisiana State University, mbilsk3@lsu.edu, Scott C Hagen, Louisiana State University, shagen@lsu.edu, Shu Gao, Louisiana State University, sgao7@lsu.edu

Unstructured finite element meshes used in physics-based numerical modeling of astronomic tides and hurricane storm surge have increased in domain size and mesh node density as computational power and remote sensing technology has improved. Such models have been progressively used for coastal flood hazard studies, the design of flood protection infrastructure, assessing coastal impacts under future climates and, in particular, real-time storm surge forecasts during an imminent hurricane event. In a real-time modeling, environment simulations and post-processing must rapidly occur in order to provide useful and usable information to emergency personnel and local officials prior to the actual threat. Therefore, a balance must be met between a models mesh resolution (including time-step criteria) and the available high performance computing (HPC) resources. This project address the issue by employing two a posteriori methods to compute local target element sizes and resourcefully arrange mesh nodes and elements, localized truncation error analysis (LTEA) and mesh simplification/decimation Specifically, LTEA will be used in the offshore (always wetted) regions of the model domain and mesh simplification will be employed across the coastal landscape, where local topographic elevations are gradients are high. LTEA and mesh simplification techniques were performed on a previously developed high-resolution, research-grade, unstructured finite element mesh that contains 5.5 million nodes and spans the western North Atlantic Ocean, Caribbean Sea, and the Gulf of Mexico with focus on the Mississippi, Alabama, and Florida panhandle coastal floodplain. The resulting forecast-grade mesh contains 2.1 million nodes and runs in 11 minutes/day of surge simulation as compared to 27 minutes/day of surge simulation across 1,020 computational cores. The 60% reduction in simulation time allows a typical 5-day surge forecast to be run in less than 60 minutes as compared to over 120 minutes with the original research-grade model. Furthermore, the resulting peak water levels and timing of peak surge from the forecast-grade model closely match those of the research-grade model. Lastly, the new model was successfully tested and implemented during Hurricane Irma and Nate (2017) in a real-time forecasting framework.

(4A) High-resolution water quality model in the urban tidal freshwater Delaware River

Kinman Leung, Philadelphia Water Department, kinman.leung@phila.gov, Phil Duzinski, Philadephia Water Department, phil.duzinski@phila.gov, Ramona McCullough, Sci-Tek Consultants, Inc., rmccullough@scitekanswers.com, Paula Kulis, CDM Smith Inc., kulisps@cdmsmith.com, Eileen Althouse, CDM Smith Inc., althouseEM@cdmsmith.com, William Bezts, CDM Smith Inc., beztsWM@cdmsmith.com, Rui Zou, Tetra Tech, Inc., rui.zou@tetratech.com

A numerical model of the tidal freshwater Delaware River was developed for the Philadelphia Green City, Clean Waters program. The model was applied to simulate in-stream concentrations of bacteria and dissolved oxygen (DO) in the Delaware River between Trenton and Delaware City. The USEPA Environmental Fluid Dynamics Code (EFDC) was used for modeling hydrodynamics and water quality. The model was validated from April to October of 2012 and 2013. Loadings of carbon, nitrogen, phosphorus, DO, algae, and fecal coliform bacteria from tributaries, and municipal and industrial discharges were all considered in model development. Meteorological data was used to achieve accurate representations of water temperature, wind, and solar radiation. An extensive database of water quality data was compiled from multiple agencies for comparison to model output including 175,370 observations. Continuous DO data at six sites along the mainstem Delaware River and Philadelphia tributaries were used for high frequency comparison of simulated and observed DO concentrations. A sensitivity analysis was conducted to identify the key global and spatially variable rate constants. Spatially variable constants were parameterized with the aid of entensive measurement of nitrification rate, sediment oxygen demand, and benthic nutrient fluxes. Time series plots, CDF plots, box plots, along-channel plots, target diagrams, and error statistics were used to evaluate water quality model performance. This paper demonstrates the use of comprehensive data to understand biochemical processes and in turn enhance water quality modeling in an ecologically and socially important estuary.

(4A) Modeling biophysical controls on hypoxia for the Neuse River Estuary using a Bayesian framework

Alexey Katin, North Carolina State University, akatin@ncsu.edu, Alexey Katin, North Carolina State University, akatin@ncsu.edu, Dr. Dario Del Giudice, North Carolina State University, ddelgiu@ncsu.edu, Dr. Hans W. Paerl, UNC-CH Institute of Marine Sciences, hans_paerl@unc.edu, Dr. Daniel R. Obenour, North Carolina State University, drobenou@ncsu.edu

Hypoxia or bottom water dissolved oxygen (BWDO) depletion below 2 mg/l frequently occurs in the Neuse River Estuary (NRE) due to combination of water column stratification, eutrophication and other environmental factors. Hypoxia causes fish kills, reduces biodiversity, and diminishes aesthetic and recreational value of coastal waters. Given the complex interactions of meteorology, hydrology and nutrient loads, it is important to reveal the main drivers affecting BWDO on various spatial and temporal scales. Here, we develop a model of intermediate mechanistic complexity to predict May to October BWDO in the NRE. The model incorporates important biophysical interactions controlling dissolved oxygen, and is embedded in a Bayesian framework to allow for model calibration, uncertainty quantification and probabilistic hypothesis testing. We leverage this approach to understand how river discharge, nutrient loads, sediment oxygen demand (SOD) and climate conditions affect hypoxia occurrence in the estuary. Model predictions explain on average 48% of BWDO variability in the NRE. The long (19-year) study period enables us to characterize drivers of BWDO depletion at different temporal scales, including nitrogen versus phosphors controls on eutrophication. Our results indicate that about 30% of BWDO is consumed meeting water column oxygen demand (WCOD), while the rest is depleted during organic matter decomposition in the sediments. Interannual SOD variation is associated with November to April discharges and phosphorus concentrations. Elevated phosphorus concentrations enhance primary production leading to increased SOD, while high flows flush out nutrients from the estuary lowering summer SOD rates. In contrast, nitrogen-limited production controls WCOD throughout much of the summer. This hybrid mechanistic-probabilistic approach can be used for assessment of impact of different factors influencing BWDO variation and it can be applied for short- and long-term hypoxia forecasting, serving as a supportive tool for watershed managers.

(4A) Cumulative Effects Assessment for Restoration Projects in the Skagit River Delta

Jonathan Whiting, Pacific Northwest National Laboratory, jonathan.whiting@pnnl.gov, Tarang Khangaonkar, Pacific Northwest National Lab, tarang.khangaonkar@pnnl.gov, Taiping Wang, Pacific Northwest National Lab, taiping.wang@pnnl.gov

The Skagit River delta has historically sustained one of the largest Pacific salmon runs in the lower 48 states, but fish populations have declined due to the development of perimeter dikes for farming. Meanwhile, the region now supports significant farming operations that are increasingly threatened by flooding due to sea level rise. Many restoration projects have been proposed in close proximity within this intertidal region, raising the question of cumulative effects on the viability and longevity of projects. To facilitate a landscape-scale assessment in the Skagit River delta region, a three-dimensional hydrodynamic model was created based on the based on the Finite Volume Community Ocean Model (FVCOM). The model consists of an unstructured grid of 131,471 elements ranging in size from 400m to less than 10m. The model was calibrated over a 7-month period from November 2015 through May 2015, which coincided with twelve stream gauge deployments and encompassed several 2-years floods and a fish outmigration period. A total of 23 restoration project opportunities were identified, mostly from existing plans, which included actions such as dike removal and setback, hydraulic alterations, and the construction of new backwater channels. Projects were grouped in simulations that isolated the effects of individual projects, then a simulation was run with all selected projects. Two additional simulations were also conducted to project performance subject to future conditions with climate change. Model results showed that the projects impacted one another by altering salinity intrusion, affecting flow distributions at bifurcations, and changing water surface elevations at projects. Hydrodynamic modeling has historically been used to inform the hydrodynamic response at a site-level for planning purposes, but a landscape-scale assessment can provide valuable insights for cumulative effects and project resiliency.

(4B) High-Resolution Modeling of Surge during Hurricane Matthew (2016)

Ajimon Thomas, North Carolina State University, athomas9@ncsu.edu, Dr. Casey Dietrich, North Carolina State University, jcdietri@ncsu.edu, Taylor Asher, University of North Carolina, taylorgasher@gmail.com, Dr. Brian Blanton, Renaissance Computing Institute, bblanton@renci.org, Dr. Jason Fleming, Seahorse Coastal Consulting, jason.fleming@seahorsecoastal.com, Dr. Rick Luettich, University of North Carolina, rick_luettich@unc.edu

This research addresses the question of how changes in the timing or speed of a hurricane can change the amount and extent of coastal flooding. Storm surge due to hurricanes can cause significant damage to property, loss of life and long-term damage to coastal landscapes. Hurricane Matthew was a category 5 storm that made landfalls along the coasts of Haiti, Cuba, Grand Bahama Island and South Carolina during October 2016, but which moved mostly parallel to the US east coast from Florida through North Carolina. This research employs the spectral wave model Simulating Waves Nearshore (SWAN) and the shallow water circulation model Advanced CIRCulation (ADCIRC): a coupled model that has achieved prominence in coastal flood forecasting and analyses. The model runs on HSOFS, an unstructured triangular mesh with 1.8M nodes that enables high-resolution representation of the geometry, bathymetry, and topography all along the U.S. east and Gulf coast. Results have indicated that the model was able to accurately predict water levels and peak surges during Matthew in comparison to observations along the U.S. east coast. The impact of Matthew varied significantly along the U.S. east coast and we hypothesize that this is due to the storm interacting nonlinearly with tides. By changing the time of occurrence of the storm by both half and full tidal periods, it was seen that there were differences in storm surge along the coast due to regions coinciding with different periods in the tidal cycles. These differences were as high as one meter at certain locations. Looking at the influence of the storm forward speed on the surge, it was seen that as the speed of the storm was reduced, there was an increase in flooding due to the storm having more time to impact the coastal waters. The faster storm moved quickly across the shoreline, thus flooding only a narrower section of the coast.

(4B) Development and Application of a Parametric Rainfall Model, P-CLIPER

Kendra Dresback, University of Oklahoma, dresback@ou.edu, Kevin Geoghegan, Northwest Hydraulic Consultants, keving@ou.edu, Patrick Fitzpatrick, Mississippi State University, drpjfitz@yahoo.com, Randall Kolar, University of Oklahoma, kolar@ou.edu

During tropical storms, rainfall can lead to a significant amount of flooding, both in the upland and coastal areas. A coupled modeling system was developed to provide total water level (tides + waves + surge + runoff) results. The system couples precipitation estimates via QPE/QPF (Quantitative Precipitation Estimation/Forecasting), a hydrologic model (HL-RDHM Research Distributed Hydrologic Model), a hydrodynamic model (ADCIRC ADvanced CIRCulation) and a wave model (SWAN Simulating WAves Nearshore). It was successfully utilized during Hurricane Irene in 2011. However, the precipitation estimates via QPE/QPF are only associated with the direct track of the hurricane. Thus as an extension of this work we have been investigating a parametric rainfall model in order to be able to incorporate ensemble capability into the precipitation estimates term. For example, the rainfall estimate produced by NOAA in the current system is only associated with the consensus track of the hurricane from the National Hurricane Center; by incorporating the new parametric rainfall model, the rainfall associated with different hurricane tracks could be included. In this presentation, we will summarize the development of a parametric rainfall model, PDF Precipitation-Climatology and Persistence (P-CLIPER). Several historical hurricane storms within the North Carolina area will be utilized in the evaluation of P-CLIPER. Skill is assessed by comparing the hydrological response of model driven by P-CLIPER versus observed precipitation estimates. Finally, as part of this work, a rainfall source term has been incorporated into the ADCIRC model to account for the rainfall over inundated coastal areas. Results with the new rainfall source terms will be shown for Hurricane Isabel.

(5A) Impacts of temperature and salinity assimilation with the Ensemble Kalman Filter on simulated chlorophyll and hypoxia in Osaka Bay, Japan

Masayasu Irie, Osaka University, irie@civil.eng.osaka-u.ac.jp, Yuma Takahashi, Osaka University, takahashi_y@civil.eng.osaka-u.ac.jp, Masahiro Hayashi, Osaka University, nan, Liuqian Yu, Dalhousie University, Liuqian.Yu@dal.ca, Katja Fennel, Dalhousie University, Katja.Fennel@dal.c

Accurate simulation of estuarine water temperature and salinity in hydrodynamic models is challenging especially when the vertical stratification is strong. The ability of hydrodynamic models to represent density also impacts the skill of coupled water quality models. The innermost part of Osaka Bay, Japan, is highly stratified in summer. Algae blooms occur in the surface layer, and hypoxia forms in early summer on the bottom and lasts into autumn. This accurate representation of the density structure is thus important for water quality simulation. In this work, the Regional Ocean Modelling System (ROMS), a 3D hydrostatic ocean model coupled with a modified water quality model based on Fennel et al. (2006), is implemented for Osaka Bay. The Ensemble Kalman Filter (EnKF) is used to assimilate salinity and temperature data to improve the simulation of the density structure. Temperature and salinity observations were collected at 11 stations as part of the Osaka Bay Constant Monitoring System. The EnKF uses 20 ensemble members, generated by spin-up runs with different vertical eddy viscosities and diffusivities. A covariance localization scheme is used with an influence radius of 5 km. The simulation lasts for 11 days from August 1, 2012, with an assimilation window of one hour. The water quality model is coupled with the assimilative hydrodynamic model, thus not only currents, temperature, and salinity but also chlorophyll and dissolved oxygen are updated by the assimilation. Here we evaluate the impact of the temperature and salinity assimilation on the representation of chlorophyll and hypoxia. After assimilation, simulated temperature and salinity profiles agree better with the observations. However, due to the changes in density distribution, currents change at the head of the Bay where the flow is strongly influenced by density. Surface residual currents in both cases flow from the middle of the bay, rotate clockwise in the head, and flow out southward along the coast, but this current is weaker in the case with assimilation. Chlorophyll concentration, which is high at the head of the Bay, is even higher in the case with assimilation because the inflow of low-chlorophyll water is reduced. In the simulation without assimilation, the bottom residual current flows clockwise at the bay head, while the current flows counter-clockwise with assimilation. This difference of bottom circulation affects the horizontal extension of hypoxia, degrading the agreement with oxygen observations at some stations. This shows that revised parameters and boundary conditions in the water quality model are necessary after the new accurate density structure is given.

(5A) A data assimilation scheme and its application coupled with ADCIRC tidal model

Lei Shi, NOAA, l.shi@noaa.gov, Rachel Tang, ERT, NOAA, liujuan.tang@noaa.gov, Edward Myers, NOAA, edward.myers@noaa.gov

Tides and tidal currents are the most dominant components of the water level and ocean currents in US coast and estuarine waters. To accurately simulate the tides generally is the baseline modeling effort for all coastal ocean modeling. Here we purpose a conceptually simple data assimilation scheme based on linearized shallow water equation to optimize/improve tidal boundary condition for coastal ocean model. For this particular application, the scheme assimilated observed CO-OPS tidal harmonic constituents to optimize the model boundary condition. We tested the assimilation scheme coupled with ADCIRC tidal model applied in US west coast. The results indicate the assimilation scheme significantly improve the model tidal results for ADCIRC tidal model. The efficiency of the data assimilation scheme will save a lot time to fine-tune the model tidal boundary condition.

(5A) Scalable data assimilation schemes within nested recursive parallel programming model for advection-diffusion codes

Fearghal O'Donncha, IBM Research - Ireland, fearghalodonncha@gmail.com, Albert Akhriev, IBM Research - Ireland, albert_akhriev@ie.ibm.com, Emanuele Ragnoli, IBM Research - Ireland, eragnoli@ie.ibm.com

Feasible and scalable systems for the accurate estimation of advection diffusion processes are required in several applications. A common example is the real time tracking and forecasting of a density in a fluid, like an oil spill in a marine environment. Data assimilation (DA) is a mathematical technique that enables the incorporation of physical observations within complex models. A key challenge facing data assimilation is computational expense. Schemes involve the computation of the inverse of a matrix which is of order O(n3) meaning that data assimilation schemes quickly become infeasible at large computational domains. AllScale (www.allscale.eu/) is a FET H2020 (Frontiers and Emerging Technologies in Horizon 2020) funded project that aims to provide computational paradigms that can tackle extreme-scale ExaScale computing (1012 Flops). A key component of these future systems is parallelism of the order of 105 106 cores. This degree of parallelism requires novel algorithmic structures to improve efficiency together with decoupling of the specification of parallelism from the associated management activities during program execution to improve productivity and the development environment. Those impose significant challenges for developers aiming at obtaining applications efficiently utilising all available resources. In particular, the development of such applications is accompanied by the complex and labour intensive task of managing parallel control flows, data dependencies and underlying hardware resources each of these obligations constituting challenging problems on its own. The foundation of AllScale is a parallel programming model based on nested recursive parallelism, opening up the potential for a variety of compiler and runtime system based techniques adding to the capabilities of resulting applications One of three pilot applications developed as part of the AllScale project is AMDADOS (Adaptive Meshing and Data Assimilation for Dispersion of Oil Spills), a code that aims to develop highly scalable data assimilation algorithms applied to the simulation of marine oil spills. The approach combines DA with domain decomposition (DD) to 1) reduce the computational expense of DA and 2) facilitate nested recursive parallelism together with DA by localising to individual sub-domains. DD is a standard tool in many scientific domains to reduce the complexity or computational cost of solution. Some of the factors which have motivated DD approaches include: 1) the solution of the subproblems is qualitatively or quantitatively easier than the original, 2) the original problem does not fit into the available memory space and 3) the subproblems can be solved with some concurrency (i.e. in parallel). In this study, the global domain is split into sub-domains and the equations are discretised on each sub-domain. Interface boundary conditions are enforced using a ghost cell approach that overlaps neighbouring solutions. The data assimilation algorithm aligns with the data decomposition strategy by adopting a set of localised data filters unique to each subdomain. The solution is implemented using the novel AllScale API (based on C++ template-style programming) that manages distribution across cores, load balancing and synchronization of solution between sub-domains.

(5B) Wave-current based storm surge modeling of a shallow lagoon-inlet-coastal system

Meng Xia, University of Maryland Eastern Shore, mengxia.umes@gmail.com, Miaohua Mao, Univ of Maryland Eastern Shore, mmao@umes.edu, Xinyi Kang, Univ of Maryland Eastern Shore, xkang@umes.edu

Inlet wave-current dynamics and interactions are vital to the physical exchanges in a lagoon-inlet-coastal ocean system. A coupled wave-current model was calibrated and validated against observational data, and applied to investigate the complex dynamics of the paired inlets in the Maryland Coastal Bays (MCBs) during hurricanes. With the inclusion of wave-current interactions, model performance was significantly improved during Hurricane Irene (2011). Major processes of wave-current interactions include the wave radiation stress-induced setup and currents, and the modulation of depth-induced breaking via water depth variations. In comparison to radiation stress, wave-induced bottom friction and sea surface roughness are of secondary importance on generating nearshore circulations. Further investigations revealed that tidal currents and ocean swells dominated inlet circulations and wave dynamics, respectively. Physical dynamics within the paired inlets are regulated by local winds, wave-current interactions, and the unique characteristics of inlet orientation and width. However, wave dynamics in the lagoon and behind inlets are mainly controlled by local winds and modulated by the shallow bathymetry. With the hypothetical closure of a single inlet, wave-current dynamics and interactions behind the closed inlet are strongly altered, whereas they are scarcely influenced remotely. This performance follows the pattern of the single-inlet system. The only exception is that circulations near the narrow Ocean City Inlet are changed moderately by shutting down the relatively wider Chincoteague Inlet, which resembles the behavior of the double-inlet system. To consider the remote forcing effect on storm surge, this study included a larger domain extending to the continental shelf region along the U.S. East Coast. Using a one-way nesting method, a case study of Hurricane Sandy (2012) was carried out for storm surge simulations in the MCBs. Results indicated that an improved accuracy was achieved by including the large model domain, and the remote forcing played a more important role in the surge propagation than the local forcing. It was found that the arrival time of the peak water level inside the bay lagged the inlet region for several hours. This coupled wave-current model with the model-nested method can be further applied to similar lagoon-inlet-coastal ocean systems such as the Barnegat Bay in New Jersey.

(5B) Coastal Marsh Response to Sea Level Rise along the mid-Atlantic coast of the US

Karim Alizad, Center for Coastal Resiliency, Louisiana State University, karimalizad@lsu.edu, Scott C. Hagen, Louisiana State University, shagen@lsu.edu, Matthew V. Bilskie, Louisiana State University, mbilsk3@lsu.edu, James T. Morris, University of South Carolina, morris@inlet.geol.sc.edu

Coastal marsh systems are vulnerable to increased flooding and consequently losing productivity due to increasing rates of sea level rise (SLR). Their responses to projected sea level changes vary based on the tide range, topography, shoreline and creeks geometry, nutrient and sediment sources. On the east coast of the United States, marsh systems ability to survive by migration to higher lands are constrained due to increased development. Therefore, it is critical to study how various estuaries and their salt marshes may respond to SLR. In this study, four estuaries in Massachusetts, New Jersey, Maryland, and Virginia were selected to study each individual marsh system response to three NOAA projected SLR for the year 2100: intermediate-low (0.5 m), intermediate (1 m), and intermediate-high (1.5 m). Hydrodynamic changes, as well as marsh productivity, were simulated using the Hydro-MEM model (Alizad et al., 2016). The integrated Hydro-MEM model couples ADvanced CIRCulation model (Luettich and Westerink, 2006) with Marsh Equilibrium Model (MEM) (Morris et al., 2002) and include feedback between the main physical and biological processes in the marsh system. The model captures topographical changes using accretion rate in the marsh system and updates bottom roughness using marsh productivity variations due to SLR. The Hydro-MEM results in the form of marsh productivity and marsh migration demonstrates different marsh system responses to SLR in the east coasts of the US. The results indicate that marsh systems can survive under intermediate-low SLR scenario, whereas under intermediate and intermediate-high SLR scenario, some marsh systems lose their productivity and become mudflats or submerged. In addition, marsh migration results demonstrate the role of developed areas in restricting these ecosystems to survive. The results imply the broader impacts of SLR on the mid-Atlantic coast of the US.

(6A) Modeling the Transport and Fate of Sediments Released from Marine Construction Projects in the Coastal Waters of British Columbia, Canada

David Fissel, ASL Environmental Sciences Inc., dfissel@aslenv.com, Yuehua Lin, ASL Environmental Sciences Inc., alin@aslenv.com

Major marine construction projects, resulting in the release of sediments, are subject to environmental assessment and other regulatory approval processes prior to the actual construction and implementation of the project. Examples of these projects include: dredging of seabed materials to allow access of large ships into berths at ports and temporary materials offloading facilities; and trenching and backfilling of the seabed to allow the installation of pipelines or major electrical cables within the seabed. An important tool for the evaluation and assessment of environmental effects is the development of suites of specialized numerical methods for this marine activity. An integrated set of numerical methods must address four distinct topics: the near-field release and mixing of suspended sediments into the water column (i.e.the initial dilution zone); the transport of the suspended sediments under the influence of complex ocean currents driven by tides, winds and river discharges in the far-field; the settling of the transported suspended sediments onto the seabed according to their particle size distribution and the physical characteristics of the receiving waters which results in a two dimensional distribution of deposited sediments; and the potential for resuspension of the deposited sediments due to unusually large near-bottom currents associated with energetic events such a seasonally large winds, very large tides and freshet river discharges into the ocean. A review of projects subjected to environmental assessment in the coastal waters of British Columbia from the year 2006 to the present, is used to provide examples of the development of improved numerical methods. British Columbia coastal waters are complex given the combination of many important oceanographic processes present, including a highly stratified water column due to the freshwater discharges from major rivers, large wind forcing especially in the fall and winter seasons, and large tidal currents, in particular, in northern British Columbia. To improve the capability and representativeness of the numerical simulations of sediment release, transport and fate, the numerical methods have evolved to include higher resolution model grids to better represent the near-field release and behavior of the sediment plume. These higher resolution model grids provide better representation of mitigative measures to limit sediment releases such as sheet-piles and silt curtains. Improvements have also been realized in the depiction of particle size dependent vertical settling rates and the computation of resuspension of initially deposited sediments, especially in relation to temporary subsea piles of sediments arising from trenching for marine pipelines. For environmental assessment, the model results are interpreted according to the identified sediment concentrations arising from project activities in comparison to natural background levels of sediments. The needs and challenges that remain for this numerical modeling application area are also identified.

(6A) New York/New Jersey Harbor Sedimentation Study

Tate McAlpin, U.S. Army Corps of Engineers, tate.o.mcalpin@usace.army.mil, Joseph V. Letter Jr., U.S. Army Corps of Engineers, nan, Mary Bryant, U.S. Army Corps of Engineers, nan, Gary L. Brown, U.S. Army Corps of Engineers, nan, Gaurav Savant, Dynamic Solutions, LLC., nan, Bryce W. Wisemiller, U.S. Army Corps of Engineers, nan, Jamal A. Sulayman, U.S. Army Corps of Engineers, nan, Corey J. Trahan, U.S. Army Corps of Engineers, nan, Anthony G. Emiren, U.S. Army Corps of Engineers

The New York/New Jersey Harbor (NYNJH) is a vital economic resource for both the local economy and the entire U.S. economy due to the vast quantity of imports and exports handled by the numerous ports in this waterway. As with most ports, there is a significant, recurring expense associated with dredging the navigation channels to the authorized depths. In an effort to determine the impact of channel enlargements (the project) on dredging volumes, a numerical model study was performed. The advantage of a numerical model study is the ability to isolate individual system modifications and associated impacts in terms of dredging volumes. Five years (1985, 1995, 1996, 2011, and 2012) were simulated for both the with- and without-project conditions to determine the impact of the channel deepening on the dredging requirements for a wide range of meteorological conditions including storm events. The numerical model results were analyzed to provide insight into which locations will experience increased/decreased deposition and quantify the amount of increase/decrease for a given channel reach. The model results indicate a relatively minor increase in the total dredge volumes for the NYNJH with the increase being insignificant in comparison to the natural variability in dredge volumes across years.

(6A) Sediment Transport Model Including Short-Lived Radioisotopes: Model Description and Idealized Test Cases

Justin Birchler, US Geological Survey, jbirchler@usgs.gov, Justin J. Birchler, US Geological Survey, jbirchler@usgs.gov, Courtney K. Harris, Virginia Institute of Marine Scienc, ckharris@vims.edu, Tara A. Kniskern, Virginia Institute of Marine Scienc, knista@vims.edu, Christopher R. Sherwood, US Geological Survey, csherwood@usgs.gov

Geochronologies measured from sediment cores in coastal locations are often used to infer event bed characteristics such as depositional thicknesses and accumulation rates. Such studies commonly use naturally occurring, short-lived radioisotopes such as Beryllium-7 (7Be) that provides an indicator of riverine-derived terrestrial sediment deposition; and Thorium-234 (234Th), whose presence indicates recent sediment suspension in marine waters. These observations, however, are somewhat disconnected from sediment transport models that have typically represented coastal flood and storm deposition via estimated grain size patterns and deposit thicknesses. To more directly connect the numerical model and field observations, we modified the Community Sediment Transport Modeling System (CSTMS) to account for reactive tracers, and used this capability to represent the behavior of these short-lived radioisotopes on the sediment bed. This paper describes the model and presents results from a set of idealized, one-dimensional (vertical) test cases. The model configuration represented fluvial deposition followed by periods of episodic storm resuspension. Sensitivity tests explored the influence on seabed radioisotope profiles by fluvial deposit thickness, and the intensities of wave resuspension and bioturbation. The intensity of biodiffusion affected the persistence of fluvial event beds as evidenced by 7Be. Both resuspension and biodiffusion increased the seabed inventory of 234Th.

(6B) STORMTOOLS Design Elevation (SDE) Maps: including the impact of sea level rise

Malcolm Spaulding, Spaulding Environmental Associates, LLC, spaulding@uri.edu, Annette Grilli, Ocean Engineering, University of RI, annette_grilli@uri.edu, Chris Damon, Environmental Data Center, Univ RI, cdamon@edc.uri.edu, Teresa Crean, Coastal Resources Center, Univ RI, tcrean@crc.uri.edu, Grover Fugate, RI Coastal Resources Management Cou, gfugate@crmc.ri.gov

Many coastal communities in the US use the base flood elevation (BFE) for the 100 yr return period from FEMA Flood Insurance Rate Maps(FIRM) to design structures and infrastructure. These maps are increasingly known to have serious problems in accurately specifying the risk coastal communities face, as most recently evidenced during hurricanes Harvey and Irma. The FIRM BFE maps also do not include the impact of sea level rise, which clearly needs to be considered in the design of coastal structures over the next several decades. In the present paper, BFE maps are generated for three coastal communities in Narragansett Bay, RI ( Barrington, Bristol and Warren) based on predictions of the coupled surge-wave models from the US Army Corp of Engineers, North Atlantic Comprehensive Coastal Study (NACCS). Wave predictions are based on application of a steady state, spectral wave model (STWAVE). All methods used are consistent with FEMA guidelines for the study area and approved models. Maps are generated for 0, 0.6, 1.5, 2.1, and 3.1 m of sea level rise, reflecting NOAA estimates for the study area through 2100. To help facilitate communication of risk to the general public risk maps are generated for each SLR case, where risk is ranked from low to high and based on likely damages to the structures that are typically found in the study area.

(6B) Real Time Decision Support with ADCIRC

Jason Fleming, Seahorse Coastal Consulting, jason.fleming@seahorsecoastal.com, Rick Luettich, University of North Carolina, rick_luettich@unc.edu, Clint Dawson, University of Texas at Austin, clint@ices.utexas.edu, Carola Kaiser, Louisiana State University, ckaiser@cct.lsu.edu

The ADCIRC finite element coastal ocean model is used in real time decisions support services for coastal and riverine hydrodynamics, tropical cyclone winds, and ocean wave modelling for public sector agencies including NOAA, FEMA, the Coast Guard, and the US Army Corps of Engineers, among others. The historic 2017 hurricane season presented unique logistical and numerical challenges for real time coastal modelling during storms including Cindy, Harvey, Irma, Jose, Maria, and Nate. Our approach to these situations, the outcomes, and implications for future hurricane seasons will be presented.

(6B) Coupling of Hazards to Evacuation/Sheltering Models: Development and Application of the Integrated Scenario-Based Evacuation Framework

Randall Kolar, University of Oklahoma, kolar@ou.edu, Kendra Dresback, University of Oklahoma, dresback@ou.edu, Rachel Davidson, University of Delaware, rdavidso@udel.edu, Brian Blanton, Renaissance Computing Institute, bblanton@renci.org, Brian Colle, Stony Brook University, brian.colle@gmail.com, Tricia Wachtendorf, University of Delaware, twachten@udel.edu, Linda Nozick, Cornell University, lkn3@cornell.edu, Humberto Vergara, University of Oklahoma, humber@ou.edu, Yang Hong, University of Oklahoma, yanghong@ou.edu, Kun Yang, University of Delaware, kunyang@udel.edu, Sarah DeYoung, University of Georgia, sarah.deyoung@uga.edu

Due to the inherent uncertainty in hurricane natural hazards, evacuation decisions are complex. For the case of evacuations in response to hurricanes, there are three important aspects that must be considered: 1) there is uncertainty in how the storms evolve; 2) there are many interactions within and across the natural, infrastructure, and human systems; and 3) the systems are dynamic. There has been significant research into hurricane forecasting and evacuations; however, none of these have looked at the connection of these three important aspects dynamics, uncertainty, and system interactions in a formalized, integrated model. During this project, we have developed a new integrated framework that models the hazards associated with the hurricane using an ensemble of probabilistic scenarios of the hurricane, which in turn is used to develop time-dependent, total water level (river + waves + surge + tide) and wind speed maps. That output is ingested by the infrastructure/decision-making model, which simulates the dynamic decision-making of emergency managers and residents and the dynamic movement of residents over the course of the event. In this presentation, an overview of the component parts of the framework will be presented, along with an example of the guidance it provides in the context of a case study based on Hurricane Isabel in 2003.

(7A) Towards high-fidelity simulation of coastal engineering flow problems

Hansong Tang, City College of New York, htang@ccny.cuny.edu

Now it has become important to simulate in high fidelity many coastal engineering flow problems, such as oil spill from bottom of ocean and impinging of storm surge onto coastal structures. In order to achieve such simulation, it is necessary to simultaneously model both large-scale background ocean flows and small-scale local phenomena, and take their interaction into account. For this purpose, we have developed a hybrid modeling system that is able to directly simulate such flows. The modeling system is an integration of a fully 3D fluid dynamics (3DFD) model and a geophysical fluid dynamics (GFD) model, and it couples the two models in two-way and makes them march in time simultaneously. The 3DFD-GFD system captures local flows via the 3DFD model and resolves background ocean flows via the GFD model. The system is able to simulate many multiscale and multiphysics coastal ocean flows in high fidelity and high resolution, especially for complicated phenomena at small scales, such as thermal discharge at bottom of ocean and tsunami propagation from deep ocean to seashore and then its impact on a coastal structure (e.g., bridges, houses, ...), which are basically beyond the reach of other existing models. In this presentation, the methodology of the system will be outlined, together with example experiments and validation. New progress of our effort on this modeling system will be reported, and issues of further study will also be discussed.

(7A) Tidal variation in sediment distribution in an idealized estuarine model in response to cohesive dynamics

Danielle Tarpley, Virginia Institute of Marine Science, drtarpley@vims.edu, Courtney K. Harris, VIMS, ckharris@vims.edu, Carl T. Friedrichs, VIMS, Carl.Friedrichs@vims.edu

Particle settling velocity and sediment availability impact the transport of sediment to the first order in coastal environments including estuaries. These processes are difficult to parameterize when dealing with fine-grained material like muds, whose transport properties change in response to several factors including salinity, suspended sediment concentration (SSC), turbulent mixing, organic content and mineral composition. Knowledge of the mechanisms governing the transport of cohesive sediment is rapidly expanding; especially in response to technical advances. As understanding progresses, numerical models describing the transport of cohesive sediment are being developed with varying degrees of complexity. Applying these parameterizations in simplified model domains is useful for teasing apart the first order effects of cohesive processes on sediment distribution, which is difficult in large realistic model domains with complex flow patterns. Additionally, an idealized domain model provides an ideal platform to examine the reliability and computational expense of a range of formulations because they have lower computational cost. With this motivation, we implement cohesive sediment transport formulations within an idealized two-dimensional model designed to represent the longitudinal dimension of a partially mixed estuary that neglects across-channel variation but exhibits salinity driven estuarine circulation. The Community Sediment Transport Modeling System (CSTMS) is used to represent suspended transport, erosion, and deposition within the Coupled Ocean-Atmosphere-Wave-Sediment Transport (COAWST) framework. The cohesive processes represented include sediment bed consolidation and swelling; sediment-induced stratification; and flocculation. The primary features of this idealized model are similar to the York River, VA and the sediment characteristics are based on observed values from the York River. Model test cases that alternately remove individual cohesive processes are used to explore how each process impacts the depositional and dispersal patterns of sediment along the estuary on a tidal timescale. In the idealized estuarine model, the estimates of SSCs are reduced by as much as an order of magnitude by including any of the cohesive processes. Sediment-induced stratification lowers the SSC by trapping sediment near the bed causing it to deposit faster during slack tide. Similarly, bed consolidation limits the amount of sediment available to be suspended, which reduces the SSC. With the implementation of flocculation, suspended sediment size distributions show a response to changes in SSC and turbulence. There is distinct sediment particle size differences between suspended material within the estuarine turbidity maximum (ETM) compared to that outside the ETM. In the ETM, tidal variation in SSC and turbulence produce changes to particle sizes, with larger particles seen when SSC and turbulence are elevated during peak flood and ebb tide. Outside the ETM, however, SSCs remain low and do not exhibit much tidal variation in particle size distribution. The implementation of sediment-induced stratification or bed consolidation carry significantly lower computational costs compared to the addition of flocculation dynamics. To explore tradeoffs between computational efficiency and model integrity, a model simulation with fewer sediment size classes is used for comparison. It is expected that the trends in suspended size distribution found inside and outside of the ETM will remain the same but SSCs will likely differ.

(7A) Development of a coupled modeling system to simulate ocean-wave-sediment-vegetation response to storms in a back barrier setting

Tarandeep Kalra, USGS, tkalra@usgs.gov, John C. Warner, USGS, jcwarner@usgs.gov, Neil K. Ganju, USGS, nganju@usgs.gov, Julia M. Moriarty, USGS, jmoriarty@usgs.gov

The COAWST modeling framework was developed to model geomorphic changes during storms that are driven by coupled ocean-wave-atmosphere dynamics. In response to recent major storms, the focus of model development has been on understanding the geomorphic response and resilience of back-barrier estuaries and wetlands. For the simulation of storm events, the coupled system allows feedback between modeled ocean water levels and currents, modeled wave heights, and spatially varying bottom stress. The modeling framework is extended to account for wave attenuation and seabed stabilization by submerged aquatic vegetation (SAV) and marshes to evaluate their potential benefits for coastal protection and estuarine resilience. The model is able to dynamically simulate SAV growth based on nutrient loading and light attenuation in the water column. In addition, the model simulates lateral erosion of salt marshes due to wave thrust and the resulting sediment transport to the bay/estuarine system. The newly developed routines enable a complete coupling of hydrodynamics, water column biogeochemistry, vegetation, and sediment dynamics.

(7B) The near real time West Coast Operational Forecast System: sensitivity to open boundary conditions

Jiangtao Xu, UCAR, jiangtao.xu@noaa.gov, Aijun Zhang, NOAA/NOS/CO-OPS, nan, Alex Kurapov, NOAA/NOS/CSDL

NOAAs National Ocean Service (NOS) has collaborated with the National Environmental Satellite, Data, and Information Service (NESDIS)/Center for Satellite Application and Research to develop and transition to operations the West Coast Operational Forecast System (WCOFS), which will be NOS first operational forecast system to incorporate data assimilation capabilities. WCOFS is based on the Regional Ocean Modeling System (ROMS) and covers coastal and shelf waters of Washington, Oregon and California. WCOFS will provide forecast guidance of water level, currents, temperature and salinity to West Coast communities up to the 10-meter isobath. Presently, quasi-operational nowcast/forecasts using a non-data assimilative version of WCOFS are being tested, which is the first step towards implementing WCOFS with data assimilation and serves as a benchmark for computing resources and model performance skill. The WCOFS open boundary conditions are obtained from the NCEP Global Real Time Ocean Forecast System (RTOFS) for non-tidal water level, temperature, salinity and non-tidal depth integrated currents. Tidal currents and water levels are obtained from the Oregon State University Tidal Inverse Software (OTIS) and added to the non-tidal variables every time step. To reduce boundary effects, the model temperature and salinity are nudged towards the averaged RTOFS temperature and salinity fields in a band of interior points along the open boundary. Three boundary condition configurations were tested to determine the effects of the specification of three-dimensional velocity, the size of the nudging zone and the strength of nudging. These are: i) use of the passive radiation conditions for 3D velocity and nudging of T/S within a 100 km band along the boundaries at 1/7-day strength; ii) use of passive-active radiation with nudging conditions for 3D velocity at the boundary points and T/S nudging (same as in (i)); and iii) use of the passive radiation conditions for 3D velocity (same as in (i)) and nudging T/S within a 200-km zone at 1/3.5-day strength. The results demonstrated that in the first case, the velocity and temperature started to drift away in 2-3 months, during which period a strong coastal jet developed along the California coast and advected cooler waters from the north to south. The nudging of the 3D velocity along the boundaries or use of a stronger nudging coefficient and over a larger area for T and S were able to constrain the model from running away from observations. The boundary effect at 24N, in Mexico, was felt by the model as far along the coast as in Oregon, approximately 3000 km to the north.

(7B) Model Development of NOAA's Integrated Northern Gulf of Mexico Operational Forecast Systems

Zizang Yang, NOAA/NOS/OCS/CSDL, zizang.yang@noaa.gov, Philip Richardson, NOAA, Phil.Richardson@noaa.gov, Edward Myers, NOAA, edward.myers@noaa.gov, Lianyang Zheng, NOAA, lianyuan.zheng@noaa.gov, Aijun Zhang, NOAA, aijun.zhang@noaa.gov

NOAAs National Ocean Service (NOS) is developing the integrated northern Gulf of Mexico (GOM) operational nowcast/forecast system (INGOFS). The purpose of the system is to produce real-time nowcast and short-range forecast guidance for water levels, 3-dimensional currents, water temperature, and salinity over the continental shelf in the northern Gulf region, the adjacent coastal estuaries, and the lower Mississippi River. It will support marine navigation, emergency response, environmental management, and harmful algal bloom (HAB) forecasts. INGOFS will be implemented using the Finite Volume Coastal Ocean Model (FVCOM). The system domain includes the northern GOM continental shelf from north of Cabo Rojo, Mexico in the southwest all the way to Panama City, FL in the northeast. It is an extension of NOAA's existing Northern Gulf of Mexico OFS (NGOFS) domain. The NGOFS already addresses the major coastal estuaries and embayments such as Matagorda Bay, Galveston Bay, Sabine Lake, Calcasieus/Lake Charles, and Mobile Bay. In addition, INGOFS encompasses the northern Mexican coastal waters, Texas coastal inlets, and the lower Mississippi River up to Baton Rouge. The model grid is composed about 300,000 nodes, 600,000 elements, and has a spatial resolution ranging from 45 m near the coast to around 10 km on the open ocean boundary. INGOFS is undergoing a one-year hindcast simulation development. The forcing data include atmospheric forecast guidance from the NOAA/NWS North American Mesoscale (NAM) numerical weather prediction modeling system, real-time oceanographic observations from the NOS Center for Operational Oceanographic Products and Services (CO-OPS), river discharge observations from U.S. Geological Survey gauges, and open ocean boundary conditions derived from the NWS Global Real-Time Operational Forecast System (G-RTOFS) and the ADCIRC EC2015 tidal database. The hindcast performance of water level, currents, temperature, and salinity will be assessed through model-data comparison using the standard NOAA/NOS skill assessment software system. Leading up to the completion of the model development, INGOFS will be further tested in a nowcast/forecast environment for about a one-year period. It is anticipated to be in operational production on NWSs NCEP Weather and Climate Operational Supercomputing System (WCOSS) in mid-2020.

(7B) New Observational and Modeling Capabilities to Improve NOAAs Real-Time Tsunami Forecast

Yong Wei, University of Washington & NOAA/PMEL, yong.wei@noaa.gov, Yong Wei, U of Washington & NOAA/PMEL, yong.wei@noaa.gov, Diego Arcas, NOAA/PMEL, diego.arcas@noaa.gov, Vasily Titov, NOAA/PMEL, vasily.titov@noaa.gov, Christopher Moore, NOAA/PMEL, christopher.moore@noaa.gov

The tragedies of 2004 Sumatra and 2011 Tohoku tsunamis exposed the limits of our knowledge in preparing for devastating tsunamis, especially in the near field. During the devastating 11 March 2011 Japanese tsunami, data from two deep-ocean tsunameters were used to determine the tsunami source within 1.5 h of earthquake origin time. Many far-field and near-field tsunami measurements of the 2011 Japanese tsunami were used to demonstrate the accuracy of NOAAs real-time tsunami flooding forecast system, termed as the Short-term Inundation Forecast of Tsunami (SIFT). Following the 2011 Tohoku tsunami, NCTR has been exploring new technologies in measurements and modeling capability to improve SIFT, which is now operational at both of NOAAs Tsunami Warning Centers (TWCs), for more rapid and accurate forecast, especially in the near field. A tsunami source can be estimated via different methods using a variety of measurements from deep-ocean tsunameters, seismometers, GPS, and other advanced instruments in or near real time. Using the 2011 Tohoku tsunami as an example, we show that these methods are compatible to provide newer and more insightful understanding of tsunami generation from earthquakes, as well as from nonseismic processes. In the coming years, combining GPS measurements, tsunami geodetic data and seismic data (particularly where GPS and tsunameter data are scarce) may lead to the further improved early estimates of earthquake rupture and tsunami-generating processes useful for both real-time forecasting as well as early post-event disaster relief. Development of deep-ocean geodetic measurements is essential to further improve forecast speed and accuracy. Existing deep-ocean tsunami observational instruments along the Pacific Rim are capable of providing tsunami data within 30 min 2 h of tsunami generation. Deployment of these tsunameters closer to the source area has enabled faster detection of tsunami generation and propagation within minutes of the earthquake origin time. However, this strategy requires separation of the tsunami signals from the overwhelming high-frequency seismic waves produced during a strong earthquake - a real technical challenge for existing operational tsunami observational network. A new-generation of nano-resolution pressure sensors can provide high temporal resolution of the earthquake and tsunami signals without loosing precision. Two DART 4Gs equipped with these breakthrough sensors are now deployed offshore of Chile ready for more exciting discovery. This study shows that these new instruments, if deployed along the Cascadia Subduction Zone, may also greatly contribute to the tsunami hazard resilience in the Pacific Northwest. In recent years NCTR has implemented the Graphic Processor Unit (GPU) technique to boost the computational speed of tsunami models by nearly 30 times. The GPU code allows for on-the-fly computation of basin-wide tsunami propagation and coastal flooding in minutes even seconds. This innovation is currently being adopted into SIFT to provide more rapid and accurate estimate of the tsunami source and propagation, with the advantage of combining real-time seismic and tsunami observations that are no longer confined by the precomputed database.

(8A) Oil Spill Risk Analysis in the Gulf of Mexico

Zhen-Gang (Jeff) Ji, BOEM, jeff.ji@boem.gov, Zhen-Gang Ji, Bureau of Ocean Energy Management, jeff.ji@boem.gov, Water Johnson, Bureau of Ocean Energy Management

The Bureau of Ocean Energy Management (BOEM), an agency of the U.S. Department of the Interior, maintains a leasing program for commercial oil and gas development on the Outer Continental Shelf in U.S. territorial waters. The BOEM performs an oil-spill risk analysis (OSRA) using, in part, a statistical model of hypothetical oil-spill trajectories. The OSRA Model is driven by analyzed sea surface winds and model-generated ocean surface currents. Instead of focusing on individual oil-spill events, the OSRA examines oil-spill risks over long periods of time, ranging from 5 years to decades. In the latest OSRA analysis, the OSRA Model calculated 40 millions of oil-spill trajectories over extended areas of the U. S. continental shelf and tabulates the frequencies with which the simulated oil-spills contact the geographic boundaries of designated natural resources within a specified number of days after the simulated spill events. The modeled ocean currents and wind fields used in the OSRA analysis are from 1993 to 2014 for 22 years. The contact probabilities of oil spills in the GOM are analyzed in detail. The OSRA model is also applied to analyze the contact probabilities of the Ixtoc Oil Spill, which happened on June 3, 1979 in the Bay of Campeche of the Gulf of Mexico and lasted for 10 months. The Ixtoc 1 Oil Well suffered a blowout, resulting in one of the largest oil spills in history and 3 million barrels of oil spilled. The OSRA model is applied to simulate particle trajectories released at the Ixtoc location using historical current and wind fields between 1993 and 2014. Detailed analysis is conducted to understand the environmental risks of the Ixtoc Oil Spill.

(8A) Three-dimensional Initial Mixing and Transport Modeling of Discharges through Dynamic Coupling with a Circulation Model: Salish Sea Plume Module Development

Lakshitha Premathilake, Pacific Northwest National Laboratory, malerajage.premathilake@pnnl.gov, Adi Nugraha, PNNL, adi.nugraha@pnnl.gov, Tarang Khangaonkar, PNNL, Tarang.Khangaonkar@pnnl.gov

Simulation of effluent plumes from wastewater outfalls is routinely conducted using semi empirical mathematical models, such as CORMIX and Visual Plumes that are collectively known as plume models. They efficiently provide predictions of general plume characteristics (plume diameter, centerline, and flux-average dilution) and location of the plume trajectory under steady-state conditions. However, existing plume models are not designed to handle long-duration simulations of multiple discharges in a tidally influenced environment. In the Pacific Northwest Salish Sea community (Puget Sound and Georgia Strait region), there is significant interest in tracking the mixing and transport of plumes from multiple storm water and wastewater discharges over long durations spanning several tidal cycles and flow reversals. Numerous shellfish sites, aquaculture pens, and other sensitive areas are exposed to contaminants and particles from nearly 100 wastewater outfalls in the Salish Sea. Typical large scale estuarine circulation hydrodynamic models can accommodate multiple sources. However, they are limited in their ability to resolve and track individual plumes as the grid size scales are too large to resolve effluent plume dimensions, concentrations, and demarcate boundaries with sufficient accuracy. The work presented here describes the development of a plume dilution and transport model and its external coupling to the hydrodynamic circulation model of Salish Sea. Salish Sea Model - Plume Module (SSM-Plume) is a three dimensional, unsteady plume model which has the capability of simulating both near-field and far-field fate and transport of submerged buoyant wastewater discharges. Near-field of SSM-Plume is modeled using Lagrangian Control Volume (LCV) technique, which is a popular method used in unsteady integral plume modeling. The entrainment in the near-field is computed with an established Froude number based method, while the kinematics of jet/plume is determined by the conservation laws. The far-field dispersion in the model is based on Lagrangian Parcel Method (LPM) which facilitates accounting of various particle based processes such as biological and chemical reactions with dissolved components, sewage particle depositions and their fate with marine hydrodynamics. The models Lagrangian framework allows visualization of the sewage particle transport over time and is well suited for marine ecosystem exposure assessment. The model calibration was conducted using steady state scenarios and comparisons against established plume models. Further, more scenarios are presented to visualize the long term spreading of wastewater plumes in Salish Sea region.

(8A) Tidally-induced dispersion and tidally-correlated exchange flow transport mechanisms in the upper Delaware Estuary

Phil Duzinski, Philadelphia Water Department, phil.duzinski@phila.gov, Robert Chant, Rutgers University, chant@marine.rutgers.edu, Ramona McCullough, Sci-tek Consultants, rmccullough@scitekanswers.com

Results of a dye release in the freshwater portion of the Delaware Estuary provide a basis for estimates of the rate of dispersion in a tidally forced reach in the Philadelphia region of this heavily urbanized estuary. These estimates are intended to inform hydrodynamic and water quality model refinements for use in further numerical-based investigations of tidally-induced dispersion, and also for environmental regulatory compliance assessments. Along channel dispersion of the dye patch appeared to increase over time with estimates exceeding that of vertical shear-induced dispersion, suggesting that other processes contribute to the along-channel spread of the dye. Possible candidates for the elevated dispersion are lateral shear dispersion, and tidal trapping of the dye in channel irregularities such as the "corrugated" shoreline associated with urban Philadelphia. Successful numerical modeling in this region of the tidal River likely requires representing these lateral processes to adequately capture the dispersive nature of this system. While the model domain is largely within the tidal-fresh upper estuary, the lower domain contains the oligohaline stretch where low flow events produce evidence of tidally-correlated exchange flow. These events potentially result in up-estuary transport at lower depths. A 3D hydrodynamic model will be used to further investigate the potential roles that lateral processes and buoyancy play in longitudinal dispersion.

(8B) Ice Forecasting in the Next-Generation Great Lakes Operational Forecast System (GLOFS)

Eric Anderson, NOAA/GLERL, eric.j.anderson@noaa.gov, Ayumi Fujisaki-Manome, University of Michigan, ayumif@umich.edu, James Kessler, University of Michigan, jamkessl@umich.edu, Philip Chu, NOAA/OAR/GLERL, philip.chu@noaa.gov, John G.W. Kelley, NOAA/OCS/CSDL, john.kelley@noaa.gov, Yi Chen, NOAA/OCS/CSDL, yi.chen@noaa.gov, Greg Lang, NOAA/OAR/GLERL, gregory.lang@noaa.gov, Jia Wang, NOAA/OAR/GLERL, jia.wang@noaa.gov

The next-generation Great Lakes Operational Forecast System (GLOFS) is currently under development, using the Finite Volume Community Ocean Model (FVCOM), to improve hydrodynamic predictions in the Great Lakes and connecting channels as well as to support ecological forecasts of harmful algal blooms (HAB Tracker) and hypoxia. Along with the upgrade to the operational hydrodynamic models, we are working on adding the capacity for short-term ice forecasting for the Great Lakes using the Los Alamos CICE model. Although, CICE has been applied to the Arctic Ocean (Gao et al., 2008), it was only recently applied to ice formation in freshwater lakes. In the past few years, extensive validation and tuning of the FVCOM-Ice model has been carried out for the Great Lakes to find adequate values for dynamic and thermodynamic parameters (e.g. turning angle, ice categories). Using the CICE model coupled with FVCOM (i.e. FVCOM-Ice), short-term ice simulation is evaluated for the Lake Erie Operational Forecast System (LEOFS) and Lake Michigan-Huron Operational Forecast System (LMHOFS). This goal of this project is to improve forecast guidance of water levels, water temperature, and currents during winter months and early spring, and provide the first-ever ice forecast guidance of extent, concentration, thickness, and velocity in the Great Lakes. Model simulations have been carried out for several hindcast years, with comparisons made to in-situ ice measurements and ice charts from the National Ice Center (NIC). Preliminary results show reasonable ability to reproduce the ice field in most years, though particular sensitivity to the accuracy of atmospheric forcing conditions and ice model parameterization is noticeable in low-ice years. As a part of this work we are also developing skill assessment criteria for short-term ice forecasting through engagement with the OFS user community.

(8B) A Community-based Modeling Approach in Support of Integrated Water Prediction

Carolyn Lindley, NOS/CO-OPS, carolyn.lindley@noaa.gov, Carolyn Lindley, National Ocean Service/CO-OPS, carolyn.lindley@noaa.gov, Aijun Zhang, National Ocean Service/CO-OPS, aijun.zhang@noaa.gov, Patrick Burke, National Ocean Service/CO-OPS, pat.burke@noaa.gov, Edward Myers, National Ocean Service/OCS, edward.myers@noaa.gov, Becky Baltes, National Ocean Service/IOOS, becky.baltes@noaa.gov, Wayne Litaker, National Ocean Service/NCCOS, wayne.litaker@noaa.gov, Lonnie Gonsalves, National Ocean Service/NCCOS, lonnie.gonsalves@noaa.gov

NOAAs National Ocean Service (NOS) is implementing a community-based modeling approach to create the Federal infrastructure backbone to support its many mission requirements. Community modeling combines elements of open source software development with a governance process to ensure that the latest mature science is incorporated into the numerical models used in operations. An emerging priority within NOAA is the NOAA Water Initiative. Its objective is to create and deliver integrated or total water information to address stakeholder requirements around inland and coastal water issues. These issues can be both complex and variable, ranging from navigation services to coastal inundation and ecological forecasting. To address these issues, NOS will leverage a unified modeling framework and standards to facilitate the transition of new models and system upgrades. This includes streamlined processes for sharing data sets and model code. Coordinated stakeholder engagement to identify these variable requirements (e.g., navigation services, flooding and inundation, ecological forecasting and water quality, search and rescue, spill response) that can be supported within the same coastal ocean modeling framework will facilitate improved and efficient service delivery of model products. To ensure operational models can evolve to support more mission requirements, opportunities for engagement being sought with the external modeling community include: 1) the transition of mature hydrodynamic models to fill geographic coverage gaps, 2) improving algorithms (mixing, wetting and drying, etc.) and model performance of critical parameters such as salinity, to improve the support of numerous NOS applications, 3) the provision of additional operations needed for data assimilation and validation, and 4) the development of a sustained regional modeling framework that incorporates coupling with NWS National Water Model. Perspectives on possible mechanisms to facilitate the transition of new models and upgrades to NOAA and processes to share data sets and model code will be provided.

(8B) EFDC-MPI: An open source code for parallel simulation of hydro-environmental processes with data assimilation

Scott James, Baylor University, sc_james@baylor.edu, Frank Suits, IBM TJ Watson Research Center, suits@us.ibm.com, Scott C. James, Baylor University, sc_james@baylor.edu, Jian Shen, Virginia Institute of Marine Scienc, shen@vims.edu, Zhen-Gang Ji, Bureau of Ocean Energy Management, jeff.ji@boem.gov, Emanuele Ragnoli, IBM Research - Ireland, eragnoli@ie.ibm.com

Environmental Fluid Dynamics Code (EFDC) is a general-purpose modelling package for simulating flow, transport, and biogeochemical processes in surface-water systems. Originally developed by Dr. John Hamrick at the Virginia Institute of Marine Science (with primary support from the State of Virginia), it has evolved to become one of the most widely used and comprehensive codes for simulating estuarine and coastal ocean processes. It solves three-dimensional, vertically hydrostatic, free-surface, turbulent-averaged equations of motion for a variable-density fluid. Dynamically coupled transport equations for turbulent kinetic energy, turbulence length scale, salinity, water-quality constituents, and temperature are also solved. The EFDC code was originally released by the US EPA in 1997 with periodic developments in the intervening period. At IBM Research Ireland, the hydrodynamic code has been extended with several specific capabilities focused on real-time operational forecasting, primarily consisting of: 1. MPI parallelization using a domain decomposition approach with targeted load balancing and 2. Data-assimilation modules applied to High Frequency Radar (HFR) flow measurements These build on capabilities developed by Sandia National Laboratories related to sediment dynamics and the representation of marine hydrokinetic devices, which was open-sourced in 2010. The open-source code is available on github: https://github.com/fearghalodonncha/EFDC-MPI.git This talk provides an overview of the EFDC code with specific focus on more recent developments from IBM Research Ireland related to MPI parallelization and data assimilation. We present details on porting an existing serial code to parallel using a strategy based on surgical changes to the source files to minimise error-prone alteration of the underlying computations, while allowing load-balanced domain decomposition for efficient execution on a commodity cluster. The impact of semi-implicit solvers involving global reductions are assessed together with approaches that efficiently handle mixed water/land regions commonly found in coastal simulations. We present details on a computationally efficient data-assimilation scheme based on a linear state estimator with stationary covariance operators, where the covariance matrices are not computed from the dynamical equation but are predetermined based on assumptions about their covariance structures. Several case-study applications are presented including Chesapeake Bay, MD and Cobscook Bay, ME and performance metrics for these applications are provided. Finally, the talk assesses the components of real-time operational systems for forecasting and monitoring coastal and marine conditions that best exploit compute resources and the combined descriptive skill of sensors and models to produce a forecast that is sufficient for operational purposes.

(9A) Streamline Upwind Petrov-Galerkin (SUPG) Based Shallow Water Model for Large Scale Geophysical Flows in Cartesian and Spherical Coordinates

Gaurav Savant, Dynamic Solutions LLc, gaurav.savant@erdc.dren.mil, Corey J. trahan, US Army Corps of Engineers, Corey.Trahan@erdc.dren.mil, Tate O. McAlpin, US Army Corps of Engineers, Tate.O.McAlpin@erdc.dren.mil

The development and implementation of a stabilized finite element model for the simulation of large scale geophysical flows using the Shallow Water Equations (SWE) will be presented. The model is derived from the mass and momentum conservative forms of the SWE, with wetting-drying implemented using a front tracking algorithm. Transient hydrodynamic phenomenon are resolved using run time h-mesh and time step adaption so that the initial grid resolution only need capture bathymetric features. Cartographic mapping is used to allow the use of Cartesian master elements when the meshing is performed in Spherical coordinates. Model validation for the US East and Gulf coast will be presented. Additionally model results will be presented for the Pacific ocean and the Northern Indian Ocean.

(9A) High-order transport scheme for unstructured-grids and its application in a cross-scale simulation coupling the eddying ocean and coastal waters

Fei Ye, VIMS, feiye@vims.edu, Fei Ye, VIMS, feiye@vims.edu, Joseph Zhang, VIMS, yjzhang@vims.edu, Harry V. Wang, VIMS, wang@vims.edu, Zhengui Wang, University of Maine, zhengui.wang@maine.edu

We present some recent developments in the SCHISM modeling system (Semi-implicit Cross-scale Hydroscience Integrated System Model; schism.wiki). Despite the recent success of our unstructured-grid (UG) model in simulating multi-resolution baroclinic processes, its skill in the adjacent eddying ocean still needs improvement before its cross-scale potential can be fully realized. To fill this gap, we have been developing new techniques for SCHISM, with the goal of seamlessly simulating baroclinic processes as in the tributary-bay-ocean continuum. One of the issues is that the transport schemes commonly used for the strongly-forced estuaries are too dissipative for the weakly-forced eddying ocean. For example, a 2nd-order transport scheme (which achieves high skills for the baroclinic processes in the Chesapeake Bay) tends to produce a diffused Gulf Stream signature, even with a high-order momentum advection scheme tailored to the eddying regime. To fix this issue, a 3rd-order transport scheme is developed based on the Weighted Essentially Non-Oscillatory (WENO) formulation to accurately and efficiently resolve mesoscale features. An application to the Northwestern Atlantic Ocean shows that the new scheme leads to a stronger and more realistic eastward penetration of the Gulf Stream into the deep ocean. The meanderings and eddying associated with the Gulf Stream are also captured by the new scheme. Furthermore, the models cross-scale capability and its robustness under complex bathymetry are illustrated by local refinements near a submarine canyon. We show that the model efficiency is largely unaffected by resolving these features. Therefore, with the newly introduced techniques, the model serves as a powerful tool for investigating the interrelated processes on a wide range of spatial scales in the eddying and non-eddying regimes.

(9A) A Hybrid Lagrangian-Eulerian Particle Model for Ecosystem Simulation in Sandusky Bay

Pengfei Xue, Michigan Tech, pexue@mtu.edu, Xinyu Ye, Michigan Tech, xinyuy@mtu.edu, Chenfu Huang, Michigan Tech, chenfuh@mtu.edu, Xing Zhou, Michigan Tech, xingzhou@mtu.edu

Situated on Lake Eries southwestern shore in Ohio, Sandusky Bay is bordered by Ottawa, Sandusky, and Erie counties. This area is a mainstay to the northern Ohio economy because of its tourism and fishing industry. Today, approximately 80% of the Sandusky River watershed is devoted to agricultural purposes, thus loading the river with high concentrations of phosphorus. With the high loads of phosphorus, algae can utilize the nutrient and grow very quickly producing algal blooms. To date, hydrodynamic modeling of Sandusky Bay is relatively limited; we have developed a high-resolution FVCOM model for this region. In this work, we will utilize a new technique which considers hydrodynamic effects and biological processes by integrating a property-carrying particle model (PCPM) and an Eulerian concentration biological model for ecosystem modeling. Results show the integration of Lagrangian and Eulerian approaches allows for a very natural coupling of mass transport (represented by particle movements and random walk) and biological processes in water columns, which is described by a common vertical 1-D biological model. This method is considered to be far more efficient than traditional tracer based Eulerian bio-physical models for 3-D simulation, particularly for a large size of ensemble simulation.

(9B) Circulation patterns for the North Sea at the end of the 21st century

Diana Martins, Marum, Bremen University, dazevedo@marum.de, Andre Paul, Marum, Bremen University, apaul@marum.de, Mark Hadfield, NIWA, Mark.Hadfield@niwa.co.nz, Michael Schulz, Marum, Bremen University, mschulz@marum.de

Climate change projections for the North Sea have followed different strategies and used various models global and regional, coupled and uncoupled. Due to the significant annual-to-decadal natural climate variability within this enclosed shelf sea region, studying the effect and impacts of climate change is challenging. Still, there is a common agreement that the entire North Sea is warming and the sea level is rising. However, there is nonetheless a good deal of uncertainty regarding what may happen to the frequency of extreme events, the general circulation and the salinity distribution. Our work aims to introduce another piece to the puzzle by investigating patterns in the ocean circulation for the end of the 21st century and using these predictions to infer future transport pathways in the region. These will impact the sediment transport patterns and the function of the local ecosystem, among others. We use a regional coupled atmosphere-ocean model (COAWST) at a 10 km resolution for the Representative Concentration Pathway 8.5 (RCP8.5) scenario given by the Intergovernmental Panel on Climate Change (IPCC) Assessment Report 5. The scenario chosen translates to an increase of the radiative forcing of 8.5 W m-2 between the years 1750 and 2100. We analyse the two time periods 2008-2020 and 2088-2100. The boundary and initial conditions for the ocean and atmospheric components are taken from the Community Climate System Model 4 (CCSM4), which is a global coupled climate model. In our study the Baltic Sea is not resolved. To properly describe the exchange between the North Sea and Baltic Sea, the open eastern boundary was forced by adding the transient CCSM4 anomaly to climatological data. The results reveal the extent to which the circulation in the North Sea can be expected to change by the end of the 21st century. Furthermore, we present an evaluation of the uncertainties behind the results, along with an assessment of the degree of confidence in the encountered projections. Information pertaining to the uncertainty in climate change modeling is vital to bridge the gap between science and policy making, and to improve the tools for climate change and risk assessments.

(9B) Effects of Climate Change and Port Development on the Fraser River Estuary

Albert Leung, Tetra Tech, albert.leung@tetratech.com, Jordan Matthieu, WSP, Jordan.Matthieu@wsp.com, Jim Stronach, Tetra Tech, Jim.Stronach@tetratech.com

Agriculture is an important industry in the Province of British Columbia, especially in the Lower Mainland where fertile land in the Fraser River Delta combined with the enormous water resources in the Fraser River Estuary, support extensive commercial agriculture, notably berry farming. However, it is also this very same estuary with such unique physical settlings where freshwater from inland meets saltwater from the Strait of Georgia that could pose potential challenges in maintaining the health of the farming industry. One of these challenges is the anticipated decrease in availability of sufficient freshwater from the river for irrigation purposes. The main driver for this change is climate change, which leads to sea level rise and to reductions in river flow at key times of the year. Dredging the navigational channel deeper to allow bigger and deeper vessels in the river may also affect the availability of fresh water for irrigation. This work undertaken by Tetra Tech examined the effects of changes in environmental (sea level rise and river flow changes as a result of climate change) and anthropogenic (channel deepening) conditions on the behaviour of the salt wedge and salinity distribution in the Fraser River. Climate change scenarios were based on the Fraser River flow rate predicted by the MIROC Global Circulation Model, selected for its demonstrated ability to closely hindcast the historical flow in the river, especially for the irrigation period between August and October. The projected flow rates from the MIROC model were used for the long-term salinity impact assessment, and the months of August, September and October were simulated as these are the months when irrigation water is required and when predicted future river flow rates could drop considerably below current flow rates, allowing higher salinity water to present itself at the current intake location. The near-term impact was assessed by considering two different, but constant, low flow values (1,000 m3/s and 2,000 m3/s) over a one month fall period. Both long-term and near-term sea level rise as well as channel dredging scenarios were evaluated. The long-term sea level rise in the study is 1 m for a time frame between 2050-2100 and 2 m for a time frame between 2100-2200, and sea level rise of 0.3 m was applied for the near-term impact (a time frame of 10-25 years). The long-term dredge depth utilized in the study is such that the channel can accommodate vessels drafting 16.5 m and 20.0 m; while the near term dredge depth is such that the channel can accommodate vessels drafting 13.5 m. Currently, the channel can accommodate vessels up to 11.5 m draft. In this study, the salinity in the river was simulated using H3D, a proprietary three-dimensional hydrodynamic numerical model which computes the three components of velocity (u,v,w) on a curvilinear grid developed specially for Fraser River in three dimensions (x,y,z), as well as scalar fields such as salinity, temperature.

(9B) Pathways of Greenland freshwater in coastal regions

Dmitry Dukhovskoy, FSU, ddukhovskoy@fsu.edu

Accelerating melt of the Greenland Ice Sheet has resulted in cumulative surplus freshwater flux exceeding 6500 km3 since the early 1990s. The volume of Greenland freshwater flux anomaly is more than half of the freshwater flux the North Atlantic during the Great Salinity Anomaly during the 1970, which is estimated to be 10,000 km3. At the same time, there is no definite evidence of freshening in the subarctic seas that could be attributed to the increased Greenland freshwater flux during the last decades. Where does the surplus freshwater end up in the subarctic seas? In order to address this question, Greenland freshwater pathways in the North Atlantic are analyzed using numerical experiments with passive tracers tracking propagation of freshwater from Greenland freshwater sources. The simulation is performed with a coupled 0.08 and 0.04-degree Arctic Ocean HYCOM-CICE. A passive tracer is constantly released during the simulation at the freshwater sources along the Greenland coast. The presentation discusses freshwater pathways and their seasonal and interannual variability in the Greenland coastal regions. Specifically, model results demonstrate strong influence of wind regimes over the Greenland coast on freshwater transport and vertical mixing. This determines the propagation of freshening signal to the offshore regions. The impact of Greenland freshwater flux anomaly on convection processes in the subarctic North Atlantic will be discussed based on the model results.

(10A) A Mass-Conservative Finite-Element Numerical Code for Three Dimensional Flows

Gaurav Savant, Dynamic Solutions LLC, gaurav.savant@erdc.dren.mil

A variety of hydrodynamic and transport codes based on the solution of shallow water equations exist. These codes are predominantly based on the finite difference (FD) or finite volume (FV) method, though finite element method (FEM) codes are gaining in popularity. The presentation will present the development and application of an unstructured fluid (and constituent transport) mass conservative three dimensional (3D) FEM code for multi scale flows. Subsequently the verification, validation and applications of the 3D model will be presented.

(10A) Novel Data Assimilation Technique for Operational Storm Surge Modeling

Taylor Asher, UNC Chapel Hill, taylorasherres@gmail.com, Rick Luettich, UNC Chapel Hill, rick_luettich@unc.edu, Jason Fleming, Seahorse Coastal Consulting, jason.fleming@seahorsecoastal.com, Brian Blanton, Renaissance Computing Institute, bblanton@renci.org

Storm surge simulations are performed using high-resolution models that emphasize efficient computation and representation of physical processes such as atmospheric forcing and tidal fluctuations. The neglect of more gradual and three dimensional physical processes can lead to errors in resolving secondary processes affecting coastal water levels such as rainfall runoff, ocean currents, and the steric effect. Fluctuations in the Gulf Stream, in particular, have been shown to alter U.S. Atlantic coastal water levels by tens of centimeters. Incorporation of these effects with additional model physics can prove challenging and substantially increases the computational cost of model execution. This is particularly problematic for operational usage of these models, where many ensemble members with alternate storm tracks and intensities need to be run in a short time frame. We instead propose a data assimilation method to account for such water level fluctuations. The method is applicable using currently-available data, and unlike traditional ensemble Kalman filter-based methods, increases the computational burden by less than 10%. This talk will demonstrate the approach and the effectiveness of the water level data assimilation method for the case of Hurricane Matthew (2016). Method robustness, computational cost, and flexibility will be demonstrated, while showcasing its ability to accurately compensate for limitations in storm surge models during active forecasting. The technical limitations of the technique and potential future directions will also be discussed.

(10B) VDatum Spatially Varying Uncertainty in the Northeastern Gulf of Mexico

Liujuan Tang, NOAA/CSDL, ERT, liujuan.tang@noaa.gov, Edward P. Myers, NOAA/CSDL, edward.myers@noaa.gov, Lei Shi, NOAA/CSDL, l.shi@noaa.gov, Kurt Hess, NOAA/CSDL, kurt.hess@noaa.gov, Alison Carisio, NOAA/CO-OPS & Lynker, alison.carisio@noaa.gov, Michael Michalski, NOAA/CO-OPS, michael.michalski@noaa.gov, Stephen White, NOAA/NGS, stephen.a.white@noaa.gov

A VDatum-spatially varying uncertainty application is developed for the Northeastern Gulf of Mexico. Built on ADCIRC model, the updated tidal model incorporates the latest bathymetry and shoreline data. The modeled tidal datums are used in combination with a spatially varying interpolation technique to provide a final set of tidal datum fields that match exactly at 182 tide station locations where observations are available. The study points out that accurate water depth of high resolution in coastal area are essential for improving model accuracy. This is especially important for shallow and narrow rivers. It is also reflected by the improved tidal boundary conditions from EC2015 tidal database. The tidal model grid also needs to well resolve the narrow entrances with breakwaters for bays. For tide stations located in intracoastal waterway, additional attention is needed for the multi-connecting channels. Both model error and lack of observations can contribute to large uncertainty. For the latter, it can be improved by placement of new observations. The study also shows that the gridding technique based on wavelength of long waves can improve model efficiency. The spatially varying uncertainty results provide more accurate representations of the uncertainty for the datum products. The results also will be used to help with decision-making on placement of new tide gauges to further reduce the uncertainty in the VDatum products.

(10B) Modeling tidal datums and spatially varying uncertainties in Texas and western Louisiana

Wei Wu, NOAA/CSDL, wei.wu@noaa.gov, Edward P. Myers, NOAA/CSDL, edward.myers@noaa.gov

Tidal datums are key components in NOAA’s Vertical Datum transformation project (VDatum) that enables effective transformation of vertical water elevation between tidal, orthometric, and ellipsoid-based three-dimensional reference systems. An application of VDatum was initially developed for the coastal waters of Texas and western Louisiana in 2013. The major thrusts of this work include: 1) using the Advanced Circulation (ADCIRC) hydrodynamic model to update tidal datums in this model domain; 2) using a recently developed data assimilation scheme on basis of variational principle called Spatially Varying Uncertainty (SVU) scheme to correct modeled tidal datums and produce associated uncertainties. Model mesh grids and bathymetry were first updated using newest available data. ADCIRC modeled water level time series at each model grid was used to compute tidal datums. Modeled tidal datums were then validated by comparing with observations at available seventy-five gauge stations. Large ( 10 cm) model biases were reduced by adjusting or extending model grids. The modeled tidal datums were corrected using the SVU data assimilation scheme. The SVU scheme is able to correct modeled tidal datums to less than 1 cm at available gauge stations and provide the spatially varying uncertainty of corrected tidal datums in the model domain. An important physical phenomenon, “non-tidal zones”, was revealed from the process of modeling and data analysis. The modeled non-tidal zones (defined as MHW-MLW less than 9 cm) provide useful information for adjusting the existing best-guessed non-tidal zones which were produced and maintained by the NOAA’s Center for Operational Oceanographic Products and Services (CO-OPS). It is also essential to enhance the quality of VDatum marine grid population for final VDatum products.

(10B) Ocean Energy: A Review of Approaches Affecting Site Selection

Surupa Shaw, Texas A&M University, surupashaw@tamu.edu

The current energy provision based on the growing population will pose a future challenge as far as the sustainable energy resource is concerned. Ocean environment is a renewable resource that can be replenished on a human timescale, hence is an excellent choice for a sustainable resource for renewable energy. The ocean can be harnessed to produced thermal energy by consuming the solar power, and mechanical energy by utilizing the tides and the waves. The mechanical energy production could be sporadic in comparison to the constant thermal energy production through ocean as an energy source. Ocean environment becomes an exceptional source of energy, when the thermal and mechanical energies are effectively harnessed from the appropriate ocean sites, while ensuring the safety of the marine ecosystem. The environmental guidelines provide the foundation for carefully understanding the environmental impacts that should be possibly avoided for the continued good health of the marine environment. The deployment of the renewable technologies in the ocean can lead to some inevitable and unpleasant effects, but other avoidable effects can be minimized by careful understanding of the environmental guidelines. The choice of the ocean site for harnessing the energy, that will meet the criteria for the clean renewable energy is one of the key parameters for successful and continued production of energy from the ocean. The current work targets the ocean environment for its availability as a promising resource to assist in the effort of tackling the world energy issues. The history of continued research on the requirement and consequence of enhancing the already existing renewable technologies, in the ocean has been traced in this paper. This study provides an overview of the key parameters responsible for the accurate choice of ocean energy harnessing site, that will eventually serve as an excellent reference for the rational deployment guidelines of the future renewable technologies. Ocean energy through waves, tides and thermal gradients is a powerful source of abundant clean, renewable energy and the essential information needed for the conversion of the ocean energy to usable form is presented in this paper.

(11A) Modeling Storm Surge and Waves in the Salish Sea

Zhaoqing Yang, Pacific Northwest National Laboratory, zhaoqing.yang@pnnl.gov, Wei-Cheng Wu, PNNL, wei-cheng.wu@pnnl.gov, Taiping Wang, PNNL, taiping.wang@pnnl.gov, Luca Castrucci, PNNL, luca.castrucci@pnnl.gov, Ruby Leung, PNNL, ruby.leung@pnnl.gov

The Pacific Northwest coasts are subject to the threat of coastal inundation as a result of storm surge and extreme waves. Accurate assessment of coastal risk depends on detailed and accurate information of sea level rise, including waves and storm surge induced by windstorms. This presentation provides an overview of a modeling study of storm surge and waves in the Salish Sea using unstructured-grid coastal hydrodynamic and wave models, such as FVCOM and UnSWAN. To assess the risk of storm surge, a series of historical storm surge events were identified based on non-tidal residual (NTR) water levels observed at Seattle tide gage. Model simulations corresponding to selected storm surge events were conducted. The Salish Sea storm surge model was validated with both observed tidal and NTR data at NOAA tide gauges in the Salish Sea. Sensitivity analysis on the effects of wind forcing and the open boundary conditions are investigated. Model results indicated that storm surge within Salish Sea is dominated by the open boundary conditions at the entrance of the Strait of Juan de Fuca and local wind forcing plays a secondary role. The Salish Sea wave model (UnSWAN) was driven by spectral open boundary conditions from the nested regional WW3 models and wind forcing from a Weather Research and Forecasting (WRF) model hindcast covering the entire west coast at 6-km resolution. Comparisons of model results with observed wave data from 2011 to 2015 at available wave buoys indicated that the model successfully reproduced the wave climates in the Salish Sea. Wave characteristics and the distribution of top 1% significant wave height in the Salish Sea were analyzed and mapped based on model results.

(11A) Assessment of two wave models on a coastal breakwater

Amin Ilia, University of Connecticut, amin.ilia@uconn.edu

Breakwaters influence on coastal wave climate and circulation by blocking and dissipating wave energy. Assessment and quantification of the dissipated wave energy are essential to the determination of coastal circulations and processes. Numerical models, used widely in recent years, are a substantial method for studying coastal wave climatology. Mike21 SW and SWAN are two third-generation spectral wave models which are used widely in coastal research and engineering applications. Recent versions of both models are improved to be capable of considering the influence of breakwater structure in the modeling domain. However, the effectiveness and precision of the models to simulate wave diffraction and dissipation around breakwater structure are not examined. In this study, we compare the performance, accuracy, and efficiency of the models to the simulation of wave climatology around a breakwater in Connecticut coast in Long Island Sound. Both models are executed with the same configuration and grid as both models work with the unstructured triangular grid. The parameterized boundary conditions are derived from a mounted ADCP at offshore side of the breakwater and Central Long Island Sound buoy data. The sensitivity of the models to some parameters and physical processes are verified. The model results are compared with in-situ observations and the model which is more consistent with the observations is determined.

(11B) Coastal Modeling in Support and Light Rail Operations on the I-90 Floating Bridge

Christopher Day, Mott MacDonald, Christopher.Day@mottmac.com, Brian Holloway, Sound Transit, Brian.Holloway@soundtransit.org, Francis Salcedo, Mott MacDonald, Frank.salcedo@mottmac.com, Abhishek Sharma, Mott MacDonald, Abhishek.Sharma@mottmac.com, Vladimir Shepsis, Mott MacDonald, Vladimir.Shepsis@mottmac.com

The Sound Transit light rail link from Seattle to Bellevue, WA has been designed and will be constructed over the I-90 floating bridge at Lake Washington. Lake Washington is a deep body of water (up to 183 m depth) connected with Puget Sound through a system of navigation locks and extends approximately 29 km from the North to South. Once affected by strong northern or southern winds, the waves in the lake can be quite significant. The bridge crossings design established two threshold wave conditions (wave heights in combination with wind speeds) beyond which light train service would be reduced or suspended to avoid exceeding the bridge pontoons stress limits. Thus, the light rail operational procedure was established to reduce or suspend train service when exceedance of the wave threshold conditions would occur. Sound Transit has initiated an engineering study to develop a reliable warning system that determines real time wave conditions at the I-90 Bridge and alerts light rail train operators when these conditions exceed the threshold criteria. The engineering study includes collection of the data (wind, waves, bathymetry at Lake Washington), 2-Dimensional numerical modeling of wave generation and transformation along the lake, and validations of the numerical modeling results with the measured wave data. Based on modeling and analysis results, software was developed that evaluates in real time wave conditions at I-90 Bridge, using output from the wind measuring stations at Lake Washington. This software was incorporated into the warning system software that alerts Light Rail Train Operators when wave conditions exceed the threshold criteria and the need for partial or complete closure of train operations on the bridge arises. The warning software has been installed at Sound Transit headquarters and currently is in a process of testing and adaptation.

(11B) Baroclinic Effect on Modeling Deep Flow in Brown Passage, BC, Canada

(Andy) Yuehua Lin, ASL Environmental Science, alin@aslenv.com, David B. Fissel, ASL Environmental Science, dfissel@aslenv.com, Todd Mudge, ASL Environmental Science, tmudge@aslenv.com, Keath Borg, ASL Environmental Science, kborg@aslenv.com

Brown Passage is a deep (up to 200 m) ocean channel connecting the western offshore waters of Hecate Strait on the Pacific continental shelf with the eastern inland waters of Chatham Sound in Northern British Columbia, Canada. Recent research into this oceanic environment is motivated by studies of the possible use of this area for disposal at sea of dredgate sediments arising from the expansion of marine terminals in and around Prince Rupert Harbor in Chatham Sound. The ocean currents in Chatham Sound are highly variable due to a combination of forcing by the large tides within this area, winds, and large freshwater discharges from the Skeena and Nass Rivers. Chatham Sound is characterized by lower saline near-surface waters on its eastern side due to the Skeena River inflow to Southern Chatham Sound (Lin and Fissel, 2018). More saline waters present on the western side of Chatham Sound result from the exchange through Brown Passage and other connecting channels with the higher salinity waters of Hecate Strait. A high-resolution 3D finite-difference numerical model (COCIRM) was developed to determine the tidal and wind-driven currents of this area (Jiang and Fissel, 2010; Lin and Fissel, 2013) and then used to simulate the transport and deposition of sediments released from disposal at sea of marine dredgates. The model results for ocean currents were found to be in reasonably good agreement with the two sets of ocean current observations at 15 and 98 m available for Brown Passage, obtained in 1991. Based on these results, Brown Passage was thought to be generally well mixed through the middle and lower parts of the water column. The first modern oceanographic measurement program carried out to directly determine the near-bottom currents raised questions about the adequacy of the model results for the near-bottom currents in Brown Passage. Using moored bottom instruments operated from October 2014 to April 2015, near-bottom ocean currents at 3.5 m height above seabed were measured as well as Acoustic Doppler Current Profiler (ADCP) derived current profile data for currents throughout the full water column. CTD profile measurements of salinity, temperature, and density were also obtained during the deployment and recovery of the bottom-mounted instruments. The measured near-bottom currents were found to be considerably higher than the previous model-derived near-bottom currents. As a result, the COCIRM model was modified and rerun, with the most important change being the use of the observed October 2014 water column density profile. This change led to considerable improvements in the ability of the model to generate episodes of relatively strong currents in the bottom layers, which are shown to be in reasonably good agreement for the near-bottom and deeper ocean currents with the 2014-2015 measurement program. Changing the 3D numerical model had smaller effects on the transport and fate of sediments from disposal at sea operations than on the changes in near-bottom currents.

(11B) Characterization of the Zone of Influence due to a Floating Structure in a Fjordal Estuary - Hood Canal Bridge Impact Assessment

Tarang Khangaonkar, PNNL, tarang.khangaonkar@pnnl.gov, Adi Nugraha, PNNL, adi.nugraha@pnnl.gov, Taiping Wang, PNNL, Taiping.Wang@pnnl.gov

Floating structures such as barges and ships in the path of ambient currents affect nearfield hydrodynamics, create a zone of influence (ZOI), and have the potential to impact basinwide circulation and water quality. ZOI, defined as the three-dimensional (3-D) space near the structure where ambient water properties are noticeably affected relative to the background, is of particular interest for assessing nearfield environmental impacts associated with water quality and fish passage. In this paper we present an assessment of nearfield hydrodynamic impact from Hood Canal Bridge, one of the only 11 floating bridges in the world in use today located within Hood Canal sub-basin in Salish Sea, Washington. Hood Canal, a 110-km long narrow fjord like waterbody experiences hypoxic conditions in the fall, typical of fjordal estuaries, but there is concern that changes to circulation from the floating bridge may be exacerbating the problem. Similarly, recent acoustic tagging studies indicate that juvenile steelhead migration is significantly slower through the migration segment encompassing the Hood Canal Bridge, and rates of mortality events are much greater in proximity to the bridge relative to other areas of Hood Canal and Puget Sound. A field data collection program was implemented to collect currents, salinity, and temperature data near the Hood Canal Bridge. The data provided a direct examination of the variation in current profiles from locations upstream, downstream and directly below the bridge. The data also allowed nearfield validation of a 3-D hydrodynamic model of Hood Canal, as part of the FVCOM based Salish Sea Model with the floating bridge section embedded. The floating bridge block was implemented through local refinement and imposition of a continuity block in the layers occupied by the bridge as a simplification. The model reproduced the observed nearfield data collected during this study as well as data collected through Salish Sea wide monitoring programs already in place. A detailed quantification of the nearfield Zone of influence (ZOI) was then conducted. The results confirm the hypothesis that Hood Canal Bridge with a draft of 4.6 m covering 90% of the width of Hood Canal obstructs the brackish outflow surface layer. This induces increased local mixing near the bridge, causes pooling of water (up-current) during ebb and flood, results in shadow/sheltering of water (down-current) during ebb and flood, and creates the associated ZOI. The change in ambient currents, salinity, and temperature is highest at the bridge location and reduces to background levels with distance from the bridge. The ZOI extends 20 m below the surface and varies from 3-4 km for currents, to 2-11 km for salinity, to 1 km for temperature before the deviations drop to 10% relative to background.

(12A) A Temperature Forecasting System as an Adaptive Management Tool for Optimization of Power Generation

Venkat Kolluru, ERM, venkat.kolluru@erm.com, Shwet Prakash, ERM, shwet.prakash@erm.com, Edward Buchak, ERM, edward.buchak@erm.com, Irene Shang, ERM, Irene.Shang@erm.com

Environmental impacts on surfacewaters are often assessed using a comprehensive modeling approach to estimate outcomes during construction, typical facility operations, possible accident scenarios, and decommissioning. Exact field conditions under which such events occur are not always available in advance. Adaptive management systems are thus required where a simplified user-run tool can be used during the occurrence of such an event to help the client respond and develop best management practices. Such adaptive management systems, Enviromatics, can be an essential part of daily operations. These systems are applied at the facility level using simplified predictive operational management tools that automate interfacing with complex process models specifically developed for facility operations. A Temperature Forecasting System (TFS) has been developed as part of Enviromatics program to manage a power plants cooling water discharge temperature in the receiving water body for thermal regulatory compliance. TFS uses an Excel based interface system, which runs a calibrated and verified 3-D hydrodynamic and water quality transport model to predict cooling water discharge temperature in a receiving water body. TFS is designed to manage heat loads during high power demand periods such that the receiving water body temperature does not go beyond the allowable limit posed by state and federal regulatory thermal water quality standards. The automation process involves downloading of accuweather/Weather Underground meteorological data and USGS flow and gage data using their API modules called within Excel for the user specified number of days for forecasting. The downloaded data after going through QA/QC process provides the necessary input data for the 3-D hydrodynamic and transport model. The model runs automatically in the background at discrete time frequency set by the user. The user can modify the plant operating conditions during the waiting period for the next simulation using the TFS Excel interface, which are usually customized depending on each facility needs. The model predicted results are compared with the site specific thermal compliance limits in an excel graphics which are then submitted to plant operators and managers along with customized report for subsequent decision making from various scenarios (energy derating) for efficient plant operation and thermal compliance. To demonstrate the usefulness of this tool, two case studies were selected and described in detail the process of model calibration and verification followed by simulation of various alternative scenarios to manage the power demand and thermal compliance.

(12A) "gridded": Grid Agnostic Data Analysis and Visualization of Oceanographic Model Results with Python

Chris Barker, NOAA, chris.barker@noaa.gov

In order to conform to irregular coastlines, coastal circulation models are most most often built on non-rectangular model grids. These include curvilinear grids typically used by finite difference models (e.g. ROMS) and triangular mesh grids used by finite volume (e.g. FVCOM) and finite element (e.g. ADCIRC) models. In addition to the complexity of the grid itself, different models can produce results on varying parts of the mesh: on nodes, on cells, and staggered (e.g. the Arakawa C-grid). While these varying grid types do an excellent job of allowing models to have effective computational schemes that conform to the boundaries of the domain, they pose complications for post-processing and analysis tools, particularly tools intended to work with a variety of models or inter-comparison of multiple models that may use different grid schemes. The first step in resolving these complications to establish data format standards. The CF metadata conventions for netcdf files has been very successful in enabling data interchange, but it does not currently support non-rectangular grid types. Over the years, the community has created conventions to help facilitate this interchange: The UGRID Conventions (http://ugrid-conventions.github.io/ugrid-conventions/), and the SGRID Conventions (http://sgrid.github.io/sgrid/). In order for these conventions to be useful tools need to be available that understand them, and provide functionality for developing analysis and visualization tools that support them. This presentation will present the "gridded" Python package. gridded provides a single API that allows users to analyze and visualize data from a variety of models grids. Essentially, a gridded.Dataset provides an abstraction for field variables irrespective of the underlying grid the data are computed from. gridded provides utilities for navigating and interpolating the grid, so that users can work with the data set as a field of variables, rather than concern themselves with the intricacies of grid structure. This talk will give a quick overview of the two data conventions, the API provided by the tools, and examples of their use in data analysis, visualization, re-gridding, inter-comparison, and particle tracking.

(12B) A Study of Hydrodynamics and Mixing in the Saint John River Estuary Combining Field Data and Numerical Modelling

Enda Murphy, National Research Council of Canada, enda.murphy@nrc-cnrc.gc.ca, Ivana Vouk, National Research Council Canada, ivana.vouk@nrc-cnrc.gc.ca, Ian Church, University of New Brunswick, ian.church@unb.ca, Enda Murphy, National Research Council Canada, enda.murphy@nrc-cnrc.gc.ca, Abolghasem (Vahid) Pilechi, National Research Council Canada, Abolghasem.Pilechi@nrc-cnrc.gc.ca, Andrew Cornett, National Research Council Canada, andrew.cornett@nrc-cnrc.gc.c

From sources in Quebec (Canada) and Maine (the United States), the Saint John River traverses a distance of more than 670 km before entering the Bay of Fundy at Saint John, New Brunswick. Mixing and exchange of fresh and salt water between the Saint John River, the Kennebecasis Fjord and the sea is driven by strong tides in the Bay of Fundy and fluvial flows, which are seasonal and highly variable. Estuarine mixing processes are controlled by a natural sill near the mouth of the Saint John River, known as the Reversing Falls. The restriction causes a turbulent rapid to form flowing upstream with the flood tide, which then reverses and flows downstream with the ebb tide. Although tides are damped by the Reversing Falls, saline water penetrates more than 30 km upstream depending on seasonal river flows. Previous numerical modelling studies of mixing and exchange in the Saint John River Estuary that include the Reversing Falls have shown promising results, but encountered difficulties in some areas, due to steep gradients in bathymetry and salinity. An ongoing, multi-year campaign of bathymetric surveys and oceanographic measurements, being conducted since 2001 by the University of New Brunswick's Ocean Mapping Group, has yielded extensive datasets to support investigation and modelling of this dynamic estuary. The data include high resolution, multibeam bathymetric survey data covering a significant portion of the study area. A new, three-dimensional, baroclinic numerical model based on the TELEMAC-3D solver is being developed to simulate hydrodynamics and circulation in the lower estuary. The work demonstrates the value of integrating field data and numerical modelling tools to provide new insight to the physics of water exchange in complex estuarine systems.

(12B) Numerical Modeling of the Lake Washington System

Keaton Jones, U.S. Army Corps of Engineers, keaton.e.jones@usace.army.mil, Tate McAlpin, U.S. Army Corps of Engineers, nan, Gaurav Savant, Dynamic Solutions, LLC., nan, Billy Johnson, U.S. Army Corps of Engineers

Two-dimensional (2D) and three dimensional (3D) Adaptive Hydraulics (AdH) models are being developed to simulate hydrodynamics and transport in the Lake Washington system including Lake Union, Lake Washington, and the Lake Washington Ship Canal. Both the 2D and 3D models simulate temperature transport, but the 3D model also includes salinity intrusion in the canal through the Chittenden (Ballard) Locks and has the ability to replicate temperature and salinity stratification. Both models will be calibrated to 2014 conditions and compared with 2015 data for model validation. This paper will address the model capabilities, development, and results. The final validated 3D temperature and salinity transport model will later be coupled with the Nutrient Simulation Module (NSM) and used to simulate nutrient fate and transport. The resulting model can then be used as a tool for investigating water quality and habitat evolution in the Lake Washington system.

(Poster) Hurricane Irma simulation at South Florida using the parallel CEST Model

Yuepeng Li, Florida International University, yuepli@fiu.edu, David Kelly, nan, dkelly@baird.com, Keqi Zhang, Florida International University, zhangk@fiu.edu

In this study a parallel extension of the Coastal and Estuarine Storm Tide (CEST) model is developed and applied to simulate the storm surge tide at South Florida induced by hurricane Irma occurred in 2017. An improvement is also made to the existing advection algorithm in CEST. This is achieved through the introduction of high-order, monotone Semi-Lagrangian advection. Distributed memory parallelization is developed via the message passing interface (MPI) library. The parallel CEST model can therefore be run efficiently on machines ranging from multicore laptops to massively parallel supercomputers. The principle advantage of being able to run the CEST model on multiple cores is that relatively low runtimes are possible for real world storm surge simulations on grids that are very high resolution, especially in the locality where the hurricane makes landfall. The parallel CEST model therefore allows for ensemble predictions with fine scale bathymetric features.

(Poster) Behavior Analysis of Scum Deposited from a Combined Sewer System in Urban Rivers

Yusuke Nakatani, Osaka University, nakatani@civil.eng.osaka-u.ac.jp, Yusuke Nakatani, Osaka University, nakatani@civil.eng.osaka-u.ac.jp, Yutaro Naka, Osaka University, naka@civil.eng.osaka-u.ac.jp, Shuzo Nishida, Osaka University, nishida@civil.eng.osaka-u.ac.jp, Kazuya Taniguchi, Osaka University, taniguchi_k@civil.eng.osaka-u.ac.jp

Scum generation is a serious problem in urban rivers. It results from the presence of suspended organic substances, mainly discharged as combined sewer overflow after rain. Since scum affects the aesthetics of rivers and causes bad smell, we require new methods to remove the scum. However, the behavior of scum has not been sufficiently elucidated, which makes the prediction of where the scum surfaces and flows very difficult. A way to understand the behavior of scum is to perform numerical simulations. In this study, the behavior of scum in the Neya riverine system in Osaka, Japan, was simulated using a three-dimensional hydrodynamic model called finite volume community ocean model (FVCOM). To reproduce the behavior of scum, a new feature was included in FVCOM. The resulting scum model considered the accumulation of suspended organic substances on the riverbed, surfacing of scum by gas generation, and scum advection after surfacing. Moreover, the parameters that affect the scum behavior were studied by sensitivity analysis. In hydrodynamic simulations, FVCOM could reproduce the flow structure in rivers relatively well in comparison with other methods. A particle-tracking analysis demonstrated that tide and backwater significantly affected the behavior of floating scum in rivers. Moreover, the scum model approximately reproduced where the scum surfaced and concentrated. The behavior of scum was shown to be largely affected by parameters such as diameter and density of scum.

(Poster) Global tide-circulation-storm surge simulations by ADCIRC: A case study for Hurricane Harvey in the Gulf of Mexico

Miaohua Mao, University of Maryland Eastern Shore, mmao@umes.edu, Meng Xia, UMES, mxia@umes.edu

Predictions of global tide, circulation, and storm surges were significant to coastal communities. In order to accurately simulate storm surge under extreme weather conditions, a global barotropic, depth-averaged, high-resolution (~ 4 km near the coast and 80 km in the open ocean), unstructured-grid (227882 nodes and 445227 elements) circulation model was configured. Model bathymetry was interpolated from the NGDCs ETOPO2 data. The amplitudes and phases of eight major tidal components estimated by the ADCIRC model were compared with those derived from the TPXO 7.2. Results indicated that the spatial distributions of the major tidal components (M2, S2, K1, and O1) from ADCIRC were consistent well with TPXO7.2. During Hurricane Harvey (2017), the ADCIRC model using either the CFSv2 or NHC derived data satisfactorily produced the cyclonic winds, centered low-pressure, and hurricane tracks. The hurricane went across the Caribbean Sean and reached the Texas coast on August 26th. National Weather Service reported 3.66 m water level in the Aransas Wildlife Refuge along the Texas coast. The ADCIRC model using the NHC data predicted a high storm surge over 6 m behind the Bolivar Peninsula. This overestimation was due to the models failure to accurately resolve the nearby barrier islands, which resulted in an excessive amount of water mass flooding into the adjacent sub-bays. Our future work will calibrate and validate the ADCIRC model against the observed water levels collected from the Center for Operational Oceanographic Products and Services (CO-OPS) and International Hydrographic Organization (IHO). Moreover, the water level and storm surge produced by the ADCIRC model will be compared with the tidal gauge data from the NOAA Tides and Currents throughout the global ocean. Furthermore, additional efforts will be made to refine the model grid near the coast.

(Poster) Oil Spill Risk Analysis in the U.S. Arctic Outer Continental Shelf

Zhen Li, Bureau of Ocean Energy Management, zhen.li@boem.gov, Walter Johnson, Bureau of Ocean Energy Management

The oil spill risk analysis (OSRA) is conducted by BOEM scientists to estimate the likelihood and timing of contact between hypothetical spills from prospective oil and gas development and environmental resources such as shoreline, marine habitats, recreational areas and other areas of biological, social and economic importance. The OSRA model consists of three components: 1) the probability of large oil spill occurrence (defined as greater than or equal to 1,000 barrels), which is based on estimated volumes of oil produced and transported, and on the large oil spill occurrence rates derived from historic data and a fault tree model; 2) the probability of contact to environmental resources from hypothetical oil spill locations (Conditional Probability); and 3) the probability of one or more large oil spills occurring and contacting environmental resources (Combined Probability). In this presentation, a few recently completed and ongoing OSRAs in the U.S. Arctic OCS will be discussed which include Chukchi Sea Lease 193, Liberty Development Project in the Beaufort Sea and other potential Beaufort Sea lease sales. Because there are no recorded large oil spill data in the U.S. offshore Arctic, a fault tree analysis is used to derive large oil spill rates for each proposed action based on the exploration and development scenarios. A fault tree analysis is a method for estimating the spill rate resulting from interactions of other events. Various Arctic effects, non-Arctic variability, and facility parameters in the fault tree analysis are considered to provide a realistic estimate of spill occurrence rates for the Arctic OCS and their uncertainties. The estimate of a conditional probability is based on the assumption (condition) that a large oil spill occurs at a hypothetical spill location and does not factor in the probability of a large spill occurring.

(Poster) Application of Coastal Environmental Risk Index (CERI) to Providence and the Fox Point Hurricane Barrier

Emilie Zarba, University of Rhode Island, ezarb96@gmail.com, Matthew Schwarz, University of Rhode Island, matt_schwarz@my.uri.edu, Emily Day, University of Rhode Island, emily_day@my.uri.edu, Peter Girard, University of Rhode Island, peter_girard@my.uri.edu, Nathaniel Menefee, University of Rhode Island, nate.menefee95@gmail.com, Michael Aiudi, University of Rhode Island, michael_aiudi@my.uri.edu

Providence, Rhode Island is located along the northern portion of Narragansett Bay at the confluence of the Providence River, the Moshassuck River and the Woonasquatucket River. This location makes it susceptible to coastal flooding from factors such as sea level rise, storm surge, and river runoff. Significant flooding from hurricanes in 1938 and 1954 prompted the construction of the Fox Point Hurricane Barrier in the 1960s, which has a series of gates that are closed whenever a storm surge event occurs. The height of the barrier was designed based on a 500 year storm surge, but without consideration to sea level rise, which has been estimated by NOAA to rise 7 feet by the year 2100. This increase in sea level will increase the likelihood of overtopping and failure of the barrier during a storm event. Given these issues, the objective of the study was to estimate damage to structures in Providence from both inundation and waves due to a 100 year storm with and without 7 feet of sea level rise. For this analysis it was assumed that the Fox Point Hurricane Barrier fails or is breached to model a worst-case scenario. This was accomplished using the Coastal Environment Risk Index (CERI; Spaulding et al. 2016) to relate the amount of flooding from inundation or wave crest height relative to the first floor elevation of every structure in the study area to a percent damage of the structure. To use CERI effectively, a GIS environment incorporating inundation layers from STORMTOOLS (CERI: Spaulding et al. 2016) was used to find estimated inundation depths, with the estimated wave crest heights being calculated using the Steady State Spectral Wave Model (STWAVE). An emergency management database (E-911) was used to obtain detailed information about each structure. Damage to each structure from inundation and waves was estimated using damage functions developed by the US Army Corp of Engineers as part of the North Atlantic Coast Comprehensive Study following Superstorm Sandy in 2012. The results of the CERI calculations show that the dominating damage in the study area is inundation, with all of the structures suffering more from inundation damage then wave crest damage. Without the Fox Point Hurricane Barrier in effect, a 100 year storm without sea level rise would damage 664 of the 1090 structures in Providence, or 60 percent of structures. With seven feet of sea level rise included, this number jumps to 1008 structures receiving damage. This large jump in damage demonstrates the devastating effect sea level rise can have on coastal cities. These results also illustrate the obvious benefits needed for maintaining the hurricane barrier into the near future. Reference Spaulding, M.L., Grilli, A., Damon, C., Crean, T., Fugate, G., Oakley, B.A. and Stempel, P., 2016. STORMTOOLS: coastal environmental risk index (CERI). Journal of Marine Science and Engineering, 4(3), 54.

(Poster) Modeling wave-current interaction in Puget Sound

Taiping Wang, Pacific Northwest National Laboratory, taiping.wang@pnnl.gov, Zhaoqing Yang, Pacific Northwest National Lab, zhaoqing.yang@pnnl.gov, Wei-Cheng Wu, Pacific Northwest National Lab, wei-cheng.wu@pnnl.gov

Puget Sound is a large, complex fjord-type estuary characterized with energetic tidal currents. It also experiences episodic windstorms that are capable of producing strong wind waves and storm surges, which threat nearshore habitats and infrastructure. However, the study of wind waves and wave-current interaction in Puget Sound is very limited. In this paper, a high-resolution, unstructured-grid coastal ocean model with an internally coupled wave model (FVCOM-SWAVE) was used to simulate both currents and waves in Puget Sound. The hydrodynamic model is based on Pacific Northwest Laboratorys (PNNL) Puget Sound circulation model. The wave model was forced with PNNLs multi-level nested WaveWatch III model output at the open boundary and high-resolution regional WRF wind product. The tides were validated with NOAA tidal observations throughout Puget Sound while the waves were validated against available measurements at buoys within the Salish Sea. Sensitivity simulations were conducted to understand wave-current interaction in Puget Sound especially during extreme weather events. The detailed results on model validation and sensitivity analysis will be analyzed and presented.

(Poster) NOS Operational Storm Surge Modeling

Yuji Funakoshi, NOAA/NOS/OCS/CSDL/CMMB, yuji.funakoshi@noaa.gov, Sergey V. Vinogradov, NOAA/NOS/OCS/CSDL/CMMB, sergey.vinogradov@noaa.gov, Edward P. Myers III, NOAA/NOS/OCS/CSDL/CMMB, edward.myers@noaa.gov

The Coast Survey Development Laboratory (CSDL) of the National Ocean Service (NOS) has established an Extratropical Surge and Tide Operational Forecast System (ESTOFS) and a Hurricane Surge On-Demand Forecast System (HSOFS) for U.S. coastal waters. The ESTOFS-Atlantic and HSOFS-Atlantic covers the Western North Atlantic Ocean including the U.S. East Coast and Gulf of Mexico and has been in operation since 2012 and 2018, respectively. The ESTOFS-Pacific covers the Eastern North Pacific Ocean including the U.S. West Coast, Gulf of Alaska and Hawaiian Islands and has been in operation since 2014. The ESTOFS-Micronesia covers the Western Tropical Pacific including Guam, Federated States of Micronesia, Palau, Marianas Islands, Marshall Islands, and Wake Island and has been in operation since 2018. The hydrodynamic model employed for ESTOFS and HSOFS is the ADvanced CIRCulation (ADCIRC) finite element model (Luettich et al. 1992; Luettich and Westerink 2004). The ADCIRC hydrodynamic model has been demonstrated to be effective at predicting tidal circulation and storm surge propagation in complex coastal systems. Its unstructured grid methodology allow for the propagation of storm surges from offshore, across the shelf, and inland. This grid can also represent irregular shorelines including barrier islands, rivers and waterways. ESTOFS (except for ESTOFS-Pacific) and HSOFS models have approximately 200m coastal grid resolution and overland resolution up to 10m elevation. ESTOFS uses the Global Forecast System (GFS) atmospheric forcing (ESTOFS apply 10 m wind speeds and sea level pressure form GFS Semi-Lagrangian T1534 (~13 km) grid resolution). HSOFS uses Generalized Asymmetric Holland Model based on National Hurricane Centers (NHC) official forecast. In addition, HSOFS has an capability for creating ensemble members by purtabating NHC forecast. Both ESTOFS and HSOFS output files are provided in two formats: structured GRIB2 files and unstructured NetCDF files on the native finite element grid. GRIB2 files are created for each hourly prediction during a forecast cycle, consisting of records of combined water level (surge with tide), harmonic tidal prediction (astronomical tides), and sub-tidal water levels (the isolated surge). NetCDF files contain an entire nowcast/forecast cycle, and consist of three water levels over the native grid, or six-minute water level records at station locations.

(Poster) Co-evolution of Coastal Natural and Human-Engineered Systems: Making Decisions under Uncertainty

Donatella Pasqualini, Los Alamos National Laboratory, dmp@lanl.gov, Nathan Urban, Los Alamos National Laboratory, nurban@lanl.gov, Site Wang, Clemson University, USA, sitew@g.clemson.edu, Phillip Wolfram, Los Alamos National Laboratory, pwolfram@lanl.gov, David Moulton, Los Alamos National Laboratory, moulton@lanl.gov, Joel Rowland, Los Alamos National Laboratory, jrowland@lanl.gov, Chonggang Xu, Los Alamos National Laboratory, cxu@lanl.gov, Russel Bent, Los Alamos National Laboratory, rbent@lanl.gov, Todd Ringler, Los Alamos National Laboratory, ringler@lanl.gov, Harsha Nagarajan, Los Alamos National Laboratory, harsha@lanl.gov

Half the U.S. population and economy lie in coastal counties. Critical infrastructures, including electrical/water networks and naval/port facilities, are threatened by episodic (storms) and long-term (sea level rise) environmental disturbances, with catastrophic economic and national security consequences. Costly engineering efforts are required to adapt to these threats, but current adaptation planning approaches are inadequate. They analyze individual impacts in isolation from each other (e.g., flooding via storm surge without consideration of how shoreline evolution alters flood patterns) and to date, decision support approaches have largely focused on quantifying threat severity and adaptive resilience strategies for complex networked infrastructures are still developed on an ad-hoc basis. We will present a new approach to develop an adaptation science for complex natural-human-engineered systems that connects physics to decision making. Our approach couples land-ocean evolution with quantified uncertainties and the natural coastal evolution with a stochastic optimization algorithm for the redesign of complex infrastructure networks that is resilient with respect to these uncertainties. Built on core Los Alamos Earth system modeling, grid science and critical infrastructure capabilities our approach simulates high-fidelity process-based model of the evolution of the coupled wetland-ocean interface. It integrates its probabilistic predictions, sampled over stressor uncertainties, into the an optimization algorithm capable of long-range multi-stage planning under uncertainty for large-scale interdependent infrastructure networks.

(Poster) Effect of local wind forcing on the accuracy of extreme wave prediction

Gabriel Garcia Medina, Pacific Northwest National Laboratory, gabriel.garciamedina@pnnl.gov, Taiping Wang, Pacific Northwest National Lab, Taiping.Wang@pnnl.gov, Zhaoqing Yang, Pacific Northwest National Lab, zhaoqing.yang@pnnl.gov, Wei-Cheng Wu, Pacific Northwest National Lab, wei-cheng.wu@pnnl.gov

The quality of wind products is critical to the accuracy of simulated wave climate and the success of wave resource assessment, especially in the nearshore regions where wave energy projects are most likely to occur. Our previous wave resource modeling study focused on the U.S. West Coast indicates that the model tends to under-predict significant wave height and energy density for large waves (e.g., 90th percentile of significant wave height) using NOAA NCEPs Climate Forecast System Reanalysis (CFSR) wind, although overall satisfactory model skills were achieved. This discrepancy appears to be consistent with that found when comparing the CFSR winds to the observed wind at the NDBC buoys in our model domains. Thus, it is important to investigate if wave model results can be improved by using more accurate wind forcing products, such as higher-resolution regional model wind or observed wind data at buoys. This poster presents a study to evaluate the sensitivity of wave models to various wind forcing datasets and to identify a feasible approach to improve extreme wave prediction through improved wind forcing. Sensitivity tests have been conducted at selected representative NDBC buoy sites in the U.S. West Coast and East Coast with various wind forcing products. The sensitivity modeling results were compared to those of the baseline condition forced by the CFSR wind. The results suggest that wave model results, especially those during the large wave events, can be improved with more accurate wind dataset such as the observed wind.

(Poster) Analysis of Wave Climates in the U.S. West Coast using a Multi-Scale, Nested-Grid Modeling Approach

Wei-Cheng Wu, Pacific Northwest Laboratory, wei-cheng.wu@pnnl.gov, Wei-Cheng Wu, Pacific Northwest National Laborato, Wei-cheng.wu@pnnl.gov, Zhaoqing Yang, Pacific Northwest National Laborato, Zhaoqing.Yang@pnnl.gov, Taiping Wang, Pacific Northwest National Laborato, Taiping.Wang@pnnl.gov, Gabriel Garcia Medina, Pacific Northwest National Laborato, gabriel.garciamedina@pnnl.gov

This study presents a modeling analysis of wave climates in the U.S. west coast based on results from a high-resolution, 32-year wave hindcast from 1979 - 2010. The long-term wave hindcasts were generated using a multi-scale nested-grid modeling approach with WaveWatchIII (WW3) and Unstructured Simulating Waves Nearshore (UnSWAN). Model validation was conducted using measured data from more than 20 wave buoys maintained by the National Data Buoy Center (NDBC)and the Coastal Data Information Program (CDIP). Inter-annual and seasonal variations of wave characteristics were analyzed along the west coast. Dominant sea-states, i.e., wind-sea and swells, were investigated and quantified statistically. Cumulative distributions of wave angles, duration of each sea state, and monthly variations of wave power density were calculated for Washington, Oregon and California coasts respectively. Analysis was also conducted to evaluate the transformation of wave energy from offshore to nearshore regions and the spatial variations. In particular, energy dissipation as a function of nearshore geometry, such as distance from the coastline, water depth and bottom slope, is investigated. Outcomes of this study will provide useful information for wave energy development and coastal hazard management in the U.S. west coast region.