19 April 2016
This is part of a new series of posts that highlight the importance of Earth and space science data and its contributions to society. Posts in this series showcase data facilities and data scientists; explain how Earth and space science data is collected, managed and used; explore what this data tells us about the planet; and delve into the challenges and issues involved in managing and using data. This series is intended to demystify Earth and space science data, and share how this data shapes our understanding of the world.
By David Arctur
This is a story about how water data standards, computational hard work, high-performance computing, serendipity and synergy led to an operational capability for nationwide forecasting of streamflow and flooding at high-resolution, in near-real-time. This has been evolving for several years now, but has gone into hyper-drive in just the last couple years.
In May 2014, the opening of a new building on the campus of the University of Alabama in Tuscaloosa triggered the sequence of events that is improving flood forecasting throughout the U.S. and the world. The new building is the U.S. National Water Center (NWC), built by the National Weather Service (NWS) and federal agency partners U.S. Geological Survey (USGS), U.S. Army Corps of Engineers (USACE), and the Federal Emergency Management Administration (FEMA).
At the NWC opening ceremony, Professor David Maidment from the University of Texas at Austin called for a transformative change to be brought about by this new resource. Currently, the NWS operations are conducted by 13 regional centers around the U.S., each running its own weather and river forecasting models, not integrated with the other regions. There has been no national-scale model of surface water flows. The creation of the NWC provided an opportunity to focus resources on understanding the nation’s water situation at a comprehensive, continental scale. This could help advance the U.S. Open Water Data Initiative (OWDI).
Dr. Maidment had an idea how this new center could leverage that opportunity, by coordinating research that had until then been promising but not yet robust enough for this task. He put out a call to NWS and to academia, to rise to the challenge of integrating continental-scale models of weather, land-surface dynamics, and hydrologic flow. The community met that challenge through the National Flood Interoperability Experiment (NFIE) and blew away hurdles that were thought to be show-stoppers. They are now feeding near-term, high-resolution rainfall forecasts through a single land-surface model framework to get national forecasts of runoff and streamflow routing. These forecasts estimate the flow of water (cubic feet or meters per second) through 2.7 million stream segments, averaging 2 miles or 3 km in length, seamlessly across the entire continental U.S. These forecasts are updated every hour of every day.
This is valuable and timely information at a scale that is useful at the local level for emergency response planning and mitigation. While some states and sophisticated urban areas have more detailed models to anticipate flooding, this will enable even the most rural areas to be alerted to extreme events much sooner than previously possible.
There is still work to be done, but the initial capability will become operational for the nation at the NWC during 2016. More than twenty federal, state and local agencies, academic research centers, and commercial software vendors are cooperating and coordinating to refine and expand on the initial capabilities. One of these efforts is led by Professor Jim Nelson at Brigham Young University (BYU) in Provo, Utah. The BYU team has developed a Python web application framework called Tethys, which provides model integration and visualization tools. They have adapted the US-NWS approach for use in global applications, using weather forecasts and related models from the European Centre for Medium-Range Weather Forecasts (ECMWF), in Reading, UK. This builds on an international initiative known as the Global Flood Awareness System (GloFAS). They are conducting outreach and education to bring this capability to the U.S., Latin America, Europe, New Zealand and elsewhere. An important aspect of this forecasting workflow is that it is scalable and cloud-based; does not require costly high-performance computing for smaller watersheds and regions, so can be implemented by developing countries using their own resources and trained staff.
This story is organized in five parts, with the second and third entries already posted on the Open Geospatial Consortium website. Additional posts in this series provide a summary of what’s been done so far, the underlying challenges and technologies, and what’s next.
—David K. Arctur is a Research Scientist and Fellow at the University of Texas at Austin.