kcompton @kcompton ?

active 3 years ago
  • ThumbnailBy Beth Bartel, Outreach Specialist, UNAVCO

    Okay, maybe that title is a bit harsh. When it comes to delivering a message about hazards and risk, there’s certainly benefit in delivering broad messages, to a […]

  • By Alexandra Branscombe

    Originally posted on AGU GeoSpace

    [caption id="attachment_7071" align="alignright" width="283"]“The Arctic in the Anthropocene: Emerging Research Questions” was released last week as an effort by the National Research Council to bring together Arctic scientists and stakeholders during a time of rapid change in the region.  Credit: National Research Council/National Academies “The Arctic in the Anthropocene: Emerging Research Questions” was released in April as an effort by the National Research Council to help chart the course of future research in the region as it goes through rapid change in the area. Credit: National Research Council[/caption]

    WASHINGTON, DC – What is hidden within and beneath Arctic ice? Why does winter matter? What is being irretrievably lost as the Arctic changes?

    These are just some of the emerging questions that scientists are being challenged to answer about the rapidly changing Arctic in a new report, “The Arctic in the Anthropocene: Emerging Research Questions,” released last month by the National Research Council’s Committee on Emerging Research Questions.

    The report focuses on questions sparked by recent discoveries about the Arctic and new tools available to investigate the region, said Henry Huntington, an Arctic scientist with the Pew Charitable Trusts in Eagle River, Alaska, who co-chaired the report. Huntington spoke during an April 29 webinar announcing the report, which was written by an international committee of Arctic experts and scientists.

    The report’s authors hope to inspire a wave of scientific research that is better equipped to study the changing Arctic. Interest in the region is rising as it undergoes rapid transformations as a result of global warming, they said.

    For example, future research should tackle the “Hidden Arctic” – areas of the Arctic that have not been studied because they couldn’t be reached, but are now accessible because glaciers and other ice are melting. Researchers should also explore what is being irretrievably lost as the Arctic changes, including the threat that melting permafrost and ice pose to archeological sites and rare habitats, the report said.

    “Our focus was on emerging research questions, distinguished from existing questions that have been asked for a while,” Huntington said. “Existing questions deserve continued attention … The focus on our task is to ask emerging questions on newly recognized phenomena, build on recent results, or on new technology that allows us to do things we couldn’t before.”

    Scientists have a lot to sink their teeth into as the Arctic changes, Huntington said, including vanishing sea ice, retreating glaciers, melting permafrost, and the rippling global ecological effects that come with these scenarios.

    “The Arctic is increasingly connected to the rest of the world,” Huntington noted. “Whatever happens in the Arctic does not stay in the Arctic.”

    Research questions should also not be limited to one field of study, according to the report. New research needs to draw from various scientific disciplines, like anthropology and geoscience, and also include investment from public sources and private industry, the report’s authors said.

    [caption id="attachment_7068" align="alignleft" width="361"]This time series from NASA satellites show Arctic sea ice declining from year to year at a rate of 11.5 percent per decade. A new report published by the National Research Council calls for more international and interdisciplinary research strategies to tackle emerging questions in the Arctic. Credit: NASA/Goddard Scientific Visualization Studio This time series from NASA satellites show Arctic sea ice declining from year to year at a rate of 11.5 percent per decade. A new report published by the National Research Council calls for more international and interdisciplinary research strategies to tackle emerging questions in the Arctic. Credit: NASA/Goddard Scientific Visualization Studio[/caption]

    Huntington said interdisciplinary or international Arctic research that is happening now is not well coordinated. Instead, there should be better systems in place to connect researchers, investors, and the public.

    The report also highlights the role humans play in the Arctic, explained Stephanie Pfirman, a professor of environmental science at Barnard College in New York City, and a co-chair of the report. Using the term “Anthropocene” – a geological time period defined by the impact humans have had on the Earth – in the report’s title encompasses not only the influence humans have had on the Arctic, but also our ability to research this part of the planet, she said.

    “We wanted to highlight that as human capacity grows, our ability to do research also grows,” said Pfirman.

     

    – Alexandra Branscombe is a science writing intern in AGU’s Public Information department

  • Kris Ludwig, Staff Scientist, US Geological Survey Natural Hazards Mission Area

    We all use some form of hypothetical situations to plan our daily lives: What if it rains? Bring an umbrella. What if you’re in an accident? Buy insurance. What if there’s traffic? Learn alternate routes. On some level, we understand and accept the risk of discrete events like a storm, an accident, or a travel delay that may adversely affect our plans. Correspondingly, we prepare

    [caption id="attachment_4095" align="alignright" width="224"]Example Chain of Consequences Credit: Department of the Interior, 2013 Example Chain of ConsequencesCredit: Department of the Interior, 2013.[/caption]

    But what if the rain causes an accident, jamming traffic – and then the roads flood, stranding travelers and limiting access by emergency personnel? How does one anticipate – and prepare for – a plausible chain of consequences, where each event may lead to another, adding complexity and uncertainty at every step?

    Scenarios can be powerful tools for anticipating the unanticipated. Used by the military, public health, natural resource, utilities, and emergency management communities, scenarios can improve preparation for, response to, and recovery from catastrophic events.

    What is a Scenario?

    The term “scenario” means different things to different users. Scenarios are typically developed by teams of stakeholders at various scales, for multiple audiences, and using many methods. Scenario outcomes may be used for different purposes including informing evacuation plans, improving supply mobilization, or shaping recovery plans. Examples of several types of scenarios include:

    Chains of Consequences

    The Department of the Interior’s Strategic Sciences Group (SSG) develops chains of consequences stemming from an environmental crisis event. Like a tree diagram, these cascades effectively “map out” possible impacts across the ecology, economy, and people of an affected region. Each consequence is assigned a level of uncertainty, informed by a group of experts assembled by the SSG in response to an event. Analyses of these chains are used to develop interventions, institutional actions that may mitigate downstream effects, for decision makers.

    Table Top Exercises

    [caption id="attachment_4085" align="alignleft" width="260"]Table Top Exercise Credit:  Photo courtesy of NIH Table Top ExerciseCredit: Photo courtesy of NIH[/caption]

    Table Top Exercises (TTX) are discussion-based sessions where personnel share what their roles, responses, and concerns would be during a specific situation. Recently, the National Institute of Environmental Health Sciences, part of NIH, developed a TTX to practice the coordination of public health research in the aftermath of a hypothetical tsunami disrupting southern California. Participants included health officials, geoscientists, emergency managers, and community representatives. The outcomes are helping shape the new NIH Disaster Research Response Project, which aims to improve environmental health disaster research by developing data collection tools and a network of trained responders.

     

    Analyses of Hypothetical yet Plausible Events

    The US Geological Survey (USGS) Science Applications for Risk Reduction (SAFRR) project has developed several scenarios including ShakeOut, ARkStorm, and a California Tsunami Scenario. Each project aims to improve resilience to natural hazards by developing partnerships among scientists, decision makers, emergency managers, and community leaders to assess the impacts of a hypothetical but plausible event. SAFRR has forged collaborations with artists and designers to develop maps, animations, and videos to communicate scenario results to broad audiences.

    Scenario Strengths and Caveats

    Two often-heard comments in the emergency management community have special relevance to scenarios:
    The disaster that happens is the one you did not prepare for.
    An emergency is the worst time to exchange business cards.
    These statements highlight both an important caveat and a vital strength of scenarios: they are not intended to predict the future, nor are they meant to preclude necessary creativity when responding to an actual event. Rather, scenarios and exercises can build critical “muscle memory,” foundational knowledge, and valuable relationships that improve our ability to respond to and recover from catastrophic events, ultimately improving resilience.

    [caption id="attachment_4081" align="alignright" width="254"]ShakeMap for Earthquake Planning Scenario Credit: USGS ShakeMap for Earthquake Planning ScenarioCredit: USGS[/caption]

    Scenarios Require a Whole Community Approach

    We live in a system of systems, where our energy, telecommunications, transportation, and other lifelines are increasingly interdependent. Intertwined with these services are our economy, social infrastructure, and natural environment. When one or more dimensions of this coupled human natural system are disrupted, others may be affected. These events quickly – and often painfully – reveal vulnerabilities that may or may not have been previously known.

    Scenarios are one way to identify these vulnerabilities before a disaster occurs. Because of the multi-disciplinary nature of hazards, effective scenarios require a “whole community” approach, where natural and social scientists, engineers, emergency managers, and policy makers are all invested in the development and implementation of scenario outcomes to improve resilience.

     

    Kris Ludwig is a Staff Scientist in the Natural Hazards Mission Area of the US Geological Survey (USGS), where she supports both the Department of the Interior Strategic Sciences Group and the USGS Science Applications for Risk Reduction project. She has a PhD in Oceanography from the University of Washington and a BS in Earth Systems from Stanford University. Dr. Ludwig will be a panelist at the 2014 AGU Science Policy Conference.

  • ThumbnailBy Dan Vimont, co-chair, Wisconsin Initiative on Climate Change Impacts (WICC)

    I am a climate scientist who has spent my career understanding the physics of the climate system, and the impacts of climate […]

  • By James Schwab, AICP
    Manager, Hazards Planning Research Center, American Planning Association

    [caption id="attachment_4049" align="alignright" width="292"]Hazard Mitigation: Integrating Best Practices into Planning (PAS Report No. 560) Hazard Mitigation: Integrating Best Practices into Planning (PAS Report No. 560)[/caption]

    There is a simple way to find out just how serious a priority hazard mitigation may be in your community. Can you find it in your comprehensive plan? If not, you already have a signal that, even if your community has adopted a local hazard mitigation plan under the Disaster Mitigation Act of 2000, a requirement for eligibility for federal hazard mitigation grants, the emergency managers and planners may not be talking to each other or sharing perspectives to ensure effective implementation. A great deal of hazard mitigation requires attention to how hazards interact with land use. The question does not end there. There are actually many possible elements in the local comprehensive plan that may have a bearing on hazard mitigation—transportation, housing, economic development, even historic preservation. It is important that the planning process identify and address these issues and identify ways to address them.

    Several years ago, the American Planning Association undertook a project with the Federal Emergency Management Agency to identify and promote best practices in integrating hazard mitigation into all aspects of the local planning process, including visioning and plan making, but also implementation, development review, and capital improvements programming. The result, published in 2010, was Hazard Mitigation: Integrating Best Practices into Planning (PAS Report No. 560). The report includes six case studies from across the U.S., with two each involving large, intermediate, and small jurisdictions. That distribution was intended to demonstrate that the concept was not only for those jurisdictions with the largest resources. The report discusses federal statutes and frameworks for supporting hazard mitigation, but made clear that ultimately effective integration was up to the states and local government.

    About ten of those states actually require some type of hazards-related element in local comprehensive plans, such as the safety element in California general plans. A few states, like California and Florida, have put some significant effort into promoting the integration of hazard mitigation into the planning process. More recently, FEMA has been producing national and regional guidance to further promote the concept and share additional case studies in successful integration.

    The next frontier may be to effectively integrate climate change considerations into hazard mitigation planning, which may require some new types of expertise among planners and emergency managers. But integrated hazard mitigation planning is an excellent starting point for communities that want to get serious about their hazards.

    James Schwab, AICP, is the manager of APA’s Hazards Planning Research Center. He is the project manager for “Planning for Post-Disaster Recovery: Next Generation,” a new version of PAS Report No. 483/484 (1998) on disaster recovery. He represents APA in the NOAA Digital Coast Partnership, is a frequent speaker on hazards issues, and often represents APA in federal agency program development with regard to hazards. Mr. Schwab will be speaking at the AGU 2014 Science Policy Conference on June 17th. 

  • ThumbnailBy Kate Gordon, Executive Director, Risky Business

    Despite massive scientific evidence that climate change will have significant effects on the American economy, the business and finance world is still largely […]

  • ThumbnailBy Jeff Rubin, Emergency Manager, Tualatin Valley Fire & Rescue, Tigard, OR

    We’re justifiably concerned about terrorism, but natural hazards still generate far greater risk in terms of number of incidents, […]

  • By Mary Lou Zoback, Consulting Professor, Stanford University

    Friday, October 17, 2014 will mark the 25th anniversary of the M6.9 Loma Prieta/World Series earthquake that struck the San Francisco Bay Area at 5:04 PM. The shaking lasted 25 seconds. When it stopped, 62 people had lost their lives, largely the result of bridge and overpass collapse (43 deaths) as well as collapse of San Francisco homes built in a region of bay fill. Beyond this loss of life, the regional economy was affected by an estimated several billion dollars in losses due to disruption of economic activity and damaged infrastructure.

    [caption id="attachment_3985" align="aligncenter" width="650"]baybridge1 Rebuilding the eastern span of the Bay Bridge was one of the many infrastructure upgrades in the San Francisco Bay Area after the magnitude 6.9 Loma Priet/World Series earthquake. A 50 food segment of the upper deck of the bridge collapsed during the shaking.[/caption]

    In the 25 years since Loma Prieta, San Francisco Bay area infrastructure (IF) providers have dramatically increased their disaster resilience through substantial system upgrades, an investment of $21 billion—almost $1billion/year—funded almost exclusively locally by ratepayers and taxpayers. These responses and infrastructure upgrades were based on a rigorous analysis of future earthquake likelihoods made by the U.S. Geological Survey’s (USGS) earthquake hazards program. Though each of the system upgrades were motivated by damage sustained in 1989, IF providers took into account the USGS’ earthquake forecasts—indicating that future damaging earthquakes striking the region are twice as likely as not to occur and are also likely to be closer to the major metropolitan areas and larger in magnitude—when making their upgrades.

    Communicating the importance of post-disaster performance of infrastructure to stakeholders was key to getting the support that IF providers needed for their large-scale projects. The risk communication efforts in partnership with the USGS made it possible for IF providers to gain internal support as well as public support for the needed upgrades. All of the upgrades were funded locally by a citizenry that recognizes the value of continued performance post-earthquake for recovery and the value of investing ahead of time to reduce substantial losses after the earthquake.

    The public/private partnerships between the USGS and Bay Area IF providers exemplifies the kinds of approaches outlined in Presidential Policy Directive 21 on Critical Infrastructure Security and Resilience, the purpose of which is “to reduce vulnerabilities, minimize consequences, identify and disrupt threats, and hasten response and recovery efforts related to critical infrastructure.”

    The effort in the Bay Area is largely an informal one in which USGS researchers collaborate with IF staff to learn of their concerns and data needs, and USGS researchers produce information and products to address those needs. PG&E has actually funded some targeted USGS research.

    The five IF providers each employed a five-step strategy to increase their resilience:
    1)    Assess vulnerabilities of their systems to expected earthquakes
    2)    Set performance goals for their systems after the earthquake
    3)    Communicate risks/benefits and secure funding
    4)    Develop creative and innovative solutions for these complex problems. These include building redundancy into the system at the start and taking advantage of, and in some cases funding, research to improve solutions.
    5)    Continue to reassess system performance as upgrades proceed. Develop real-time damage assessment capability using near real-time maps of earthquake shaking intensity provided by the USGS overlain by their system fragility functions.

    [caption id="attachment_3972" align="aligncenter" width="600"]Infrastructure providers in the San Francisco Bay Area have spent $21 billion dollars combined since the Loma Prieta/World Series earthquake to ensure resiliency to future events. Infrastructure providers in the San Francisco Bay Area have spent $21 billion dollars combined since the Loma Prieta/World Series earthquake to ensure resiliency to future events.[/caption]

    I feel this sustained, coordinated and successful partnership and effort in improving regional resilience is exemplary and should serve as inspiration for the rest of the nation. Perhaps most impressive is that now that most of the initial round of system upgrades is complete, the IF providers are working together plan the next level of resilience: addressing key regional interdependencies and vulnerabilities. This entire story illustrates that improving resilience is an ongoing process, informed by “lessons learned” and enabled by cooperation among diverse players in the public and private sectors.

    Mary Lou Zoback is a seismologist and Consulting Professor in the Geophysics Department at Stanford University. She spent much of her career in the USGS Earthquake Hazard Program and served as Chief Scientist of the USGS Western Earthquake Hazards Team. From 2006-2011 she was Vice President for Earthquake Risk Applications with Risk Management Solutions, a private catastrophe modeling firm serving the insurance industry. In that role she utilized the company’s commercial risk models to explore the societal role of earthquake insurance, and to quantify the costs and benefits of disaster management and risk reduction activities. Dr. Zoback will be speaking at the AGU Science Policy Conference in June.

  • By Linda R. Rowan, Director of External Affairs, UNAVCO
    and J Ramon Arrowsmith, EarthScope National Office Director and Professor of Geology, Arizona State University

     

    [caption id="attachment_3949" align="alignright" width="350"]EarthScope infrastructure across the United States. Credit: Jeffrey Freymueller  EarthScope infrastructure across the United States. Credit: Jeffrey Freymueller[/caption]

    EarthScope is a grand earth-observing project funded by the National Science Foundation that has had many science discoveries, technological innovations and broader societal benefits. For example, Popular Science declared it the “most epic” science project in 2011. EarthScope has several large-scale and mostly distributed observatories. The project celebrated a decade of success and a consideration of its future with a symposium, reception and congressional briefings in Washington DC in May, 2014.

    The USArray is a network of seismic instruments placed throughout the United States and Canada. This network includes an array of 400 seismometers moving across the country as well as additional seismometry arrays for focused regional studies and a magnetotelluric array for measuring electric and magnetic fields emanating from Earth’s interior.

    A permanent network of instruments, called the Plate Boundary Observatory (PBO) examines the Pacific plate interaction with the North American plate. The network includes 1,100 continuous GPS stations, 145 meteorological instruments, more than 75 borehole geophysical tools, 6 laser strainmeters and other tools like tiltmeters. Most of PBO is concentrated in the Western U.S., along the San Andreas Fault, the Cascadia Subduction Zone and along the subduction zone boundary that forms the Aleutian Arc where 130 active volcanoes line-up and demarcate the southern edge of Alaska.

    The third component is a deeply drilled hole that penetrated about 2 miles below the surface along the San Andreas Fault. The San Andreas Fault Observatory at Depth (SAFOD) provided the first opportunity to directly observe the conditions along a fault under which earthquakes and aseismic slip occur.

    EarthScope has collected data for a decade and made numerous discoveries about the solid Earth and the atmosphere in North America. We know more about how the major plates move, how earthquakes work, how active volcanoes are, how severe weather and hurricanes track, how space weather forms, how glaciers change and how water is stored near the surface in soil, vegetation and snow pack.

    The National Science Foundation is supporting a continuation of EarthScope for 5 years, ending in 2018. Beyond science discoveries, data, and innovations, there are broader societal benefits. Some broader benefits include earthquake early warning, volcano hazard monitoring, weather forecasts, water resource management, land-use management, surveying, and engineering. EarthScope has a broad network of informal educators at national, state, and local parks and monuments, and its data and understanding are essential for earth science education.

    [caption id="attachment_3951" align="alignleft" width="350"]Federal government panel discussion about EarthScope at the May 15, 2014 EarthScope Symposium in Washington DC. The panelists include from left to right, William Leith, U.S. Geological Survey, Senior Advisor on Earthquake and Geological Hazards  John LaBrecque, National Aeronautics and Space Administration, Lead, Earth Surface and Interior Focus Area, NASA Science Mission Directorate Juliana Blackwell, National Oceanic and Atmospheric Administration, Director, National Geodetic Survey Iftikhar Jamil, National Oceanic and Atmospheric Administration, National Weather Service, Chief Information Officer Gregory J. Anderson, National Science Foundation, Geosciences Directorate, Division of Earth Sciences, Program Officer for EarthScope     Photo Credit: J Ramon Arrowsmith Federal government panel discussion about EarthScope at the May 15, 2014 EarthScope Symposium in Washington DC. The panelists include from left to right, William Leith, U.S. Geological Survey, Senior Advisor on Earthquake and Geological Hazards John LaBrecque, National Aeronautics and Space Administration, Lead, Earth Surface and Interior Focus Area, NASA Science Mission Directorate Juliana Blackwell, National Oceanic and Atmospheric Administration, Director, National Geodetic Survey Iftikhar Jamil, National Oceanic and Atmospheric Administration, National Weather Service, Chief Information Officer Gregory J. Anderson, National Science Foundation, Geosciences Directorate, Division of Earth Sciences, Program Officer for EarthScope Photo Credit: J Ramon Arrowsmith[/caption]

    What happens after 2018 will depend on what the Earth science research community decides, what the National Science Foundation’s priorities are, and whether agency partners or other stakeholders want to continue to use all or part of the project. Some of the Federal agencies that use EarthScope the most, include the U.S. Geological Survey, the National Oceanic and Atmospheric Administration, NASA, the Federal Emergency Management Agency, the Department of Energy, and the Nuclear Regulatory Commission.

    A significant component of EarthScope that should be maintained for the long term are the data and data services. UNAVCO maintains the geodetic data, the Incorporated Research Institutions for Seismology maintains the seismic data and Texas A&M University maintains the SAFOD data, especially the physical core materials from the drilling. All of the institutions have an open access to data policy and consideration for data services beyond 2018 is important to maintain these valuable resources.

    We have grown to rely on the great practical value of the observational infrastructure of EarthScope. Support has been authorized to maintain 25% of the USArray of seismometers in the Eastern U.S. as a permanent network. This decision was driven by the dearth of seismic monitoring in the East and accelerated by the 2011 Mw 5.8 Mineral, Virginia earthquake that shutdown the North Anna nuclear generating station. It reminded policymakers, especially those in Washington DC who experienced the earthquake, that damaging events can occur in the East (e.g. New Madrid events in 1811-1812 and the Charleston earthquake of 1886).

    The Plate Boundary Observatory is a large, permanent geodetic network. Beyond hundreds of successful research projects, the GPS stations and other tools are used by Federal, state and local agencies for hazards, resource management and land-use planning and by commercial users for surveying and engineering. At least 50% of the real-time GPS data users are from the commercial sector. Maintaining or upgrading this infrastructure beyond 2018 has research and broader benefits. Discussions need to begin regarding who will support PBO beyond 2018, how many partners might be involved, and how much of the infrastructure might be maintained or upgraded.

    EarthScope is epic and the odyssey for science, innovation and broader impacts continues for more than a decade, until at least 2018. The potential for research is great and the utility of operationalizing components for applied research and broader impacts is significant. The EarthScope organizers and stakeholders need to find the best path to reach these ambitious goals.

    Linda R. Rowan is Director of External Affairs at UNAVCO. UNAVCO is a non-profit university-governed consortium that facilitates geoscience research and education using geodesy. Linda focuses on open access to data, international cooperation through geodesy, and policy and media relations for UNAVCO. She has a BS in Computer Science/Math and Geology from the University of Illinois at Urbana-Champaign and a Ph.D. in Geology from the California Institute of Technology.  

    J Ramon Arrowsmith is the Director of the EarthScope National Office and the Chairman of the EarthScope Steering Committee. He is also Professor of Geology in the School of Earth and Space Exploration at Arizona State University. He has interests in earthquake geology, tectonic geomorphology, and large scale science organizations among numerous other topics. He is originally from New Mexico, and has a BA from Whittier College and a Ph.D. from Stanford University.

  • By John Bwarie, Founder, Stratiscope

    Having served as staff for over a decade for three L.A. City Councilmen, as well as L.A. Mayor James Hahn, I’ve been on the receiving end of countless requests for support, meetings, and action from concerned citizens and interest groups. In 2010, my world was turned upside down when I started working with USGS scientists to inform policymakers on how science can be used as they make policy decisions. Since I’ve been on both sides of the conversation, a few key strategies have emerged as critical to understanding how to talk to policymakers.

    It’s sometimes hard to believe, but policymakers are just like you and me: they have a history that has shaped who they are, and their decisions are often based on what they’ve experienced and who they spend time with. Whether a freshman congressman or a four-term mayor, elected officials make decisions as human beings who have had a unique set of experiences and relationships. Knowing this is half the battle in talking about what you think is important.

    Here are seven often overlooked tips that can help in preparing to talk about science or uncertainty to policymakers at the local level. (They could also apply to state and federal officials, but there are nuances at those levels that would require additional posts!)

    What follows is not about what to say or even how to say it, but these are often overlooked strategies to getting your point across to this very discreet yet important audience.

    1. Know Your Audience

    Before contacting an elected official and/or their office, make sure to do your homework. You should know before approaching them:
    a. How long have they been in this office? Are they elected or appointed?
    b. What’s their next job/position/election? How long will it be before they get there?
    c. What’s their experience/opinion of the topic you want to talk about?
    d. What are the issues they think are important, and what is their position on these topics?

    Knowing who you are talking to, where they are coming from, and where they are going is key to making a meaningful connection. Elected officials are people, too.

    2. Talk to the Staff

    Don’t think that the policy maker makes all the decisions. Their key advisers are their staff. Building a personal connection with staff can create a lasting relationship that brings your issue to the top of their list. Make sure to connect to what matters to them first; from what are they under pressure or siege? How can you solve their problems? Show them the “win” for them and how you can make it happen. Then, they will be in a position to support your work when asked.

    3. Provide More Than You Ask for

    You should always go in offering information, resources, assistance to meet their needs and goals. Provide tangible value to them that they can use to advance their own position through your expertise. Once you prove valuable to them, they will likely be more receptive to your request.

    4. Think Local

    Connect your issue to their jurisdiction, city, and/or district. All politics are local – it’s true. So make sure to relate a global, national, or even regional issue to the special area of the local official. And, if possible, relate it to other local issues they’re grappling with or that matter to them (see #1).

    5. Be Patient; Be Bold

    Sometimes, it can take months or years to see the results of a request. Be patient, consistent, and unrelenting in providing value to the elected officials office so that your name (and your request) doesn’t get lost in their mountains of work. Patient doesn’t mean weak, though. Know when to be the one to speak or when to send in someone less expert but more effective (perhaps they have an existing relationship, have had previous success, or are more comfortable in this realm). Being bold means knowing that the path to success may not be the one you started on when you set out.

    6. Influence the Influencers

    Don’t just go to the elected officials. Reach out and educate those who have influence over your targeted policymaker, which could include staff, colleagues, friends, or relatives. Other key influencers that may be less personal include local community & business leaders, local organizations, and local activists. Building a coalition of support (grassroots or otherwise) is a strategic way to get your issues heard and hopefully addressed.

    7. Know What You Want

    Make sure, above all, that you know what you’re asking for. Make sure that the person your asking has the ability and wherewithal to do it, or modify your ask to something they can do (from signing a letter to funding a project, support can be shown in many ways). Have a plan of what you actually need today, and what you could return to ask for in the future (if that option exists). Be clear, direct, and specific so the response can be the same.

    John Bwarie is the Founder of Stratiscope and will be speaking at the AGU Science Policy Conference in June about communicating risk and natural disaster preparedness.

  • ThumbnailOriginally posted on the Opower blog

    On Tuesday, the White House released the most authoritative scientific report ever written about the current and future consequences of climate change in the United […]

  • By Lexi Shultz, Director of Public Affairs at the American Geophysical Union
    and Kat Compton, Public Affairs Intern

    As if the recent reports from the Intergovernmental Panel on Climate Change (IPCC) and the National Climate Assessment (NCA) weren’t enough of a reminder of the ways in which human actions are changing our planet, new research published in the current edition of Geophysical Research Letters (GRL) presents evidence that part of the West Antarctic Ice Sheet has passed a tipping point. The glaciers in the Amundsen Sea sector of West Antarctica are melting at an unprecedented rate, they’ll be gone within the next few centuries, and there’s nothing we can do to stop their disappearance. These glaciers contain enough ice to raise the global see level by 4 feet (1.2 meters), and even if it takes two centuries, that’s still one foot of sea level rise every 50 years. These are the sorts of findings that take your breath away, or at least they did ours.

    [caption id="attachment_3909" align="alignright" width="300"]A photograph of Thwaites glacier in West Antarctica taken by NASA’s Operation IceBridge. A new study finds a rapidly melting section of the West Antarctic Ice Sheet appears to be in an irreversible state of decline, with nothing to stop the glaciers in this area from melting into the sea. Credit: NASA A photograph of Thwaites glacier in West Antarctica taken by NASA’s Operation IceBridge. A new study finds a rapidly melting section of the West Antarctic Ice Sheet appears to be in an irreversible state of decline, with nothing to stop the glaciers in this area from melting into the sea.Credit: NASA[/caption]

    The NCA report projects between 1-4 feet of sea level rise by 2100, but Eric Rignot, lead author of the GRL study (of NASA’s Jet Propulsion Laboratory (JPL), Pasadena, California, and UC Irvine), says that due to the study’s findings these projections will need to be revised upward. According to their findings, the West Antarctic glaciers alone will contribute to global sea level rise at the upper bounds of the NCA projections, and that’s before factoring in melting from East Antarctica and Greenland. This GRL study follows on the coattails of another study published in Nature Climate Change on 4 May, which says that the East Antarctic is vulnerable to the same kind of tipping point. If such a tipping point were reached, the East Antarctic could contribute as much as 10-13 feet (3-4 meters) to global sea level rise.

    The implications of 4 feet (or more) of sea level rise are a little unnerving. According to the NCA, nearly five million people in the U.S. live within 4 feet of the current sea level. FIVE MILLION. Beyond the obvious loss of coastal real estate and the disappearance of wetland habitat, sea level rise puts communities at risk for severe flooding during storm surge events like those associated with Hurricane Katrina and Super Storm Sandy. Tsunamis generated from large earthquakes will reach communities we have traditionally thought of as safe. How can we respond? Should we start building up our sea walls? Should we rezone coastal real estate and prevent future infrastructure development in vulnerable areas? Should we modify our building codes to reflect the inevitable changes in our environment? What will the economic effects of our choices be? It’s these and similar questions that we aim to tackle during this year’s Climate Change and Natural Hazard Preparedness sessions during the AGU Science Policy Conference in June.

    Whatever we decide, now is the time for our government – federal, state, and local – get more serious about the development and implementation of policies that will keep communities safe and resilient. Of course, policies that curb our emission of green house gasses like carbon dioxide are important to help curtail the future effects of climate change. But, at least in the case of sea level rise, research like this demonstrates that we’re not going to be able to stop many of the effects of climate change. That makes preparation and adaptation all the more important. We have to start preparing for the “when.”

    AGU’s goal is to help facilitate the conversation between scientists and the decision-makers grappling with the policy implications of our changing climate. This dialogue is key to developing sound policy that will ensure community resilience. We know that many of you are already doing that on your own, and we want to hear from you. Tell us about how you’re contributing to the conversation! Share your experience with us using the Sharing Science tool on our website or leave a comment below.

  • By Harold E. Brooks, Senior Research Scientist, NOAA/National Severe Storms Laboratory

    With the release of the new National Climate Assessment, the scientific community has put forward our best understanding of the changes that have occurred and are expected to occur as the planet continues to warm. Noticeably, little is said about tornadoes in this document. There’s good reason for this absence. Despite a wide variety of speculation in the online community, there’s little in the formal literature that addresses the problem.

    [caption id="attachment_3860" align="alignright" width="300"]Number of F1 and stronger tornadoes in 12 consecutive months beginning with the time on the x-axis. Based on data from the National Weather Service’s Storm Prediction Center. Number of F1 and stronger tornadoes in 12 consecutive months beginning with the time on the x-axis. Based on data from the National Weather Service’s Storm Prediction Center.[/caption]

    The challenges in coming up with highly confident statements are many. Changes in the way reports have been collected and damage assessed over the years and over space make using the official tornado database (available from NOAA’s Storm Prediction Center) without considering those changes dangerous and make it impossible to compare to the rest of the planet. Verbout et al. (2006) indicate that the counts of F1 and stronger tornadoes starting in 1954 are reasonably consistent. (Tornadoes prior to 2007 are rated on the Fujita (F) scale and more recent ones are on the Enhanced Fujita (EF) scale. For this purpose, the scales appear to be similar in practice.) The weakest (F0) tornadoes have increased from fewer than 100 per year 60 years ago to 700-800 currently. For a shorthand, I’ll refer to the F1 and stronger as “damaging tornadoes.” If we take a running total over 12 month periods, we see that there’s little long-term trend in the number, and the average has remained a relatively stable 500 per year, but the records for most and fewest damaging tornadoes have both occurred since 2010. This appearance of increasing variability is also seen if we look at the timing of the early part of the “tornado season” during the year. If we use the date of the 50th damaging tornado (roughly 10% of the annual average) as a measure of the beginning of the season, 7 of the 9 earliest starts have occurred since 1997 and all 5 of the latest starts have occurred during that time, representing 30% of the record.

    [caption id="attachment_3861" align="alignleft" width="300"]Average number of days per year with at least 1 F1 or stronger tornado (black line, left vertical axis) and days per year with at least 25 F1 or stronger tornadoes (red line, right vertical axis). Averages computed over a decade. Average number of days per year with at least 1 F1 or stronger tornado (black line, left vertical axis) and days per year with at least 25 F1 or stronger tornadoes (red line, right vertical axis). Averages computed over a decade.[/caption]

    Underlying the increased variability is a large decrease in the number of days per year with at least one damaging tornado and a large increase in the number of days per year with a large number of damaging tornadoes, as seen in the decadal averages. In short, we have fewer days with damaging tornadoes being reported now than 40 years ago, but more “big” tornado days. The two changes have balanced each other, so that the decadal average number of tornadoes has remained relatively constant, but variability on time scales from days to years has increased. In effect, tornadoes have been concentrated into a fewer number of days. At this point, no physical mechanism has been found that explains the changes, but it very difficult to imagine a scenario in which the reporting changes are responsible since one aspect would come from more aggressive data collection and the other from more conservative data collection.

    The reporting differences have led scientists to look at the distribution of environmental conditions in which severe thunderstorms and tornadoes most commonly form as an approximation of the events. This approach has also been applied to climate models (e.g., Trapp et al. 2007). In general, as the planet warms, the energy available to fuel thunderstorms will increase, but other factors that support conditions favorable for tornadoes could decrease with climate change, leading to a question of which influence will dominate. Diffenbaugh et al. (2013) have suggested recently that the environments supportive of tornadoes will increase over the 21st century, although the large interannual variability seen in the observations may make it difficult for such trends to emerge from the noise for a long time.

    Fundamentally, even without a change in mean occurrence, the clustering of tornadoes could have significant impacts on society. In such a scenario, emergency response and insurance would find resources taxed significantly by an increase in the number of outbreaks. It is possible that the long-term casualty and damage averages wouldn’t change, but their distributions might have large changes. Preparing for such a future will require a great deal of thought and planning.

    Dr. Brooks was a contributing author for the Third and Fifth IPCC Assessment Reports, writing about severe thunderstorms, as well as writing the severe thunderstorm section of the US Climate Change Science Program Synthesis and Assessment Product report on Weather and Climate Extremes. Dr. Brooks will speak at the 2014 AGU Science Policy Conference on June 17th.

  • By Carolyn Berndt, Program Director for Sustainability, National League of Cities
    Originally posted on CitiesSpeak.org

    The continuing drought in the west and wildfires burning in the plains are real world examples—happening right now—of what scientists say is evidence of climate change. Remember the floods in Colorado last year and Hurricane Sandy the year before? Those too are indicative of the kinds of extreme weather events the U.S. will face in the coming years because of climate change. Hotter. Drier. Flooding and rising seas. Cities need to be prepared.

    [caption id="attachment_3839" align="alignright" width="300"]The continuing drought in the west and wildfires burning in the plains are real world examples—happening right now—of what scientists say is evidence of climate change. The continuing drought in the west and wildfires burning in the plains are real world examples—happening right now—of what scientists say is evidence of climate change.[/caption]

    The National Climate Assessment released by President Obama yesterday is the most comprehensive examination of climate change impacts on the U.S. Looking across seven sectors—human health, water, energy, transportation, agriculture, forests and ecosystems—and by region—Northeast, Southeast, Midwest, Great Plains, Southwest, Northwest, Alaska, Hawaii, and the country’s coastal areas, oceans, and marine resources—the report “concludes that the evidence of human-induced climate change continues to strengthen and that impacts are increasing across the country.”

    These short NCA videos tell the stories of climate change impacts by sector and region.

    As cities build infrastructure and revitalize neighborhoods, downtowns, and riverfronts, they can no longer rely on past forecasts to predict the future. For example, a water utility that looks at only historical rainfall when planning for future capacity needs, will miss the boat. Or need a bigger boat. Looking instead at climatic trends will paint a different, more accurate, picture for local governments. Here are some other key scientific take aways from the National Climate Assessment:

    Temperatures increasing and extreme heat waves

    U.S. average temperature has increased by 1.3°F to 1.9°F since 1895, and most of this increase has occurred since 1970. Because human-induced warming is superimposed on a background of natural variations in climate, warming is not uniform over time. Short-term fluctuations in the long-term upward trend are natural and expected. Temperatures are projected to rise another 2°F to 4°F in most areas of the United States over the next few decades.

    Heat waves have generally become more frequent across the U.S. in recent decades, with western regions (including Alaska) setting records for numbers of these events in the 2000s. The recent heat waves and droughts in Texas (2011) and the Midwest (2012) set records for highest monthly average temperatures, exceeding in some cases records set in the 1930s, including the highest monthly contiguous U.S. temperature on record (July 2012, breaking the July 1936 record) and the hottest summers on record in several states (New Mexico, Texas, Oklahoma, and Louisiana in 2011 and Colorado and Wyoming in 2012).

    Changes in precipitation and extreme rainfall events

    Since 1900, average annual precipitation over the U.S. has increased by roughly 5 percent. There is a clear national trend toward a greater amount of precipitation being concentrated in very heavy events, particularly in the Northeast and Midwest, while the Southwest is becoming drier.

    Across most of the United States, the heaviest rainfall events have become heavier and more frequent. The amount of rain falling on the heaviest rain days has increased over the past few decades. Since 1991, the amount of rain falling in very heavy precipitation events has been significantly above average. This increase has been greatest in the Northeast, Midwest, and upper Great Plains – more than 30 percent above the 1901-1960 average. There has also been an increase in flooding events in the Midwest and Northeast where the largest increases in heavy rain amounts have occurred.

    Sea levels rising

    Water expands as it warms, causing global sea levels to rise; melting of land-based ice also raises sea level by adding water to the oceans. Over the past century, global average sea level has risen by about 8 inches. Since 1992, the rate of global sea level rise measured by satellites has been roughly twice the rate observed over the last century, providing evidence of acceleration. Sea level is projected to rise by another 1 to 4 feet in this century.

    [caption id="attachment_3825" align="alignleft" width="234"]The third National Climate Assessment was released Tuesday, May 6th by the U.S. Global Change Research Program (USGCRP) The third National Climate Assessment was released Tuesday, May 6th by the U.S. Global Change Research Program (USGCRP)[/caption]

    This is not a partisan issue. These trends have real, everyday consequences for local governments, who are on the front lines when it comes to mitigation and adaptation efforts.

    Taking one of the trends above, sea level rise, the stakes couldn’t be higher for local governments. Combined with coastal storms, there is an increased risk of erosion, storm surge damage, and flooding for coastal communities, especially along the Gulf Coast, the Atlantic seaboard, and in Alaska, putting coastal infrastructure, including roads, rail lines, energy infrastructure, airports, port facilities, and military bases, and the nearly five million Americans and hundreds of billions of dollars of property located in areas that are less than four feet above the local high-tide level increasingly at risk.

    Local governments can’t go it alone. The federal government must step up and adopt and enact policies and programs that will support local efforts. The Administration has taken the lead with the President’s Climate Action Plan that created the President’s Task Force on Climate Preparedness and Resilience, comprised of state and local officials to advise the President on ways the federal government can assist local efforts to address and prepare for the impacts of climate change.

    NLC and ICLEI USA, under the auspices of the Resilient Communities for America campaign, where 175 mayors and other local elected leaders have pledged to create more resilient cities and towns, will release a policy report next week with nine recommendations on ways the federal government can remove barriers to local resilient investments, modernize federal grant and loan programs to better support local efforts, and develop the information and tools needed to prepare for climate change.

    The 1,300 page National Climate Assessment is presented online in an accessible, interactive and graphic manner, which makes it easy for local leaders to dive into for an understanding of how climate change will impact their communities. For communities that have not yet begun climate mitigation efforts, adaptation planning or implemented resilience-building programs, policies and practices, now is the time. Climate change is happening now.

    Carolyn Berndt is the Program Director for Sustainability on the NLC Federal Advocacy team. She leads NLC’s advocacy, regulatory, and policy efforts on energy and environmental issues, including water infrastructure and financing, air and water quality, climate change, and energy efficiency. Carolyn will speak at the 2014 AGU Science Policy Conference on June 17th. Follow Carolyn on Twitter at @BerndtCarolyn

  • By Lexi Shultz, Director of Public Affairs at the American Geophysical Union

    Today, the White House released the U.S. Global Change Research Program’s (USGCRP) third National Climate Assessment (NCA). This report, coming in at 1,300 pages and written by more than 300 authors (many of whom are American Geophysical Union members) is an impressive accounting of the many current and future effects of climate change across the country. The NCA synthesizes and summarizes the ways in which all regions and aspects of the U.S.—from Alaska to Florida, from human health to energy and national security—are being affected by climate change and will continue to be affected.

    [caption id="attachment_3825" align="alignright" width="234"]The third National Climate Assessment was released Tuesday, May 6th by the U.S. Global Change Research Program (USGCRP) The third National Climate Assessment was released Tuesday, May 6th by the U.S. Global Change Research Program (USGCRP)[/caption]

    But what excites me most about the NCA is the accessibility of the information and the report’s focus on—in addition to the scientific findings—decision making in the face of uncertainty.  Affected communities need information now about how they can cope with the effects they are feeling already, as well as how they can prepare for what’s to come. The information the NCA provides can help with that. The report also makes clear, however, that some of the projected impacts will be very costly – and unpleasant – to deal with.  As a result, I’m glad to see an emphasis not only on adaptation and preparation, but also mitigation.

    The American Geophysical Union’s stated position on climate change, written in 2003 and revised and reaffirmed most recently last year, is titled “Human‐Induced Climate Change Requires Urgent Action.”  AGU, through its members, crafted this statement to help us grapple with what we believe to be one of society’s greatest challenges.

    This NCA report only emphasizes this point.  AGU is committed to making sure that the NCA remains part of the conversation and that we don’t lose sight of it as an important resource as we move forward. For example, the report will play an important role in shaping our discussion of climate change preparedness at our Science Policy Conference in June.

    But we cannot do this alone. As the Director of Public Affairs at AGU, I think daily about how to connect scientists with policy makers and public. Scientists, policy-makers, and the public must not be siloed. My hope is that the NCA report, crafted for accessibility and usability will provide us with an opportunity to continue the dialogue between these three communities and to develop solutions that are regionally appropriate and based on sound science.

    This is where you come in. Explore the USGCRP website and the interactive report online. Then, I’d like to personally encourage you to engage with your communities on the topic of climate change preparedness, adaptation, and mitigation. Write a letter to the editor of your local newspaper, contact your legislators, or set up a community meeting, and then tell us about your experience using the Sharing Science tool on our website. As the one of the key findings of the NCA report states, “There is no ‘one-size fits all’ adaptation, but there are similarities in approaches across regions and sectors. Sharing best practices, learning by doing, and iterative and collaborative processes including stakeholder involvement, can help support progress.” If we are to make true progress toward climate change adaptation and mitigation, we need to start the conversation, from grassroots to grasstops, and across communities. Do you have ideas for ways AGU can be involved? Leave us a comment here or send us an email to sciencepolicy@agu.org.

  • Written by John Schelling, Washington State Emergency Management

    The invitation to contribute my perspective on tsunami risk reduction efforts to “The Bridge” arrived on my tablet as I sat in the Snohomish County Emergency Operations Center (EOC) in Everett, Washington. There I was—working as part of the response and recovery effort to a major landslide (the Oso landslide, which occurred at 10:37 a.m. on March 22)–and presented with the question, “Just how much of a threat are tsunamis, really?”

    [caption id="attachment_3808" align="alignright" width="300"]Combination berm-tower structure in profile view Combination structures offer the advantages of two types of structures: berm and tower. In the berm-tower combination, the footprint in reduced by creating a tower platform that is accessed by a series of ramps and/or sloping berms and can reduce visual impacts of hardened towers or large berms. Credit: Ron Kasprisin, University of Washington  Combination berm-tower structure in profile viewCombination structures offer the advantages of two types of structures: berm and tower. In the berm-tower combination, the footprint in reduced by creating a tower platform that is accessed by a series of ramps and/or sloping berms and can reduce visual impacts of hardened towers or large berms. Credit: Ron Kasprisin, University of Washington[/caption]

    The Oso landslide was a localized event that affected three small communities in the foothills of the Pacific Northwest. However, this disaster features some of the very same challenges and opportunities, to mitigate losses as we see with an earthquake and tsunami.

    With the recent earthquakes that rocked Chile and the Solomon Islands in the same timeframe as the Oso landslide and the 10-year anniversary of the Indian Ocean earthquake and tsunami on the horizon, it’s appropriate to reflect on the past, take stock of our progress, and consider what the future might hold.

    The Great Sumatra–Andaman earthquake and tsunami, which claimed more than 230,000 victims on that fateful December day in 2004, was a pronounced tragedy felt throughout the world. In the United States, Congress recognized our shorelines face similar threats; that our tsunami warning system needed to be improved and that our coastal communities needed to be better prepared for the day when an earthquake and tsunami strikes U.S. shores.

    In 2006, Congress passed the Tsunami Warning and Education Act (TWEA), which expired in 2012. TWEA helped solidify work that had been underway since 1995 as part of the National Tsunami Hazard Mitigation Program (NTHMP). Through the NTHMP, remarkable progress was made in a relatively short time to improve the readiness of coastal residents and visitors. The message was getting out to people that they had to evacuate at a moment’s notice when the ground begins to shake from a local earthquake, a precursor for a potential local tsunami, and they knew where to go. Despite this substantial progress, more work remains.

    Improvements in scientific research of tsunami sources and the technological side of the warning system have enabled states, like Washington, to actually eliminate the need for dangerous evacuations. During the 2011 Great East Japan Earthquake and Tsunami, we went ‘all in’ on the tsunami warning system and the science behind it. And it paid off, negating the need to evacuate the entire ocean coast of Washington State. It also paid off for Alaska, Oregon, California, and Hawaii when they could respond effectively to a Tsunami Warning issued for their coasts.

    The reality is that many of our coastal communities are poised to respond more quickly and are much safer today against tsunami threats than they were in 1995. However, some communities are better prepared for a distant tsunami than for one closer to home. Many coastal areas do not have high ground for evacuating away from a local tsunami. In these cases, using a strategy called vertical evacuation is perhaps the only way to improve life-safety.

    Through the NTHMP, Washington State initiated an effort known as Project Safe Haven to empower local communities that lack natural high ground to develop plans for integrating multi-purpose reinforced towers, buildings, and berms into the natural and built environments. And, we’re seeing results. The voters of Westport, Washington recently approved a local bond to construct the nation’s first tsunami vertical evacuation refuge as part of new elementary school. In the coastal community of Long Beach, Washington, the city council approved an application for construction of vertical evacuation berm adjacent to the community’s school and downtown business district. That application is currently in review with the Federal Emergency Management Agency.

    Following disasters, like the Oso landslide or Great East Japan Earthquake, we talk a lot about ‘lessons learned’. But, we must have the opportunities and resources to apply the solutions once we’ve learned the lesson. One of the key lessons from the Great East Japan Earthquake is the need for continuous updates of hazard assessments. Who knew that a subduction zone was capable of 50 meters of slip? Now that we know, we need to revisit previously held assumptions and perhaps revise prior hazard maps, evacuation areas, and evacuation routes. Like preparedness, this process is never done. There has to be a feedback loop to consider new data and, when necessary, revise hazard maps and ensure that the public is fully educated about these changes and communities are appropriately resourced to conduct drills to actually practice their plans.

    Japan has invested in technologies like offshore GPS and Canada has invested in ocean bottom seismometers. The United States has not. The U.S. can’t afford not to deploy similar technologies to further improve our understanding of threatening subduction zones. The advancements in earthquake early warning technologies in California and the West Coast of the United States need to be integrated into existing systems and processes so there is no confusion about what life-safety actions need to be taken with perhaps nothing more than a few seconds notice– especially for people living on the coast.

    From my perspective, we must always remember that there is a delicate balance between science, technology, and their actual application. Investment in all of the technological marvels one can consider won’t stop the earth from shaking. We must, however, continue to ensure residents and visitors in tsunami hazard zones understand nature’s warning systems and can react quickly and respond appropriately when the time comes.

    On March 27, 2014, Senator Mark Begich from Alaska, along with co-sponsors that included Washington State Senator Maria Cantwell and Hawaii Senator Brian Schatz, introduced the Tsunami Warning and Education Reauthorization Act (TWERA). Hopefully, the U.S. House of Representatives will consider this or a similar measure in order to ensure that communities have the resources they need to keep calm, carry on and prepare.

    John Schelling is the Earthquake, Tsunami, and Volcano Programs Manager for Washington State Emergency Management (WA EMD) and has extensive experience in emergency management, land use planning, and risk reduction policy analysis. He is currently serving as the Interim Mitigation & Recovery Section Manager for WA EMD.

  • ThumbnailBy Alexandra Branscombe

    Originally posted on AGU GeoSpace

    WASHINGTON, DC – Phasing down powerful climate-damaging greenhouse gases used in refrigerators and air conditioners could prevent the equivalent of […]

  • kcompton became a registered member 3 years, 3 months ago

  • Load More