With each passing year, our planet experiences the effects of climate change in bigger and more catastrophic ways. Longer, drier summers are leading to drought, devastating wildfires and lost crops. Intense periods of rainfall and rising sea levels are flooding our cities. Warming oceans are adversely affecting marine species, destroying coral reefs and disrupting coastal communities that rely on fishing industries.
Our goal at Vulcan is to improve the world’s understanding of climate change and its effects, especially on rainfall patterns and extremes. One way we’re working to achieve this is by improving existing climate modeling methods. Using the latest programming languages along with supercomputing and machine learning technologies, our Vulcan Climate Modeling (VCM) team is partnering with NOAA, MeteoSchweiz and the University of Washington’s Department of Atmospheric Sciences to produce faster, more accurate forecasts.
Chris Bretherton (left) and Oli Fuhrer
Below, Chris Bretherton, Sr. Director of VCM, who leads the machine learning group and his colleague Oli Fuhrer, Sr. Director on the VCM team who leads the Domain Specific Language (DSL) group, tell us why climate models matter, how their innovations will be “game changers” in the field, and what makes them optimistic about the future of our planet.
What are climate models and why do they matter?
Humans are changing climate. Scientists can measure alarming trends in carbon dioxide and other greenhouse gases that act like an insulating blanket that warms up the Earth’s surface. They can unequivocally attribute the CO2 trend to human activities, mostly the combustion of fossil fuels. Swedish chemist Arrhenius first told us over a century ago that the Earth will warm a few degrees Fahrenheit for each doubling of CO2.
To predict the precise amount and geographic distribution of land and ocean warming, how it will affect rainfall, snowfall, ice sheets, fisheries, crops and forests, we need computer models of climate. They work by simulating Earth-like weather, ocean currents, soils and vegetation, and ice and snow hour-by-hour on a grid of points covering the globe for periods of many years, as greenhouse gas concentrations or other climate change agents gradually change.
The first global climate models were developed almost 60 years ago. They have improved as we better understand how to translate the Earth system into mathematical equations solvable on a computer, and as we use finer computational grids to better represent features such as mountain ranges and islands. Current climate models use grids covering the globe with columns of points spaced 20-150 miles apart. Finer grids are more accurate but take much more computer resources for each simulated year.
How do they currently inform future climate change? (What do they currently tell us?)
There are upward of 20 global climate models independently developed by centers all around the world. All these climate models predict a pattern of warming consistent with what we are already seeing, stronger over land than ocean and strongest in the Arctic. They predict the Arctic will become wetter, but the Mediterranean will become drier. They predict that unless we stop greenhouse emissions very soon, the Arctic Ocean will be ice-free by mid-century and sea level will rise by up to 4 feet by the end of the century. They predict that skiing near Seattle will become an increasingly soggy affair, but that England may be the next great wine terroir. They predict longer droughts but also more intense rainfall and hurricanes.
Importantly, they can project how different trajectories of a transition away from fossil fuels would likely change the amount of climate change that we will see and the damages that might result. The scientific consensus is that the risks of catastrophic and irreversible impacts increase rapidly if the global-average warming exceeds 4°F (2°C), which will occur within 50 years unless we reduce human CO2 emissions drastically in the next few decades.
What have been the limitations of climate models past? (What data are we missing?)
Climate models must represent a mind-boggling array of processes, some of which we understand well (e. g. how air and water flow), some of which we don’t (e. g. the future of life in a warmer, more acidic ocean). This understanding is often generalized from limited data gathered in a few places and times. For instance, scientists study the Jakobshavn Glacier in west Greenland to learn about the climate change response of large glaciers in general. Difficult compromises must be made between adding complexity and keeping the climate model efficient and understandable. Even processes we do understand, such as air flow around mountains, can be challenging to accurately represent on the coarse grid of a climate model.
Three key atmospheric modeling uncertainties in predicting climate change over the next century involve rain and clouds. Rain and clouds challenge climate models because they vary a lot within an individual grid cell of a global climate model, and such variability is not naturally predicted by the model. The world’s climate modeling centers deal with this in diverse ways, leading to different climate change predictions.
The first uncertainty, which is our focus at Vulcan Inc., is how regional precipitation will change. This is a particular concern over semi-arid land regions, where agriculture and cities are vulnerable to drought. Some climate models suggest increased rainfall over the Sahel in western and north-central Africa, some suggest a rainfall decrease in the 21st century. Two other key uncertainties are the response of clouds to greenhouse warming and to aerosol pollution; these ‘cloud feedbacks’ affect the overall sensitivity of the amount of global warming to human activities. In the future, our strategy could be extended to tackle those uncertainties, too.
How is your team working to change climate modeling?
Our goal at Vulcan is to take better advantage of global weather models that employ extremely fine grids with a horizontal spacing of 1-2 miles. These grids are fine enough to simulate individual thunderstorms that produce much of the world’s rainfall. We could use such models for studying climate with carefully designed simulations of as little as a year or two.
One roadblock we come across is that current climate models are challenged to efficiently use an ever-changing landscape of new supercomputers. Our domain-specific language work is implementing a new way of designing climate models to solve this problem, allowing fine-grid weather and climate models to run up to tenfold faster and longer. Our machine learning (ML) work is using output from such models to discover the variability of clouds and rain within the much coarser grid of a climate model affordable for century-long simulations. The machine learning software is designed to improve the climate model accuracy and reduce its rainfall prediction uncertainty by 50 percent. This will help societies plan for and adapt to the huge water resource, agricultural and ecosystem impacts of climate change.
Who are you working with in partnership and why?
The Vulcan Climate Modeling team has a strong collaboration with NOAA’s Geophysical Fluid Dynamics Laboratory (GFDL), a world-leading climate modeling center in Princeton, NJ. Our work is focused on improving their experimental fine-grid model, SHiELD, which shares components with the U. S. global weather forecasting model. This collaboration brings our team’s innovation together with GFDL’s climate modeling experience and computing resources to achieve quicker impact that can set an example for other climate modeling centers to follow.
To complete the impact chain, the international climate modeling community, through a process called CMIP, undertakes coordinated climate change simulations which are archived to help stakeholder groups plan for climate change. This simulation archive is also a foundation of international climate assessments (organized by the Intergovernmental Panel on Climate Change, or IPCC) and national climate assessments (which the U. S. releases quadrennially), which feed into annual United Nations Conference of the Parties (COP) meetings where governments negotiate international climate change agreements such as the Paris accord.
Vulcan's Climate Modeling Team at work in Seattle.
What makes the Vulcan Climate Modeling team unique?
Across the board, we work closely with each other and with our partner institutions.
Chris Bretherton has been a University of Washington Professor of Atmospheric Science and Applied Mathematics for nearly 35 years, studying how to better understand and simulate the role of clouds in the climate system. His team members have all been awarded doctorate degrees in the last five years. Five of seven did either graduate or post-doctoral studies at University of Washington in diverse areas of climate science. All seven have a passion for Python, Machine Learning and cloud computing.
- Noah Brenowitz studied cumulonimbus cloud systems using machine learning
- Andre Perkins studied the climate of the last thousand years
- Brian Henn studied ‘atmospheric rivers’ and California flooding
- Jeremy McGibbon used machine learning to predict low-lying marine clouds
- Oli Watt-Meyer modeled how tropical rain belts respond to global warming
- Astrophysicist Anna Kwa honed her Machine Learning expertise at Zillow.
- Spencer Clark is embedded at GFDL, where he earlier studied monsoons
Oli Fuhrer, who came to Vulcan from MeteoSchweiz – Switzerland’s national weather service, pioneered the use of domain specific languages to speed up their weather models, allowing them to exploit a powerful new supercomputer that uses graphical processor units and other accelerator technologies. His team brings together experts from around the world in numerical modeling, compiler design, and scientific computing:
- Compiler expert Tobias Wicky came from Zurich, where he worked with Oli.
- Rhea George applied atmospheric modeling to renewable energy forecasting
- Johann Dahm worked on code development and optimization for IBM
- Mark Cheeseman worked in a supercomputing center in Australia
- Eddie Davis did his Ph. D. on compiler design at Boise State University
- Computational physicist Oliver Elbert is embedded at GFDL
In Seattle, our team sits and works together and has spirited discussions about local lunch spots, training regimens, and outdoor adventures. They talk daily to our Vulcan Climate Modeling team members and collaborators in Princeton and Zurich.
How will this be a game-changer for predicting future climate change?
With 1-3 kilometer grids, global models can better simulate high-impact extreme precipitation, including over mountainous terrain. Using such a model to train our ML scheme, we hope to similarly improve coarser-resolution models to inform planning for flood control, land use and other areas in a changing climate. Currently, a second stage of ‘downscaling’ software is used for that, which adds a lot of uncertainty and cost.
What will stakeholders (governments, etc.) be able to do with these climate models once they are available?
Ruinous hurricanes and wildfires are reminders of our pressing need for reliable prediction of weather extremes in our changing climate. How high of a sea-wall does New York need to protect against higher sea level and bigger waves driven by more intense cyclones decades from now. Where should homeowners in the Australian bush be allowed to rebuild their scorched homes and will they be able to get fire insurance? Most planners will consult climate model predictions, so even modest reductions in their uncertainty will inform smarter decisions.
What makes you optimistic about our planet’s future?
Our planet has gone through a lot of climates, from a volcano-induced hot house 50 million years ago to an ice-covered ‘snowball Earth’ 700 million years ago, and everything in between. Life on Earth will adapt to any climate, though perhaps in a form very different than now.
But it is also not too late to avoid unrecognizably changing the planet. A concerted global effort over the next 30 years can turn the corner on the global warming challenge. When really pushed to it, even humans have proved smart enough to use our incredible knowledge to rescue ourselves, our societies, and our natural world from our self-made crises.
What can the average person do to help?
Climate change is a pressing collective problem and requires strong collective action. A widely touted goal is to reduce our global CO2 emissions by 80 percent by 2050, to avoid catastrophic impacts by keeping global warming under 3 F.
Luckily, we know how to get started. The most important thing we can do is advocate and support government policies that spur quick growth in renewable energy such as wind and solar power, as well as enabling technologies (electric vehicles and mass transit, smart grids, power storage, green building design). We should work to raise consciousness among friends, neighbors, kids, so that everyone gets behind solving the climate change problem. Lastly, we should also try to reduce our personal carbon footprint through energy-efficient and resource-efficient choices at home, in our commute and in our travel.