Earth science

everyone-agrees:-2024-the-hottest-year-since-the-thermometer-was-invented

Everyone agrees: 2024 the hottest year since the thermometer was invented


An exceptionally hot outlier, 2024 means the streak of hottest years goes to 11.

With very few and very small exceptions, 2024 was unusually hot across the globe. Credit: Copernicus

Over the last 24 hours or so, the major organizations that keep track of global temperatures have released figures for 2024, and all of them agree: 2024 was the warmest year yet recorded, joining 2023 as an unusual outlier in terms of how rapidly things heated up. At least two of the organizations, the European Union’s Copernicus and Berkeley Earth, place the year at about 1.6° C above pre-industrial temperatures, marking the first time that the Paris Agreement goal of limiting warming to 1.5° has been exceeded.

NASA and the National Oceanic and Atmospheric Administration both place the mark at slightly below 1.5° C over pre-industrial temperatures (as defined by the 1850–1900 average). However, that difference largely reflects the uncertainties in measuring temperatures during that period rather than disagreement over 2024.

It’s hot everywhere

2023 had set a temperature record largely due to a switch to El Niño conditions midway through the year, which made the second half of the year exceptionally hot. It takes some time for that heat to make its way from the ocean into the atmosphere, so the streak of warm months continued into 2024, even as the Pacific switched into its cooler La Niña mode.

While El Niños are regular events, this one had an outsized impact because it was accompanied by unusually warm temperatures outside the Pacific, including record high temperatures in the Atlantic and unusual warmth in the Indian Ocean. Land temperatures reflect this widespread warmth, with elevated temperatures on all continents. Berkeley Earth estimates that 104 countries registered 2024 as the warmest on record, meaning 3.3 billion people felt the hottest average temperatures they had ever experienced.

Different organizations use slightly different methods to calculate the global temperature and have different baselines. For example, Copernicus puts 2024 at 0.72° C above a baseline that will be familiar to many people since they were alive for it: 1991 to 2000. In contrast, NASA and NOAA use a baseline that covers the entirety of the last century, which is substantially cooler overall. Relative to that baseline, 2024 is 1.29° C warmer.

Lining up the baselines shows that these different services largely agree with each other, with most of the differences due to uncertainties in the measurements, with the rest accounted for by slightly different methods of handling things like areas with sparse data.

Describing the details of 2024, however, doesn’t really capture just how exceptional the warmth of the last two years has been. Starting in around 1970, there’s been a roughly linear increase in temperature driven by greenhouse gas emissions, despite many individual years that were warmer or cooler than the trend. The last two years have been extreme outliers from this trend. The last time there was a single comparable year to 2024 was back in the 1940s. The last time there were two consecutive years like this was in 1878.

A graph showing a curve that increases smoothly from left to right, with individual points on the curve hosting red and blue lines above and below. The red line at 2024 is larger than any since 1978.

Relative to the five-year temperature average, 2024 is an exceptionally large excursion. Credit: Copernicus

“These were during the ‘Great Drought’ of 1875 to 1878, when it is estimated that around 50 million people died in India, China, and parts of Africa and South America,” the EU’s Copernicus service notes. Despite many climate-driven disasters, the world at least avoided a similar experience in 2023-24.

Berkeley Earth provides a slightly different way of looking at it, comparing each year since 1970 with the amount of warming we’d expect from the cumulative greenhouse gas emissions.

A graph showing a reddish wedge, growing from left to right. A black line traces the annual temperatures, which over near the top edge of the wedge until recent years.

Relative to the expected warming from greenhouse gasses, 2024 represents a large departure. Credit: Berkeley Earth

These show that, given year-to-year variations in the climate system, warming has closely tracked expectations over five decades. 2023 and 2024 mark a dramatic departure from that track, although it comes at the end of a decade where most years were above the trend line. Berkeley Earth estimates that there’s just a 1 in 100 chance of that occurring due to the climate’s internal variability.

Is this a new trend?

The big question is whether 2024 is an exception and we should expect things to fall back to the trend that’s dominated since the 1970s, or it marks a departure from the climate’s recent behavior. And that’s something we don’t have a great answer to.

If you take away the influence of recent greenhouse gas emissions and El Niño, you can focus on other potential factors. These include a slight increase expected due to the solar cycle approaching its maximum activity. But, beyond that, most of the other factors are uncertain. The Hunga Tonga eruption put lots of water vapor into the stratosphere, but the estimated effects range from slight warming to cooling equivalent to a strong La Niña. Reductions in pollution from shipping are expected to contribute to warming, but the amount is debated.

There is evidence that a decrease in cloud cover has allowed more sunlight to be absorbed by the Earth, contributing to the planet’s warming. But clouds are typically a response to other factors that influence the climate, such as the amount of water vapor in the atmosphere and the aerosols present to seed water droplets.

It’s possible that a factor that we missed is driving the changes in cloud cover or that 2024 just saw the chaotic nature of the atmosphere result in less cloud cover. Alternatively, we may have crossed a warming tipping point, where the warmth of the atmosphere makes cloud formation less likely. Knowing that will be critical going forward, but we simply don’t have a good answer right now.

Climate goals

There’s an equally unsatisfying answer to what this means for our chance of hitting climate goals. The stretch goal of the Paris Agreement is to limit warming to 1.5° C, because it leads to significantly less severe impacts than the primary, 2.0° target. That’s relative to pre-industrial temperatures, which are defined using the 1850–1900 period, the earliest time where temperature records allow a reconstruction of the global temperature.

Unfortunately, all the organizations that handle global temperatures have some differences in the analysis methods and data used. Given recent data, these differences result in very small divergences in the estimated global temperatures. But with the far larger uncertainties in the 1850–1900 data, they tend to diverge more dramatically. As a result, each organization has a different baseline, and different anomalies relative to that.

As a result, Berkeley Earth registers 2024 as being 1.62° C above preindustrial temperatures, and Copernicus 1.60° C. In contrast, NASA and NOAA place it just under 1.5° C (1.47° and 1.46°, respectively). NASA’s Gavin Schmidt said this is “almost entirely due to the [sea surface temperature] data set being used” in constructing the temperature record.

There is, however, consensus that this isn’t especially meaningful on its own. There’s a good chance that temperatures will drop below the 1.5° mark on all the data sets within the next few years. We’ll want to see temperatures consistently exceed that mark for over a decade before we consider that we’ve passed the milestone.

That said, given that carbon emissions have barely budged in recent years, there’s little doubt that we will eventually end up clearly passing that limit (Berkeley Earth is essentially treating it as exceeded already). But there’s widespread agreement that each increment between 1.5° and 2.0° will likely increase the consequences of climate change, and any continuing emissions will make it harder to bring things back under that target in the future through methods like carbon capture and storage.

So, while we may have committed ourselves to exceed one of our major climate targets, that shouldn’t be viewed as a reason to stop trying to limit greenhouse gas emissions.

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

Everyone agrees: 2024 the hottest year since the thermometer was invented Read More »

study:-warming-has-accelerated-due-to-the-earth-absorbing-more-sunlight

Study: Warming has accelerated due to the Earth absorbing more sunlight

The concept of an atmospheric energy imbalance is pretty straightforward: We can measure both the amount of energy the Earth receives from the Sun and how much energy it radiates back into space. Any difference between the two results in a net energy imbalance that’s either absorbed by or extracted from the ocean/atmosphere system. And we’ve been tracking it via satellite for a while now as rising greenhouse gas levels have gradually increased the imbalance.

But greenhouse gases aren’t the only thing having an effect. For example, the imbalance has also increased in the Arctic due to the loss of snow cover and retreat of sea ice. The dark ground and ocean absorb more solar energy compared to the white material that had previously been exposed to the sunlight. Not all of this is felt directly, however, as a lot of the areas where it’s happening are frequently covered by clouds.

Nevertheless, the loss of snow and ice has caused the Earth’s reflectivity, termed its albedo, to decline since the 1970s, enhancing the warming a bit.

Vanishing clouds

The new paper finds that the energy imbalance set a new high in 2023, with a record amount of energy being absorbed by the ocean/atmosphere system. This wasn’t accompanied by a drop in infrared emissions from the Earth, suggesting it wasn’t due to greenhouse gases, which trap heat by absorbing this radiation. Instead, it seems to be due to decreased reflection of incoming sunlight by the Earth.

While there was a general trend in that direction, the planet set a new record low for albedo in 2023. Using two different data sets, the teams identify the areas most effected by this, and they’re not at the poles, indicating loss of snow and ice are unlikely to be the cause. Instead, the key contributor appears to be the loss of low-level clouds. “The cloud-related albedo reduction is apparently largely due to a pronounced decline of low-level clouds over the northern mid-latitude and tropical oceans, in particular the Atlantic,” the researchers say.

Study: Warming has accelerated due to the Earth absorbing more sunlight Read More »

google’s-deepmind-tackles-weather-forecasting,-with-great-performance

Google’s DeepMind tackles weather forecasting, with great performance

By some measures, AI systems are now competitive with traditional computing methods for generating weather forecasts. Because their training penalizes errors, however, the forecasts tend to get “blurry”—as you move further ahead in time, the models make fewer specific predictions since those are more likely to be wrong. As a result, you start to see things like storm tracks broadening and the storms themselves losing clearly defined edges.

But using AI is still extremely tempting because the alternative is a computational atmospheric circulation model, which is extremely compute-intensive. Still, it’s highly successful, with the ensemble model from the European Centre for Medium-Range Weather Forecasts considered the best in class.

In a paper being released today, Google’s DeepMind claims its new AI system manages to outperform the European model on forecasts out to at least a week and often beyond. DeepMind’s system, called GenCast, merges some computational approaches used by atmospheric scientists with a diffusion model, commonly used in generative AI. The result is a system that maintains high resolution while cutting the computational cost significantly.

Ensemble forecasting

Traditional computational methods have two main advantages over AI systems. The first is that they’re directly based on atmospheric physics, incorporating the rules we know govern the behavior of our actual weather, and they calculate some of the details in a way that’s directly informed by empirical data. They’re also run as ensembles, meaning that multiple instances of the model are run. Due to the chaotic nature of the weather, these different runs will gradually diverge, providing a measure of the uncertainty of the forecast.

At least one attempt has been made to merge some of the aspects of traditional weather models with AI systems. An internal Google project used a traditional atmospheric circulation model that divided the Earth’s surface into a grid of cells but used an AI to predict the behavior of each cell. This provided much better computational performance, but at the expense of relatively large grid cells, which resulted in relatively low resolution.

For its take on AI weather predictions, DeepMind decided to skip the physics and instead adopt the ability to run an ensemble.

Gen Cast is based on diffusion models, which have a key feature that’s useful here. In essence, these models are trained by starting them with a mixture of an original—image, text, weather pattern—and then a variation where noise is injected. The system is supposed to create a variation of the noisy version that is closer to the original. Once trained, it can be fed pure noise and evolve the noise to be closer to whatever it’s targeting.

In this case, the target is realistic weather data, and the system takes an input of pure noise and evolves it based on the atmosphere’s current state and its recent history. For longer-range forecasts, the “history” includes both the actual data and the predicted data from earlier forecasts. The system moves forward in 12-hour steps, so the forecast for day three will incorporate the starting conditions, the earlier history, and the two forecasts from days one and two.

This is useful for creating an ensemble forecast because you can feed it different patterns of noise as input, and each will produce a slightly different output of weather data. This serves the same purpose it does in a traditional weather model: providing a measure of the uncertainty for the forecast.

For each grid square, GenCast works with six weather measures at the surface, along with six that track the state of the atmosphere and 13 different altitudes at which it estimates the air pressure. Each of these grid squares is 0.2 degrees on a side, a higher resolution than the European model uses for its forecasts. Despite that resolution, DeepMind estimates that a single instance (meaning not a full ensemble) can be run out to 15 days on one of Google’s tensor processing systems in just eight minutes.

It’s possible to make an ensemble forecast by running multiple versions of this in parallel and then integrating the results. Given the amount of hardware Google has at its disposal, the whole process from start to finish is likely to take less than 20 minutes. The source and training data will be placed on the GitHub page for DeepMind’s GraphCast project. Given the relatively low computational requirements, we can probably expect individual academic research teams to start experimenting with it.

Measures of success

DeepMind reports that GenCast dramatically outperforms the best traditional forecasting model. Using a standard benchmark in the field, DeepMind found that GenCast was more accurate than the European model on 97 percent of the tests it used, which checked different output values at different times in the future. In addition, the confidence values, based on the uncertainty obtained from the ensemble, were generally reasonable.

Past AI weather forecasters, having been trained on real-world data, are generally not great at handling extreme weather since it shows up so rarely in the training set. But GenCast did quite well, often outperforming the European model in things like abnormally high and low temperatures and air pressure (one percent frequency or less, including at the 0.01 percentile).

DeepMind also went beyond standard tests to determine whether GenCast might be useful. This research included projecting the tracks of tropical cyclones, an important job for forecasting models. For the first four days, GenCast was significantly more accurate than the European model, and it maintained its lead out to about a week.

One of DeepMind’s most interesting tests was checking the global forecast of wind power output based on information from the Global Powerplant Database. This involved using it to forecast wind speeds at 10 meters above the surface (which is actually lower than where most turbines reside but is the best approximation possible) and then using that number to figure out how much power would be generated. The system beat the traditional weather model by 20 percent for the first two days and stayed in front with a declining lead out to a week.

The researchers don’t spend much time examining why performance seems to decline gradually for about a week. Ideally, more details about GenCast’s limitations would help inform further improvements, so the researchers are likely thinking about it. In any case, today’s paper marks the second case where taking something akin to a hybrid approach—mixing aspects of traditional forecast systems with AI—has been reported to improve forecasts. And both those cases took very different approaches, raising the prospect that it will be possible to combine some of their features.

Nature, 2024. DOI: 10.1038/s41586-024-08252-9  (About DOIs).

Google’s DeepMind tackles weather forecasting, with great performance Read More »

a-how-to-for-ethical-geoengineering-research

A how-to for ethical geoengineering research

Holistic climate justice: The guidelines recognize that geoengineering won’t affect just those people currently residing on Earth, but on future generations as well. Some methods, like stratospheric aerosols, don’t eliminate the risks caused by warming, but shift them onto future generations, who will face sudden and potentially dramatic warming if the geoengineering is ever stopped. Others may cause regional differences in either benefits or warming, shifting consequences to different populations.

Special attention should be paid to those who have historically been on the wrong side of environmental problems in the past. And harms to nature need to be considered as well.

Inclusive public participation: The research shouldn’t be approached as simply a scientific process; instead, any affected communities should be included in the process, and informed consent should be obtained from them. There should be ongoing public engagement with those communities and adapt to their cultural values.

Transparency: The public needs to be aware of who’s funding any geoengineering research and ensure that whoever’s providing the money doesn’t influence decisions regarding the design of the research. Those decisions, and the considerations behind them, should also be made clear to the public.

Informed governance: Any experiments have to conform to laws ranging from local to international. Any research programs should be approved by an independent body before any work starts. All the parties involved—and this could include the funders, the institutions, and outside contractors—should be held accountable to governments, public institutions, and those who will potentially be impacted by the work.

If you think this will make pursuing this research considerably more complicated, you are absolutely correct. But again, even tests of these approaches could have serious environmental consequences. And many of these things represent best practices for any research with potential public consequences; the fact that they haven’t always been pursued is not an excuse to continue to avoid doing them.

A how-to for ethical geoengineering research Read More »

climate-change-boosted-milton’s-landfall-strength-from-category-2-to-3

Climate change boosted Milton’s landfall strength from Category 2 to 3

Using this simulated data set, called IRIS, the researchers selected for those storms that made landfall along a track similar to that of Milton. Using these, they show that the warming climate has boosted the frequency of storms of Milton’s intensity by 40 percent. Correspondingly, the maximum wind speeds of similar storms have been boosted by about 10 percent. In Milton’s case, that means that, in the absence of climate change, it was likely to have made landfall as a Category 2 storm, rather than the Category 3 it actually was.

Rainfall

The lack of full meteorological data caused a problem when it came to analyzing Milton’s rainfall. The researchers ended up having to analyze rainfall more generally. They took four data sets that do track rainfall across these regions and tracked the link between extreme rainfall and the warming climate to estimate how much more often extreme events occur in a world that is now 1.3° C warmer than it was in pre-industrial times.

They focus on instances of extreme one-day rainfall within the June to November period, looking specifically at 1-in-10-year and 1-in-100-year events. Both of these produced similar results, suggesting that heavy one-day rainfalls are about twice as likely in today’s climates, and the most extreme of these are between 20 and 30 percent more intense.

These results came from three of the four data sets used, which produced largely similar results. The fourth dataset they used suggested a far stronger effect of climate change, but since it wasn’t consistent with the rest, these results weren’t used.

As with the Helene analysis, it’s worth noting that this work represents a specific snapshot in time along a long-term warming trajectory. In other words, it’s looking at the impact of 1.3° C of warming at a time when our emissions are nearly at the point where they commit us to at least 1.5° C of warming. And that will tilt the scales further in favor of extreme weather events like this.

Climate change boosted Milton’s landfall strength from Category 2 to 3 Read More »

rapid-analysis-finds-climate-change’s-fingerprint-on-hurricane-helene

Rapid analysis finds climate change’s fingerprint on Hurricane Helene

The researchers identified two distinct events associated with Helene’s landfall. The first was its actual landfall along the Florida coast. The second was the intense rainfall on the North Carolina/Tennessee border. This rainfall came against a backdrop of previous heavy rain caused by a stalled cold front meeting moisture brought north by the fringes of the hurricane. These two regions were examined separately.

A changed climate

In these two regions, the influence of climate change is estimated to have caused a 10 percent increase in the intensity of the rainfall. That may not seem like much, but it adds up. Over both a two- and three-day window centered on the point of maximal rainfall, climate change is estimated to have increased rainfall along the Florida Coast by 40 percent. For the southern Appalachians, the boost in rainfall is estimated to have been 70 percent.

The probability of storms with the wind intensity of Helene hitting land near where it did is about a once-in-130-year event in the IRIS dataset. Climate change has altered that so it’s now expected to return about once every 50 years. The high sea surface temperatures that helped fuel Helene are estimated to have been made as much as 500 times more likely by our changed climate.

Overall, the researchers estimate that rain events like Helene’s landfall should now be expected about once every seven years, although the uncertainty is large (running from three to 25 years). For the Appalachian region, where rainfall events this severe don’t appear in our records, they are likely to now be a once-in-every-70-years event thanks to climate warming (with an uncertainty of between 20 and 3,000 years).

“Together, these findings show that climate change is enhancing conditions conducive to the most powerful hurricanes like Helene, with more intense rainfall totals and wind speeds,” the researchers behind the work conclude.

Rapid analysis finds climate change’s fingerprint on Hurricane Helene Read More »

how-did-volcanism-trigger-climate-change-before-the-eruptions-started?

How did volcanism trigger climate change before the eruptions started?

Image of a person in a stream-filled gap between two tall rock faces.

Enlarge / Loads of lava: Kasbohm with a few solidified lava flows of the Columbia River Basalts.

Joshua Murray

As our climate warms beyond its historical range, scientists increasingly need to study climates deeper in the planet’s past to get information about our future. One object of study is a warming event known as the Miocene Climate Optimum (MCO) from about 17 to 15 million years ago. It coincided with floods of basalt lava that covered a large area of the Northwestern US, creating what are called the “Columbia River Basalts.” This timing suggests that volcanic CO2 was the cause of the warming.

Those eruptions were the most recent example of a “Large Igneous Province,” a phenomenon that has repeatedly triggered climate upheavals and mass extinctions throughout Earth’s past. The Miocene version was relatively benign; it saw CO2 levels and global temperatures rise, causing ecosystem changes and significant melting of Antarctic ice, but didn’t trigger a mass extinction.

A paper just published in Geology, led by Jennifer Kasbohm of the Carnegie Science’s Earth and Planets Laboratory, upends the idea that the eruptions triggered the warming while still blaming them for the peak climate warmth.

The study is the result of the world’s first successful application of high-precision radiometric dating on climate records obtained by drilling into ocean sediments, opening the door to improved measurements of past climate changes. As a bonus, it confirms the validity of mathematical models of our orbits around the Solar System over deep time.

A past climate with today’s CO2 levels

“Today, with 420 parts per million [of CO2], we are basically entering the Miocene Climate Optimum,” said Thomas Westerhold of the University of Bremen, who peer-reviewed Kasbohm’s study. While our CO2 levels match, global temperatures have not yet reached the MCO temperatures of up to 8° C above the preindustrial era. “We are moving the Earth System from what we call the Ice House world… in the complete opposite direction,” said Westerhold.

When Kasbohm began looking into the link between the basalts and the MCO’s warming in 2015, she found that the correlation had huge uncertainties. So she applied high-precision radiometric dating, using the radioactive decay of uranium trapped within zircon crystals to determine the age of the basalts. She found that her new ages no longer spanned the MCO warming. “All of these eruptions [are] crammed into just a small part of the Miocene Climate Optimum,” said Kasbohm.

But there were also huge uncertainties in the dates for the MCO, so it was possible that the mismatch was an artifact of those uncertainties. Kasbohm set out to apply the same high-precision dating to the marine sediments that record the MCO.

A new approach to an old problem

“What’s really exciting… is that this is the first time anyone’s applied this technique to sediments in these ocean drill cores,” said Kasbohm.

Normally, dates for ocean sediments drilled from the seabed are determined using a combination of fossil changes, magnetic field reversals, and aligning patterns of sediment layers with orbital wobbles calculated by astronomers. Each of those methods has uncertainties that are compounded by gaps in the sediment caused by the drilling process and by natural pauses in the deposition of material. Those make it tricky to match different records with the precision needed to determine cause and effect.

The uncertainties made the timing of the MCO unclear.

Tiny clocks: Zircon crystals from volcanic ash that fell into the Caribbean Sea during the Miocene.

Enlarge / Tiny clocks: Zircon crystals from volcanic ash that fell into the Caribbean Sea during the Miocene.

Jennifer Kasbohm

Radiometric dating would circumvent those uncertainties. But until about 15 years ago, its dates had such large errors that they were useless for addressing questions like the timing of the MCO. The technique also typically needs kilograms of material to find enough uranium-containing zircon crystals, whereas ocean drill cores yield just grams.

But scientists have significantly reduced those limitations: “Across the board, people have been working to track and quantify and minimize every aspect of uncertainty that goes into the measurements we make. And that’s what allows me to report these ages with such great precision,” Kasbohm said.

How did volcanism trigger climate change before the eruptions started? Read More »

string-of-record-hot-months-came-to-an-end-in-july

String of record hot months came to an end in July

Hot, but not that hot —

July had the two hottest days recorded but fell 0.04° Celsius short of last year.

Image of a chart with many dull grey squiggly lines running left to right, with an orange and red line significantly above the rest.

Enlarge / Absolute temperatures show how similar July 2023 and 2024 were.

The past several years have been absolute scorchers, with 2023 being the warmest year ever recorded. And things did not slow down in 2024. As a result, we entered a stretch where every month set a new record as the warmest iteration of that month that we’ve ever recorded. Last month, that pattern stretched out for a full 12 months, as June of 2024 once again became the warmest June ever recorded. But, despite some exceptional temperatures in July, it fell just short of last July’s monthly temperature record, bringing the streak to a close.

Europe’s Copernicus system was first to announce that July of 2024 was ever so slightly cooler than July of 2023, missing out on setting a new record by just 0.04° C. So far, none of the other major climate trackers, such as Berkeley Earth or NASA GISS, have come out with data for July. These each have slightly different approaches to tracking temperatures, and, with a margin that small, it’s possible we’ll see one of them register last month as warmer or statistically indistinguishable.

How exceptional are the temperatures of the last few years? The EU averaged every July from 1991 to 2020—a period well after climate change had warmed the planet significantly—and July of 2024 was still 0.68° C above that average.

While it didn’t set a record, both the EU’s Copernicus climate service and NASA’s GISS found that it contained the warmest day ever recorded. In the EU’s case, they were the two hottest days recorded, as the temperatures on the 21st and 22nd were statistically indistinguishable, with only 0.01° C separating them. Late July and early August tend to be the warmest times of the year for surface air temperatures, so we’re likely past the point where any daily records will be set in 2024.

That’s all in terms of absolute temperatures. If you compare each day of the year only to instances of that day in the past, there have been far more anomalous days in the temperature record.

In terms of anomalies over years past, both 2023 (orange) and 2024 (red) have been exceptionally warm.

Enlarge / In terms of anomalies over years past, both 2023 (orange) and 2024 (red) have been exceptionally warm.

That image also shows how exceptional the past year’s temperatures have been and makes it clear that 2024 is only falling out of record territory because the second half of 2023 was so exceptionally warm. It’s unlikely that 2024 will be quite as extreme, as the El Niño event that helped drive warming appears to have faded after peaking in December of 2023. NOAA’s latest forecast expects that the Pacific will remain in neutral for another month or two before starting to shift into cooler La Niña conditions before the year is out. (This is based on the August 8 ENSO forecast obtained here.)

In terms of anomalies, July also represents the first time in a year that a month had been less than 1.5° C above preindustrial temperatures (with preindustrial defined as the average over 1850–1900). Capping our modern temperatures at 1.5° C above preindustrial levels is recognized as a target that, while difficult to achieve, would help avoid some of the worst impacts we’ll see at 2° C of warming, and a number of countries have committed to that goal.

Listing image by Dmitriy83

String of record hot months came to an end in July Read More »

model-mixes-ai-and-physics-to-do-global-forecasts

Model mixes AI and physics to do global forecasts

Cloudy with a chance of accuracy —

Google/academic project is great with weather, has some limits for climate.

Image of a dark blue flattened projection of the Earth, with lighter blue areas showing the circulation of the atmosphere.

Enlarge / Image of some of the atmospheric circulation seen during NeuralGCM runs.

Google

Right now, the world’s best weather forecast model is a General Circulation Model, or GCM, put together by the European Center for Medium-Range Weather Forecasts. A GCM is in part based on code that calculates the physics of various atmospheric processes that we understand well. For a lot of the rest, GCMs rely on what’s termed “parameterization,” which attempts to use empirically determined relationships to approximate what’s going on with processes where we don’t fully understand the physics.

Lately, GCMs have faced some competition from machine-learning techniques, which train AI systems to recognize patterns in meteorological data and use those to predict the conditions that will result over the next few days. Their forecasts, however, tend to get a bit vague after more than a few days and can’t deal with the sort of long-term factors that need to be considered when GCMs are used to study climate change.

On Monday, a team from Google’s AI group and the European Centre for Medium-Range Weather Forecasts are announcing NeuralGCM, a system that mixes physics-based atmospheric circulation with AI parameterization of other meteorological influences. Neural GCM is computationally efficient and performs very well in weather forecast benchmarks. Strikingly, it can also produce reasonable-looking output for runs that cover decades, potentially allowing it to address some climate-relevant questions. While it can’t handle a lot of what we use climate models for, there are some obvious routes for potential improvements.

Meet NeuralGCM

NeuralGCM is a two-part system. There’s what the researchers term a “dynamical core,” which handles the physics of large-scale atmospheric convection and takes into account basic physics like gravity and thermodynamics. Everything else is handled by the AI portion. “It’s everything that’s not in the equations of fluid dynamics,” said Google’s Stephan Hoyer. “So that means clouds, rainfall, solar radiation, drag across the surface of the Earth—also all the residual terms in the equations that happen below the grid scale of about roughly 100 kilometers or so.” It’s what you might call a monolithic AI. Rather than training individual modules that handle a single process, such as cloud formation, the AI portion is trained to deal with everything at once.

Critically, the whole system is trained concurrently rather than training the AI separately from the physics core. Initially, performance evaluations and updates to the neural network were performed at six-hour intervals since the system isn’t very stable until at least partially trained. Over time, those are stretched out to five days.

The result is a system that’s competitive with the best available for forecasts running out to 10 days, often exceeding the competition depending on the precise measure used (in addition to weather forecasting benchmarks, the researchers looked at features like tropical cyclones, atmospheric rivers, and the Intertropical Convergence Zone). On the longer forecasts, it tended to produce features that were less blurry than those made by pure AI forecasters, even though it was operating at a lower resolution than they were. This lower resolution means larger grid squares—the surface of the Earth is divided up into individual squares for computational purposes—than most other models, which cuts down significantly on its computing requirements.

Despite its success with weather, there were a couple of major caveats. One is that NeuralGCM tended to underestimate extreme events occurring in the tropics. The second is that it doesn’t actually model precipitation; instead, it calculates the balance between evaporation and precipitation.

But it also comes with some specific advantages over some other short-term forecast models, key among them being that it isn’t actually limited to running over the short term. The researchers let it run for up to two years, and it successfully reproduced a reasonable-looking seasonal cycle, including large-scale features of the atmospheric circulation. Other long-duration runs show that it can produce appropriate counts of tropical cyclones, which go on to follow trajectories that reflect patterns seen in the real world.

Model mixes AI and physics to do global forecasts Read More »

the-earth-heated-up-when-its-day-was-22-hours-long

The Earth heated up when its day was 22 hours long

The Earth heated up when its day was 22 hours long

Because most things about Earth change so slowly, it’s difficult to imagine them being any different in the past. But Earth’s rotation has been slowing due to tidal interactions with the Moon, meaning that days were considerably shorter in the past. It’s easy to think that a 22-hour day wouldn’t be all that different, but that turns out not to be entirely true.

For example, some modeling has indicated that certain day lengths will be in resonance with other effects caused by the planet’s rotation, which can potentially offset the drag caused by the tides. Now, a new paper looks at how these resonances could affect the climate. The results suggest that it would shift rain to occurring in the morning and evening while leaving midday skies largely cloud-free. The resulting Earth would be considerably warmer.

On the Lamb

We’re all pretty familiar with the fact that the daytime Sun warms up the air. And those of us who remember high school chemistry will recall that a gas that is warmed will expand. So, it shouldn’t be a surprise to hear that the Earth’s atmosphere expands due to warming on its day side and contracts back again as it cools (these lag the daytime peak in sunlight). These differences provide something a bit like a handle that the gravitational pulls of the Sun and Moon can grab onto, exerting additional forces on the atmosphere. This complicated network of forces churns our atmosphere, helping shape the planet’s weather.

Two researchers, Russell Deitrick and Colin Goldblatt at Canada’s University of Victoria, were curious as to what would happen to these forces as the day length got shorter. Specifically, they were interested in a period where the day’s length would be at resonance with phenomena called Lamb waves.

Lamb waves aren’t specific to the atmosphere. Rather, they’re a specific manner in which a disturbance can travel through a medium, from vibrations in a solid to sound through the air.

Although various forces can create Lamb waves in the atmosphere, they’ll travel with a set of characteristic frequencies. One of those is roughly 10.5 to 11 hours. As you go back in time to shorter days, you’ll reach a point where the Earth’s day was a bit shorter than 22 hours, or twice the period of the Lamb waves. At this point, any disturbances in the atmosphere related to day length would have the ability to interact with the Lamb waves that were set off the day prior. This resonance could potentially strengthen the impact of any atmospheric phenomena related to day length.

Figuring out whether they do turned out to be a bit of a challenge. There are plenty of climate models to let researchers explore what’s going on in the modern atmosphere. But a lot of these have key features, like day length and solar output, hard coded into them. Others don’t let you do things like rearrange the Earth’s continents or change some atmospheric components.

The researchers did find a model that would allow them to change day length, solar intensity, and carbon dioxide concentrations to those present when Earth’s day length was 22 hours (which was likely to be in the pre-Cambrian). But it wasn’t able to reset the ozone concentrations, and ozone is also a greenhouse gas. So, they ran simulations without ozone, which are expected to be an under-estimate, and one where they elevated methane concentrations in order to mimic ozone’s greenhouse effect.

The Earth heated up when its day was 22 hours long Read More »

is-a-colonial-era-drop-in-co₂-tied-to-regrowing-forests?

Is a colonial-era drop in CO₂ tied to regrowing forests?

More trees, less carbon —

Carbon dioxide dropped after colonial contact wiped out Native Americans.

Image of a transparent disk against a blue background. The disk has lots of air bubbles embedded in it.

Enlarge / A slice through an ice core showing bubbles of trapped air.

British Antarctic Survey

Did the massive scale of death in the Americas following colonial contact in the 1500s affect atmospheric CO2 levels? That’s a question scientists have debated over the last 30 years, ever since they noticed a sharp drop in CO2 around the year 1610 in air preserved in Antarctic ice.

That drop in atmospheric CO2 levels is the only significant decline in recent millennia, and scientists suggested that it was caused by reforestation in the Americas, which resulted from their depopulation via pandemics unleashed by early European contact. It is so distinct that it was proposed as a candidate for the marker of the beginning of a new geological epoch—the “Anthropocene.”

But the record from that ice core, taken at Law Dome in East Antarctica, shows that CO2 starts declining a bit late to match European contact, and it plummets over just 90 years, which is too drastic for feasible rates of vegetation regrowth. A different ice core, drilled in the West Antarctic, showed a more gradual decline starting earlier, but lacked the fine detail of the Law Dome ice.

Which one was right? Beyond the historical interest, it matters because it is a real-world, continent-scale test of reforestation’s effectiveness at removing CO2 from the atmosphere.

In a recent study, Amy King of the British Antarctic Survey and colleagues set out to test if the Law Dome data is a true reflection of atmospheric CO2 decline, using a new ice core drilled on the “Skytrain Ice Rise” in West Antarctica.

Precious tiny bubbles

In 2018, scientists and engineers from the British Antarctic Survey and the University of Cambridge drilled the ice core, a cylinder of ice 651 meters long by 10 centimeters in diameter (2,136 feet by 4 inches), from the surface down to the bedrock. The ice contains bubbles of air that got trapped as snow fell, forming tiny capsules of past atmospheres.

The project’s main aim was to investigate ice from the time about 125,000 years ago when the climate was about as warm as it is today. But King and colleagues realized that the younger portion of ice could shed light on the 1610 CO2 decline.

“Given the resolution of what we could obtain with Skytrain Ice Rise, we predicted that, if the drop was real in the atmosphere as in Law Dome, we should see the drop in Skytrain, too,” said Thomas Bauska of the British Antarctic Survey, a co-author of the new study.

The ice core was cut into 80-centimeter (31-inch) lengths, put into insulated boxes, and shipped to the UK, all the while held at -20°C (-4°F) to prevent it from melting and releasing its precious cargo of air from millennia ago. “That’s one thing that keeps us up at night, especially as gas people,” said Bauska.

In the UK they took a series of samples across 31 depth intervals spanning the period from 1454 to 1688 CE: “We went in and sliced and diced our ice core as much as we could,” said Bauska. They sent the samples, still refrigerated, off to Oregon State University where the CO2 levels were measured.

The results didn’t show a sharp drop of CO2—instead, they showed a gentler CO2 decline of about 8 ppm over 157 years between 1516 and 1670 CE, matching the other West Antarctic ice core.

“We didn’t see the drop,” said Bauska, “so we had to say, OK, is our understanding of how smooth the records are accurate?”

A tent on the Antarctic ice where the core is cut into segments for shipping.

A tent on the Antarctic ice where the core is cut into segments for shipping.

British Antarctic Survey

To test if the Skytrain ice record is too blurry to show a sharp 1610 drop, they analyzed the levels of methane in the ice. Because methane is much less soluble in water than CO2, they were able to melt continuously along the ice core to liberate the methane and get a more detailed graph of its concentration than was possible for CO2. If the atmospheric signal was blurred in Skytrain, it should have smoothed the methane record. But it didn’t.

“We didn’t see that really smoothed out methane record,” said Bauska, “which then told us the CO2 record couldn’t have been that smoothed.”

In other words, the gentler Skytrain CO2 signal is real, not an artifact.

Does this mean the sharp drop at 1610 in the Law Dome data is an artifact? It looks that way, but Bauska was cautious, saying, “the jury will still be out until we actually get either re-measurements of the Law Dome, or another ice core drilled with a similarly high accumulation.”

Is a colonial-era drop in CO₂ tied to regrowing forests? Read More »

east-coast-has-a-giant-offshore-freshwater-aquifer—how-did-it-get-there?

East Coast has a giant offshore freshwater aquifer—how did it get there?

Image of a large boat with a tall tower at its center, and a crane in the rear. It is floating on a dark blue ocean and set in front of a white cloud.

Enlarge / An oceangoing scientific drilling vessel may be needed to figure out how huge undersea aquifers formed.

One-quarter of the world’s population is currently water-stressed, using up almost their entire fresh water supply each year. The UN predicts that by 2030, this will climb to two-thirds of the population.

Freshwater is perhaps the world’s most essential resource, but climate change is enhancing its scarcity. An unexpected source may have the potential to provide some relief: offshore aquifers, giant undersea bodies of rock or sediment that hold and transport freshwater. But researchers don’t know how the water gets there, a question that needs to be resolved if we want to understand how to manage the water stored in them.

For decades, scientists have known about an aquifer off the US East Coast. It stretches from Martha’s Vineyard to New Jersey and holds almost as much water as two Lake Ontarios. Research presented at the American Geophysical Union conference in December attempted to explain where the water came from—a key step in finding out where other undersea aquifers lie hidden around the world.

As we discover and study more of them, offshore aquifers might become an unlikely resource for drinking water. Learning the water’s source can tell us if these freshwater reserves rebuild slowly over time or are a one-time-only emergency supply.

Reconstructing history

When ice sheets sat along the East Coast and the sea level was significantly lower than it is today, the coastline was around 100 kilometers further out to sea. Over time, freshwater filled small pockets in the open, sandy ground. Then, 10,000 years ago, the planet warmed, and sea levels rose, trapping the freshwater in the giant Continental Shelf Aquifer. But how that water came to be on the continental shelf in the first place is a mystery.

New Mexico Institute of Mining and Technology paleo-hydrogeologist Mark Person has been studying the aquifer since 1991. In the past three decades, he said, scientists’ understanding of the aquifer’s size, volume, and age has massively expanded. But they haven’t yet nailed down the water’s source, which could reveal where other submerged aquifers are hiding—if we learn the conditions that filled this one, we could look for other locations that had similar conditions.

“We can’t reenact Earth history,” Person said. Without the ability to conduct controlled experiments, scientists often resort to modeling to determine how geological structures formed millions of years ago. “It’s sort of like forensic workers looking at a crime scene,” he said.

Person developed three two-dimensional models of the offshore aquifer using seismic data and sediment and water samples from boreholes drilled onshore. Two models involved ice sheets melting; one did not.

Then, to corroborate the models, Person turned to isotopes—atoms with the same number of protons but different numbers of neutrons. Water mostly contains Oxygen-16, a lighter form of oxygen with two fewer neutrons than Oxygen-18.

Throughout the last million years, a cycle of planetary warming and cooling occurred every 100,000 years. During warming, the lighter 16O in the oceans evaporated into the atmosphere at a higher rate than the heavier 18O. During cooling, that lighter oxygen came down as snow, forming ice sheets with lower levels of 18O and leaving behind oceans with higher levels of 18O.

To determine if ice sheets played a role in forming the Continental Shelf Aquifer, Person explained, you have to look for water that is depleted in 18O—a sure sign that it came from ice sheets melting at their base. Person’s team used existing global isotope records from the shells of deep-ocean-dwelling animals near the aquifer. (The shells contain carbonate, an ion that includes oxygen pulled from the water).

Person then incorporated methods developed by a Columbia graduate student in 2019 that involve using electromagnetic imaging to finely map undersea aquifers. Since saltwater is more electrically conductive than freshwater, the boundaries between the two kinds of water are clear when electromagnetic pulses are sent through the seafloor: saltwater conducts the signal well, and freshwater doesn’t. What results looks sort of like a heat map, showing regions where fresh and saltwater are concentrated.

Person compared the electromagnetic and isotope data with his models to see which historical scenarios (ice or no ice) were statistically likely to form an aquifer that matched all the data. His results, which are in the review stage with the Geological Society of America Bulletin, suggest it’s very likely that ice sheets played a role in forming the aquifer.

“There’s a lot of uncertainty,” Person said, but “it’s the best thing we have going.”

East Coast has a giant offshore freshwater aquifer—how did it get there? Read More »