en
en

Nobel Prize in Physics honors research on climate, glass, and other complex systems

Syukuro Manabe, Klaus Hasselmann, and Giorgio Parisi are to share the 2021 Nobel Prize in Physics for their work on complex systems, the Royal Swedish Academy of Sciences announced on Tuesday.

Manabe (Princeton University) developed early climate models to investigate the interplay of incoming radiation from the Sun, IR from Earth, convection in the atmosphere, and the latent heat of water vapor. His 1967 framework offered an initial estimate of the rise in global surface temperatures that would result from increased atmospheric carbon dioxide.

About a decade later, Hasselmann (Max Planck Institute for Meteorology in Hamburg, Germany) created a stochastic climate model that incorporated the weather’s fluctuations as noise. He also demonstrated how the fingerprints of discrete influences on the climate, including the impact of humans, can be identified and extracted.

Parisi (Sapienza University of Rome) tackled the spin-glass problem: How do magnetic spins orient themselves when subjected to competing energetic and geometric constraints? His solution, which found patterns in the possible configurations, has influenced mathematics, biology, neuroscience, and machine learning.

Manabe and Hasselmann will share half the 10 million Swedish kronor (roughly $1.1 million) prize; it is the first time the Nobel physics committee has recognized achievements in atmospheric or climate science. Parisi will receive the other half.

Climate modeling and attribution

Manabe received his PhD from the University of Tokyo in 1958. Soon after, he arrived at the US Weather Bureau in Washington, DC, as one of the first hires of Joseph Smagorinsky, who had been tasked with running numerical simulations on computers to develop re-creations, or general circulation models, of Earth’s atmosphere. Coming up with the physics equations was the easy part for Smagorinsky, Manabe, and colleagues; the challenge was replicating, with sufficient accuracy but only limited computing power, the complexity of the atmosphere and its interactions with land, ice, and sea.

Manabe dedicated a lot of time to studying specific hydrological processes, such as how different types of soil absorb water, according to historian Spencer Weart in his climate science history. But what set Manabe apart was his ability to simplify, says V. Balaji, a computational climate scientist at Princeton University and the Pierre-Simon Laplace Institute in France who has worked with Manabe. “He took a complicated system and reduced it to a few elements that were mathematically tractable,” Balaji says. For example, in 1964 Manabe and Robert Strickler devised an adjustment to account for the thermal effects as water vapor rises from the surface and releases heat when it condenses into clouds.

Three years later Manabe, by now working with NOAA’s Geophysical Fluid Dynamics Laboratory at Princeton, and Richard Wetherald published a numerical model of a 24-km-tall column of Earth’s atmosphere. Although far from being a global three-dimensional model, it described the atmosphere relatively well and represented a key step in incorporating basic physics and the effects of greenhouse gases, particularly water vapor and CO2. When the researchers doubled the concentration of CO2 in their simulated atmosphere, the temperature near the surface rose by about 2 °C. “This was the first time a greenhouse warming computation included enough of the essential factors, in particular the effects of water vapor, to seem plausible to the experts,” Weart writes.

In 1969 Manabe and colleague Kirk Bryan published the first coupled atmospheric and ocean model. And in 1975 Manabe and Wetherald built on their work from eight years earlier to create a primitive 3D general circulation model. This time they calculated an equilibrium climate sensitivity—the globally averaged surface temperature change resulting from a doubling of atmospheric CO2—of nearly 3 °C.

By the 1970s Manabe and others had successfully created models despite the inability to simulate many of the complex, chaotic processes, such as the year-to-year variability in the El Niño–Southern Oscillation, that contribute to Earth’s natural climate fluctuations. Yet considering the substantial cumulative impact of all those processes, climate scientists needed ways to quantify the variability in their models.

While on a flight to a conference in the mid 1970s, Hasselmann devised a solution. Inspired by Brownian motion, in which the observed motion of macroscopic particles is the result of continuous random movement at the microscopic level, Hasselmann developed a stochastic climate model that treated changes to the climate as the integrated effects of continuous random weather disturbances.

Hasselmann’s insights into deciphering the causes of observable effects at long time scales also led him to consider how to separate an anthropogenic climate signal from natural climate fluctuations. The models by Manabe and others had allowed researchers to explore the potential for global warming due to increased CO2, but it was unclear how much of that warming was the result of external radiative forcing (the signal) versus natural fluctuations (the noise). In a 1979 paper, Hasselmann provided a statistical blueprint for climate scientists to examine their model results and distinguish signals from noise. He also identified telltale data signatures that cannot be explained by natural fluctuations.

2019 review in Nature Climate Change calls Hasselmann’s work “the first serious effort to provide a sound statistical framework for identifying a human-caused warming signal.” In the decades since, climate scientists have been able to demonstrate with increasing confidence that rising global temperatures and other observed climatic effects are the direct result of human activities.

This year’s physics Nobel announcement comes several weeks before the United Nations’ 26th Climate Change Conference, to be held in Glasgow, Scotland, and two months after the partial release of the Sixth Assessment Report of the UN Intergovernmental Panel on Climate Change (IPCC). One of the report’s conclusions, which is based on the attribution studies so influenced by Hasselmann, is that it is “unequivocal that human influence has warmed the atmosphere, ocean, and land.”

In its report, the IPCC pegged the equilibrium climate sensitivity at 2.5–4 °C. Forty-two years earlier, in its Ad Hoc Study Group on Carbon Dioxide and Climate, the US National Research Council had relied on Manabe’s modeling work to estimate a probable range of 1.5–4.5 °C. “Everything goes back to those first calculations that Manabe made,” Balaji says.

A new spin on a complex problem

Complexity can be found not just on the planetary scale, but also in how particles arrange themselves on the microscale. Parisi’s 1979 mathematical description of how a collection of spins orient has wide-ranging applications.

Imagine a magnetic spin placed at each of the three corners of a triangle. If the spins have antiferromagnetic interactions, no single lowest-energy state exists. With one spin up and one spin down, the third spin can’t point antiparallel to both neighbors. The situation gets more complicated with additional spins and complex geometric arrangements. Because there’s no unique equilibrium, such so-called spin glasses settle into a multitude of metastable states that minimize the energy as best they can. Which of those states they enter, however, is difficult to predict.

In 1975 Sam Edwards and Philip Anderson reimagined the spin-glass problem in terms of averaging over the configurations of many replicas of the system. Rather than multiplying the complexity, that approach simplified the math by turning the problem into a thermodynamic calculation. David Sherrington and Scott Kirkpatrick extended that model to infinite dimensions, which once again paradoxically simplified the math. But the extension also predicted negative entropy at low temperatures, a clear sign of trouble. Jairo de Almeida and David Thouless realized in 1978 that the issue lay with the assumption of replica symmetry, which treats all replicas as equally related to one another. Despite that insight, a general solution eluded physicists.

Then, one year later, Parisi solved the problem. He introduced a parameter that describes how similar the states of two replicas are—in other words, how many of the N spins are pointing in the same direction in the two replicas. The underlying idea is represented in the tree shown below. Each colored point represents a state of the system. Any pair of states with the same color overlap to the same degree.

To assess how similar any two states are, you count how many nodes you pass moving from one state to the other in the tree. For example, the red states are all one node apart, and any given red state is three nodes away from either yellow state and five away from any of the blue or green states. For any three states chosen at random, at least two of them overlap by the same amount, or number of nodes. That insight leads to a tidy equation for the distribution of overlaps. (See the column by Philip Anderson, Physics Today, July 1989, page 9.)

“The solution has a richness that I don’t think anyone anticipated in advance,” says A. Peter Young of the University of California, Santa Cruz. “It has influenced much of the spin-glass literature for the last 40 years.”

The applications of Parisi’s approach have reached far beyond spin glasses. “It appears, for instance, when we want to understand the performance of modern artificial intelligence systems that learn by neural networks,” explains Lenka Zdeborová of the Swiss Federal Institute of Technology Lausanne.

She is just one member of what she characterized as the “large community of researchers that use [Parisi’s] science every day in a broad range of areas from physics, to mathematics, computer science, materials science, neuroscience, and biology.”

by physicstoday.scitation.org