THE PARIS AGREEMENT UNDER CLOSE SCRUTINY IN ADRIAN RAFTERY’S LECTURE: THE LATEST STATISTICAL ANALYSIS SHOWS PARTIAL SUCCESSES AND A HARD REALITY
On Tuesday, 25 November, the Institute of Computer Science of the Czech Academy of Sciences hosted an expert lecture within the ICS CAS Invited Lecture in Statistical and Environmental Modelling series by a leading global figure in climate statistical modelling, Professor Adrian E. Raftery of the University of Washington. The lecture, titled “Did the Paris Climate Agreement Work? Evidence from Bayesian Integrated Assessment,” attracted participants from universities, the Czech Academy of Sciences and partner institutions to a full lecture hall and was also held online.

The aim of the lecture was to present the latest results from 2015–2024, making it possible for the first time to assess how the world has developed since the adoption of the Paris Agreement. Professor Raftery built on his influential 2017 work, which first introduced a fully statistical, probabilistic model of emission development based on the well-known IPAT identity: emissions = population × GDP per capita × carbon intensity. A recently published article by Jiang, Shi & Raftery (2025) confirms that this method remains surprisingly well calibrated and, thanks to new data, brings several key findings.
What actually happened after 2015? Carbon intensity declined faster than before, on average by 3.1% annually. This represents a significant improvement compared to the period before Paris. However, GDP grew even faster — globally by about 3.9% per year. The result? Total CO₂ emissions still increased by 5.6%. The Paris Agreement, which aims to keep the rise in average global temperature below 2°C, therefore brought a tangible shift toward more efficient energy use, but rapid economic growth outweighed the positive climate impact.
WHAT DO THE NEW STATISTICAL PROJECTIONS SAY?
- Thanks to nine additional years of data, projections up to the year 2100 have been significantly refined:
- The expected warming has decreased only slightly — from 2.6°C to 2.4°C.
- The probability of keeping global warming below 2°C remains low — around 17%.
- If countries meet their ambitious emission targets, this probability rises to 34%.
- A positive development: the risk of a “catastrophic scenario” above 3°C has dropped sharply (from 26% to 9%).
- The model also indicates that the most likely year of the global emissions peak is 2024, but with considerable uncertainty extending into the 2040s.
- Changes in global population growth or decline have little effect on these results.
The lecture thus presented a world that is moving in the right direction, but too slowly to meet ambitious climate goals.
LIVELY DISCUSSION AND AUDIENCE RESPONSE
A very lively discussion followed the lecture. Participants were interested, among other things, in how the statistical models are formulated and how they respond to rapid technological changes and unexpected events. They also asked about the details of setting emission targets and whether the “below 2°C” goal can still realistically be achieved, as well as which countries currently have the greatest room to accelerate decarbonisation.
Professor Raftery emphasised that carbon intensity — the rate at which economies reduce emissions per unit of output — will play the most important role in the future. He also reminded the audience that statistical estimates are not predictions of fate but a tool for informed decision-making.
The lecture confirmed that data from the past decade can now be used to realistically evaluate the Paris Agreement. It also brought an encouraging message: the world is gradually moving toward a lower risk of extreme warming. At the same time, however, it showed that without faster and more ambitious action, the chances of achieving the 1.5–2°C target remain small.
The event was supported by the DigiWELL project and the AV21 Strategy programme. It was an excellent opportunity to see top-level statistics in practice, linking data, modelling and current climate policy.


