Header visualHeader visualHeader visual

Innovation brings risk
Some of the greatest threats we face come from advances in biotechnology and advanced AI systems.

Global Catastrophic Risks Fund

Donate Now

Our objective

Stop the next global catastrophe in its tracks

We live in an era of new perils.

Humanity faces existential risks, including war between great powers, natural and engineered pandemics, thermonuclear war, threats from advanced artificial intelligence (AI), and frontier military technologies.

These global catastrophic risks have the potential to kill hundreds of millions, even billions, of people alive today.

We can come together – scientists, policymakers, engineers, military leaders, and motivated citizens – to mitigate these risks. It's happened before. During the Cold War, political leaders negotiated to reduce stockpiles of weapons of mass destruction. At the turn of the millennium, scientists tracked large asteroids and comets in Earth’s vicinity. Today, countries are working together on global preparedness for the next pandemic disease.

The Global Catastrophic Risks Fund (GCRF) tackles far-future threats and takes action now to help protect every human being alive today. We aim to:

  • Reduce the probability of large-scale catastrophic events;
  • Mitigate the potential negative impacts of these events if they occur;
  • Improve the ability to anticipate new and emerging risks on the horizon.

Want to tackle climate change? We have an entire Fund dedicated to it.

The Global Catastrophic Risks Fund is a philanthropic co-funding vehicle that does not provide investment returns.

Illustrative image

▲ Photo by Cash Macanaya on Unsplash

Our strategy

We find opportunities to support highly impactful and neglected initiatives to reduce the probability of worldwide catastrophes and mitigate their consequences. This is a complex mission, with an ever-changing threat landscape. We give special consideration to threats that could curtail humanity’s future, leaning towards tractable solutions today. By seeking opportunities that are neglected by other grant-makers, we can ensure that the Fund is as high-leverage as possible.

Grant-making

Our decision-making is guided by three core values: impact, innovation, and flexibility.

  • To maximize impact, our grant portfolio includes both direct interventions, like funding the development of new crisis communications technology or personal protective equipment (PPE), as well as research and hits-based bets. Hits-based bets are initiatives where success is less certain, but where there is potential to improve many more lives if successful.
  • We are committed to innovation, including developing new and better approaches to grantmaking, and providing seed funding for novel projects.
  • We maintain flexibility to respond rapidly to emerging crises and windows of opportunity. We work with networks of domain experts, trusted partners, and government decision-makers to identify new opportunities, and deploy funds in the most effective ways.

When evaluating potential grants, we consider several factors:

  • Counterfactual impact.
  • Collaborating with trusted partners.
  • Avoiding harm and information hazards.
  • Filling funding gaps.
  • Organizational strength.
  • Seizing time-sensitive opportunities and policy windows.

We don't solicit or accept applications for grants. Grant recipients are chosen through careful evaluations, and are based on our research and strategy.

Illustrative image

▲ Photo by CDC on Unsplash

Direct and co-funded grants

Date
Recipient
Grant
Amount
December 2023
INHR

Support for U.S.-China diplomatic dialogues on artificial intelligence, including the AI-bio nexus.

$146,000
May 2023

Seed funding to launch the organization. A co-funded grant with $200k from GCRF and $350k from advised grants.

$550,000
January 2023
Center for a New American Security

To support a one-year project on an "International Autonomous Incidents Agreement," as described in Founders Pledge's report on Autonomous Weapons and Military AI (pages 39-40). A co-funded grant with $100k from GCRF and $100k from advised grants.

$200,000

Advised grants

These grants have been identified, evaluated and advised on by Fund Managers; resources were deployed by external philanthropists through their giving infrastructure, separately from the Fund.

Date
Recipient
Grant
Amount
January 2024
Carnegie Endowment for International Peace

To launch Project "Averting Armageddon"

$2,504,000
October 2023

Seed funding to launch the organization

$3,000,000
September 2023
Pacific Forum

For the U.S.-China Strategic Nuclear Dialogues

$200,000

Prevent the most severe global catastrophes

Donate Now

Founders Pledge members

Contribute through your Donor Advised Fund (DAF) easily through the member app. Don’t have a DAF or want to discuss your options? Reach out to giving@founderspledge.com.

Donate on the member app

Not a member?

Contribute through Every.org or Giving What We Can. You can also contribute from any Donor Advised Fund; for details, reach out to giving@founderspledge.com.

Donate on Every.orgDonate on Giving What We CanDonate on Charityvest

Our impact

Impact reports

2023 Impact Report

Meet the Fund Manager

Portrait

Christian Ruhl

Christian Ruhl is a Senior Researcher at Founders Pledge. Christian’s work focuses on understanding, forecasting, and mitigating global catastrophic risks, including risks from great power conflict and weapons of mass destruction.

Previously, Christian managed the program on “The Future of the Global Order: Power, Technology, and Governance” at Perry World House, the University of Pennsylvania’s global affairs think tank. After receiving his BA from Williams College, he studied on a Dr. Herchel Smith Fellowship at the University of Cambridge for two master’s degrees, one in History and Philosophy of Science and one in International Relations and Politics, with dissertations on early modern state-sponsored science and Cold War nuclear strategy.

Christian was a member of the 2021 Project on Nuclear Issues (PONI) Nuclear Scholars Initiative, serves on the External Advisory Board of the Berkeley Risk and Security Lab (BRSL), and is a Mentor for summer fellows at the Cambridge Existential Risks Initiative (CERI). His writing has appeared in The Atlantic, the Bulletin of the Atomic Scientists, Foreign Policy, and more.

Learn More

Prospectus
Learn more