Showing grants tagged "Global Catastrophic Risks"

We're open to supporting safe bets, like direct cash transfers to the world's poorest people, as well as high-risk, high-reward projects, like minimizing risks from potentially world-changing science and technology. Read more about how we choose what to fund here.

  1. Open Phil AI Fellowship — 2019 Class

    Award Date:
    05/2019
    Amount:
    $2,000,000
    Potential Risks from Advanced AI
  2. Massachusetts Institute of Technology Media Lab — Kevin Esvelt’s Research

    Award Date:
    03/2019
    Amount:
    $1,000,000
    Global Catastrophic Risks
  3. Machine Intelligence Research Institute — General Support (2019)

    Award Date:
    02/2019
    Amount:
    $2,112,500
    Potential Risks from Advanced AI
  4. Georgetown University — Center for Security and Emerging Technology

    Award Date:
    01/2019
    Amount:
    $55,000,000
    Potential Risks from Advanced AI
  5. Berkeley Existential Risk Initiative — CHAI ML Engineers

    Award Date:
    01/2019
    Amount:
    $250,000
    Potential Risks from Advanced AI
  6. Center for International Security and Cooperation — Biosecurity Research (2019)

    Award Date:
    01/2019
    Amount:
    $1,625,000
    Biosecurity and Pandemic Preparedness
  7. iGEM — Synthetic Biology Safety and Security (2018)

    Award Date:
    11/2018
    Amount:
    $420,000
    Biosecurity and Pandemic Preparedness
  8. Nuclear Threat Initiative — Projects to Reduce Global Catastrophic Biological Risks

    Award Date:
    11/2018
    Amount:
    $1,904,942
    Biosecurity and Pandemic Preparedness
  9. UC Berkeley — AI safety research (2018)

    Award Date:
    11/2018
    Amount:
    $1,145,000
    Potential Risks from Advanced AI
  10. Center for a New American Security — Outreach on Technological Risk

    Award Date:
    09/2018
    Amount:
    $400,352
    Global Catastrophic Risks