Showing grants tagged "Global Catastrophic Risks"

We're open to supporting safe bets, like direct cash transfers to the world's poorest people, as well as high-risk, high-reward projects, like minimizing risks from potentially world-changing science and technology. Read more about how we choose what to fund here.

  1. Stanford University — AI Safety Seminar

    Award Date:
    02/2020
    Amount:
    $6,500
    Potential Risks from Advanced AI
  2. Bipartisan Commission on Biodefense — General Support

    Award Date:
    02/2020
    Amount:
    $2,970,000
    Biosecurity and Pandemic Preparedness
  3. Nuclear Threat Initiative — Biosecurity Program Support (February 2020)

    Award Date:
    02/2020
    Amount:
    $8,000,000
    Biosecurity and Pandemic Preparedness
  4. Wilson Center — AI Policy Seminar Series (February 2020)

    Award Date:
    02/2020
    Amount:
    $368,440
    Potential Risks from Advanced AI
  5. Johns Hopkins Center for Health Security — Masters and PhD Program Support

    Award Date:
    02/2020
    Amount:
    $1,860,000
    Biosecurity and Pandemic Preparedness
  6. Future of Humanity Institute — New DPhil Positions

    Award Date:
    02/2020
    Amount:
    $939,263
    Global Catastrophic Risks
  7. Machine Intelligence Research Institute — General Support (2020)

    Award Date:
    02/2020
    Amount:
    $6,243,750
    Potential Risks from Advanced AI
  8. GHSS — General Support (February 2020)

    Award Date:
    02/2020
    Amount:
    $1,200,000
    Biosecurity and Pandemic Preparedness
  9. Berkeley Existential Risk Initiative — General Support

    Award Date:
    01/2020
    Amount:
    $150,000
    Potential Risks from Advanced AI
  10. RAND Corporation — Research on the State of AI Assurance Methods

    Award Date:
    01/2020
    Amount:
    $30,751
    Potential Risks from Advanced AI