EA - Announcing the Founders Pledge Global Catastrophic Risks Fund by christian.r

The Nonlinear Library: EA Forum - Podcast autorstwa The Nonlinear Fund

Podcast artwork

Kategorie:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Announcing the Founders Pledge Global Catastrophic Risks Fund, published by christian.r on October 26, 2022 on The Effective Altruism Forum.At Founders Pledge, we just launched a new addition to our funds: the Global Catastrophic Risks Fund. This post gives a brief overview of the fund.Key PointsThe fund will focus on global catastrophic risks with a special emphasis on risk pathways through international stability and great power relations.The fund’s shorter giving timelines are complementary to our investing-to-give Patient Philanthropy Fund — we are publishing a short write-up on this soon.The fund is designed to offer high-impact giving opportunities for both longtermists and non-longtermists who care about catastrophic risks (see section on “Our Perspective” in the Prospectus).You can find more information — including differences and complementarity with other funds and longtermist funders — in our Fund Prospectus.OverviewThe GCR Fund will build on Founders Pledge’s recent research into great power conflict and risks from frontier military and civilian technologies, with a special focus on international stability — a pathway that we believe shapes a number of the biggest risks facing humanity — and will work on:War between great powers, like a U.S.-China clash over Taiwan, or U.S.-Russia war;Nuclear war, especially emerging threats to nuclear stability, like vulnerabilities of nuclear command, control, and communications;Risks from artificial intelligence (AI), including risks from both machine learning applications (like autonomous weapon systems) and from transformative AI;Catastrophic biological risks, such as naturally-arising pandemics, engineered pathogens, laboratory accidents, and the misuse of new advances in synthetic biology; andEmerging threats from new technologies and in new domains.Moreover, the Fund will support field-building activities around the study and mitigation of global catastrophic risks, and methodological interventions, including new ways of studying these risks, such as probabilistic forecasting and experimental wargaming. The focus on international security is a current specialty, and we expect the areas of expertise of the fund to expand as we build capacity.Current and Future GenerationsThis Fund is designed both to tackle threats to humanity’s long-term future and to take action now to protect every human being alive today. We believe both that some interventions on global catastrophic risks can be justified on a simple cost-benefit analysis alone, and also that safeguarding the long-term future of humanity is among the most important things we can work on (and that in practice, they often converge). Whether or not you share our commitment to longtermism or believe that reducing existential risks is particularly important, you may still be interested in the Fund for the simple reason that you want to help prevent the deaths and suffering of millions of people.To illustrate this, the Fund may support the development of confidence-building measures on AI — like an International Autonomous Incidents Agreement — with the aim of both mitigating the destabilizing impact of near-term military AI applications, as well as providing a focal point for long-termist AI governance. Some grants will focus mainly on near-termist risks; others mainly on longtermist concerns.Like our other Funds, this will be a philanthropic co-funding vehicle designed to enable us to pursue a number of grantmaking opportunities, including:Active grantmaking, working with organizations to shape their plans for the future;Seeding new organizations and projects with high expected value;Committing to multi-year funding to give stability to promising projects and decrease their fundraising costs;Filling small funding gaps that fall between the cr...

Visit the podcast's native language site