EA - Announcing the Open Philanthropy AI Worldviews Contest by Jason Schukraft
The Nonlinear Library: EA Forum - Podcast autorstwa The Nonlinear Fund
Kategorie:
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Announcing the Open Philanthropy AI Worldviews Contest, published by Jason Schukraft on March 10, 2023 on The Effective Altruism Forum.We are pleased to announce the 2023 Open Philanthropy AI Worldviews Contest.The goal of the contest is to surface novel considerations that could influence our views on AI timelines and AI risk. We plan to distribute $225,000 in prize money across six winning entries. This is the same contest we preannounced late last year, which is itself the spiritual successor to the now-defunct Future Fund competition. Part of our hope is that our (much smaller) prizes might encourage people who already started work for the Future Fund competition to share it publicly.The contest deadline is May 31, 2023. All work posted for the first time on or after September 23, 2022 is eligible. Use this form to submit your entry.Prize Conditions and AmountsEssays should address one of these two questions:Question 1: What is the probability that AGI is developed by January 1, 2043?Question 2: Conditional on AGI being developed by 2070, what is the probability that humanity will suffer an existential catastrophe due to loss of control over an AGI system?Essays should be clearly targeted at one of the questions, not both.Winning essays will be determined by the extent to which they substantively inform the thinking of a panel of Open Phil employees. There are several ways an essay could substantively inform the thinking of a panelist:An essay could cause a panelist to change their central estimate of the probability of AGI by 2043 or the probability of existential catastrophe conditional on AGI by 2070.An essay could cause a panelist to change the shape of their probability distribution for AGI by 2043 or existential catastrophe conditional on AGI by 2070, which could have strategic implications even if it doesn’t alter the panelist’s central estimate.An essay could clarify a concept or identify a crux in a way that made it clearer what further research would be valuable to conduct (even if the essay doesn’t change anybody’s probability distribution or central estimate).We will keep the composition of the panel anonymous to avoid participants targeting their work too closely to the beliefs of any one person. The panel includes representatives from both our Global Health & Wellbeing team and our Longtermism team. Open Phil’s published body of work on AI broadly represents the views of the panel.Panelist credences on the probability of AGI by 2043 range from ~10% to ~45%. Conditional on AGI being developed by 2070, panelist credences on the probability of existential catastrophe range from ~5% to ~50%.We will award a total of six prizes across three tiers:First prize (two awards): $50,000Second prize (two awards): $37,500Third prize (two awards): $25,000EligibilitySubmissions must be original work, published for the first time on or after September 23, 2022 and before 11:59 pm EDT May 31, 2023.All authors must be 18 years or older.Submissions must be written in English.No official word limit — but we expect to find it harder to engage with pieces longer than 5,000 words (not counting footnotes and references).Open Phil employees and their immediate family members are ineligible.The following groups are also ineligible:People who are residing in, or nationals of, Puerto Rico, Quebec, or countries or jurisdictions that prohibit such contests by lawPeople who are specifically sanctioned by the United States or based in a US-sanctioned country (North Korea, Iran, Russia, Myanmar, Afghanistan, Syria, Venezuela, and Cuba at time of writing)You can submit as many entries as you want, but you can only win one prize.Co-authorship is fine.See here for additional details and fine print.SubmissionUse this form to submit your entries. We strongl...
