Centre for the Study of Existential Risk

Source: Wikipedia, the free encyclopedia.

Centre for the Study of Existential Risk
Formation2012; 12 years ago (2012)
Founders
PurposeThe study and mitigation of existential risk
HeadquartersCambridge, England
Parent organization
University of Cambridge
Websitecser.ac.uk

The Centre for the Study of Existential Risk (CSER) is a research centre at the

President of the Royal Society) and Jaan Tallinn (co-founder of Skype, early investor to Anthropic).[2]

Areas of focus

Managing extreme technological risks

Risks are associated with emerging and future technological advances and impacts of human activity. Managing these extreme technological risks is an urgent task - but one that poses particular difficulties and has been comparatively neglected in academia.[3]

  • CSER researchers developed a widely used tool to automatically scan the scientific literature for new research relevant to global risk.[4]
  • CSER has held two international Cambridge Conferences on Catastrophic Risk.[5][6] The Centre has also advised on the establishment of global risk projects at the Australia National University,[7] the University of California, Los Angeles[8] and the University of Warwick.[9]
  • CSER helped establish the first All-Party Parliamentary Group for Future Generations in the United Kingdom Parliament, bringing global risk and long-term thinking to UK political leaders.[10]
  • CSER has held over thirty workshops bringing together academia, policy and industry on topics including cybersecurity, nuclear security, climate change, and gene drives.[11]
  • CSER Public Lectures have been viewed over 100,000 times online.[12]

Global catastrophic biological risks

  • In 2017, CSER convened policy-makers and academics to identify challenges for the Biological Weapons Convention (BWC). A key issue identified was that the rapid rate of progress in relevant sciences and technologies has made it very difficult for governance bodies including the BWC to keep pace.[13]
  • CSER researchers ran a horizon-scanning exercise for 20 Emerging Issues in Biological Engineering drawing on 30 European and US experts.[14] They presented the paper at the 2017 Meeting of States Parties to the BWC,[15] and at the Science Advisory Board of the Organisation for the Prohibition of Chemical Weapons in 2018.[16]

Extreme risks and the global environment

Risks from advanced artificial intelligence

  • In 2015 CSER helped organise a conference on the future directions of AI in Puerto Rico, resulting in an
    Open Letter on Artificial Intelligence signed by research leaders worldwide calling for research on ensuring that AI systems are safe and societally beneficial.[24]
  • In 2016, CSER launched its first spin-off: the Leverhulme Centre for the Future of Intelligence (CFI). Led by Professor Price, CFI focuses on the opportunities and challenges posed by AI.[25]
  • From 2017 onwards, CSER has organized a series of academic conferences bringing together Decision Theory and AI safety.[26]
  • In 2018, with partners from tech companies and security think-tanks, CSER published The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation, on the implications of AI for physical and cybersecurity.
    AAAI/ACM AI Ethics and Society conference.[28]

Media coverage

CSER has been covered in many different newspapers (particularly in the United Kingdom),[29][30][31] mostly covering different topics of interest. CSER was profiled on the front cover of Wired,[32] and in the special Frankenstein issue of Science in 2018.[33]

Advisors

CSER Advisors include Cambridge academics such as:

And advisors such as:

See also

References

  1. PMID 26336680
    .
  2. ^ Lewsey, Fred (25 November 2012). "Humanity's last invention and our uncertain future". Research News. Retrieved 24 December 2012.
  3. ^ "Managing Extreme Technological Risks".
  4. ^ "Existential Risk Research Network | X-Risk Research Network | www.x-risk.net".
  5. ^ "Cambridge Conference on Catastrophic Risk 2016".
  6. ^ "Cambridge Conference on Catastrophic Risk 2018".
  7. ^ "Latest news | Humans for Survival".
  8. ^ "The B. John Garrick Institute for the Risk Sciences".
  9. ^ "PAIS researchers secure prestigious Leverhulme funding".
  10. ^ "Appg-future-gens".
  11. ^ "Events".
  12. ^ "CSER Cambridge". YouTube. Retrieved 6 April 2019.
  13. ^ "Biological Weapons Convention: Where Next?".
  14. PMID 29132504
    .
  15. ^ "BWC Press Conference".
  16. ^ "Talk to Organisation for the Prohibition of Chemical Weapons".
  17. ^ University of California (24 September 2015). "A 'Parking Lot Pitch' to the Pope". Retrieved 6 April 2019 – via YouTube.
  18. S2CID 241969653
    .
  19. .
  20. .
  21. .
  22. ^ "Business School Rankings for the 21st Century".
  23. ^ Berwick, Isabel (27 January 2019). "As business schools rethink what they do, so must the FT". Financial Times.
  24. Wired
    . Retrieved 24 April 2015.
  25. ^ "Leverhulme Centre for the Future of Intelligence".
  26. ^ "Decision & AI".
  27. ^ maliciousaireport.com
  28. ^ "Best Paper Award – Aies Conference".
  29. ^ Connor, Steve (14 September 2013). "Can We Survive?". The New Zealand Herald.
  30. ^ "CSER media coverage". Centre for the Study of Existential Risk. Archived from the original on 30 June 2014. Retrieved 19 June 2014.
  31. ^ "Humanity's Last Invention and Our Uncertain Future". University of Cambridge Research News. 25 November 2012.
  32. ^ Benson, Richard (12 February 2017). "Meet Earth's Guardians, the real-world X-men and women saving us from existential threats". Wired UK.
  33. PMID 29326256
    .
  34. ^ "Team".

External links