Suffering risks
Suffering risks, or s-risks, are risks involving an astronomical amount of suffering; much more than all of the suffering that has occurred on Earth thus far.[2][3] They are sometimes categorized as a subclass of existential risks.[4]
Sources of possible s-risks include embodied artificial intelligence[5] and superintelligence,[6] as well as space colonization, which could potentially lead to "constant and catastrophic wars"[7] and an immense increase in wild animal suffering by introducing wild animals, who "generally lead short, miserable lives full of sometimes the most brutal suffering", to other planets, either intentionally or inadvertently.[8]
Steven Umbrello, an
AI ethics researcher, has warned that biological computing may make system design more prone to s-risks.[5]
See also
References
- – via Existential Risk.
- ^ Daniel, Max (2017-06-20). "S-risks: Why they are the worst existential risks, and how to prevent them (EAG Boston 2017)". Center on Long-Term Risk. Retrieved 2023-09-14.
- ^ Hilton, Benjamin (September 2022). "'S-risks'". 80,000 Hours. Retrieved 2023-09-14.
- ^ Baumann, Tobias (2017). "S-risk FAQ". Center for Reducing Suffering. Retrieved 2023-09-14.
- ^ hdl:2318/1702133.
- ISSN 1854-3871.
- S2CID 149794325.
- S2CID 230597480.
Further reading
- Baumann, Tobias (2022). Avoiding the Worst: How to Prevent a Moral Catastrophe. Independently published. ISBN 979-8359800037.
- ISSN 2705-0785.
- Minardi, Di (2020-10-15). "The grim fate that could be 'worse than extinction'". BBC Future. Retrieved 2021-02-11.
- Baumann, Tobias (2017). "S-risks: An introduction". Center for Reducing Suffering. Retrieved 2021-02-10.
- Althaus, David; Gloor, Lukas (2016-09-14). "Reducing Risks of Astronomical Suffering: A Neglected Priority". Center on Long-Term Risk. Retrieved 2021-02-10.