|
You are here |
joecarlsmith.com | ||
| | | | |
longtermrisk.org
|
|
| | | | | Suffering risks, or s-risks, are "risks of events that bring about suffering in cosmically significant amounts" (Althaus and Gloor 2016).1 This article will discuss why the reduction of s-risks could be a candidate for a top priority among altruistic causes aimed at influencing the long-term future. The number of sentient beings in the future might be astronomical, and certain cultural, evolutionary, and technological forces could cause many of these beings to have lives dominated by severe suffering. S-... | |
| | | | |
www.alignmentforum.org
|
|
| | | | | > This post is part of my AI strategy nearcasting series: trying to answer key strategic questions about transformative AI, under the assumption that... | |
| | | | |
www.lesswrong.com
|
|
| | | | | On a career move, and on AI-safety-focused people working at frontier AI companies. | |
| | | | |
www.pcgamer.com
|
|
| | | The latest Mass Effect Andromeda breaking news, comment, reviews and features from the experts at PC Gamer | ||