|
You are here |
www.cold-takes.com | ||
| | | | |
www.newyorker.com
|
|
| | | | | Matthew Hutson on technologists' warnings about the dangers of the so-called singularity, and the question of whether anything can be done to prevent it. | |
| | | | |
blog.heim.xyz
|
|
| | | | | New technologies under development, most notably artificial general intelligence (AGI), could pose an existential threat to humanity. We expect significant competitive pressure around the development of AGI, including a significant amount of interest from state actors. | |
| | | | |
joecarlsmith.com
|
|
| | | | | A high-level picture of how we might get from here to safe superintelligence. | |
| | | | |
www.securitymagazine.com
|
|
| | | |||