|
You are here |
fintechdystopia.com | ||
| | | | |
idratherbewriting.com
|
|
| | | | | Note: This content is entirely AI-generated, but with steering and shaping from me. Book club meeting recording Suleyman's main argument Pessimism aversion:... | |
| | | | |
www.lesswrong.com
|
|
| | | | | Comment by Wei Dai - I passed up an invitation to invest in Anthropic in the initial round which valued it at $1B (it's now planning a round at $170B valuation), to avoid contributing to x-risk. (I didn't want to signal that starting another AI lab was a good idea from a x-safety perspective, or that I thought Anthropic's key people were likely to be careful enough about AI safety. Anthropic had invited a number of rationalist/EA people to invest, apparently to gain such implicit endorsements.) This idea/plan seems to legitimize giving founders and early investors of AGI companies extra influence on or ownership of the universe (or just extremely high financial returns, if they were to voluntarily sell some shares to the public as envisioned here), which is ... | |
| | | | |
www.wheresyoured.at
|
|
| | | | | Fair warning: this is the longest thing I've written on this newsletter. I do apologize. Soundtrack: EL-P - $4 Vic Listen to my podcast Better Offline. We have merch. Last week, Bloomberg profiled Microsoft CEO Satya Nadella, revealing that he's either a liar or a specific kind of idiot. The | |
| | | | |
www.commonsense.org
|
|
| | | Here's what educators can do as artificial intelligence evolves. | ||