Explore >> Select a destination


You are here

dynomight.net
| | gwern.net
3.0 parsecs away

Travel
| | On GPT-3: meta-learning, scaling, implications, and deep theory. The scaling hypothesis: neural nets absorb data & compute, generalizing and becoming more Bayesian as problems get harder, manifesting new abilities even at trivial-by-global-standards-scale. The deep learning revolution has begun as foretold.
| | epoch.ai
5.2 parsecs away

Travel
| | This Gradient Updates issue explores how much energy ChatGPT uses per query, revealing it's 10x less than common estimates.
| | www.alexirpan.com
3.7 parsecs away

Travel
| | In August 2020, I wrote a post about my AI timelines.Using the following definition of AGI:
| | www.kunal-chowdhury.com
29.3 parsecs away

Travel
| Explore the game-changing impact of AI integration in organizations through no-code platforms. Unleash innovation and efficiency in a tech-driven era!