Explore >> Select a destination


You are here

lambdalabs.com
| | blog.moonglow.ai
3.0 parsecs away

Travel
| | Parameters and data. These are the two ingredients of training ML models. The total amount of computation ("compute") you need to do to train a model is proportional to the number of parameters multiplied by the amount of data (measured in "tokens"). Four years ago, it was well-known that if
| | hpc-ai.com
3.1 parsecs away

Travel
| | Colossal-AI offers an open-source solution to efficiently replicate ChatGPT-like model training at high speed and low cost.
| | lacker.io
2.8 parsecs away

Travel
| | I've been playing around with OpenAI's new GPT-3 language model. When I got beta access, the first thing I wondered was, how human is GPT-3? How close is it ...
| | aras-p.info
32.7 parsecs away

Travel
|