|
You are here |
lmsys.org | ||
| | | | |
www.datadoghq.com
|
|
| | | | | Explore Toto, Datadog's open source time series foundation model (TSFM), and BOOM, a new benchmark for observability metrics. Both are open source under the Apache 2.0 license and deliver state-of-the-art forecasting performance on real-world data. | |
| | | | |
simonwillison.net
|
|
| | | | | A month ago I asked Could you train a ChatGPT-beating model for $85,000 and run it in a browser?. $85,000 was a hypothetical training cost for LLaMA 7B plus Stanford ... | |
| | | | |
bair.berkeley.edu
|
|
| | | | | [AI summary] The article introduces Koala, a dialogue model trained by fine-tuning Meta's LLaMA on dialogue data from the web, with a focus on interactions with large closed-source models like ChatGPT. The model's performance is compared to ChatGPT and Stanford's Alpaca, showing competitive results. The paper emphasizes the importance of high-quality training data for smaller models and highlights the potential for open-source models to match the performance of closed-source ones. However, it also acknowledges the limitations and safety concerns of Koala, including potential for misinformation and biases, and emphasizes its research prototype status for academic use. | |
| | | | |
www.eliostruyf.com
|
|
| | | Explore my journey with AI in development, from prompt engineering to collaborative coding, and discover the future of AI-assisted programming. | ||