Explore >> Select a destination


You are here

www.tensoic.com
| | www.marktechpost.com
10.4 parsecs away

Travel
| | Microsoft AI Releases Phi-3 Family of Models: A 3.8B Parameter Language Model Trained on 3.3T Tokens Locally on Your Phone
| | bdtechtalks.com
9.2 parsecs away

Travel
| | Large language models (LLM) require huge memory and computational resources. LLM compression techniques make models more compact and executable on memory-constrained devices.
| | www.philschmid.de
13.0 parsecs away

Travel
| | This blog post is an extended guide on instruction-tuning Llama 2 from Meta AI
| | neptune.ai
30.0 parsecs away

Travel
| You can apply the key ideas of this "Google Collab-friendly" approach to many other base models and tasks.