Explore >> Select a destination


You are here

blog.chand1012.dev
| | simonam.dev
2.9 parsecs away

Travel
| | All the steps required to turn an RTX 2060 into an OpenAI drop-in replacement
| | mattmazur.com
5.1 parsecs away

Travel
| | Below are the steps I used to get Mistral 8x7Bs Mixture of Experts (MOE) model running locally on my Macbook (with its Apple M2 chip and 24 GB of memory). Here's a great overview of the model for anyone interested in learning more. Short version: The Mistral "Mixtral" 8x7B 32k model,developed by Mistral AI, is...
| | www.danieldemmel.me
2.7 parsecs away

Travel
| Part two of the series Building applications using embeddings vector search and Large Language Models