|
You are here |
til.simonwillison.net | ||
| | | | |
blog.mozilla.ai
|
|
| | | | | When it comes to using LLMs, it's not always a question of which model to use: it's also a matter of choosing who provides the LLM and where it is deployed. Today, we announce the release of any-llm, a Python library that provides a simple unified interface to access the most popular providers. | |
| | | | |
isthisit.nz
|
|
| | | | | August 2024 Update: Now a solved problem. Use Structured Outputs. Large language models (LLMs) return unstructured output. When we prompt them they respond with one large string. This is fine for applications such as ChatGPT, but in others where we want the LLM to return structured data such as lists or key value pairs, a parseable response is needed. In Building A ChatGPT-enhanced Python REPL I used a technique to prompt the LLM to return output in a text format I could parse. | |
| | | | |
blog.daniemon.com
|
|
| | | | | How to use ChatGPT function calling to have better control of the API's response, making it easier to use the generated content in your code. | |
| | | | |
lil.law.harvard.edu
|
|
| | | Today we're releasing WARC-GPT: an open-source, highly-customizable Retrieval Augmented Generation tool the web archiving community can use to explore the in... | ||