|
You are here |
lancecarlson.com | ||
| | | | |
mattmazur.com
|
|
| | | | | One of the many new features announced at yesterday's OpenAI dev day is better support for generating valid JSON output. From the JSON mode docs: A common way to use Chat Completions is to instruct the model to always return JSON in some format that makes sense for your use case, by providing a system... | |
| | | | |
www.s-anand.net
|
|
| | | | | I built podcast generator app in one-shot. I wrote a prompt, fed it to an LLM, and it generated the output without errors. I tested three LLMs, and all produced correct, working output. It still took me an hour to craft the prompt - even after I'd built a Python prototype and my colleague built [...] | |
| | | | |
www.markhneedham.com
|
|
| | | | | In this post, we'll learn how to use LLMs on the command line with Simon Willison's llm library. | |
| | | | |
uo.com
|
|
| | | |||