You are here |
incoherency.co.uk | ||
| | | |
trivial.observer
|
|
| | | | Sorting out my cellar is coming along. After hanging my bikes from the beams (and shoving the robot behind the door) I have moved the timber slats that were acting as a partition wall across a few feet and assembled a workbench to run along it. Same space, more access There was an enormous table down there but it was difficult to move around and just ended up as a dumping ground, now I have one wall of worktop and shelving that I can actually access. | |
| | | |
trivial.observer
|
|
| | | | Toward the end of the afternoon, out of the blue, I decided I'd had enough of our disorganised heap of a cellar. I ordered some cheap shelving and workbench space which should be arriving at the weekend and pondered how best to store the 3 bikes. Given the space is only 180cm high, hoisting them up would achieve nothing. Hanging them parallel to the wall individually would take over the entire space and hanging them on the same hook would be a faff and difficult for Mrs Basil to get to hers on her own. | |
| | | |
www.blacktelephone.com
|
|
| | | | ||
| | | |
ezyang.github.io
|
|
| | When you're learning to use a new framework or library, simple uses of the software can be done just by copy pasting code from tutorials and tweaking them as necessary. But at some point, it's a good idea to just slog through reading the docs from top-to-bottom, to get a full understanding of what is and is not possible in the software. One of the big wins of AI coding is that LLMs know so many things from their pretraining. For extremely popular frameworks that occur prominently in the pretraining set, an LLM is likely to have memorized most aspects of how to use the framework. But for things that are not so common or beyond the knowledge cutoff, you will likely get a model that hallucinates things. Ideally, an agentic model would know to do a web search and find the docs it needs. However, Sonnet does not currently support web search, so you have to manually feed it documentation pages as needed. Fortunately, Cursor makes this very convenient: simply dropping a URL inside a chat message will include its contents for the LLM. |