Summary: Simon Willison shares how he effectively uses Large Language Models (LLMs) to help him write code, despite challenges and mixed opinions from other developers. He emphasizes the importance of treating LLMs like a digital intern, providing clear instructions to improve efficiency and speed in development. Willison also highlights that LLMs can answer questions about codebases, making them useful for learning and understanding new projects.
Don’t fall into the trap of anthropomorphizing LLMs and assuming that failures which would discredit a human should discredit the machine in the same way. (View Highlight)
For OpenAI’s models this is usually October of 2023. Anthropic and Gemini and other providers may have more recent dates.
This is extremely important for code, because it influences what libraries they will be familiar with. If the library you are using had a major breaking change since October 2023, OpenAI models won’t know about it! (View Highlight)
Note: Beware of model training cutoff dates, especially when working with recently updated libraries.
One of my favorite code prompting techniques is to drop in several full examples relating to something I want to build, then prompt the LLM to use them as inspiration for a new project. (View Highlight)
Note: This is an interesting way to start.
I find LLMs respond extremely well to function signatures like the one I use here. I get to act as the function designer, the LLM does the work of building the body to my specification. (View Highlight)
Note: If you can define a function signature, an LLM can often figure out the function body.
They’re also much less lazy than me—they’ll remember to catch likely exceptions, add accurate docstrings, and annotate code with the relevant types. (View Highlight)
the one thing you absolutely cannot outsource to the machine is testing that the code actually works. (View Highlight)
Note: Always test the code!
a bad initial result isn’t a failure, it’s a starting point for pushing the model in the direction of the thing you actually want. (View Highlight)