Prompting Tips
Prompts are the instructions given to AI models. Output quality depends on how well they are written for each model.
Top tips for getting the most out of AI models like Claude 3.5 Sonnet and GPT-4o
- Be specific and clear: Provide clear, explicit instructions with all the necessary details.
- Provide context: Include any necessary background information for your question. AI models typically have broad knowledge so any specifics should be provided in the context.
- Describe persona: Instruct AI to pretend to be a character or role, or take on a persona. This is useful shorthand to guide the model. Example: “You are a product manager”.
- Set format expectations: Define the desired output structure (e.g., bullet points, markdown, JSON). Be aware that asking for too much structure has been shown to reduce reasoning ability.
Tip: Add “Use well-structured markdown” to your prompt to add headings, bullets, and more.
- Use examples for clarity: Include a few examples in your instructions like pairs of representative inputs and outputs. This will help guide the model towards the type of answer you expect. Prompt engineers call this “few-shot prompting” — prompting without examples is called “one-shot prompting”.
- Prompt step by step: Specifically insert “Think step by step” (for OpenAI’s models) or “Think step by step in
<thinking>
tags” (for Anthropic’s models). - Use a variety of AI models: Different models can handle different tasks like writing, analysis, creative, math, and coding. Use all the top models together in Hunch.
- Flow of thought: Break complex tasks down into multiple sub-tasks and “chain” them together. Hunch excels at this! In Hunch each sub-task is a block, and blocks are connected together on a canvas to achieve a more complex task.
- Iterate and refine: This applies to all the steps above. If the initial response isn’t what you expected, refine your prompt and try again.
- Understand limitations: The AI model itself often doesn’t have access to external data or the ability to browse the web for real-time info. Custom tools in Hunch can help with these as well!
General tip: Think of AI as a very capable intern.
Effective prompting will involve a combination of these strategies to get the most relevant and high-quality responses.
Example task
Prompting suggestion from Anthropic: “For example, if you want Claude to help with explaining tax situations, you could first prompt it to create a list of the tax codes that are related to the specific question, then prompt Claude to identify the relevant sections in each document, and finally, to respond to a user question based on the information Claude’s gathered.”
Each of these steps would be AI blocks in Hunch. You don’t even have to use the same model for each step — you might discover that GPT-4o is better for some, or you can accelerate the flow with Claude 3 Haiku.
Bonus advanced tip: Claude likes XML tags in prompts — it’s a great way to identify examples and other context for Claude.