Top 10 AI Prompting Tips for Text Generation Models
Prompts are the instructions given to AI models. Output quality depends on how well they are written for each model.
Top tips for getting the most out of AI models like Claude 3.5 Sonnet and GPT-4o
- Be specific and clear: Provide clear, explicit instructions with all the necessary details.
- Provide context: Include any necessary background information for your question. AI models typically have broad knowledge so any specifics should be provided in the context.
- Describe persona: Instruct AI to pretend to be a character or role, or take on a persona. This is useful shorthand to guide the model. Example: "You are a product manager".
- Set format expectations: Define the desired output structure (e.g., bullet points, markdown, JSON). Be aware that asking for too much structure has been shown to reduce reasoning ability. Tip: add "Use well-structured markdown" to your prompt to add headings, bullets, and more.
- Use examples for clarity: Include a few examples in your instructions like pairs of representative inputs and outputs. This will help guide the model towards the type of answer you expect. Prompt engineers call this "few-shot prompting" – prompting without examples is called "one-shot prompting".
- Prompt step by step: Specifically insert “Think step by step” (for OpenAI's models) or "Think step by step in <thinking> tags" (for Anthropic's models).
- Use a variety of AI models: Different models can handle different tasks like writing, analysis, creative, math, and coding. Use all the top models together in Hunch.
- Flow of thought: Break complex tasks down into multiple sub-tasks and "chain" them together. Hunch excels at this! In Hunch each sub-task is a block, and blocks are connected together on a canvas to achieve a more complex task.
- Iterate and refine: This applies to all the steps above. If the initial response isn't what you expected, refine your prompt and try again.
- Understand limitations: The AI model itself often doesn't have access to external data or the ability to browse the web for real-time info. Custom tools in Hunch can help with these as well!
General tip: Think of AI as a very capable intern.
Effective prompting will involve a combination of these strategies to get the most relevant and high-quality responses.
⭐ Put any prompt into our Prompt Improver tool to see how these strategies work.
However, while using a well-written prompt could greatly improve the resulting output of an AI model, handling complex, sophisticated tasks is typically where individual AI models fall down. Splitting these tasks into parallel and sequential sub-tasks would be well-suited for Hunch, where each sub-task uses it's own AI model. You don't even have to use the same model for each step — you might discover that GPT-4o is better for some, or you can accelerate the flow with Claude 3 Haiku.
Bonus advanced tip: Claude likes XML tags in prompts — it's a great way to identify examples and other context for Claude.
👉 Next, check out prompting tips from the Hunch community!
About Hunch
Hunch is a workspace for AI-first work, empowering users to create custom AI workflows without any programming. It's as simple as connecting blocks, each representing a specific task. The workspace lets you harness any AI model to be more productive and creative. With Hunch, any workflow you create can become an AI tool - a new skill that you or an AI agent can use to do your work for you next time. Then share your AI tools to level-up your team or the community.