PROMPT TRAINING
Az Anthropic útmutatója szerint a kontextus dokumentumokat a prompt elejére kell helyezni
Most people are doing long context prompts backwards. Instead of burying your documents in the middle or bottom of your prompt, Anthropic's latest guidance says to put all your longform data at the very top—before your instructions, examples, or questions. The results? Up to 30% better response quality, especially when working with multiple complex documents.
Here's the simple structure that works: First, dump all your long documents at the top using XML tags like <document> and <source>. Then, add your actual query at the very end. Pro tip: Ask Claude to quote relevant parts of the documents first before carrying out the main task. This helps it focus on what matters instead of getting lost in document noise. Oh, and it might sound counterintuitive, but putting your most important question last (not first) dramatically improves Claude's ability to process massive amounts of information.
- Place all long-form data at the very top of the prompt.
- Use XML tags such as <document> and <source> to organize input data.
- Position instructions and specific queries at the very end of the prompt.
- Ask the model to cite specific quotes before generating a final answer to reduce document noise.
- The most important question should be placed last for optimal processing.
Miért fontos?
Following these structural prompt engineering tips can improve response quality by up to 30% for long-context models.