Prompts
Testing and Refining Prompts
Testing your prompts ensures that they produce consistent and accurate results. In Datograde, you can simulate prompt executions, compare versions, and refine instructions iteratively.
Testing a Prompt
- Open a prompt from the Prompts Dashboard.
- Click Test Prompt.
- Enter a Sample Input in the input field. For example:
- Click Run Test to see the generated output.
Comparing Prompt Versions
- Open the Version History tab for a prompt.
- Select two versions to compare:
- Input/output differences.
- Execution metrics (cost, latency, errors).
- Identify which version performs best for your use case.
Refining Prompts
- Based on test results, refine your prompt by:
- Adjusting the wording of instructions.
- Adding or clarifying constraints (e.g., "Limit to 100 words").
- Using examples to guide the AI.
- Save changes as a new version and re-test.
Testing with Real-World Examples
- Link the prompt to curated Examples to simulate real-world scenarios.
- Use the Example Testing tab to evaluate how well the prompt performs across diverse inputs.