Prompts

Testing and Refining Prompts

Testing your prompts ensures that they produce consistent and accurate results. In Datograde, you can simulate prompt executions, compare versions, and refine instructions iteratively.


Testing a Prompt

  1. Open a prompt from the Prompts Dashboard.
  2. Click Test Prompt.
  3. Enter a Sample Input in the input field. For example:
    {
      "transcript": "The candidate mentioned their experience in AI and machine learning projects..."
    }
  4. Click Run Test to see the generated output.

Comparing Prompt Versions

  1. Open the Version History tab for a prompt.
  2. Select two versions to compare:
    • Input/output differences.
    • Execution metrics (cost, latency, errors).
  3. Identify which version performs best for your use case.

Refining Prompts

  1. Based on test results, refine your prompt by:
    • Adjusting the wording of instructions.
    • Adding or clarifying constraints (e.g., "Limit to 100 words").
    • Using examples to guide the AI.
  2. Save changes as a new version and re-test.

Testing with Real-World Examples

  1. Link the prompt to curated Examples to simulate real-world scenarios.
  2. Use the Example Testing tab to evaluate how well the prompt performs across diverse inputs.

Next Steps


On this page