POST https://api.zatomic.ai/v1/prompts/scoring
Zatomic API
- Overview
- Versioning
- Authentication
- Workspaces
- Status Codes and Errors
- Token Usage
- Expanding Objects
- OpenAPI Spec
- Changelog
- Prompts
- The Prompt Object
- Create Prompt
- Update Prompt
- Delete Prompt
- Retrieve Prompt
- Retrieve All Prompts
- Generate Prompt
- Versions
- The Version Object
- Create Version
- Update Version
- Delete Version
- Retrieve Version
- Retrieve All Versions
- Calculate Version Score
- Retrieve Version Score
- Analyze Version Risk
- Retrieve Version Risk
- Analyze Version Balance
- Retrieve Version Balance
- Generate Version Heatmap
- Retrieve Version Heatmap
- Improve Version
- Scoring Criteria
- The Scoring Criteria Object
- The Scoring Criterion Object
- Create Scoring Criteria
- Update Scoring Criteria
- Delete Scoring Criteria
- Retrieve Scoring Criteria
- Retrieve All Scoring Criteria
- Generate Scoring Criteria
- Create Scoring Criterion
- Update Scoring Criterion
- Delete Scoring Criterion
- Retrieve Scoring Criterion
- Scoring Criteria Results
- The Scoring Criteria Results Object
- Scoring
- The Scoring Object
- Calculate Prompt Score
- Risk
- The Risk Object
- Analyze Prompt Risk
- Balance
- The Balance Object
- Analyze Prompt Balance
- Heatmaps
- The Heatmap Object
- Generate Prompt Heatmap
Scoring
Prompt scoring uses various criteria to analyze prompts and assign them a score and rating, with higher scores leading to better prompt performance.
Scoring can be performed and retrieved on individual prompt versions using their specific scoring endpoints. You can also score prompts without a version stored in the system by using the non-version specific endpoint.
Prompts are scored in the following ranges:
Scoring Range | Prompt Rating |
---|---|
0 - 49% | Poor |
50 - 74% | Fair |
75 - 89% | Good |
90 - 100% | Excellent |
Endpoints