User Guide
- Manage Prompts
- Create Prompt
- Edit Version
- Generate Version
- Rename Version
- Delete Version
- Set Primary Version
- Compare Versions
- Token Count and Cost
- Update Prompt Use Case
- Copy Prompt to Workspace
- Move Prompt to Workspace
- Optimize Prompts
- Prompt Scoring
- Scoring Criteria
- Prompt Risk
- Prompt Balance
- Prompt Heatmap
- Connect AI Providers
- AI21 Labs
- Amazon Bedrock
- Anthropic
- Azure OpenAI
- Azure Serverless
- Cohere
- Deep Infra
- Fireworks AI
- Google Gemini
- Groq
- Hugging Face
- Hyperbolic
- Lambda
- Meta
- Mistral
- Moonshot AI
- OpenAI
- Perplexity
- Together AI
- xAI
Scoring Criteria
On this page you'll find the following:
- Overview
- Create Scoring Criteria from Scratch
- Add Criterion from Default Criteria
- Generate Scoring Criteria
- Generate Scoring Criteria with Different Models>
- Generate Additional Scoring Criteria
- Copy Scoring Criteria
- Delete Scoring Criteria
- Delete Single Criterion
- Update Use Case for Scoring Criteria
- Default Scoring Criteria
Overview
Custom scoring criteria allows you to score prompts based on factors specific to your use case. While the default scoring criteria satisfies 80-90% of prompts, that criteria used in conjunction with your own custom criteria is a flexible and powerful mechanism that leads to consistent AI outcomes for your business needs.
You can create new scoring criteria one of two ways:
- Manually from scratch, or
- By letting Zatomic generate it for you
If you let Zatomic generate scoring criteria for you, you can use any of the models Zatomic provides or you can use models from your own AI provider.
Though sometimes used interchangeably, the term criteria refers to the overall set of criteria, while the term criterion refers to the individual piece of criteria. Criteria is a collection of criterion.
Create Scoring Criteria from Scratch
If you know what kind of custom criteria you need for a given use case, then you may want to start by creating new scoring criteria from scratch. To do this, click the Scoring Criteria link in the left sidebar, then click the New criteria button at the top of the Scoring Criteria list page.


On the New Scoring Criteria page, give your criteria a name and an (optional) use case. If the criteria is for an existing prompt, you can select it from the dropdown list and its use case will appear in the textbox. Then click the purple From scratch button.
Tip: While not required, it is strongly recommended to define a use case for your scoring criteria. This greatly enhances the analysis when using Zatomic's prompt optimization tools, and is used when generating additional scoring criteria.

Your new criteria will be created, and now you can add each piece of criterion to the set. Click the purple Add criterion button; this will bring up the Add Scoring Criterion modal box that contains a small form.


The definition for the criterion form fields is as follows:
Slug | Should be unique within the criteria. Can only contain lowercase letters and underscores. Examples: persona_role, use_case_fit, instructions. |
Label | A label for the slug. For example, if the slug is persona_role, the label might be "Persona or role". |
Description | Short description of what the criterion is for. Will be displayed on the prompt scoring page. |
Questions | One or two questions that will be used to determine how well the criterion is met or satisfied. |
Weight | Whole number between 1 and 999. Determines how much weight to give the criterion, in relation to other criterion in the set. |
Fill out the form with the necessary information and click the green Add criterion button to save. Repeat as many times as necessary to add all of the criterion for the set.
Tip: Select an option from the Start from a default criterion dropdown list to prepopulate the form with its respective values.
Add Criterion from Default Criteria
You can also jumpstart adding criterion to your custom criteria by copying criterion from the default criteria. To do this, click the down arrow to the right of the Add criterion button and select the Copy from system default option. This will prepopulate your custom criteria with the set of criterion from the default criteria.

Generate Scoring Criteria
To save yourself a lot of time, you can let Zatomic generate scoring criteria for you. On the New Scoring Criteria screen, give the criteria a name and enter a use case. The use case is required when generating scoring criteria. If the criteria is for an existing prompt, you can select it from the dropdown list and its use case will appear in the textbox. Then click the purple Generate for me button.

This will display the Generating Criteria modal box. After a few seconds, a set of criteria specific to your use case will be created, at which point you can use as-is or customize as needed.

Generate Scoring Criteria with Different Models
To change which AI model is used to generate the criteria, click the Settings link at the top-right of the screen.

This will open a side panel where you can choose different models to generate scoring criteria. Make your model selection, click the green Apply button, then click the purple Generate for me button.

Generate Additional Scoring Criteria
Once you've created scoring criteria, you can generate additional criteria based on its use case. This is a quick and easy way to see multiple criteria that could be useful for your specific requirements.
Generating additional criteria takes into account criterion that already exists and ensures that only criteria that's different will be generated.
To generate additional criteria, go to the Generate tab of the scoring criteria, then click the purple Generate criteria button. Note that you can specify which AI model to use by opening the Settings panel.

After a few seconds a new set of criteria will be displayed. You can choose to keep the entire set by clicking the green Keep this criterion set button, and they will be added to the existing set of criteria. Or you can click the purple Generate criteria button again to generate a new set.

Copy Scoring Criteria
To copy a set of scoring criteria, go to the Scoring Criteria list page and click the 3 vertical dots at the far right of the criteria you want to copy, then select the Copy option.

This will open the Copy Scoring Criteria modal box. Enter a name for the new criteria, then click the green Copy criteria button.

Delete Scoring Criteria
To delete a set of criteria, go to the Scoring Criteria list page and click the 3 vertical dots at the far right of the criteria you want to delete, then select the Delete option.

This will open the Delete Scoring Criteria modal box for you to confirm deletion.

Delete Single Criterion
To delete a single criterion from a set of criteria, click the 3 vertical dots at the far right of the criterion you want to delete, then select the Delete option.

This will bring up the Delete Scoring Criterion modal box for you to confirm deletion.

Update Use Case for Scoring Criteria
To update the use case for a set of scoring criteria, go to its Info tab. Modify the use case field as needed then click the green Update criteria button.

Default Scoring Criteria
When scoring a prompt, every version starts out with the same set of default criteria, which can be applied to any prompt in any scenario. For each criterion in the set, the questions are used to determine if the prompt satisfies the condition and the weight is used to calculate its weighted score, which contributes to the overall prompt score.
The default set of criteria is as follows:
Description | Questions | Weight | |
---|---|---|---|
Constraints and restrictions | Sets contextual, functional, and ethical boundaries. | Does the prompt define boundaries, such as ethical, contextual, or functional constraints? | 10 |
Diverse examples | Examples represent diverse scenarios, including edge cases. | Do the examples cover diverse scenarios and edge cases? | 5 |
Expected output | Specifies the format and style of expected outputs. | Does the prompt specify the format, structure, or characteristics of the output? | 5 |
Five examples | No more than five examples are provided. | Does the prompt include up to five examples to demonstrate its use in various scenarios? | 5 |
One example | At least one example is provided. | Does the prompt provide at least one example to clarify its purpose? | 5 |
Persona or role | Defines a relevant and aligned persona or role. | Does the prompt define a relevant persona or role, such as an expert or assistant? | 10 |
Relevant examples | Examples are relevant to the prompt and support the use case. | Are the examples aligned with the use case and the prompt's purpose? | 5 |
Rules and instructions | Provides clear and actionable rules or instructions. | Are the rules and instructions clear, actionable, and specific? | 10 |
Sample inputs and outputs | Examples include sample inputs and outputs for improved context. | Do the examples provide clear input-output pairs? | 5 |
Specificity | Avoids vagueness; is detailed and unambiguous. | Is the prompt detailed and unambiguous? | 10 |
Task and purpose | Clearly defines the task, purpose, and goals. | Does the prompt define a clear task or purpose? | 10 |
Tone, style, and language | Specifies tone, style, and language appropriate for the task. | Is the tone appropriate, and does the style align with the intended audience or context? | 5 |
Use case fit | Satisfies the intended use case and purpose. | Does the prompt align with its intended purpose? Does it clearly address the use case? | 15 |