User Guide
- Manage Prompts
- Create Prompt
- Edit Version
- Generate Version
- Rename Version
- Delete Version
- Set Primary Version
- Lock Version
- Compare Versions
- Token Count and Cost
- Update Prompt Use Case
- Copy Prompt to Workspace
- Move Prompt to Workspace
- Test Prompts
- Prompt Chat
- Response Alignment
- Optimize Prompts
- Prompt Scoring
- Scoring Criteria
- Prompt Heatmap
- Prompt Balance
- Prompt Risk
- Workspaces
- Add Workspace
- Rename Workspace
- Delete Workspace
- Switch Workspace
- Projects
- Add Project
- Delete Project
- Update Project Brief
- Assign Prompt to Project
- API Clients
- Add API Client
- Edit API Client
- Delete API Client
- Regenerate API Client
- Require API Client
- Connect AI Providers
- AI21 Labs
- Amazon Bedrock
- Anthropic
- Azure OpenAI
- Azure Serverless
- Cohere
- Deep Infra
- Fireworks AI
- Google Gemini
- Groq
- Hugging Face
- Hyperbolic
- IBM WatsonX
- Inception
- Meta
- Mistral
- Moonshot AI
- OpenAI
- Perplexity
- Together AI
- xAI
Test Prompts
Zatomic provides two ways to test your prompts:
- Prompt chat: Allows you to test the AI outputs and responses for each prompt version.
- Response alignment: Gives you a score and analysis for how well an AI response aligns with the intent and expectations of the prompt, its use case, the project brief (if applicable), and the user query.