User Guide
- Manage Prompts
- Create Prompt
- Edit Version
- Generate Version
- Rename Version
- Delete Version
- Set Primary Version
- Compare Versions
- Token Count and Cost
- Update Prompt Use Case
- Copy Prompt to Workspace
- Move Prompt to Workspace
- Optimize Prompts
- Prompt Scoring
- Scoring Criteria
- Prompt Risk
- Prompt Balance
- Prompt Heatmap
- Connect AI Providers
- AI21 Labs
- Amazon Bedrock
- Anthropic
- Azure OpenAI
- Azure Serverless
- Cohere
- Deep Infra
- Fireworks AI
- Google Gemini
- Groq
- Hugging Face
- Hyperbolic
- Lambda
- Meta
- Mistral
- Moonshot AI
- OpenAI
- Perplexity
- Together AI
- xAI
Token Count and Cost
When editing a prompt, you'll see a sidebar on the right that shows you the number of input tokens used by the prompt, as well as the cost per use, the cost per 1,000, and cost per 1M:
- Cost per use is how much it will cost you each time the prompt is used.
- Cost per 1,000 is how much it will cost you if you use the prompt one thousand times.
- Cost per 1M is how much it will cost you if you use the prompt one million times.
You can compare token counts and cost for different models by changing the model in the dropdown list. Switch to the exact model you'll use with the prompt so that you can plan accordingly for your input token cost.
The token count and cost are accurate based on real-time queries to the model and current token prices.
