Light Mode
Deepseek Models
DeepSeek API Model Specifications
Model Comparison
Model | Context Length | Max CoT Tokens | Max Output Tokens | Input Price (per 1M tokens) | Output Price (per 1M tokens) |
---|---|---|---|---|---|
deepseek-chat | 64K | - | 8K | $0.30 | $1.80 |
deepseek-reasoner | 64K | 32K | 8K | $0.55 | $2.20 |
Important Notes
Model Versions
deepseek-chat
corresponds to DeepSeek-V3deepseek-reasoner
corresponds to DeepSeek-R1
Chain of Thought (CoT)
- CoT represents the reasoning process generated by
deepseek-reasoner
before producing the final answer - These intermediate reasoning steps are included in the token count and pricing
- CoT represents the reasoning process generated by
Token Management
- Default maximum output length is 4K tokens if
max_tokens
is not specified - Adjust
max_tokens
parameter as needed for longer outputs
- Default maximum output length is 4K tokens if
Pricing Structure
- For
deepseek-reasoner
, both CoT tokens and final answer tokens are counted and priced at the same rate - Pricing is calculated based on the total token count including all reasoning steps
- This is a temporary pricing. The pricing will only go DOWN for the time being.
- For
Additional Resources
For comprehensive API documentation and advanced feature's user guide, please visit Deepseek Official API guide.
Last modified: 23 days ago