DeepSeek-R1 only get 8k context lenght through API

The document says it has 16k context window.
Supported models

But the API and playground both have 8k context lenght.

How to get the 16k context?

Thanks,
Joe

Hi @nragfr640,

Thank you for bringing this to our attention. We’ve received your message and will investigate the issue from our end. We’ll keep you informed with any updates as we make progress.

Please feel free to reach out if you have any further information or questions.

Best regards,
Shivani

Hi @nragfr640 ,
Through our API testing, we found that the model supports up to 16k sequence length. However, when using the Playground, it appears to support only up to 8k. Thank you for bringing this to our attention — we’ll continue to look into the discrepancy.

Best regards,
Shivani