Context Length is Short

The support context length of each LLM model is too short, the 130K tokens context length of https://chat.qwenlm.ai is really too comfortable, and I hope to increase the context length.

2 Likes

Hi @NB2025

Thanks for your post, as it helps highlight support for the direction we will be taking in the future.

We are always working to increase the context lengths of all of our available models in the cloud. So please do keep an eye out on our community announcements to find out when we have made updates to the context lengths.

Alex

1 Like