Context Length is Short

The support context length of each LLM model is too short, the 130K tokens context length of https://chat.qwenlm.ai is really too comfortable, and I hope to increase the context length.

2 Likes

Hi @NB2025

Thanks for your post, as it helps highlight support for the direction we will be taking in the future.

We are always working to increase the context lengths of all of our available models in the cloud. So please do keep an eye out on our community announcements to find out when we have made updates to the context lengths.

Alex

1 Like

@NB2025

As an FYI the Meta-Llama-3.3-70B-Instruct ow supports up to 128k context lengths. Please note as always if TTFT is your main concern working with the shortest context length would always be the fastest.

-Coby