qwen2.5 coder is a greate programing llm,but the context length is 8K, too small
3 Likes
@NB2025 Thank you for joining the community. May I ask what context lengths you require for your projects?
-Coby
1 Like
I would also go for 128K context length, as it is the original value for the model.
A 128K context is generally beneficial for all models where possible.
From a load perspective, I understand the restrictions, but in most cases, time is less important than the answer quality, so I can accept if the model responds to 1 answer per minute but with a more extended context.
I think the extended context is more critical in evaluation than the speed at which feasible use cases are found. When things go into production, I think everyone will go for the paid option, which is measured on tokens.
The bigger, the better. Thank you.