Hi @hello1 , Meta-Llama-3.3-70B model currently supports a maximum context length of 4096 tokens.
Thanks & Regards