Could we know more deployment details for Meta-Llama-3.3-70B-Instruct model?

Hi We saw good performance on Llama 3.3 70B especially on TPOT.

Could we know more details such as how many cards used for Llama3.3 70B deployments?

Good to know how many cards, data types and how we share the endpoints among users.

Thanks

Louie

Hello Louie,

We have received your report regarding Llama 3.3 70B configuration inquiry. We will provide an update as soon as possible.

Best regards,
Virendra Jopale