Not able to use ChatOpenAI from langchain_openai

Hello Abjidge!
Thanks for stopping by to check us out, We’re excited to have you.
Can you add streaming=True to your llm args and let us know if that works for you?

Here’s what I tried:

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
model="Meta-Llama-3.1-8B-Instruct",
api_key=SAMBANOVA_API_KEY,
base_url=SAMBANOVA_API_URL,
streaming=True,
)

messages = [
(
"system",
"Give a short story for human message.",
),
("human", "A trip to small village."),
]

print(llm.invoke(messages).content)