Not able to use ChatOpenAI from langchain_openai

Below snippet not working,

from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
api_key=SAMBANOVA_API_KEY,
model=“Meta-Llama-3.1-8B-Instruct”,
base_url=SAMBANOVA_API_URL,
)

messages = [
(
“system”,
“Give a short story for human message.”,
),
(“human”, “A trip to small village.”),
]
ai_msg = llm.invoke(messages)

Error:
ValidationError: 1 validation error for ChatMessage
role
none is not an allowed value (type=type_error.none.not_allowed)

This is limiting using the LLM with langchain framework.

Hello Abjidge!
Thanks for stopping by to check us out, We’re excited to have you.
Can you add streaming=True to your llm args and let us know if that works for you?

Here’s what I tried:

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
model="Meta-Llama-3.1-8B-Instruct",
api_key=SAMBANOVA_API_KEY,
base_url=SAMBANOVA_API_URL,
streaming=True,
)

messages = [
(
"system",
"Give a short story for human message.",
),
("human", "A trip to small village."),
]

print(llm.invoke(messages).content)