The tool use OpenAI API compatibility does not support $ref for nested objects

I have been testing the possibility of using sambanova as a LLM provider in one of my apps, however the function calling API does not seem to be compatible with OpenAI’s API spec. Specifically it does not support $ref, which makes switching really hard as it requires a major rewrite of the current pipelines I use.

Here is a small example I made where I use $ref:

response = client.chat.completions.create(
    model="Meta-Llama-3.1-405B-Instruct",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hey, I'm Alex, I am 33, what's the weather in San Francisco?"},
    ],
    tools=[
        {
            "type": "function",
            "function": {
                "name": "get_current_weather",
                "description": "Get the current weather in a given location, customized to the user's details",
                "parameters": {
                    "type": "object",
                    "$defs": {
                        "UserDetails": {
                            "properties": {
                                "name": {
                                    "items": {"type": "string"},
                                    "type": "array",
                                },
                                "age": {"type": "integer"},
                            },
                            "required": ["score"],
                            "title": "Feedback",
                            "type": "object",
                        }
                    },
                    "properties": {
                        "location": {
                            "type": "string",
                            "description": "The city and state, e.g. San Francisco, CA",
                        },
                        "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                        "user": {"$ref": "#/$defs/UserDetails"},
                    },
                    "required": ["location"],
                },
            },
        }
    ],
    tool_choice="auto",
    temperature=0.1,
    top_p=0.1,
)

The response is Unsupported JSON schema type: None, meanwhile the same schema works with OpenAI and other OpenAI API compatible providers.

Stitching the $ref with direct declaration seems to work, however that requires rewriting the object definition

response = client.chat.completions.create(
    model="Meta-Llama-3.1-405B-Instruct",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {
            "role": "user",
            "content": "Hey, I'm Alex, I am 33, what's the weather in San Francisco?",
        },
    ],
    tools=[
        {
            "type": "function",
            "function": {
                "name": "get_current_weather",
                "description": "Get the current weather in a given location, customized to the user's details",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {
                            "type": "string",
                            "description": "The city and state, e.g. San Francisco, CA",
                        },
                        "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                        "user": {
                            "properties": {
                                "name": {
                                    "items": {"type": "string"},
                                    "type": "array",
                                },
                                "age": {"type": "integer"},
                            },
                            "required": ["score"],
                            "title": "Feedback",
                            "type": "object",
                        },
                    },
                    "required": ["location"],
                },
            },
        }
    ],
    tool_choice="auto",
    temperature=0.1,
    top_p=0.1,
)

tool_calls=[ChatCompletionMessageToolCall(id='call_1a5280b5491b4125a6', function=Function(arguments='{"location":"San Francisco","unit":null,"user":{"age":33,"name":["Alex"]}}

I would really appreciate if there was a way to easily use sambanova without requiring a major rewrite of the current pipelines.

Additionally, I have been testing the DeepSeek-V3-0324 and although it is a model capable of function calling and tool use, it does not seem to support function calling through the API.

Thanks,

1 Like

@clirimfurriku

First let me welcome you to our developer community.

Thank you for bringing this to our attention . I will file the applicable RFEs/ defects to get engineering’s attention on this .

-Coby

1 Like

@coby.adams Thank you, please let me know if there are any updates

1 Like

@clirimfurriku

Engineering is still working on it. They know what needs to be changed but they are running a regress of many test scenarios as any changes to things like the chat template can have unanticipated results for other use cases.

-Coby

@clirimfurriku

I wanted to let you know that a fix is on the way. I cannot commit to a date but it should be available soon.

-Coby

1 Like