OpenAI sdk compatibility using Vercel AI SDK with OpenAI-compatible provider

im finding a few issues integrating projects that use the Vercel AI SDK - which seems to be all related to SambaNova’s responses not being quite OpenAI compatible. For example SambaNova’s streaming response is missing the required index field in tool calls, which OpenAI’s specification mandates.

Could the team review the OpenAI spec and see how we can continue to move towards compatibility - as that will massively help adoption as sooo many projects are built using this - for example what we see needed is;

  1. Add the required index field to each tool call in streaming responses

  2. Use exact function names seems that we are providing names that are being transformed int he response

  3. Follow OpenAI’s function calling specification exactly, including all required fields - example - Sent: get_weather_data

  • Returned: weather_data

  • Pattern: Appears to remove common prefixes like “get_”

  1. Fix streaming response format to match OpenAI’s chunk structure
    example
  • Response Type: chat.completion.chunk (streaming)

  • Structure: Complete tool call in single chunk (non-streaming behavior)

  • Expected: Progressive tool call construction across multiple chunks

if anybody has worked around this that would be helpful

Hello @david.keane ,

Thanks for bringing this to our attention. We’re currently reviewing them and working to align with OpenAI’s spec. We’ll keep you updated on progress as soon as we have more to share.

Appreciate your patience.

Regards,
Durgesh

1 Like