CORS Issue with SambaNova's API Endpoint

I kept getting CORS issue. Is there any way to solve this with an diffrenet endpoint or something?

5 Likes

Hi @alex

Thanks for using the SambaNova Cloud, could you please provide some further details on how you are receiving this error including the endpoint URL you are currently calling?

Thanks & Regards

1 Like

It keeps returning Failed to fetch with the right code. Iā€™ve tried the old Fast API endpoint anf the new api.sambanova.ai one, all failed.

1 Like

Hey Alex,

Could you paste your code and Iā€™ll try to reproduce and correct anything if needed?

Thanks!

Seth

1 Like

@alex Welcome aboard and I am sorry you are having issues with your code functioning.

Can you let me know if this code worked against the older version of fast-api ( the current version of fast-api is just a redirect o to our cloud new SambaNova cloud api).

What programming language are you calling the api from? CORS is often seen with java script applications.

Finally as others have stated if you could provide a code snippet that duplicates the issue it could accelerate root cause determination.

-Coby

For reference, i tried a request now with openwebui and it gave me the response, so i think OP should paste the code for addressing the issue.

1 Like

const response = await fetch(ā€˜https://api.sambanova.ai/v1/chat/completionsā€™, {
method: ā€˜POSTā€™,
headers: {
ā€˜Content-Typeā€™: ā€˜application/jsonā€™,
ā€˜Authorizationā€™: ā€˜Bearer MYAPIKEYā€™
},
body: JSON.stringify({
model: ā€œMeta-Llama-3.1-405B-Instructā€,
messages: [
{role: ā€œuserā€, content: prompt}
],
max_tokens: 3000,
temperature: 0.7,
top_p: 1,
top_k: null,
stop: null,
stream: false
})

Hereā€™s the snippet of code that connects to the endpoint!

1 Like

The old fast API endpoint did work for me, what I did was just switch out the endpoints and it stops working. Error shows that itā€™s a CORS issue.

1 Like

Hey Alex, Thanks for sending that.
First let me be upfront, Iā€™m not a JS developer by any stretch. That said I used the 405b endpoint to come up with a solution that is tested and working. Iā€™m not sure if this is sufficient for your needs but hereā€™s what Iā€™ve done:
I created node package and installed required packages:

 npm init
 npm install express axios cors

Then I created a local proxy and index file to test:
proxy.js

const express = require('express');
const cors = require('cors');
const axios = require('axios');

const app = express();
app.use(cors());
app.use(express.json());

app.post('/v1/chat/completions', async (req, res) => {
  try {
    console.log('Received request:', req.body);
    const response = await axios.post('https://api.sambanova.ai/v1/chat/completions', req.body, {
      headers: {
        'Content-Type': 'application/json',
        'Authorization': 'Bearer MYAPIKEY'
      }
    });
    console.log('API response:', response.data);
    res.json(response.data);
  } catch (error) {
    console.error('Error:', error);
    res.status(500).json({ message: 'Internal Server Error' });
  }
});

const port = 3000;
app.listen(port, () => {
  console.log(`Proxy server listening on port ${port}`);
});

index.html

<!DOCTYPE html>
<html>
<head>
  <title>Chat Completion</title>
</head>
<body>
  <h1>Chat Completion Response</h1>
  <div id="response"></div>

  <script>
    (async () => {
      try {
        const response = await fetch('http://localhost:3000/v1/chat/completions', {
          method: 'POST',
          headers: {
            'Content-Type': 'application/json'
          },
          body: JSON.stringify({
            model: "Meta-Llama-3.1-405B-Instruct",
            messages: [
              {role: "user", content: "tell me a happy tale"}
            ],
            max_tokens: 3000,
            temperature: 0.7,
            top_p: 1,
            top_k: null,
            stop: null,
            stream: false
          })
        });
        const data = await response.json();
        console.log('Server response:', data);

        // Get the response div
        const responseDiv = document.getElementById('response');

        // Extract the message from the response data
        const message = data.choices[0].message.content;

        // Create a paragraph element to display the message
        const messageParagraph = document.createElement('p');
        messageParagraph.textContent = message;

        // Append the message paragraph to the response div
        responseDiv.appendChild(messageParagraph);
      } catch (error) {
        console.error('Error:', error);
      }
    })();
  </script>
</body>
</html>

This produces a webpage with the message content, see attached image.

Let me know if this is acceptable. More questions and comments are always welcome.
Thanks!
Seth

1 Like

even i have been using proxies to get around thisā€¦ but that only works well for testing and stuffā€¦
eg : working on an android app rnā€¦ would be difficult to manage this into that ( altough i have not looked at your codeā€¦ i will have a look at it when iā€™m on my laptopā€¦ it maybe feasible to do)
main point beingā€¦ this is still a work around no?? should the endpoint not be fixed at sambanovaā€™s end??

edit : okay my bad i sounded a bit rudeā€¦ i did not mean that to come off like that

1 Like

Not rude at all. Iā€™m asking internally to find out if we can address this from the backend. It might take some time but I will get an answer for you. Appreciate the patience.

Thanks!
Seth

3 Likes

@alex and @gaurishyt I have logged a Jira ticket for this and will let you know engineerings findings.

2 Likes

Thank you Seth! It works for me, but Iā€™m deploying to Vercel & that doesnā€™t allow server. Iā€™m looking for a workaround with this, perhaps using serverless functions, but so far hasnā€™t been able to do that yet.
But anyways, that worked for me in my local machine, so thank you very much!

1 Like

Any update/eta yet???

1 Like

+1 this. Itā€™s a major roadblock for me. Any update on this yet?

1 Like

I think there has to be a backend/proxy server to do this for you. Please correct me if I am wrong. Even the Node.js code shown by @seth.kneeland is serving as backend, not frontend JavaScript code. I think it is reasonable to keep CORS disabled for safety reason.

But this post may help to resolve it.

All,

To put a button on this thread we are not going to allow cors without a proxy at this time . A sample proxy server code is attached and the readme is below.

# CORS Proxy Server

This repository contains a simple CORS (Cross-Origin Resource Sharing) proxy server built using Python's built-in `http.server` module. It accepts `POST` requests, forwards them to a target API with an `Authorization` Bearer token, and returns the response to the client. This server can be used to bypass CORS restrictions for frontend clients.

## Features
- Allows `POST` requests from specified origins.
- Adds Bearer token authentication headers to outgoing requests.
- Supports response streaming for handling large responses.
- Handles `OPTIONS` requests to facilitate CORS preflight.

## Prerequisites
- Python 3.7 or higher
- `requests` library

Install the python libraries:
```bash
pip install -r requirements.txt

Setup

Environment Variables

Set the following environment variables to configure the proxy:

  • BEARER_TOKEN: The Bearer token used to authenticate requests to the target API.
  • TARGET_URL: The target API URL to which the incoming requests will be forwarded.
  • ALLOWED_ORIGINS: Comma-separated list of allowed origins for CORS. By default, it allows all origins (*).

Example environment variable setup:

export BEARER_TOKEN="your_bearer_token_here"
export TARGET_URL="https://api.sambanova.ai/v1/chat/completions"
export ALLOWED_ORIGINS="http://localhost:3000, http://example.com"

Script Execution

  1. Run the Server:

    python cors_proxy.py
    

    This will start the server on http://localhost:5007.

  2. Check the Server Status:
    When the server starts, you should see the following message:

    CORS Proxy Server running on http://localhost:5007
    

Usage

You can test the proxy server using curl or any HTTP client. The server will forward the request body to the configured target URL and return the response to the client.

Example curl Command

curl -H "Content-Type: application/json" \
     -d '{
        "stream": true,
        "model": "Meta-Llama-3.1-70B-Instruct", "max_tokens": 100,
        "messages": [
                {
                        "role": "system",
                        "content": "You are a helpful assistant"
                },
                {
                        "role": "user",
                        "content": "Hello"
                }
        ]
}' \
     -X POST http://localhost:5007/v1/chat/completions

Response

The server will return the response from the TARGET_URL API along with the following headers:

  • Access-Control-Allow-Origin: * (or as configured)
  • Cache-Control: no-cache
  • Connection: close
  • Content-Type: text/event-stream (or as returned by the target API)

Handling CORS Preflight Requests

The server also handles OPTIONS requests for CORS preflight, responding with the following headers:

  • Access-Control-Allow-Origin
  • Access-Control-Allow-Methods: POST, OPTIONS
  • Access-Control-Allow-Headers: Content-Type, Authorization
  • Access-Control-Max-Age: 3600

Logging

Server logs are available in the console, with error messages printed for any issues encountered during request handling.

Troubleshooting

If you encounter issues, check for the following:

  1. Missing Environment Variables:
    Ensure that the BEARER_TOKEN and TARGET_URL environment variables are set correctly.

  2. CORS Errors:
    Ensure that ALLOWED_ORIGINS is correctly set for your frontend applicationā€™s origin.



[python-proxy.zip|attachment](upload://3iwNwM440gy9BerGZvyofFFF1JW.zip) (3.2 KB)
1 Like