Hello,
i’m trying to integrate your models in my local openwebui instance.
I already have a meta3.1 external api, and i can use it but trying with sambanova api i have this error:
r.raise_for_status()
INFO [open_webui.env] Saving ‘OLLAMA_BASE_URLS’ to the database
INFO: 172.17.0.1:64361 - “GET /ollama/api/version HTTP/1.1” 200 OK
ERROR [open_webui.apps.openai.main] Connection error: 404, message=‘Attempt to decode JSON with unexpected mimetype: text/plain; charset=utf-8’, url=‘https://api.sambanova.ai/v1/models’
Hope you can help me.
Thanks
1 Like
@a.malaga
Welcome to the community. AT this time we are only doing the streaming inference API but I will submit your request to have the model list API available . In the meantime this Documentation topic will contain the list of supported models.
Supported Models
regards
-Coby
Hello, thanks.
Yes /models is “a must” cause many tools use get call to retrieve all models.
I’ll try with a workaround in the meantime.
Thanks
1 Like
Thank you for your understanding.
@karan.srivastava FYI
If anyone else will ask, you can “bypass” with: pipelines/examples/pipelines/providers at main · open-webui/pipelines · GitHub
and select openai_pipeline.py , editing per API and model provided here.
1 Like