flask - Ollama + OpenWebUI on VPS : which endpoint for curl requests to embedding model? - Stack Overflow

I have a VPS with Ollama and OpenWebUI running. I want to use "granite-embedding:278m" as emb

I have a VPS with Ollama and OpenWebUI running. I want to use "granite-embedding:278m" as embedding model for curl requests. I have the same setup running locally on my Mac and the endpoint is http://localhost:11434/v1/embeddings. But I can't find any similar endpoint in the docs for the OpenWebUI API.

So I guess I need to use the Ollama API and install a Flask server. Is there any potential conflicts or incompatibilities I should care of ?

Any better advice ? Thank you in advance.

发布者:admin,转转请注明出处:http://www.yc00.com/questions/1745155013a4614064.html

相关推荐

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信