Replies: 3 comments 11 replies
-
Hello! For the Ollama APIs, this project uses OpenAI format for API endpoints. So actually, API should be available in If you set the API URL as Try |
Beta Was this translation helpful? Give feedback.
-
For the record, I have it working with the proxied ollama API. Works just fine. Set url to https://example.com/api/ollama/ and then set API key in settings after the fact. |
Beta Was this translation helpful? Give feedback.
-
I will close this as completed for now. If there is something that still doesn't work, please let me know. |
Beta Was this translation helpful? Give feedback.
-
Hi people,
I'm trying to get GPTmobile (which seems to be a great app btw!) to work with my locally running AI which is built upon OpenWebUI.
OpenWebUI has a proxied API interface to access the Ollama API. When executing
I'm getting a correct answer.
In GPTassist in the Ollama Settings, I've set the API URL as
http://my_openwebui_domain.local/ollama/api/
, passed my API key and set the API Model towizardlm-uncensored
. However, when starting a chat and entering anything, the only answer I get is "Error: Unknown error".Did I miss out on something? I'm very grateful for any answer.
Beta Was this translation helpful? Give feedback.
All reactions