We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can ollama be integrated?
The text was updated successfully, but these errors were encountered:
Since there are many AI models and they are updated very quickly, this may be arranged as a later plan.
Sorry, something went wrong.
thank you,looking forward to your plan
I found that the API of ollama is compatible with openai. You can use it by directly filling in the location of openai.
url should be filled in:
http://localhost:11434/v1/chat/completions
However, you need to set an environment variable before starting ollama
OLLAMA_ORIGINS=*
I will preset more openai interfaces in the next version.
No branches or pull requests
Can ollama be integrated?
The text was updated successfully, but these errors were encountered: