Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

local llm support #523

Open
itsPreto opened this issue Dec 5, 2023 · 2 comments
Open

local llm support #523

itsPreto opened this issue Dec 5, 2023 · 2 comments

Comments

@itsPreto
Copy link

itsPreto commented Dec 5, 2023

are there plans to support local backends instead of just openai/claude etc?

@microchipgnu
Copy link

In theory, you could create your own integration and pass it as a component to the ChatProvider.

See https://docs.ai-jsx.com/guides/models#setting-the-model-explicitly

I haven't tried this yet, but thinking about it.

@microchipgnu
Copy link

microchipgnu commented Jan 16, 2024

Well took some time this evening to hack something and ended up working on an Ollama Provider. Find it here: https://gist.github.com/microchipgnu/0f327e328c4e18e4549725b41ee37d84

The code could see a cleanup and is based off of the Replicate Llama2 Provider.

I also included the index.tsx based on the ai-jsx-template and changed it a bit with the automation example I found on the docs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants