New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switching LLM to ChatGPT 4 Turbo Preview for Cody Pro users #210
Comments
I think you should be able to do this with the most recent update now. If you use |
@tjdevries I can not set the model, it still showing anthropic 2.0 as the model. Can you help me with that? I press M and select the model, but there is nothing change after that. |
hi @tjdevries , can you help me take a look? I just paid sg a month usage to try out, but find out I could not try the new Claude 3 model |
Sorry, I was away for awhile but have updated sg.nvim and it should have the latest models. You should be able to also set the default model with: require("sg").setup {
accept_tos = true,
chat = {
default_model = "opeanai/gpt-4o",
},
} Let me know if that's not working! |
I recently just subscribed to Cody Pro, and I've been looking for how to switch to using gpt 4 rather than the default llm available.
How do I switch llm to use Chatgpt 4 turbo preview?
Any information would be appreciated, thank you!
The text was updated successfully, but these errors were encountered: