-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
3.0.20 breaks experimental_onFunctionCall in useChat #1351
Comments
I have just tested the openai function call example incl. client side execution ( https://github.com/vercel/ai/tree/main/examples/next-openai/app/function-calling ) and it works as expected. I'll check out your repository to see if I can find out what's different. |
Workaround: you can use this to create the stream: const stream = OpenAIStream(response, {
async experimental_onFunctionCall() {
return;
},
}); I'll investigate how to automatically support function calls like this. |
Thanks, the workaround works for us. |
@johnson-liang thanks. No need for a PR, we're in the process of updating our docs, so your changes would get lost. |
With the recently added |
Description
Steps to reproduce
functionCallHandler
on client code to process the function call from LLM.The repository that can reproduce the issue: https://github.com/MrOrz/vercel-ai-function-bug/
Expected result
This screenshot is made by pinning ai SDK at version
3.0.19
:functionCallHandler
is invoked correctlymessages
and chat API is invoked again, as expectedActual result
This is the result of
3.0.20
and3.0.22
:Note that console prints nothing --
functionCallHandler
is probably not called at allCode example
https://github.com/MrOrz/vercel-ai-function-bug/
Additional context
#1316 indicates that there is a breaking change in 3.0.20. But it is more related to custom parsing of
StreamingTextResponse
. Does not mention that 3.0.20 also breaksexperimental_onFunctionCall
.The text was updated successfully, but these errors were encountered: