Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3.0.20 breaks experimental_onFunctionCall in useChat #1351

Closed
MrOrz opened this issue Apr 15, 2024 · 5 comments
Closed

3.0.20 breaks experimental_onFunctionCall in useChat #1351

MrOrz opened this issue Apr 15, 2024 · 5 comments
Assignees
Labels
ai/ui bug Something isn't working wontfix This will not be worked on

Comments

@MrOrz
Copy link

MrOrz commented Apr 15, 2024

Description

Steps to reproduce

  1. Follow quickstart https://sdk.vercel.ai/docs/getting-started
  2. Follow "Defining functions" and "Handling Function Calls on the Client" in OpenAI functions doc https://sdk.vercel.ai/docs/guides/providers/openai-functions#handling-function-calls-on-the-client , which implements functionCallHandler on client code to process the function call from LLM.

The repository that can reproduce the issue: https://github.com/MrOrz/vercel-ai-function-bug/

Expected result

This screenshot is made by pinning ai SDK at version 3.0.19:
image

  1. console prints function call argument, indicating that the functionCallHandler is invoked correctly
  2. function return value is added to messages and chat API is invoked again, as expected

Actual result

This is the result of 3.0.20 and 3.0.22:
image
Note that console prints nothing -- functionCallHandler is probably not called at all

Code example

https://github.com/MrOrz/vercel-ai-function-bug/

Additional context

#1316 indicates that there is a breaking change in 3.0.20. But it is more related to custom parsing of StreamingTextResponse. Does not mention that 3.0.20 also breaks experimental_onFunctionCall.

@lgrammel
Copy link
Collaborator

I have just tested the openai function call example incl. client side execution ( https://github.com/vercel/ai/tree/main/examples/next-openai/app/function-calling ) and it works as expected. I'll check out your repository to see if I can find out what's different.

@lgrammel
Copy link
Collaborator

Workaround: you can use this to create the stream:

  const stream = OpenAIStream(response, {
    async experimental_onFunctionCall() {
      return;
    },
  });

I'll investigate how to automatically support function calls like this.

@johnson-liang
Copy link

Thanks, the workaround works for us.
I can help sending PR to amend the documentation if the workaround would stick around for a while.

@lgrammel
Copy link
Collaborator

@johnson-liang thanks. No need for a PR, we're in the process of updating our docs, so your changes would get lost.

@lgrammel
Copy link
Collaborator

With the recently added useChat / streamText tool calling revamp (see https://sdk.vercel.ai/docs/ai-sdk-ui/chatbot-with-tool-calling ), experimental_onFunctionCall has become deprecated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/ui bug Something isn't working wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

3 participants