-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in useChat when parsing stream causing empty messages from AI #1347
Comments
Duplicate of #1316 See #1316 (comment) for upgrade instructions. |
I don't understand why this is resolved as a duplicate. The other issue seems to be related to parsing the StreamingTextResponse manually, while this is about using I just installed version (3.0.22), removed my package-lock file, node_modules folder and .next build folder. Even after that, I got the same error as above (i.e. |
Also got It turns out that the versions of It seems that there is a breaking change on 3.0.20, so that both API and UI must both use We really do not expect such incompatibility in patch versions though, took us hours to debug and reach here. |
Okay, so it seems like the default mode for streaming responses is to include stream data now. I was able to get streaming work with const data = new StreamData();
const stream = await pipeline.stream(
question,
{
callbacks: [{
handleChainEnd: (_, __, parentRunId) => {
if (parentRunId === undefined) data.close();
},
}],
},
);
return new StreamingTextResponse(
stream.pipeThrough(createStreamDataTransformer()),
undefined,
data,
); @lgrammel, so we still need to manually close the StreamData object? |
@mauriceackel in your first example, langchain returns a text stream that gets forwarded. I'm working on making useChat/useCompletion compatible with it again, see #1350 In your current example, this might be sufficient:
i.e. you might not need data, the important part is the stream transformation |
This worked for me. |
With this solution, this happens to me |
this works for me, no need to stub |
@pedrocarnevale This can happen when you use an older version of the AI SDK on the client. Please make sure you use >= 3.0.20 on the client (ideally the client version should match the server version) |
@lgrammel I have this in my package.json file |
It depends on your project. If you use e.g. Next.js there would most likely be only 1 package.json file for both server & client code. |
Description
I was following the docs for implementing
useChat
but the responses from the stream would not show in themessages
array returned from useChat. I would only see the user's prompts in there and not the AI reponses.When I inspected the response from the chat endpoint everything seemed to work fine and it was returning the correctly streamed AI response.
By inserting a log statement in the
onError
callback inuseChat
I was able to get the following message:Finally I tried reverting to an older version and there it worked just fine. I tried bumping the versions and found out that the error started appearing in version 3.0.20.
Reverting to 3.0.19 fixed the issue.
Code example
API route
Client component
Additional context
No response
The text was updated successfully, but these errors were encountered: