Skip to content

Commit a6cb2c8

Browse files
lgrammelAntzyMogithub-actions[bot]
authoredJul 15, 2024··
fix (ai/ui): keep last message in useChat on error (#2262)
Co-authored-by: AntzyMo <mozbano@163.com> Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
1 parent 92217a8 commit a6cb2c8

File tree

18 files changed

+443
-99
lines changed

18 files changed

+443
-99
lines changed
 

‎.changeset/sour-points-fold.md

+10
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
---
2+
'@ai-sdk/ui-utils': patch
3+
'@ai-sdk/svelte': patch
4+
'@ai-sdk/react': patch
5+
'@ai-sdk/solid': patch
6+
'ai': patch
7+
'@ai-sdk/vue': patch
8+
---
9+
10+
feat (ai/ui): add keepLastMessageOnError option to useChat

‎content/docs/05-ai-sdk-ui/02-chatbot.mdx

+112-32
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ import { useChat } from 'ai/react';
2424

2525
export default function Page() {
2626
const { messages, input, handleInputChange, handleSubmit } = useChat({
27-
api: 'api/chat',
27+
keepLastMessageOnError: true,
2828
});
2929

3030
return (
@@ -35,13 +35,9 @@ export default function Page() {
3535
{message.content}
3636
</div>
3737
))}
38+
3839
<form onSubmit={handleSubmit}>
39-
<input
40-
name="prompt"
41-
value={input}
42-
onChange={handleInputChange}
43-
id="input"
44-
/>
40+
<input name="prompt" value={input} onChange={handleInputChange} />
4541
<button type="submit">Submit</button>
4642
</form>
4743
</>
@@ -50,60 +46,143 @@ export default function Page() {
5046
```
5147

5248
```ts filename='app/api/chat/route.ts'
53-
import { type CoreMessage, streamText } from 'ai';
5449
import { openai } from '@ai-sdk/openai';
50+
import { convertToCoreMessages, streamText } from 'ai';
5551

5652
// Allow streaming responses up to 30 seconds
5753
export const maxDuration = 30;
5854

5955
export async function POST(req: Request) {
60-
const { messages }: { messages: CoreMessage[] } = await req.json();
56+
const { messages } = await req.json();
6157

6258
const result = await streamText({
63-
model: openai('gpt-4'),
59+
model: openai('gpt-4-turbo'),
6460
system: 'You are a helpful assistant.',
65-
messages,
61+
messages: convertToCoreMessages(messages),
6662
});
6763

6864
return result.toAIStreamResponse();
6965
}
7066
```
7167

72-
In the `Page` component, the `useChat` hook will request to your AI provider endpoint whenever the user submits a message. The messages are then streamed back in real-time and displayed in the chat UI.
68+
In the `Page` component, the `useChat` hook will request to your AI provider endpoint whenever the user submits a message.
69+
The messages are then streamed back in real-time and displayed in the chat UI.
70+
71+
This enables a seamless chat experience where the user can see the AI response as soon as it is available,
72+
without having to wait for the entire response to be received.
7373

74-
This enables a seamless chat experience where the user can see the AI response as soon as it is available, without having to wait for the entire response to be received.
74+
<Note>
75+
`useChat` has a `keepLastMessageOnError` option that defaults to `false`. This
76+
option can be enabled to keep the last message on error. We will make this the
77+
default behavior in the next major release. Please enable it and update your
78+
error handling/resubmit behavior.
79+
</Note>
7580

7681
## Customized UI
7782

7883
`useChat` also provides ways to manage the chat message and input states via code, show loading and error states, and update messages without being triggered by user interactions.
7984

80-
### Loading and error states
85+
### Loading State
8186

82-
To show a loading spinner while the chatbot is processing the user's message, you can use the `isLoading` state returned by the `useChat` hook:
87+
The `isLoading` state returned by the `useChat` hook can be used for several
88+
purposes
8389

84-
```tsx
85-
const { isLoading, ... } = useChat()
90+
- To show a loading spinner while the chatbot is processing the user's message.
91+
- To show a "Stop" button to abort the current message.
92+
- To disable the submit button.
8693

87-
return <>
88-
{isLoading ? <Spinner /> : null}
89-
...
94+
```tsx filename='app/page.tsx' highlight="6,20-27,34"
95+
'use client';
96+
97+
import { useChat } from 'ai/react';
98+
99+
export default function Page() {
100+
const { messages, input, handleInputChange, handleSubmit, isLoading } =
101+
useChat({
102+
keepLastMessageOnError: true,
103+
});
104+
105+
return (
106+
<>
107+
{messages.map(message => (
108+
<div key={message.id}>
109+
{message.role === 'user' ? 'User: ' : 'AI: '}
110+
{message.content}
111+
</div>
112+
))}
113+
114+
{isLoading && (
115+
<div>
116+
<Spinner />
117+
<button type="button" onClick={() => stop()}>
118+
Stop
119+
</button>
120+
</div>
121+
)}
122+
123+
<form onSubmit={handleSubmit}>
124+
<input
125+
name="prompt"
126+
value={input}
127+
onChange={handleInputChange}
128+
disabled={isLoading}
129+
/>
130+
<button type="submit">Submit</button>
131+
</form>
132+
</>
133+
);
134+
}
90135
```
91136

92-
Similarly, the `error` state reflects the error object thrown during the fetch request. It can be used to display an error message, or show a toast notification:
137+
### Error State
93138

94-
```tsx
95-
const { error, ... } = useChat()
139+
Similarly, the `error` state reflects the error object thrown during the fetch request.
140+
It can be used to display an error message, disable the submit button, or show a retry button.
96141

97-
useEffect(() => {
98-
if (error) {
99-
toast.error(error.message)
100-
}
101-
}, [error])
142+
<Note>
143+
We recommend showing a generic error message to the user, such as "Something
144+
went wrong." This is a good practice to avoid leaking information from the
145+
server.
146+
</Note>
102147

103-
// Or display the error message in the UI:
104-
return <>
105-
{error ? <div>{error.message}</div> : null}
106-
...
148+
```tsx file="app/page.tsx" highlight="6,18-25,31"
149+
'use client';
150+
151+
import { useChat } from 'ai/react';
152+
153+
export default function Chat() {
154+
const { messages, input, handleInputChange, handleSubmit, error, reload } =
155+
useChat({
156+
keepLastMessageOnError: true,
157+
});
158+
159+
return (
160+
<div>
161+
{messages.map(m => (
162+
<div key={m.id}>
163+
{m.role}: {m.content}
164+
</div>
165+
))}
166+
167+
{error && (
168+
<>
169+
<div>An error occurred.</div>
170+
<button type="button" onClick={() => reload()}>
171+
Retry
172+
</button>
173+
</>
174+
)}
175+
176+
<form onSubmit={handleSubmit}>
177+
<input
178+
value={input}
179+
onChange={handleInputChange}
180+
disabled={error != null}
181+
/>
182+
</form>
183+
</div>
184+
);
185+
}
107186
```
108187

109188
### Modify messages
@@ -175,6 +254,7 @@ const { reload, isLoading, ... } = useChat()
175254
return <>
176255
<button onClick={reload} disabled={isLoading}>Regenerate</button>
177256
...
257+
</>
178258
```
179259

180260
When the user clicks the "Regenerate" button, the AI provider will regenerate the last message and replace the current one correspondingly.

‎content/docs/05-ai-sdk-ui/21-error-handling.mdx

+77-13
Original file line numberDiff line numberDiff line change
@@ -5,27 +5,91 @@ description: Learn how to handle errors in the AI SDK UI
55

66
# Error Handling
77

8-
Errors can be handled by passing an [`onError`](/docs/reference/ai-sdk-ui/use-chat#on-error) callback function as an option to the [`useChat`](/docs/reference/ai-sdk-ui/use-chat), [`useCompletion`](/docs/reference/ai-sdk-ui/use-completion) or [`useAssistant`](/docs/reference/ai-sdk-ui/use-assistant) hooks.
8+
### Error Handling Callback
99

10-
Each AI SDK UI hook also returns an [error](/docs/reference/ai-sdk-ui/use-chat#error) object that you can use to render the error in your UI.
10+
Errors can be processed by passing an [`onError`](/docs/reference/ai-sdk-ui/use-chat#on-error) callback function as an option to the [`useChat`](/docs/reference/ai-sdk-ui/use-chat), [`useCompletion`](/docs/reference/ai-sdk-ui/use-completion) or [`useAssistant`](/docs/reference/ai-sdk-ui/use-assistant) hooks.
11+
The callback function receives an error object as an argument.
1112

12-
```tsx
13+
```tsx file="app/page.tsx" highlight="7-10"
1314
import { useChat } from 'ai/react';
1415

15-
const { ... } = useChat({
16-
onError: error => {
17-
// handle error
18-
console.error(error);
19-
},
20-
});
16+
export default function Page() {
17+
const {
18+
/* ... */
19+
} = useChat({
20+
onError: error => {
21+
// handle error
22+
console.error(error);
23+
},
24+
});
25+
}
2126
```
2227

28+
### Error Helper Object
29+
2330
Each AI SDK UI hook also returns an [error](/docs/reference/ai-sdk-ui/use-chat#error) object that you can use to render the error in your UI.
31+
You can use the error object to show an error message, disable the submit button, or show a retry button.
32+
33+
<Note>
34+
We recommend showing a generic error message to the user, such as "Something
35+
went wrong." This is a good practice to avoid leaking information from the
36+
server.
37+
</Note>
38+
39+
```tsx file="app/page.tsx" highlight="7,19-26,32"
40+
'use client';
2441

25-
```tsx
2642
import { useChat } from 'ai/react';
2743

28-
const { error } = useChat();
29-
if (error) return <div>{error.message}</div>;
30-
});
44+
export default function Chat() {
45+
const { messages, input, handleInputChange, handleSubmit, error, reload } =
46+
useChat({
47+
keepLastMessageOnError: true,
48+
});
49+
50+
return (
51+
<div>
52+
{messages.map(m => (
53+
<div key={m.id}>
54+
{m.role}: {m.content}
55+
</div>
56+
))}
57+
58+
{error && (
59+
<>
60+
<div>An error occurred.</div>
61+
<button type="button" onClick={() => reload()}>
62+
Retry
63+
</button>
64+
</>
65+
)}
66+
67+
<form onSubmit={handleSubmit}>
68+
<input
69+
value={input}
70+
onChange={handleInputChange}
71+
disabled={error != null}
72+
/>
73+
</form>
74+
</div>
75+
);
76+
}
77+
```
78+
79+
### useChat: Keep Last Message on Error
80+
81+
`useChat` has a `keepLastMessageOnError` option that defaults to `false`.
82+
This option can be enabled to keep the last message on error.
83+
We will make this the default behavior in the next major release.
84+
Please enable it and update your error handling/resubmit behavior.
85+
86+
### Injecting Errors for Testing
87+
88+
You might want to create errors for testing.
89+
You can easily do so by throwing an error in your route handler:
90+
91+
```ts file="app/api/chat/route.ts"
92+
export async function POST(req: Request) {
93+
throw new Error('This is a test error');
94+
}
3195
```

‎content/docs/07-reference/ai-sdk-ui/01-use-chat.mdx

+6
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,12 @@ Allows you to easily create a conversational user interface for your chatbot app
4343
type: "string = '/api/chat'",
4444
description: 'The chat completion API endpoint offered by the provider.',
4545
},
46+
{
47+
name: 'keepLastMessageOnError',
48+
type: 'boolean',
49+
description:
50+
'Keeps the last message when an error happens. This will be the default behavior starting with the next major release. The flag was introduced for backwards compatibility and currently defaults to `false`. Please enable it and update your error handling/resubmit behavior.',
51+
},
4652
{
4753
name: 'id',
4854
type: 'string',

‎examples/next-openai/app/api/chat/route.ts

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
import { openai } from '@ai-sdk/openai';
2-
import { streamText } from 'ai';
2+
import { convertToCoreMessages, streamText } from 'ai';
33

44
// Allow streaming responses up to 30 seconds
55
export const maxDuration = 30;
@@ -11,7 +11,7 @@ export async function POST(req: Request) {
1111
// Call the language model
1212
const result = await streamText({
1313
model: openai('gpt-4-turbo'),
14-
messages,
14+
messages: convertToCoreMessages(messages),
1515
async onFinish({ text, toolCalls, toolResults, usage, finishReason }) {
1616
// implement your own logic here, e.g. for storing messages
1717
// or recording token usage

‎examples/next-openai/app/page.tsx

+40-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,19 @@
33
import { useChat } from 'ai/react';
44

55
export default function Chat() {
6-
const { messages, input, handleInputChange, handleSubmit } = useChat();
6+
const {
7+
error,
8+
input,
9+
isLoading,
10+
handleInputChange,
11+
handleSubmit,
12+
messages,
13+
reload,
14+
stop,
15+
} = useChat({
16+
keepLastMessageOnError: true,
17+
});
18+
719
return (
820
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
921
{messages.map(m => (
@@ -13,12 +25,39 @@ export default function Chat() {
1325
</div>
1426
))}
1527

28+
{isLoading && (
29+
<div className="mt-4 text-gray-500">
30+
<div>Loading...</div>
31+
<button
32+
type="button"
33+
className="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
34+
onClick={stop}
35+
>
36+
Stop
37+
</button>
38+
</div>
39+
)}
40+
41+
{error && (
42+
<div className="mt-4">
43+
<div className="text-red-500">An error occurred.</div>
44+
<button
45+
type="button"
46+
className="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
47+
onClick={() => reload()}
48+
>
49+
Retry
50+
</button>
51+
</div>
52+
)}
53+
1654
<form onSubmit={handleSubmit}>
1755
<input
1856
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
1957
value={input}
2058
placeholder="Say something..."
2159
onChange={handleInputChange}
60+
disabled={isLoading || error != null}
2261
/>
2362
</form>
2463
</div>

‎examples/nuxt-openai/pages/index.vue

+29-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,12 @@
11
<script setup lang="ts">
22
import { useChat } from '@ai-sdk/vue';
3+
import { computed } from 'vue';
34
4-
const { messages, input, handleSubmit } = useChat();
5+
const { error, input, isLoading, handleSubmit, messages, reload, stop } =
6+
useChat({
7+
keepLastMessageOnError: true,
8+
});
9+
const disabled = computed(() => isLoading.value || error.value != null);
510
</script>
611

712
<template>
@@ -11,11 +16,34 @@ const { messages, input, handleSubmit } = useChat();
1116
{{ m.content }}
1217
</div>
1318

19+
<div v-if="isLoading" class="mt-4 text-gray-500">
20+
<div>Loading...</div>
21+
<button
22+
type="button"
23+
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
24+
@click="stop"
25+
>
26+
Stop
27+
</button>
28+
</div>
29+
30+
<div v-if="error" class="mt-4">
31+
<div class="text-red-500">An error occurred.</div>
32+
<button
33+
type="button"
34+
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
35+
@click="() => reload()"
36+
>
37+
Retry
38+
</button>
39+
</div>
40+
1441
<form @submit="handleSubmit">
1542
<input
1643
class="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
1744
v-model="input"
1845
placeholder="Say something..."
46+
:disabled="disabled"
1947
/>
2048
</form>
2149
</div>

‎examples/nuxt-openai/server/api/chat.ts

+10-9
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,23 @@
1-
import { streamText } from 'ai';
1+
import { convertToCoreMessages, streamText } from 'ai';
22
import { createOpenAI } from '@ai-sdk/openai';
33

44
export default defineLazyEventHandler(async () => {
5-
const apiKey = useRuntimeConfig().openaiApiKey;
6-
if (!apiKey) throw new Error('Missing OpenAI API key');
7-
85
const openai = createOpenAI({
9-
apiKey: apiKey,
6+
apiKey: useRuntimeConfig().openaiApiKey,
107
});
118

129
return defineEventHandler(async (event: any) => {
13-
// Extract the `prompt` from the body of the request
10+
// Extract the `messages` from the body of the request
1411
const { messages } = await readBody(event);
1512

16-
// Ask OpenAI for a streaming chat completion given the prompt
13+
// Call the language model
1714
const result = await streamText({
18-
model: openai('gpt-3.5-turbo'),
19-
messages,
15+
model: openai('gpt-4-turbo'),
16+
messages: convertToCoreMessages(messages),
17+
async onFinish({ text, toolCalls, toolResults, usage, finishReason }) {
18+
// implement your own logic here, e.g. for storing messages
19+
// or recording token usage
20+
},
2021
});
2122

2223
// Respond with the stream
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,21 @@
11
import { openai } from '@ai-sdk/openai';
2-
import { convertToCoreMessages, streamText } from 'ai';
32
import { APIEvent } from '@solidjs/start/server';
3+
import { convertToCoreMessages, streamText } from 'ai';
44

55
export const POST = async (event: APIEvent) => {
6+
// Extract the `messages` from the body of the request
67
const { messages } = await event.request.json();
78

9+
// Call the language model
810
const result = await streamText({
9-
model: openai('gpt-3.5-turbo'),
11+
model: openai('gpt-4-turbo'),
1012
messages: convertToCoreMessages(messages),
13+
async onFinish({ text, toolCalls, toolResults, usage, finishReason }) {
14+
// implement your own logic here, e.g. for storing messages
15+
// or recording token usage
16+
},
1117
});
1218

19+
// Respond with the stream
1320
return result.toAIStreamResponse();
1421
};

‎examples/solidstart-openai/src/routes/index.tsx

+39-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,18 @@ import { For } from 'solid-js';
22
import { useChat } from '@ai-sdk/solid';
33

44
export default function Chat() {
5-
const { messages, input, handleInputChange, handleSubmit } = useChat();
5+
const {
6+
error,
7+
input,
8+
isLoading,
9+
handleInputChange,
10+
handleSubmit,
11+
messages,
12+
reload,
13+
stop,
14+
} = useChat({
15+
keepLastMessageOnError: true,
16+
});
617

718
return (
819
<div class="flex flex-col w-full max-w-md py-24 mx-auto stretch">
@@ -15,12 +26,39 @@ export default function Chat() {
1526
)}
1627
</For>
1728

29+
{isLoading() && (
30+
<div class="mt-4 text-gray-500">
31+
<div>Loading...</div>
32+
<button
33+
type="button"
34+
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
35+
onClick={stop}
36+
>
37+
Stop
38+
</button>
39+
</div>
40+
)}
41+
42+
{error() && (
43+
<div class="mt-4">
44+
<div class="text-red-500">An error occurred.</div>
45+
<button
46+
type="button"
47+
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
48+
onClick={() => reload()}
49+
>
50+
Retry
51+
</button>
52+
</div>
53+
)}
54+
1855
<form onSubmit={handleSubmit}>
1956
<input
2057
class="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
2158
value={input()}
2259
placeholder="Say something..."
2360
onInput={handleInputChange}
61+
disabled={isLoading() || error() != null}
2462
/>
2563
</form>
2664
</div>
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,74 @@
11
<script lang="ts">
2-
import { useChat } from '@ai-sdk/svelte'
2+
import { useChat } from '@ai-sdk/svelte';
33
4-
const { input, handleSubmit, messages } = useChat()
4+
const {
5+
error,
6+
input,
7+
isLoading,
8+
handleSubmit,
9+
messages,
10+
reload,
11+
stop
12+
} = useChat({
13+
keepLastMessageOnError: true,
14+
});
515
</script>
616

717
<svelte:head>
8-
<title>Home</title>
9-
<meta name="description" content="Svelte demo app" />
18+
<title>Home</title>
19+
<meta name="description" content="Svelte demo app" />
1020
</svelte:head>
1121

1222
<section>
13-
<h1>useChat</h1>
14-
<ul>
15-
{#each $messages as message}
16-
<li>{message.role}: {message.content}</li>
17-
{/each}
18-
</ul>
19-
<form on:submit={handleSubmit}>
20-
<input bind:value={$input} />
21-
<button type="submit">Send</button>
22-
</form>
23+
<h1>useChat</h1>
24+
<ul>
25+
{#each $messages as message}
26+
<li>{message.role}: {message.content}</li>
27+
{/each}
28+
</ul>
29+
30+
{#if $isLoading}
31+
<div class="mt-4 text-gray-500">
32+
<div>Loading...</div>
33+
<button
34+
type="button"
35+
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
36+
on:click={stop}
37+
>
38+
Stop
39+
</button>
40+
</div>
41+
{/if}
42+
43+
{#if $error}
44+
<div class="mt-4">
45+
<div class="text-red-500">An error occurred.</div>
46+
<button
47+
type="button"
48+
class="px-4 py-2 mt-4 text-blue-500 border border-blue-500 rounded-md"
49+
on:click={() => reload()}
50+
>
51+
Retry
52+
</button>
53+
</div>
54+
{/if}
55+
56+
<form on:submit={handleSubmit}>
57+
<input bind:value={$input} disabled={$isLoading || $error != null} />
58+
<button type="submit">Send</button>
59+
</form>
2360
</section>
2461

2562
<style>
26-
section {
27-
display: flex;
28-
flex-direction: column;
29-
justify-content: center;
30-
align-items: center;
31-
flex: 0.6;
32-
}
33-
34-
h1 {
35-
width: 100%;
36-
}
63+
section {
64+
display: flex;
65+
flex-direction: column;
66+
justify-content: center;
67+
align-items: center;
68+
flex: 0.6;
69+
}
70+
71+
h1 {
72+
width: 100%;
73+
}
3774
</style>

‎examples/sveltekit-openai/src/routes/api/chat/+server.ts

+9-5
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
import { createOpenAI } from '@ai-sdk/openai';
2-
import { StreamingTextResponse, streamText } from 'ai';
2+
import { convertToCoreMessages, streamText } from 'ai';
33
import type { RequestHandler } from './$types';
44

55
import { env } from '$env/dynamic/private';
@@ -17,12 +17,16 @@ export const POST = (async ({ request }) => {
1717
// Extract the `prompt` from the body of the request
1818
const { messages } = await request.json();
1919

20-
// Ask OpenAI for a streaming chat completion given the prompt
20+
// Call the language model
2121
const result = await streamText({
22-
model: openai('gpt-4-turbo-preview'),
23-
messages,
22+
model: openai('gpt-4-turbo'),
23+
messages: convertToCoreMessages(messages),
24+
async onFinish({ text, toolCalls, toolResults, usage, finishReason }) {
25+
// implement your own logic here, e.g. for storing messages
26+
// or recording token usage
27+
},
2428
});
2529

2630
// Respond with the stream
27-
return new StreamingTextResponse(result.toAIStream());
31+
return result.toAIStreamResponse();
2832
}) satisfies RequestHandler;

‎packages/core/svelte/use-chat.ts

+6-1
Original file line numberDiff line numberDiff line change
@@ -83,6 +83,7 @@ const getStreamedResponse = async (
8383
onResponse: ((response: Response) => void | Promise<void>) | undefined,
8484
sendExtraMessageFields: boolean | undefined,
8585
fetch: FetchFunction | undefined,
86+
keepLastMessageOnError: boolean | undefined,
8687
) => {
8788
// Do an optimistic update to the chat state to show the updated messages
8889
// immediately.
@@ -141,7 +142,9 @@ const getStreamedResponse = async (
141142
},
142143
abortController: () => abortControllerRef,
143144
restoreMessagesOnFailure() {
144-
mutate(previousMessages);
145+
if (!keepLastMessageOnError) {
146+
mutate(previousMessages);
147+
}
145148
},
146149
onResponse,
147150
onUpdate(merged, data) {
@@ -179,6 +182,7 @@ export function useChat({
179182
body,
180183
generateId = generateIdFunc,
181184
fetch,
185+
keepLastMessageOnError = false,
182186
}: UseChatOptions = {}): UseChatHelpers {
183187
// Generate a unique id for the chat if not provided.
184188
const chatId = id || `chat-${uniqueId++}`;
@@ -246,6 +250,7 @@ export function useChat({
246250
onResponse,
247251
sendExtraMessageFields,
248252
fetch,
253+
keepLastMessageOnError,
249254
),
250255
experimental_onFunctionCall,
251256
experimental_onToolCall,

‎packages/react/src/use-chat.ts

+8-3
Original file line numberDiff line numberDiff line change
@@ -96,9 +96,9 @@ const getStreamedResponse = async (
9696
}) => JSONValue)
9797
| undefined,
9898
fetch: FetchFunction | undefined,
99+
keepLastMessageOnError: boolean,
99100
) => {
100-
// Do an optimistic update to the chat state to show the updated messages
101-
// immediately.
101+
// Do an optimistic update to the chat state to show the updated messages immediately:
102102
const previousMessages = messagesRef.current;
103103
mutate(chatRequest.messages, false);
104104

@@ -161,7 +161,9 @@ const getStreamedResponse = async (
161161
},
162162
abortController: () => abortControllerRef.current,
163163
restoreMessagesOnFailure() {
164-
mutate(previousMessages, false);
164+
if (!keepLastMessageOnError) {
165+
mutate(previousMessages, false);
166+
}
165167
},
166168
onResponse,
167169
onUpdate(merged, data) {
@@ -197,6 +199,7 @@ export function useChat({
197199
body,
198200
generateId = generateIdFunc,
199201
fetch,
202+
keepLastMessageOnError = false,
200203
}: UseChatOptions & {
201204
key?: string;
202205

@@ -341,6 +344,7 @@ By default, it's set to 0, which will disable the feature.
341344
sendExtraMessageFields,
342345
experimental_prepareRequestBody,
343346
fetch,
347+
keepLastMessageOnError,
344348
),
345349
experimental_onFunctionCall,
346350
experimental_onToolCall,
@@ -407,6 +411,7 @@ By default, it's set to 0, which will disable the feature.
407411
abortControllerRef,
408412
generateId,
409413
fetch,
414+
keepLastMessageOnError,
410415
],
411416
);
412417

‎packages/solid/src/use-chat.ts

+5-1
Original file line numberDiff line numberDiff line change
@@ -101,6 +101,7 @@ const getStreamedResponse = async (
101101
onToolCall: UseChatOptions['onToolCall'] | undefined,
102102
sendExtraMessageFields: boolean | undefined,
103103
fetch: FetchFunction | undefined,
104+
keepLastMessageOnError: boolean,
104105
) => {
105106
// Do an optimistic update to the chat state to show the updated messages
106107
// immediately.
@@ -139,7 +140,9 @@ const getStreamedResponse = async (
139140
},
140141
abortController: () => abortController,
141142
restoreMessagesOnFailure() {
142-
mutate(previousMessages);
143+
if (!keepLastMessageOnError) {
144+
mutate(previousMessages);
145+
}
143146
},
144147
onResponse,
145148
onUpdate(merged, data) {
@@ -258,6 +261,7 @@ export function useChat(
258261
useChatOptions().onToolCall?.(),
259262
useChatOptions().sendExtraMessageFields?.(),
260263
useChatOptions().fetch?.(),
264+
useChatOptions().keepLastMessageOnError?.() ?? false,
261265
),
262266
experimental_onFunctionCall:
263267
useChatOptions().experimental_onFunctionCall?.(),

‎packages/svelte/src/use-chat.ts

+6-1
Original file line numberDiff line numberDiff line change
@@ -83,6 +83,7 @@ const getStreamedResponse = async (
8383
onResponse: ((response: Response) => void | Promise<void>) | undefined,
8484
sendExtraMessageFields: boolean | undefined,
8585
fetch: FetchFunction | undefined,
86+
keepLastMessageOnError: boolean | undefined,
8687
) => {
8788
// Do an optimistic update to the chat state to show the updated messages
8889
// immediately.
@@ -141,7 +142,9 @@ const getStreamedResponse = async (
141142
},
142143
abortController: () => abortControllerRef,
143144
restoreMessagesOnFailure() {
144-
mutate(previousMessages);
145+
if (!keepLastMessageOnError) {
146+
mutate(previousMessages);
147+
}
145148
},
146149
onResponse,
147150
onUpdate(merged, data) {
@@ -176,6 +179,7 @@ export function useChat({
176179
body,
177180
generateId = generateIdFunc,
178181
fetch,
182+
keepLastMessageOnError = false,
179183
}: UseChatOptions = {}): UseChatHelpers {
180184
// Generate a unique id for the chat if not provided.
181185
const chatId = id || `chat-${uniqueId++}`;
@@ -243,6 +247,7 @@ export function useChat({
243247
onResponse,
244248
sendExtraMessageFields,
245249
fetch,
250+
keepLastMessageOnError,
246251
),
247252
experimental_onFunctionCall,
248253
experimental_onToolCall,

‎packages/ui-utils/src/types.ts

+8
Original file line numberDiff line numberDiff line change
@@ -312,6 +312,14 @@ The options to be passed to the fetch call.
312312

313313
export type UseChatOptions = {
314314
/**
315+
Keeps the last message when an error happens. This will be the default behavior
316+
starting with the next major release.
317+
The flag was introduced for backwards compatibility and currently defaults to `false`.
318+
Please enable it and update your error handling/resubmit behavior.
319+
*/
320+
keepLastMessageOnError?: boolean;
321+
322+
/**
315323
* The API endpoint that accepts a `{ messages: Message[] }` object and returns
316324
* a stream of tokens of the AI chat response. Defaults to `/api/chat`.
317325
*/

‎packages/vue/src/use-chat.ts

+5-2
Original file line numberDiff line numberDiff line change
@@ -84,6 +84,7 @@ export function useChat({
8484
body,
8585
generateId = generateIdFunc,
8686
fetch,
87+
keepLastMessageOnError = false,
8788
}: UseChatOptions = {}): UseChatHelpers {
8889
// Generate a unique ID for the chat if not provided.
8990
const chatId = id || `chat-${uniqueId++}`;
@@ -129,7 +130,7 @@ export function useChat({
129130

130131
// Do an optimistic update to the chat state to show the updated messages
131132
// immediately.
132-
const previousMessages = messagesData.value;
133+
const previousMessages = messagesSnapshot;
133134
mutate(messagesSnapshot);
134135

135136
const requestOptions = {
@@ -198,7 +199,9 @@ export function useChat({
198199
},
199200
restoreMessagesOnFailure() {
200201
// Restore the previous messages if the request fails.
201-
mutate(previousMessages);
202+
if (!keepLastMessageOnError) {
203+
mutate(previousMessages);
204+
}
202205
},
203206
generateId,
204207
onToolCall: undefined, // not implemented yet

0 commit comments

Comments
 (0)
Please sign in to comment.