Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: vercel/ai
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: ai@3.2.17
Choose a base ref
...
head repository: vercel/ai
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: ai@3.2.18
Choose a head ref
  • 5 commits
  • 14 files changed
  • 5 contributors

Commits on Jul 8, 2024

  1. feat (ai/react): add setThreadId helper to switch between threads for…

    … useAssistant (#2209)
    jeremyphilemon authored Jul 8, 2024
    Copy the full SHA
    70d1800 View commit details
  2. feat (docs): document aborting streams (#2211)

    lgrammel authored Jul 8, 2024
    Copy the full SHA
    c27fec3 View commit details

Commits on Jul 9, 2024

  1. chore(docs): fix packages/core readme typos (#2187)

    Co-authored-by: Lars Grammel <lars.grammel@gmail.com>
    elguarir and lgrammel authored Jul 9, 2024
    Copy the full SHA
    b9f2136 View commit details
  2. chore (examples): Add example for useAssistant to illustrate thread m…

    …anagement (#2212)
    jeremyphilemon authored Jul 9, 2024
    Copy the full SHA
    7a382a6 View commit details
  3. Version Packages (#2210)

    Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
    github-actions[bot] and github-actions[bot] authored Jul 9, 2024
    Copy the full SHA
    974d040 View commit details
77 changes: 77 additions & 0 deletions content/docs/06-advanced/02-stopping-streams.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
---
title: Stopping Streams
description: Learn how to cancel streams with the Vercel AI SDK
---

# Stopping Streams

Cancelling ongoing streams is often needed.
For example, users might want to stop a stream when they realize that the response is not what they want.

The different parts of the Vercel AI SDK support cancelling streams in different ways.

## AI SDK Core

The AI SDK functions have an `abortSignal` argument that you can use to cancel a stream.
You would use this if you want to cancel a stream from the server side to the LLM API, e.g. by
forwarding the `abortSignal` from the request.

```tsx highlight="10,11"
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

export async function POST(req: Request) {
const { prompt } = await req.json();

const result = await streamText({
model: openai('gpt-4-turbo'),
prompt,
// forward the abort signal:
abortSignal: req.signal,
});

return result.toTextStreamResponse();
}
```

## AI SDK UI

The hooks, e.g. `useChat` or `useCompletion`, provide a `stop` helper function that can be used to cancel a stream.
This will cancel the stream from the client side to the server.

```tsx file="app/page.tsx" highlight="9,18-20"
'use client';

import { useCompletion } from 'ai/react';

export default function Chat() {
const {
input,
completion,
stop,
isLoading,
handleSubmit,
handleInputChange,
} = useCompletion();

return (
<div>
{isLoading && (
<button type="button" onClick={() => stop()}>
Stop
</button>
)}
{completion}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
</form>
</div>
);
}
```

## AI SDK RSC

<Note type="warning">
The AI SDK RSC does not currently support stopping streams.
</Note>
6 changes: 6 additions & 0 deletions content/docs/07-reference/ai-sdk-ui/20-use-assistant.mdx
Original file line number Diff line number Diff line change
@@ -107,6 +107,12 @@ This works in conjunction with [`AssistantResponse`](./assistant-response) in th
type: 'string | undefined',
description: 'The current thread ID.',
},
{
name: 'setThreadId',
type: '(threadId: string | undefined) => void',
description:
"Set the current thread ID. Specifying a thread ID will switch to that thread, if it exists. If set to 'undefined', a new thread will be created. For both cases, `threadId` will be updated with the new value and `messages` will be cleared.",
},
{
name: 'input',
type: 'string',
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
import { Message, useAssistant } from 'ai/react';
import { useEffect, useState } from 'react';

export default function Page() {
const {
status,
messages,
input,
submitMessage,
handleInputChange,
threadId,
setThreadId,
} = useAssistant({ api: '/api/assistant' });

const [threads, setThreads] = useState<string[]>([
'thread_wFjFAc6llmI2DaVvaRs6en0z',
'thread_o1KXo6qCtb12A5GaVCx1X5YL',
'thread_jrANWD0rR4QWoIV5Lxq6YFrD',
]);

useEffect(() => {
if (threadId !== undefined) {
if (!threads.includes(threadId)) {
setThreads([...threads, threadId]);
}
}
}, [threadId, threads]);

return (
<div className="flex flex-row" style={{ height: '100dvh' }}>
<div className="w-56 flex-shrink-0 flex flex-col gap-1 bg-zinc-100 p-2">
<div
className={`py-1 px-2 text-zinc-900 hover:bg-zinc-300 cursor-pointer rounded-md ${
threadId === undefined ? 'bg-zinc-300 p-1' : ''
}`}
onClick={() => {
setThreadId(undefined);
}}
>
new thread
</div>

{threads.map((thread, index) => (
<div
key={thread}
className={`py-1 px-2 text-zinc-900 hover:bg-zinc-300 cursor-pointer rounded-md ${
threadId === thread ? 'bg-zinc-300' : ''
}`}
onClick={() => {
setThreadId(thread);
}}
>
thread {index + 1}
</div>
))}
</div>

<div className="flex flex-col gap-2 w-full relative">
<div className="p-2">status: {status}</div>

<div className="flex flex-col p-2 gap-2">
{messages.map((message: Message) => (
<div key={message.id} className="flex flex-row gap-2">
<div className="w-24 text-zinc-500">{`${message.role}: `}</div>
<div className="w-full">{message.content}</div>
</div>
))}
</div>

<form onSubmit={submitMessage} className="absolute bottom-0 p-2 w-full">
<input
className="bg-zinc-100 w-full p-2 rounded-md"
placeholder="Send message..."
value={input}
onChange={handleInputChange}
disabled={status !== 'awaiting_message'}
/>
</form>
</div>
</div>
);
}
4 changes: 4 additions & 0 deletions examples/next-openai-pages/pages/index.tsx
Original file line number Diff line number Diff line change
@@ -56,6 +56,10 @@ const examples = [
title: 'Stream OpenAI Assistant API response with tool calls',
link: '/assistants/stream-assistant-response-with-tools',
},
{
title: 'Stream OpenAI Assistant API response and switch between threads',
link: '/assistants/stream-assistant-switch-threads',
},
];

export default function Home() {
7 changes: 7 additions & 0 deletions packages/core/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,12 @@
# ai

## 3.2.18

### Patch Changes

- Updated dependencies [70d18003]
- @ai-sdk/react@0.0.18

## 3.2.17

### Patch Changes
4 changes: 2 additions & 2 deletions packages/core/README.md
Original file line number Diff line number Diff line change
@@ -49,7 +49,7 @@ main();

The [AI SDK UI](https://sdk.vercel.ai/docs/ai-sdk-ui/overview) module provides a set of hooks that help you build chatbots and generative user interfaces. These hooks are framework agnostic, so they can be used in Next.js, React, Svelte, Vue, and SolidJS.

###### @/app/page.tsx (Next.js Pages Router)
###### @/app/page.tsx (Next.js App Router)

```tsx
"use client"
@@ -81,7 +81,7 @@ export default function Page() {
}
```

###### @/app/api/chat/route.ts (Next.js Pages Router)
###### @/app/api/chat/route.ts (Next.js App Router)

```ts
import { CoreMessage, streamText } from 'ai';
4 changes: 2 additions & 2 deletions packages/core/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "ai",
"version": "3.2.17",
"version": "3.2.18",
"license": "Apache-2.0",
"sideEffects": false,
"main": "./dist/index.js",
@@ -77,7 +77,7 @@
"dependencies": {
"@ai-sdk/provider": "0.0.11",
"@ai-sdk/provider-utils": "1.0.1",
"@ai-sdk/react": "0.0.17",
"@ai-sdk/react": "0.0.18",
"@ai-sdk/solid": "0.0.12",
"@ai-sdk/svelte": "0.0.13",
"@ai-sdk/ui-utils": "0.0.10",
6 changes: 6 additions & 0 deletions packages/core/tests/e2e/next-server/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -4,6 +4,12 @@

### Patch Changes

- ai@3.2.18

## null

### Patch Changes

- Updated dependencies [3db90c3d]
- Updated dependencies [abb22602]
- Updated dependencies [5c1f0bd3]
6 changes: 6 additions & 0 deletions packages/react/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# @ai-sdk/react

## 0.0.18

### Patch Changes

- 70d18003: add setThreadId helper to switch between threads for useAssistant

## 0.0.17

### Patch Changes
2 changes: 1 addition & 1 deletion packages/react/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ai-sdk/react",
"version": "0.0.17",
"version": "0.0.18",
"license": "Apache-2.0",
"sideEffects": false,
"main": "./dist/index.js",
21 changes: 17 additions & 4 deletions packages/react/src/use-assistant.ts
Original file line number Diff line number Diff line change
@@ -28,6 +28,11 @@ export type UseAssistantHelpers = {
*/
threadId: string | undefined;

/**
* Set the current thread ID. Specifying a thread ID will switch to that thread, if it exists. If set to 'undefined', a new thread will be created. For both cases, `threadId` will be updated with the new value and `messages` will be cleared.
*/
setThreadId: (threadId: string | undefined) => void;

/**
* The current value of the input field.
*/
@@ -97,7 +102,9 @@ export function useAssistant({
}: UseAssistantOptions): UseAssistantHelpers {
const [messages, setMessages] = useState<Message[]>([]);
const [input, setInput] = useState('');
const [threadId, setThreadId] = useState<string | undefined>(undefined);
const [currentThreadId, setCurrentThreadId] = useState<string | undefined>(
undefined,
);
const [status, setStatus] = useState<AssistantStatus>('awaiting_message');
const [error, setError] = useState<undefined | Error>(undefined);

@@ -151,7 +158,7 @@ export function useAssistant({
body: JSON.stringify({
...body,
// always use user-provided threadId when available:
threadId: threadIdParam ?? threadId ?? null,
threadId: threadIdParam ?? currentThreadId ?? null,
message: message.content,

// optional request data:
@@ -216,7 +223,7 @@ export function useAssistant({
}

case 'assistant_control_data': {
setThreadId(value.threadId);
setCurrentThreadId(value.threadId);

// set id of last message:
setMessages(messages => {
@@ -267,11 +274,17 @@ export function useAssistant({
append({ role: 'user', content: input }, requestOptions);
};

const setThreadId = (threadId: string | undefined) => {
setCurrentThreadId(threadId);
setMessages([]);
};

return {
append,
messages,
setMessages,
threadId,
threadId: currentThreadId,
setThreadId,
input,
setInput,
handleInputChange,
192 changes: 192 additions & 0 deletions packages/react/src/use-assistant.ui.test.tsx
Original file line number Diff line number Diff line change
@@ -143,3 +143,195 @@ describe('stream data stream', () => {
});
});
});

describe('thread management', () => {
const TestComponent = () => {
const { status, messages, error, append, setThreadId, threadId } =
useAssistant({
api: '/api/assistant',
});

return (
<div>
<div data-testid="status">{status}</div>
<div data-testid="thread-id">{threadId || 'undefined'}</div>
{error && <div data-testid="error">{error.toString()}</div>}
{messages.map((m, idx) => (
<div data-testid={`message-${idx}`} key={idx}>
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.content}
</div>
))}

<button
data-testid="do-append"
onClick={() => {
append({ role: 'user', content: 'hi' });
}}
/>
<button
data-testid="do-new-thread"
onClick={() => {
setThreadId(undefined);
}}
/>
<button
data-testid="do-thread-3"
onClick={() => {
setThreadId('t3');
}}
/>
</div>
);
};

beforeEach(() => {
render(<TestComponent />);
});

afterEach(() => {
vi.restoreAllMocks();
cleanup();
});

it('create new thread', async () => {
await screen.findByTestId('thread-id');
expect(screen.getByTestId('thread-id')).toHaveTextContent('undefined');
});

it('should show streamed response', async () => {
const { requestBody } = mockFetchDataStream({
url: 'https://example.com/api/assistant',
chunks: [
formatStreamPart('assistant_control_data', {
threadId: 't0',
messageId: 'm0',
}),
formatStreamPart('assistant_message', {
id: 'm0',
role: 'assistant',
content: [{ type: 'text', text: { value: '' } }],
}),
// text parts:
'0:"Hello"\n',
'0:","\n',
'0:" world"\n',
'0:"."\n',
],
});

await userEvent.click(screen.getByTestId('do-append'));

await screen.findByTestId('message-0');
expect(screen.getByTestId('message-0')).toHaveTextContent('User: hi');

expect(screen.getByTestId('thread-id')).toHaveTextContent('t0');

await screen.findByTestId('message-1');
expect(screen.getByTestId('message-1')).toHaveTextContent(
'AI: Hello, world.',
);

// check that correct information was sent to the server:
expect(await requestBody).toStrictEqual(
JSON.stringify({
threadId: null,
message: 'hi',
}),
);
});

it('should switch to new thread on setting undefined threadId', async () => {
await userEvent.click(screen.getByTestId('do-new-thread'));

expect(screen.queryByTestId('message-0')).toBeNull();
expect(screen.queryByTestId('message-1')).toBeNull();

const { requestBody } = mockFetchDataStream({
url: 'https://example.com/api/assistant',
chunks: [
formatStreamPart('assistant_control_data', {
threadId: 't1',
messageId: 'm0',
}),
formatStreamPart('assistant_message', {
id: 'm0',
role: 'assistant',
content: [{ type: 'text', text: { value: '' } }],
}),
// text parts:
'0:"Hello"\n',
'0:","\n',
'0:" world"\n',
'0:"."\n',
],
});

await userEvent.click(screen.getByTestId('do-append'));

await screen.findByTestId('message-0');
expect(screen.getByTestId('message-0')).toHaveTextContent('User: hi');

expect(screen.getByTestId('thread-id')).toHaveTextContent('t1');

await screen.findByTestId('message-1');
expect(screen.getByTestId('message-1')).toHaveTextContent(
'AI: Hello, world.',
);

// check that correct information was sent to the server:
expect(await requestBody).toStrictEqual(
JSON.stringify({
threadId: null,
message: 'hi',
}),
);
});

it('should switch to thread on setting previously created threadId', async () => {
await userEvent.click(screen.getByTestId('do-thread-3'));

expect(screen.queryByTestId('message-0')).toBeNull();
expect(screen.queryByTestId('message-1')).toBeNull();

const { requestBody } = mockFetchDataStream({
url: 'https://example.com/api/assistant',
chunks: [
formatStreamPart('assistant_control_data', {
threadId: 't3',
messageId: 'm0',
}),
formatStreamPart('assistant_message', {
id: 'm0',
role: 'assistant',
content: [{ type: 'text', text: { value: '' } }],
}),
// text parts:
'0:"Hello"\n',
'0:","\n',
'0:" world"\n',
'0:"."\n',
],
});

await userEvent.click(screen.getByTestId('do-append'));

await screen.findByTestId('message-0');
expect(screen.getByTestId('message-0')).toHaveTextContent('User: hi');

expect(screen.getByTestId('thread-id')).toHaveTextContent('t3');

await screen.findByTestId('message-1');
expect(screen.getByTestId('message-1')).toHaveTextContent(
'AI: Hello, world.',
);

// check that correct information was sent to the server:
expect(await requestBody).toStrictEqual(
JSON.stringify({
threadId: 't3',
message: 'hi',
}),
);
});
});
2 changes: 1 addition & 1 deletion pnpm-lock.yaml