Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: vercel/ai
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: ai@3.2.26
Choose a base ref
...
head repository: vercel/ai
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: ai@3.2.27
Choose a head ref
  • 3 commits
  • 6 files changed
  • 4 contributors

Commits on Jul 16, 2024

  1. docs: add decision tree (#2303)

    nicoalbanese authored Jul 16, 2024
    Copy the full SHA
    e117cf8 View commit details
  2. fix (ai/core): generateText token usage is sum over all roundtrips (#…

    lgrammel authored Jul 16, 2024
    Copy the full SHA
    811f449 View commit details
  3. Version Packages (#2305)

    Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
    github-actions[bot] and github-actions[bot] authored Jul 16, 2024
    Copy the full SHA
    3c98e56 View commit details
70 changes: 70 additions & 0 deletions content/docs/02-getting-started/01-navigating-the-library.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
---
title: Navigating the Library
description: Learn how to navigate the Vercel AI SDK.
---

# Navigating the Library

The Vercel AI SDK is a powerful toolkit for building AI applications. This page will help you pick the right tools for your requirements.

Let’s start with a quick overview of the Vercel AI SDK, which is comprised of three parts:

- **[AI SDK Core](/docs/ai-sdk-core):** A unified API for generating text, structured objects, and tool calls with LLMs.
- **[AI SDK UI](/docs/ai-sdk-ui):** A set of framework-agnostic hooks for quickly building chat and generative user interface.
- **[AI SDK RSC](/docs/ai-sdk-rsc):** A library to stream generative user interfaces with React Server Components (RSC).

## Choosing the Right Tool for Your Environment

When deciding which part of the Vercel AI SDK to use, your first consideration should be the environment and existing stack you are working with. Different components of the SDK are tailored to specific frameworks and environments.

| Library | Purpose | Environment Compatibility |
| ----------- | -------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------- |
| AI SDK Core | Call any LLM with unified API (e.g. [streamText](/docs/reference/ai-sdk-core/stream-text) and [streamObject](/docs/reference/ai-sdk-core/stream-object)) | Any JS environment (e.g. Node.js, Deno, Browser) |
| AI SDK UI | Quickly build chat and generative UIs (e.g. [useChat](/docs/reference/ai-sdk-ui/use-chat)) | React & Next.js, Vue & Nuxt, Svelte & SvelteKit, Solid.js & SolidStart |
| AI SDK RSC | Stream generative UIs from Server to Client (e.g. [streamUI](/docs/reference/ai-sdk-rsc/stream-ui)) | Any framework that supports React Server Components (e.g. Next.js) |

## Interoperability of the AI SDK

These tools have been designed to work seamlessly with each other and it's likely that you will be using them together. Let's look at how you could decide which libraries to use based on your application environment, existing stack, and requirements.

The following table outlines AI SDK compatibility based on environment:

| Environment | AI SDK Core | AI SDK UI | AI SDK RSC |
| --------------------- | ------------------- | ------------------- | ------------------- |
| None / Node.js / Deno | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> |
| Vue / Nuxt | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> |
| Svelte / SvelteKit | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> |
| Solid.js / SolidStart | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> |
| Next.js Pages Router | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> |
| Next.js App Router | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |

## When to use AI SDK RSC

[React Server Components](https://nextjs.org/docs/app/building-your-application/rendering/server-components) (RSCs) provide a new approach to building React applications that allow components to render on the server, fetch data directly, and stream the results to the client, reducing bundle size and improving performance. They also introduce a new way to call server-side functions from anywhere in your application called [Server Actions](https://nextjs.org/docs/app/building-your-application/data-fetching/server-actions-and-mutations).

RSCs and Server Actions are still in early development. When considering whether to use AI SDK RSC, it's important to be aware of the current limitations of RSCs and Server Actions:

- **Cancellation/Logging/Debugging**: currently, it is not possible to abort a stream using Server Actions. This will be improved in future releases of React and Next.js.
- **Quadratic Data Transfer**: using [`createStreamableUI`](/docs/reference/ai-sdk-rsc/create-streamable-ui) can lead to quadratic data transfer. You can avoid this using [ `createStreamableValue` ](/docs/reference/ai-sdk-rsc/create-streamable-value) instead, and rendering the component client-side.
- **Re-mounting Issue During Streaming**: when using `createStreamableUI`, components re-mount on `.done()`, causing undesirable behavior.
- **Human-in-the-Loop Roundtrips**: currently, there is no support for human-in-the-loop chatbot roundtrips.

If any of the above limitations are important to you application, we recommended using AI SDK UI.

### AI SDK UI Framework Compatibility

AI SDK UI supports the following frameworks: [React](https://react.dev/)[Svelte](https://svelte.dev/)[Vue.js](https://vuejs.org/), and [SolidJS](https://www.solidjs.com/). Here is a comparison of the supported functions across these frameworks:

| Function | React | Svelte | Vue.js | SolidJS |
| ---------------------------------------------------------- | ------------------- | ------------------- | ------------------- | ------------------- |
| [useChat](/docs/reference/ai-sdk-ui/use-chat) | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| [useChat](/docs/reference/ai-sdk-ui/use-chat) tool calling | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> | <Check size={18} /> |
| [useChat](/docs/reference/ai-sdk-ui/use-chat) attachments | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> | <Cross size={18} /> |
| [useCompletion](/docs/reference/ai-sdk-ui/use-completion) | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> |
| [useObject](/docs/reference/ai-sdk-ui/use-object) | <Check size={18} /> | <Cross size={18} /> | <Cross size={18} /> | <Cross size={18} /> |
| [useAssistant](/docs/reference/ai-sdk-ui/use-assistant) | <Check size={18} /> | <Check size={18} /> | <Check size={18} /> | <Cross size={18} /> |

<Note>
[Contributions](https://github.com/vercel/ai/blob/main/CONTRIBUTING.md) are
welcome to implement missing features for non-React frameworks.
</Note>
6 changes: 6 additions & 0 deletions packages/core/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# ai

## 3.2.27

### Patch Changes

- 811f4493: fix (ai/core): generateText token usage is sum over all roundtrips

## 3.2.26

### Patch Changes
8 changes: 8 additions & 0 deletions packages/core/core/generate-text/generate-text.test.ts
Original file line number Diff line number Diff line change
@@ -494,6 +494,14 @@ describe('options.maxToolRoundtrips', () => {
assert.deepStrictEqual(result.toolResults, []);
});

it('should sum token usage', () => {
assert.deepStrictEqual(result.usage, {
completionTokens: 25,
promptTokens: 20,
totalTokens: 45,
});
});

it('should return information about all roundtrips', () => {
assert.deepStrictEqual(result.roundtrips, [
{
17 changes: 15 additions & 2 deletions packages/core/core/generate-text/generate-text.ts
Original file line number Diff line number Diff line change
@@ -165,6 +165,11 @@ By default, it's set to 0, which will disable the feature.
const responseMessages: Array<CoreAssistantMessage | CoreToolMessage> =
[];
const roundtrips: GenerateTextResult<TOOLS>['roundtrips'] = [];
const usage: CompletionTokenUsage = {
completionTokens: 0,
promptTokens: 0,
totalTokens: 0,
};

do {
// once we have a roundtrip, we need to switch to messages format:
@@ -219,13 +224,21 @@ By default, it's set to 0, which will disable the feature.
tracer,
});

// token usage:
const currentUsage = calculateCompletionTokenUsage(
currentModelResponse.usage,
);
usage.completionTokens += currentUsage.completionTokens;
usage.promptTokens += currentUsage.promptTokens;
usage.totalTokens += currentUsage.totalTokens;

// add roundtrip information:
roundtrips.push({
text: currentModelResponse.text ?? '',
toolCalls: currentToolCalls,
toolResults: currentToolResults,
finishReason: currentModelResponse.finishReason,
usage: calculateCompletionTokenUsage(currentModelResponse.usage),
usage: currentUsage,
warnings: currentModelResponse.warnings,
logprobs: currentModelResponse.logprobs,
});
@@ -267,7 +280,7 @@ By default, it's set to 0, which will disable the feature.
toolCalls: currentToolCalls,
toolResults: currentToolResults,
finishReason: currentModelResponse.finishReason,
usage: calculateCompletionTokenUsage(currentModelResponse.usage),
usage,
warnings: currentModelResponse.warnings,
rawResponse: currentModelResponse.rawResponse,
logprobs: currentModelResponse.logprobs,
2 changes: 1 addition & 1 deletion packages/core/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "ai",
"version": "3.2.26",
"version": "3.2.27",
"license": "Apache-2.0",
"sideEffects": false,
"main": "./dist/index.js",
7 changes: 7 additions & 0 deletions packages/core/tests/e2e/next-server/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -4,6 +4,13 @@

### Patch Changes

- Updated dependencies [811f4493]
- ai@3.2.27

## null

### Patch Changes

- Updated dependencies [8f545ce9]
- ai@3.2.26