Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add remix example #1274

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
25 changes: 25 additions & 0 deletions docs/pages/docs/guides.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Guides

Vercel AI SDK is compatible with many popular AI and model providers. This section contains guides for using the Vercel AI SDK
with models and services from these providers:

- [OpenAI](./guides/providers/openai)
- [Azure OpenAI](./guides/providers/azure-openai)
- [AWS Bedrock](./guides/providers/aws-bedrock)
- [Anthropic](./guides/providers/anthropic)
- [Cohere](./guides/providers/cohere)
- [Fireworks.ai](./guides/providers/fireworks)
- [Google](./guides/providers/google)
- [Hugging Face](./guides/providers/huggingface)
- [Inkeep](./guides/providers/inkeep)
- [LangChain](./guides/providers/langchain)
- [Perplexity](./guides/providers/perplexity)
- [Replicate](./guides/providers/replicate)

We also have guides for using the SDK with these frameworks:

- [Next.js App Router](./guides/frameworks/nextjs-app)
- [Next.js Pages Router](./guides/frameworks/nextjs-pages)
- [SolidJS and SolidStart](./guides/frameworks/solidjs)
- [SvelteKit](./guides/frameworks/sveltekit)
- [Nuxt](./guides/frameworks/nuxt)
77 changes: 77 additions & 0 deletions docs/pages/docs/guides/frameworks/remix.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
---
title: Remix
---

# Remix

This is an example of using the AI SDK and OpenAI's completion service with [Remix](https://remix.run/).

## UI Route

In the UI route module, you can use the `useChat` hook from `ai/react` just like in the Next.js.

```typescript
// /app/routes/_index.tsx
import { useChat } from 'ai/react';

export default function Index() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
{messages.length > 0
? messages.map(m => (
<div key={m.id} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.content}
</div>
))
: null}

<form onSubmit={handleSubmit}>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={handleInputChange}
/>
</form>
</div>
);
}
```

## API Endpoint

By default, the `useChat` hook will send data to `/api/chat` endpoint. So we need to create an API route in `app/routes/api.chat.ts` to handle the form submission and connect to OpenAI.

```typescript
// app/routes/api.chat.ts

import type { ActionFunctionArgs } from '@vercel/remix';
import { OpenAIStream, StreamingTextResponse } from 'ai';
import OpenAI from 'openai';

export const config = { runtime: 'edge' };

const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});

export async function action({ request }: ActionFunctionArgs) {
const { messages } = await request.json();
const response = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
stream: true,
messages,
});

// Convert the response into a friendly text-stream
const stream = OpenAIStream(response);
// Respond with the stream
return new StreamingTextResponse(stream);
}
```

## Examples

- [Remix-openai](https://github.com/vercel/ai/tree/main/examples/remix-openai)
5 changes: 5 additions & 0 deletions examples/remix-openai/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
node_modules

/.cache
/build
.env
25 changes: 25 additions & 0 deletions examples/remix-openai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Vercel AI SDK, Remix, and OpenAI Chat Example

This example shows how to use the [Vercel AI SDK](https://sdk.vercel.ai/docs) with [Remix](https://remix.run/) and [OpenAI](https://openai.com) to create a ChatGPT-like AI-powered streaming chat bot.

## Development

Create an `.env` file with content:

```
OPENAI_API_KEY=your_api_key
```

From your terminal:

```sh
npm run dev
```

This starts your app in development mode, rebuilding assets on file changes.

## Deploy your own

Deploy the example using [Vercel](https://vercel.com?utm_source=github&utm_medium=readme&utm_campaign=ai-sdk-example):

[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai%2Ftree%2Fmain%2Fexamples%remix-openai&env=OPENAI_API_KEY&envDescription=OpenAI%20API%20Key&envLink=https%3A%2F%2Fplatform.openai.com%2Faccount%2Fapi-keys&project-name=vercel-ai-chat-openai&repository-name=vercel-ai-chat-openai)
18 changes: 18 additions & 0 deletions examples/remix-openai/app/entry.server.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import { handleRequest } from '@vercel/remix';
import { RemixServer } from '@remix-run/react';
import type { EntryContext } from '@vercel/remix';

export default function (
request: Request,
responseStatusCode: number,
responseHeaders: Headers,
remixContext: EntryContext,
) {
const remixServer = <RemixServer context={remixContext} url={request.url} />;
return handleRequest(
request,
responseStatusCode,
responseHeaders,
remixServer,
);
}
33 changes: 33 additions & 0 deletions examples/remix-openai/app/root.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
import stylesheet from '~/tailwind.css';
import type { LinksFunction } from '@remix-run/node';
import {
Links,
LiveReload,
Meta,
Outlet,
Scripts,
ScrollRestoration,
} from '@remix-run/react';

export const links: LinksFunction = () => [
{ rel: 'stylesheet', href: stylesheet },
];

export default function App() {
return (
<html lang="en">
<head>
<meta charSet="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<Meta />
<Links />
</head>
<body>
<Outlet />
<ScrollRestoration />
<Scripts />
<LiveReload />
</body>
</html>
);
}
34 changes: 34 additions & 0 deletions examples/remix-openai/app/routes/_index.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
import { useChat } from 'ai/react';
import type { MetaFunction } from '@vercel/remix';

export const meta: MetaFunction = () => {
return [
{ title: 'New Remix App' },
{ name: 'description', content: 'Welcome to Remix!' },
];
};

export default function Index() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
{messages.length > 0
? messages.map(m => (
<div key={m.id} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.content}
</div>
))
: null}

<form onSubmit={handleSubmit}>
<input
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
value={input}
placeholder="Say something..."
onChange={handleInputChange}
/>
</form>
</div>
);
}
100 changes: 100 additions & 0 deletions examples/remix-openai/app/routes/api.chat-with-functions.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
import type { ActionFunctionArgs } from '@vercel/remix';
import {
OpenAIStream,
StreamingTextResponse,
experimental_StreamData,
} from 'ai';
import OpenAI from 'openai';
import type { CompletionCreateParams } from 'openai/resources/chat/completions';
// Create an OpenAI API client (that's edge friendly!)
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY || '',
});

// IMPORTANT! Set the runtime to edge when deployed to vercel
export const config = { runtime: 'edge' };

const functions: CompletionCreateParams.Function[] = [
{
name: 'get_current_weather',
description: 'Get the current weather.',
parameters: {
type: 'object',
properties: {
format: {
type: 'string',
enum: ['celsius', 'fahrenheit'],
description: 'The temperature unit to use.',
},
},
required: ['format'],
},
},
{
name: 'eval_code_in_browser',
description: 'Execute javascript code in the browser with eval().',
parameters: {
type: 'object',
properties: {
code: {
type: 'string',
description: `Javascript code that will be directly executed via eval(). Do not use backticks in your response.
DO NOT include any newlines in your response, and be sure to provide only valid JSON when providing the arguments object.
The output of the eval() will be returned directly by the function.`,
},
},
required: ['code'],
},
},
];

export async function action({ request }: ActionFunctionArgs) {
const { messages } = await request.json();

const response = await openai.chat.completions.create({
model: 'gpt-3.5-turbo-0613',
stream: true,
messages,
functions,
});

const data = new experimental_StreamData();
const stream = OpenAIStream(response, {
experimental_onFunctionCall: async (
{ name, arguments: args },
createFunctionCallMessages,
) => {
if (name === 'get_current_weather') {
// Call a weather API here
const weatherData = {
temperature: 20,
unit: args.format === 'celsius' ? 'C' : 'F',
};

data.append({
text: 'Some custom data',
});

const newMessages = createFunctionCallMessages(weatherData);
return openai.chat.completions.create({
messages: [...messages, ...newMessages],
stream: true,
model: 'gpt-3.5-turbo-0613',
});
}
},
onCompletion(completion) {
console.log('completion', completion);
},
onFinal(completion) {
data.close();
},
experimental_streamData: true,
});

data.append({
text: 'Hello, how are you?',
});

return new StreamingTextResponse(stream, {}, data);
}
24 changes: 24 additions & 0 deletions examples/remix-openai/app/routes/api.chat.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
import type { ActionFunctionArgs } from '@vercel/remix';
import { OpenAIStream, StreamingTextResponse } from 'ai';
import OpenAI from 'openai';

// IMPORTANT! Set the runtime to edge when deployed to vercel
export const config = { runtime: 'edge' };

const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});

export async function action({ request }: ActionFunctionArgs) {
const { messages } = await request.json();
const response = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
stream: true,
messages,
});

// Convert the response into a friendly text-stream
const stream = OpenAIStream(response);
// Respond with the stream
return new StreamingTextResponse(stream);
}