Skip to content

Commit b717dad

Browse files
authoredFeb 12, 2024
Add Inkeep as a provider (#861)
1 parent 22cda3a commit b717dad

24 files changed

+1238
-5
lines changed
 

‎.changeset/chatty-rivers-protect.md

+5
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
---
2+
'ai': patch
3+
---
4+
5+
Adding Inkeep as a stream provider
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,186 @@
1+
# InkeepStream
2+
3+
## `InkeepStream(res: Response, cb?: AIStreamCallbacks): ReadableStream` [#InkeepStream]
4+
5+
The `InkeepStream` function is a utility that transforms the output from [Inkeep's](https://www.inkeep.com) API into a [ReadableStream](https://developer.mozilla.org/docs/Web/API/ReadableStream). It uses [AIStream](/docs/api-reference/ai-stream) under the hood, applying a specific parser for the Inkeep's response data structure.
6+
7+
This works with the official Inkeep API, and it's supported in both Node.js, the [Edge Runtime](https://edge-runtime.vercel.app), and browser environments.
8+
9+
## Parameters
10+
11+
### `res: Response`
12+
13+
The `Response` object returned by the request to the Inkeep API.
14+
15+
### `cb?: InkeepAIStreamCallbacksAndOptions`
16+
17+
This optional parameter can be an object containing the callback functions to handle the start, each token, completion, and other events of the AI response. In the absence of this parameter, default behavior is implemented.
18+
19+
The `InkeepAIStreamCallbacksAndOptions` extends the standard [AIStreamCallbacks](/docs/api-reference/ai-stream#AIStreamCallbacks) by (1) including additional `metadata` in `onFinal` and (2) adding an `onRecordsCited` callback.
20+
21+
<OptionTable
22+
options={[
23+
[
24+
'onRecordsCited',
25+
'(records_cited: any) => void',
26+
'An optional function that is called once for every request, after the main content of a message has completed. It includes the information about the records (sources) cited in the AI chat response. It's the payload of the `records_cited` event from the Inkeep API.',
27+
],
28+
[
29+
'onFinal',
30+
'(completion: string, metadata: InkeepOnFinalMetadata) => Promise<void>',
31+
"An optional function that is called once for every request. It's always the final callback invoked. It's passed the content of the chat response as a string and metadata as an object of type `InkeepOnFinalMetadata`.",
32+
],
33+
]}
34+
/>
35+
36+
Check the [@inkeep/ai-api](https://github.com/inkeep/ai-api-ts) SDK for the latest typings of the Inkeep APIs. For example, the `records_cited` data payload can be obtained as:
37+
38+
```
39+
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
40+
41+
// RecordsCited$.Inbound
42+
```
43+
44+
#### InkeepOnFinalMetadata
45+
46+
Information included in `metadata` of InkeepStream's `onFinal` callback.
47+
48+
<OptionTable
49+
options={[
50+
[
51+
'chat_session_id',
52+
'string',
53+
'The Inkeep chat_session_id. This should be included in follow-up chat requests that are part of a chat session. Used for analytics and threading chat conversations.',
54+
],
55+
[
56+
'records_cited',
57+
'any',
58+
'Contains information about the citations used in the chat response body.',
59+
],
60+
]}
61+
/>
62+
63+
## Example
64+
65+
The Inkeep API provides two routes:
66+
67+
1. `POST chat_sessions/chat_results` - To **create** a chat session
68+
2. `POST chat_sessions/${chat_session_id}/chat_results` - To **continue** a chat session
69+
70+
The example below shows how to create a `chat` API endpoint compatible with Vercel AI SDK client-side utilities like `useChat`:
71+
72+
```tsx filename="app/api/chat/route.ts" showLineNumbers
73+
import {
74+
InkeepStream,
75+
InkeepOnFinalMetadata,
76+
StreamingTextResponse,
77+
experimental_StreamData,
78+
} from 'ai';
79+
import { InkeepAI } from '@inkeep/ai-api';
80+
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
81+
82+
interface ChatRequestBody {
83+
messages: Array<{
84+
role: 'user' | 'assistant';
85+
content: string;
86+
}>;
87+
chat_session_id?: string;
88+
}
89+
90+
const inkeepIntegrationId = process.env.INKEEP_INTEGRATION_ID;
91+
92+
export async function POST(req: Request) {
93+
const chatRequestBody: ChatRequestBody = await req.json();
94+
const chat_session_id = chatRequestBody.chat_session_id;
95+
96+
const ikpClient = new InkeepAI({
97+
apiKey: process.env.INKEEP_API_KEY,
98+
});
99+
100+
let response;
101+
102+
if (!chat_session_id) {
103+
const createRes = await ikpClient.chatSession.create({
104+
integrationId: inkeepIntegrationId,
105+
chatSession: {
106+
messages: chatRequestBody.messages,
107+
},
108+
stream: true,
109+
});
110+
111+
response = createRes.rawResponse;
112+
} else {
113+
const continueRes = await ikpClient.chatSession.continue(chat_session_id, {
114+
integrationId: inkeepIntegrationId,
115+
message: chatRequestBody.messages[chatRequestBody.messages.length - 1],
116+
stream: true,
117+
});
118+
119+
response = continueRes.rawResponse;
120+
}
121+
122+
// used to pass custom metadata to the client
123+
const data = new experimental_StreamData();
124+
125+
if (!response?.body) {
126+
throw new Error('Response body is null');
127+
}
128+
129+
const stream = InkeepStream(response, {
130+
onRecordsCited: async (records_cited: RecordsCited$.Inbound) => {
131+
// append the citations to the message annotations
132+
data.appendMessageAnnotation({
133+
records_cited,
134+
});
135+
},
136+
onFinal: async (complete: string, metadata?: InkeepOnFinalMetadata) => {
137+
// return the chat_session_id to the client
138+
if (metadata) {
139+
data.append({ onFinalMetadata: metadata });
140+
}
141+
data.close();
142+
},
143+
experimental_streamData: true,
144+
});
145+
146+
return new StreamingTextResponse(stream, {}, data);
147+
}
148+
```
149+
150+
This example uses the [experimental_StreamData](/docs/api-reference/stream-data) and the callback methods of `InkeepStream` to attach metadata to the response.
151+
152+
### Client
153+
154+
From [`useChat`](/docs/api-reference/use-chat), this is available as:
155+
156+
```tsx filename="app/chat/page.tsx" showLineNumbers
157+
import { InkeepOnFinalMetadata } from 'ai/streams';
158+
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
159+
160+
// ... your chat component
161+
162+
const { messages, data } = useChat();
163+
164+
/* ==For chat_session_id== */
165+
// get the onFinalMetadata item from the global chat data
166+
const onFinalMetadataItem = data?.find(
167+
item =>
168+
typeof item === 'object' && item !== null && 'onFinalMetadata' in item,
169+
) as { onFinalMetadata: InkeepOnFinalMetadata } | undefined;
170+
171+
// get the chat_session_id from the onFinalMetadata item
172+
const chatSessionId = onFinalMetadataItem?.onFinalMetadata?.chat_session_id;
173+
174+
/* For messages[n].annotations, available independently for each message */
175+
176+
const recordsCitedAnnotation =
177+
messages &&
178+
messages.length > 0 &&
179+
(messages[0].annotations?.find(
180+
item =>
181+
typeof item === 'object' && item !== null && 'records_cited' in item,
182+
) as { records_cited: RecordsCited$.Inbound } | undefined);
183+
184+
// get the citations from the records_cited annotation
185+
const citations = recordsCitedAnnotation?.records_cited?.citations;
186+
```

‎docs/pages/docs/guides.mdx

+1
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ with models and services from these providers:
1111
- [Fireworks.ai](./guides/providers/fireworks)
1212
- [Google](./guides/providers/google)
1313
- [Hugging Face](./guides/providers/huggingface)
14+
- [Inkeep](./guides/providers/inkeep)
1415
- [LangChain](./guides/providers/langchain)
1516
- [Perplexity](./guides/providers/perplexity)
1617
- [Replicate](./guides/providers/replicate)
+273
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,273 @@
1+
---
2+
title: Inkeep
3+
---
4+
5+
import { Steps, Callout } from 'nextra-theme-docs';
6+
7+
# Inkeep
8+
9+
Vercel AI SDK provides a set of utilities to make it easy to use [Inkeep](https://inkeep.com/)'s AI chat APIs to create chat experiences **powered by your own content**.
10+
11+
In this guide, we'll walk through how to create a Q&A support bot powered by Inkeep.
12+
13+
<Callout>
14+
You can also use Inkeep as a retrieval-augmented generation (RAG) component or
15+
neural search component of a complex LLM application, agent, or workflow.
16+
</Callout>
17+
18+
## Guide: Inkeep Chatbot
19+
20+
<Steps>
21+
22+
### Create a Next.js app
23+
24+
Create a Next.js application, install `ai`, the Vercel AI SDK, as well as [`@inkeep/ai-api`](https://github.com/inkeep/ai-api-ts), the Inkeep API SDK.
25+
26+
```sh
27+
pnpm dlx create-next-app my-rag-app
28+
cd my-rag-app
29+
```
30+
31+
```
32+
pnpm add ai @inkeep/ai-api
33+
```
34+
35+
### Add your Inkeep API Key to `.env`
36+
37+
Create a `.env` file in your project root and add your Inkeep API Key:
38+
39+
```env filename=".env"
40+
INKEEP_API_KEY=xxxxxx
41+
INKEEP_INTEGRATION_ID=xxxxxx
42+
```
43+
44+
### Create a Route Handler
45+
46+
In order to provide analytics and correlate multiple message exchanges into a single "chat session", the Inkeep API provides two endpoints:
47+
48+
1. `POST chat_sessions/chat_results` - To **create** a chat session
49+
2. `POST chat_sessions/${chat_session_id}/chat_results` - To **continue** a chat session
50+
51+
In this example, we'll use [@inkeep/ai-api](https://github.com/inkeep/chat-api-ts) package to call these endpoints, the `ai` Vercel SDK to create a streamed text response, and `useChat` to render the messages in the UI.
52+
53+
First, let's create a Next.js route handler at `app/api/chat/route.ts` that accepts a `POST` request with a `messages` array of strings and an optional `chat_session_id`. We'll use `chat_session_id` to decide whether to create or continue a chat.
54+
55+
```tsx filename="app/api/chat/route.ts" showLineNumbers
56+
import {
57+
InkeepStream,
58+
InkeepOnFinalMetadata,
59+
StreamingTextResponse,
60+
experimental_StreamData,
61+
} from 'ai';
62+
import { InkeepAI } from '@inkeep/ai-api';
63+
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
64+
65+
interface ChatRequestBody {
66+
messages: Array<{
67+
role: 'user' | 'assistant';
68+
content: string;
69+
}>;
70+
chat_session_id?: string;
71+
}
72+
73+
const inkeepIntegrationId = process.env.INKEEP_INTEGRATION_ID;
74+
75+
export async function POST(req: Request) {
76+
const chatRequestBody: ChatRequestBody = await req.json();
77+
const chat_session_id = chatRequestBody.chat_session_id;
78+
79+
const ikpClient = new InkeepAI({
80+
apiKey: process.env.INKEEP_API_KEY,
81+
});
82+
83+
let response;
84+
85+
if (!chat_session_id) {
86+
const createRes = await ikpClient.chatSession.create({
87+
integrationId: inkeepIntegrationId,
88+
chatSession: {
89+
messages: chatRequestBody.messages,
90+
},
91+
stream: true,
92+
});
93+
94+
response = createRes.rawResponse;
95+
} else {
96+
const continueRes = await ikpClient.chatSession.continue(chat_session_id, {
97+
integrationId: inkeepIntegrationId,
98+
message: chatRequestBody.messages[chatRequestBody.messages.length - 1],
99+
stream: true,
100+
});
101+
102+
response = continueRes.rawResponse;
103+
}
104+
105+
// used to pass custom metadata to the client
106+
const data = new experimental_StreamData();
107+
108+
if (!response?.body) {
109+
throw new Error('Response body is null');
110+
}
111+
112+
const stream = InkeepStream(response, {
113+
onRecordsCited: async (records_cited: RecordsCited$.Inbound) => {
114+
// append the citations to the message annotations
115+
data.appendMessageAnnotation({
116+
records_cited,
117+
});
118+
},
119+
onFinal: async (complete: string, metadata?: InkeepOnFinalMetadata) => {
120+
// return the chat_session_id to the client
121+
if (metadata) {
122+
data.append({ onFinalMetadata: metadata });
123+
}
124+
data.close();
125+
},
126+
experimental_streamData: true,
127+
});
128+
129+
return new StreamingTextResponse(stream, {}, data);
130+
}
131+
```
132+
133+
This example leverages a few utilities provided by the Vercel AI SDK:
134+
135+
1. First, we pass the streaming `response` we receive from the Inkeep API to the
136+
[`InkeepStream`](/docs/api-reference/inkeep-stream). This
137+
method decodes/extracts the content of the message from Inkeep's server-side events response and then re-encodes them into a standard [ReadableStream](https://developer.mozilla.org/docs/Web/API/ReadableStream).
138+
139+
2. We then pass that stream directly to the Vercel AI SDK's [`StreamingTextResponse`](/docs/api-reference/streaming-text-response).
140+
This is another utility class that extends the normal Node/Edge Runtime `Response`
141+
class with the default headers you probably want (hint: `'Content-Type':
142+
'text/plain; charset=utf-8'` is already set for you). This will provide the streamed content to the client.
143+
144+
3. Lastly, we use the [experimental_StreamData](/docs/api-reference/stream-data) and callback methods of the `InkeepStream` to attach metadata to the response like `onFinalMetadata.chat_session_id` and `records_cited.citations` for use by the client.
145+
146+
<Callout>
147+
It's common to save a chat to a database. To do so, you can leverage the
148+
`onFinal` callback to add your own saving logic. For example, add `await
149+
saveCompletionToDatabase(complete, metadata);` prior to `data.close();`.
150+
</Callout>
151+
152+
### Wire up the UI
153+
154+
Next, let's create a client component with a form that we'll use to gather the prompt from the user and then stream back the chat response from.
155+
156+
By default, the [`useChat`](/docs/api-reference/use-chat) hook will use the `POST` Route Handler we created above (it defaults to `/api/chat`).
157+
158+
We will use the `data` field to get the Inkeep `chat_session_id`, which we will include in the request body in any subsequent messages.
159+
160+
```tsx filename="app/page.tsx" showLineNumbers
161+
'use client';
162+
163+
import { useChat } from 'ai/react';
164+
import { useEffect, useState } from 'react';
165+
import { Message } from 'ai';
166+
import { type InkeepOnFinalMetadata } from 'ai/streams';
167+
import { Citations } from './Citations';
168+
169+
export default function Chat() {
170+
/**
171+
* You can alternatively put the chat_session_id in search params e.g. ?chat_session_id=123 or path params like /chat/123 depending on your use case
172+
*/
173+
const [chatSessionId, setChatSessionId] = useState<string | undefined>(
174+
undefined,
175+
);
176+
177+
const { messages, input, handleInputChange, handleSubmit, data } = useChat({
178+
body: {
179+
chat_session_id: chatSessionId,
180+
},
181+
});
182+
183+
// SET THE INKEEP CHAT SESSION ID FROM THE CHAT DATA
184+
useEffect(() => {
185+
// get the onFinalMetadata item from the global data
186+
const onFinalMetadataItem = data?.find(
187+
item =>
188+
typeof item === 'object' && item !== null && 'onFinalMetadata' in item,
189+
) as { onFinalMetadata: InkeepOnFinalMetadata } | undefined;
190+
191+
// get the chat_session_id from the onFinalMetadata item
192+
const chatSessionId = onFinalMetadataItem?.onFinalMetadata?.chat_session_id;
193+
194+
setChatSessionId(chatSessionId);
195+
}, [data]);
196+
197+
return (
198+
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
199+
{messages.map(m => {
200+
return (
201+
<div key={m.id} className="whitespace-pre-wrap">
202+
<br />
203+
<strong>{m.role === 'user' ? 'User: ' : 'AI: '}</strong>
204+
{m.content}
205+
<Citations annotations={m.annotations} />
206+
</div>
207+
);
208+
})}
209+
210+
<form onSubmit={handleSubmit}>
211+
<input
212+
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
213+
value={input}
214+
placeholder="Say something..."
215+
onChange={handleInputChange}
216+
/>
217+
</form>
218+
</div>
219+
);
220+
}
221+
```
222+
223+
#### Show Citations (optional)
224+
225+
The Inkeep API provides information about the sources (documentation, web pages, forums, etc.) used to answer a question in a `records_cited` message annotation.
226+
227+
We can use this to display a list of "Citations" at the end of the main chat message content.
228+
229+
```tsx filename="app/Citations.tsx" showLineNumbers
230+
import { Message } from 'ai';
231+
import { InkeepRecordsCitedData } from 'ai/streams';
232+
233+
interface CitationsProps {
234+
annotations: Message['annotations'];
235+
}
236+
237+
export const Citations = ({ annotations }: CitationsProps) => {
238+
// get the records_cited annotation of the message
239+
const recordsCitedAnnotation = annotations?.find(
240+
item =>
241+
typeof item === 'object' && item !== null && 'records_cited' in item,
242+
) as { records_cited: InkeepRecordsCitedData } | undefined;
243+
244+
// get the citations from the records_cited annotation
245+
const citations = recordsCitedAnnotation?.records_cited?.citations;
246+
247+
return (
248+
citations && (
249+
<>
250+
{annotations && annotations.length > 0 && (
251+
<div>
252+
<br />
253+
{'---SOURCES USED---'}
254+
<br />
255+
<div>
256+
{citations.map((citation, citationIndex) => (
257+
<p key={citationIndex}>
258+
{citationIndex + 1}.{' '}
259+
<a target="_blank" href={citation.record.url || ''}>
260+
{citation.record.title}
261+
</a>
262+
</p>
263+
))}
264+
</div>
265+
</div>
266+
)}
267+
</>
268+
)
269+
);
270+
};
271+
```
272+
273+
</Steps>

‎docs/pages/docs/index.mdx

+1
Original file line numberDiff line numberDiff line change
@@ -197,4 +197,5 @@ export function A({ href, children }) {
197197
<IntegrationCard href="/docs/guides/providers/hugging-face" title="Hugging Face" description="Hugging Face is a collaboration and model hosting platform for the machine learning community. The Vercel AI SDK provides a simple way to use Hugging Face models in your frontend web applications." />
198198
<IntegrationCard href="/docs/guides/providers/fireworks" title="Fireworks" description="Fireworks is a lightning-fast LLM inference platform. The Vercel AI SDK provides a simple way to use Fireworks.ai's models in your frontend web applications." />
199199
<IntegrationCard href="/docs/guides/providers/perplexity" title="Perplexity" description="Perplexity AI is an answer engine that combines AI with web search to provide ready-made answers to user questions in natural language. The Vercel AI SDK provides a simple way to use Perplexity's models in your frontend web applications." />
200+
<IntegrationCard href="/docs/guides/providers/inkeep" title="Inkeep" description="Inkeep's AI chat service provides answers grounded in your company's documentation, blogs, support tickets, and other sources. The Vercel AI SDK provides a simple way to use Inkeep's RAG service to build a custom support and search copilot." />
200201
</div>}
+2
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
INKEEP_API_KEY=xxxxxx
2+
INKEEP_INTEGRATION_ID=xxxxxx

‎examples/next-inkeep/.gitignore

+35
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
2+
3+
# dependencies
4+
/node_modules
5+
/.pnp
6+
.pnp.js
7+
8+
# testing
9+
/coverage
10+
11+
# next.js
12+
/.next/
13+
/out/
14+
15+
# production
16+
/build
17+
18+
# misc
19+
.DS_Store
20+
*.pem
21+
22+
# debug
23+
npm-debug.log*
24+
yarn-debug.log*
25+
yarn-error.log*
26+
27+
# local env files
28+
.env*.local
29+
30+
# vercel
31+
.vercel
32+
33+
# typescript
34+
*.tsbuildinfo
35+
next-env.d.ts

‎examples/next-inkeep/README.md

+41
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
# Vercel AI SDK, Next.js, and Inkeep Chat Example
2+
3+
This example shows how to use the [Vercel AI SDK](https://sdk.vercel.ai/docs) with [Next.js](https://nextjs.org/) and [Inkeep's Managed AI Chat Service](https://docs.inkeep.com/claude/reference/getting-started-with-the-api) to create an LLM-powered chat bot on your content.
4+
5+
## Deploy your own
6+
7+
Deploy the example using [Vercel](https://vercel.com?utm_source=github&utm_medium=readme&utm_campaign=ai-sdk-example):
8+
9+
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fvercel%2Fai%2Ftree%2Fmain%2Fexamples%2Fnext-inkeep&env=INKEEP_API_KEY&envDescription=Inkeep_API_Key&envLink=https://console.inkeep.com/account/keys&project-name=vercel-ai-chat-inkeep&repository-name=vercel-ai-chat-inkeep)
10+
11+
## How to use
12+
13+
Execute [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app) with [npm](https://docs.npmjs.com/cli/init), [Yarn](https://yarnpkg.com/lang/en/docs/cli/create/), or [pnpm](https://pnpm.io) to bootstrap the example:
14+
15+
```bash
16+
npx create-next-app --example https://github.com/vercel/ai/tree/main/examples/next-inkeep next-inkeep-app
17+
```
18+
19+
```bash
20+
yarn create next-app --example https://github.com/vercel/ai/tree/main/examples/next-inkeep next-inkeep-app
21+
```
22+
23+
```bash
24+
pnpm create next-app --example https://github.com/vercel/ai/tree/main/examples/next-inkeep next-inkeep-app
25+
```
26+
27+
To run the example locally you need to:
28+
29+
1. [Get onboarded](https://docs.inkeep.com/overview/getting-started) to Inkeep to receive an Inkeep API Key and Integration ID.
30+
2. Set the required Inkeep environment variables as the token value as shown [the example env file](./.env.local.example) but in a new file called `.env.local`
31+
3. `pnpm install` to install the required dependencies.
32+
4. `pnpm dev` to launch the development server.
33+
34+
## Learn More
35+
36+
To learn more about OpenAI, Next.js, and the Vercel AI SDK take a look at the following resources:
37+
38+
- [Vercel AI SDK docs](https://sdk.vercel.ai/docs)
39+
- [Vercel AI Playground](https://play.vercel.ai)
40+
- [Inkeep Documentation](https://docs.inkeep.com) - learn about Inkeep features and API.
41+
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
+75
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
import {
2+
InkeepStream,
3+
InkeepOnFinalMetadata,
4+
StreamingTextResponse,
5+
experimental_StreamData,
6+
} from 'ai';
7+
import { InkeepAI } from '@inkeep/ai-api';
8+
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
9+
10+
interface ChatRequestBody {
11+
messages: Array<{
12+
role: 'user' | 'assistant';
13+
content: string;
14+
}>;
15+
chat_session_id?: string;
16+
}
17+
18+
const inkeepIntegrationId = process.env.INKEEP_INTEGRATION_ID!;
19+
20+
export async function POST(req: Request) {
21+
const chatRequestBody: ChatRequestBody = await req.json();
22+
const chat_session_id = chatRequestBody.chat_session_id;
23+
24+
const ikpClient = new InkeepAI({
25+
apiKey: process.env.INKEEP_API_KEY,
26+
});
27+
28+
let response;
29+
30+
if (!chat_session_id) {
31+
const createRes = await ikpClient.chatSession.create({
32+
integrationId: inkeepIntegrationId,
33+
chatSession: {
34+
messages: chatRequestBody.messages,
35+
},
36+
stream: true,
37+
});
38+
39+
response = createRes.rawResponse;
40+
} else {
41+
const continueRes = await ikpClient.chatSession.continue(chat_session_id, {
42+
integrationId: inkeepIntegrationId,
43+
message: chatRequestBody.messages[chatRequestBody.messages.length - 1],
44+
stream: true,
45+
});
46+
47+
response = continueRes.rawResponse;
48+
}
49+
50+
// used to pass custom metadata to the client
51+
const data = new experimental_StreamData();
52+
53+
if (!response?.body) {
54+
throw new Error('Response body is null');
55+
}
56+
57+
const stream = InkeepStream(response, {
58+
onRecordsCited: async (records_cited: RecordsCited$.Inbound) => {
59+
// append the citations to the message annotations
60+
data.appendMessageAnnotation({
61+
records_cited,
62+
});
63+
},
64+
onFinal: async (complete: string, metadata?: InkeepOnFinalMetadata) => {
65+
// return the chat_session_id to the client
66+
if (metadata) {
67+
data.append({ onFinalMetadata: metadata });
68+
}
69+
data.close();
70+
},
71+
experimental_streamData: true,
72+
});
73+
74+
return new StreamingTextResponse(stream, {}, data);
75+
}

‎examples/next-inkeep/app/favicon.ico

25.3 KB
Binary file not shown.

‎examples/next-inkeep/app/globals.css

+3
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
@tailwind base;
2+
@tailwind components;
3+
@tailwind utilities;

‎examples/next-inkeep/app/layout.tsx

+21
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
import './globals.css';
2+
import { Inter } from 'next/font/google';
3+
4+
const inter = Inter({ subsets: ['latin'] });
5+
6+
export const metadata = {
7+
title: 'Create Next App',
8+
description: 'Generated by create next app',
9+
};
10+
11+
export default function RootLayout({
12+
children,
13+
}: {
14+
children: React.ReactNode;
15+
}) {
16+
return (
17+
<html lang="en">
18+
<body className={inter.className}>{children}</body>
19+
</html>
20+
);
21+
}

‎examples/next-inkeep/app/page.tsx

+100
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
'use client';
2+
3+
import { useChat } from 'ai/react';
4+
import { useEffect, useState } from 'react';
5+
import { Message } from 'ai';
6+
import { InkeepOnFinalMetadata } from 'ai/streams';
7+
import type { RecordsCited$ } from '@inkeep/ai-api/models/components';
8+
9+
export default function Chat() {
10+
/**
11+
* you can also put the chat session id in search params e.g. ?chatSessionId=123
12+
* or path params like /chat/123 depending on your use case
13+
*/
14+
const [chatSessionId, setChatSessionId] = useState<string | undefined>(
15+
undefined,
16+
);
17+
18+
const { messages, input, handleInputChange, handleSubmit, data } = useChat({
19+
body: {
20+
chat_session_id: chatSessionId,
21+
},
22+
});
23+
24+
// SET THE INKEEP CHAT SESSION ID FROM THE CHAT DATA
25+
useEffect(() => {
26+
// get the onFinalMetadata item from the global data
27+
const onFinalMetadataItem = data?.find(
28+
item =>
29+
typeof item === 'object' && item !== null && 'onFinalMetadata' in item,
30+
) as { onFinalMetadata: InkeepOnFinalMetadata } | undefined;
31+
32+
// get the chatSessionId from the onFinalMetadata item
33+
const chatSessionId = onFinalMetadataItem?.onFinalMetadata?.chat_session_id;
34+
35+
setChatSessionId(chatSessionId);
36+
}, [data]);
37+
38+
return (
39+
<div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
40+
{messages.map(m => {
41+
return (
42+
<div key={m.id} className="whitespace-pre-wrap">
43+
<br />
44+
<strong>{m.role === 'user' ? 'User: ' : 'AI: '}</strong>
45+
{m.content}
46+
<Citations annotations={m.annotations} />
47+
</div>
48+
);
49+
})}
50+
51+
<form onSubmit={handleSubmit}>
52+
<input
53+
className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
54+
value={input}
55+
placeholder="Say something..."
56+
onChange={handleInputChange}
57+
/>
58+
</form>
59+
</div>
60+
);
61+
}
62+
63+
interface CitationsProps {
64+
annotations: Message['annotations'];
65+
}
66+
67+
const Citations = ({ annotations }: CitationsProps) => {
68+
// get the records_cited annotation of the message
69+
const recordsCitedAnnotation = annotations?.find(
70+
item =>
71+
typeof item === 'object' && item !== null && 'records_cited' in item,
72+
) as { records_cited: RecordsCited$.Inbound } | undefined;
73+
74+
// get the citations from the records_cited annotation
75+
const citations = recordsCitedAnnotation?.records_cited?.citations;
76+
77+
return (
78+
citations && (
79+
<>
80+
{annotations && annotations.length > 0 && (
81+
<div>
82+
<br />
83+
{'---SOURCES USED---'}
84+
<br />
85+
<div>
86+
{citations.map((citation, citationIndex) => (
87+
<p key={citationIndex}>
88+
{citationIndex + 1}.{' '}
89+
<a target="_blank" href={citation.record.url || ''}>
90+
{citation.record.title}
91+
</a>
92+
</p>
93+
))}
94+
</div>
95+
</div>
96+
)}
97+
</>
98+
)
99+
);
100+
};

‎examples/next-inkeep/next.config.js

+4
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
/** @type {import('next').NextConfig} */
2+
const nextConfig = {};
3+
4+
module.exports = nextConfig;

‎examples/next-inkeep/package.json

+31
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
{
2+
"name": "next-inkeep",
3+
"version": "0.0.0",
4+
"private": true,
5+
"scripts": {
6+
"dev": "next dev",
7+
"build": "next build",
8+
"start": "next start",
9+
"lint": "next lint"
10+
},
11+
"dependencies": {
12+
"@inkeep/ai-api": "^0.1.4",
13+
"ai": "2.2.34",
14+
"eventsource-parser": "1.1.1",
15+
"next": "14.0.3",
16+
"react": "18.2.0",
17+
"react-dom": "^18.2.0",
18+
"zod": "^3.22.4"
19+
},
20+
"devDependencies": {
21+
"@types/node": "^17.0.12",
22+
"@types/react": "18.2.8",
23+
"@types/react-dom": "18.2.4",
24+
"autoprefixer": "^10.4.14",
25+
"eslint": "^7.32.0",
26+
"eslint-config-next": "13.4.12",
27+
"postcss": "^8.4.23",
28+
"tailwindcss": "^3.3.2",
29+
"typescript": "5.1.3"
30+
}
31+
}
+6
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
module.exports = {
2+
plugins: {
3+
tailwindcss: {},
4+
autoprefixer: {},
5+
},
6+
};
+18
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
/** @type {import('tailwindcss').Config} */
2+
module.exports = {
3+
content: [
4+
'./pages/**/*.{js,ts,jsx,tsx,mdx}',
5+
'./components/**/*.{js,ts,jsx,tsx,mdx}',
6+
'./app/**/*.{js,ts,jsx,tsx,mdx}',
7+
],
8+
theme: {
9+
extend: {
10+
backgroundImage: {
11+
'gradient-radial': 'radial-gradient(var(--tw-gradient-stops))',
12+
'gradient-conic':
13+
'conic-gradient(from 180deg at 50% 50%, var(--tw-gradient-stops))',
14+
},
15+
},
16+
},
17+
plugins: [],
18+
};

‎examples/next-inkeep/tsconfig.json

+28
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
{
2+
"compilerOptions": {
3+
"target": "es5",
4+
"lib": ["dom", "dom.iterable", "esnext"],
5+
"allowJs": true,
6+
"skipLibCheck": true,
7+
"strict": true,
8+
"forceConsistentCasingInFileNames": true,
9+
"noEmit": true,
10+
"esModuleInterop": true,
11+
"module": "esnext",
12+
"moduleResolution": "node",
13+
"resolveJsonModule": true,
14+
"isolatedModules": true,
15+
"jsx": "preserve",
16+
"incremental": true,
17+
"plugins": [
18+
{
19+
"name": "next"
20+
}
21+
],
22+
"paths": {
23+
"@/*": ["./*"]
24+
}
25+
},
26+
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
27+
"exclude": ["node_modules"]
28+
}

‎packages/core/streams/index.ts

+11
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,13 @@
1+
export * from './ai-stream';
2+
export * from './aws-bedrock-stream';
3+
export * from './openai-stream';
4+
export * from './streaming-text-response';
5+
export * from './huggingface-stream';
6+
export * from './cohere-stream';
7+
export * from './anthropic-stream';
8+
export * from './inkeep-stream';
9+
export * from './langchain-stream';
10+
export * from './replicate-stream';
111
export * from '../shared/types';
212
export * from '../shared/utils';
313
export * from './ai-stream';
@@ -7,6 +17,7 @@ export * from './aws-bedrock-stream';
717
export * from './cohere-stream';
818
export * from './google-generative-ai-stream';
919
export * from './huggingface-stream';
20+
export * from './inkeep-stream';
1021
export * from './langchain-stream';
1122
export * from './openai-stream';
1223
export * from './replicate-stream';
+113
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,113 @@
1+
import {
2+
InkeepOnFinalMetadata,
3+
InkeepStream,
4+
StreamingTextResponse,
5+
experimental_StreamData,
6+
} from '.';
7+
import { InkeepEventStream } from '../tests/snapshots/inkeep';
8+
import { readAllChunks } from '../tests/utils/mock-client';
9+
import { DEFAULT_TEST_URL, createMockServer } from '../tests/utils/mock-server';
10+
11+
const server = createMockServer([
12+
{
13+
url: DEFAULT_TEST_URL,
14+
chunks: InkeepEventStream,
15+
formatChunk: ({ event, data }) =>
16+
`event: ${event}\ndata: ${JSON.stringify(data)}\n\n`,
17+
},
18+
]);
19+
20+
describe('InkeepStream', () => {
21+
beforeAll(() => {
22+
server.listen();
23+
});
24+
25+
afterEach(() => {
26+
server.resetHandlers();
27+
});
28+
29+
afterAll(() => {
30+
server.close();
31+
});
32+
33+
it('should be able to parse SSE and receive the streamed response', async () => {
34+
const response = await fetch(DEFAULT_TEST_URL);
35+
36+
const stream = InkeepStream(response);
37+
38+
const responseStream = new StreamingTextResponse(stream);
39+
40+
expect(await readAllChunks(responseStream)).toEqual([
41+
' Hello',
42+
',',
43+
' world',
44+
'.',
45+
]);
46+
});
47+
48+
const recordsCitedSerialized =
49+
'"records_cited":{"citations":[{"number":1,"record":{"url":"https://inkeep.com","title":"Inkeep","breadcrumbs":["Home","About"]}}]}';
50+
51+
describe('StreamData protocol', () => {
52+
it('should receive and send Inkeep onFinal metadata with chat_session_id', async () => {
53+
const data = new experimental_StreamData();
54+
55+
const response = await fetch(DEFAULT_TEST_URL);
56+
57+
const stream = InkeepStream(response, {
58+
onFinal: async (complete: string, metadata?: InkeepOnFinalMetadata) => {
59+
// return the chat_session_id to the client
60+
if (metadata) {
61+
data.append({ onFinalMetadata: metadata });
62+
}
63+
data.close();
64+
},
65+
experimental_streamData: true,
66+
});
67+
68+
const responseStream = new StreamingTextResponse(stream, {}, data);
69+
70+
expect(await readAllChunks(responseStream)).toEqual([
71+
'0:" Hello"\n',
72+
'0:","\n',
73+
'0:" world"\n',
74+
'0:"."\n',
75+
`2:[{"onFinalMetadata":{"chat_session_id":"12345",${recordsCitedSerialized}}}]\n`,
76+
]);
77+
});
78+
79+
it('should receive and send Inkeep records_cited data as message annotation', async () => {
80+
const data = new experimental_StreamData();
81+
82+
const response = await fetch(DEFAULT_TEST_URL);
83+
84+
const stream = InkeepStream(response, {
85+
onRecordsCited: async records_cited => {
86+
// append the citations to the message annotations
87+
data.appendMessageAnnotation({
88+
records_cited,
89+
});
90+
},
91+
onFinal: async (complete: string, metadata?: InkeepOnFinalMetadata) => {
92+
// return the chat_session_id to the client
93+
if (metadata) {
94+
data.append({ onFinalMetadata: metadata });
95+
}
96+
data.close();
97+
},
98+
experimental_streamData: true,
99+
});
100+
101+
const responseStream = new StreamingTextResponse(stream, {}, data);
102+
103+
expect(await readAllChunks(responseStream)).toEqual([
104+
'0:" Hello"\n',
105+
'0:","\n',
106+
'0:" world"\n',
107+
'0:"."\n',
108+
`2:[{"onFinalMetadata":{"chat_session_id":"12345",${recordsCitedSerialized}}}]\n`,
109+
`8:[{${recordsCitedSerialized}}]\n`,
110+
]);
111+
});
112+
});
113+
});
+71
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
// packages/core/streams/inkeep-stream.ts
2+
import {
3+
AIStream,
4+
type AIStreamCallbacksAndOptions,
5+
AIStreamParser,
6+
} from './ai-stream';
7+
import { createStreamDataTransformer } from './stream-data';
8+
9+
export type InkeepOnFinalMetadata = {
10+
chat_session_id: string;
11+
records_cited: any;
12+
};
13+
14+
export type InkeepChatResultCallbacks = {
15+
onFinal?: (
16+
completion: string,
17+
metadata?: InkeepOnFinalMetadata,
18+
) => Promise<void> | void;
19+
onRecordsCited?: (
20+
records_cited: InkeepOnFinalMetadata['records_cited'],
21+
) => void;
22+
};
23+
24+
export type InkeepAIStreamCallbacksAndOptions = AIStreamCallbacksAndOptions &
25+
InkeepChatResultCallbacks;
26+
27+
export function InkeepStream(
28+
res: Response,
29+
callbacks?: InkeepAIStreamCallbacksAndOptions,
30+
): ReadableStream {
31+
if (!res.body) {
32+
throw new Error('Response body is null');
33+
}
34+
35+
let chat_session_id = '';
36+
let records_cited: any;
37+
38+
const inkeepEventParser: AIStreamParser = (data: string, options) => {
39+
const { event } = options;
40+
41+
if (event === 'records_cited') {
42+
records_cited = JSON.parse(data) as any;
43+
callbacks?.onRecordsCited?.(records_cited);
44+
}
45+
46+
if (event === 'message_chunk') {
47+
const inkeepMessageChunk = JSON.parse(data);
48+
chat_session_id = inkeepMessageChunk.chat_session_id ?? chat_session_id;
49+
return inkeepMessageChunk.content_chunk;
50+
}
51+
return;
52+
};
53+
54+
let { onRecordsCited, ...passThroughCallbacks } = callbacks || {};
55+
56+
// extend onFinal callback with Inkeep specific metadata
57+
passThroughCallbacks = {
58+
...passThroughCallbacks,
59+
onFinal: completion => {
60+
const inkeepOnFinalMetadata: InkeepOnFinalMetadata = {
61+
chat_session_id,
62+
records_cited,
63+
};
64+
callbacks?.onFinal?.(completion, inkeepOnFinalMetadata);
65+
},
66+
};
67+
68+
return AIStream(res, inkeepEventParser, passThroughCallbacks).pipeThrough(
69+
createStreamDataTransformer(passThroughCallbacks?.experimental_streamData),
70+
);
71+
}
+57
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
export const InkeepEventStream = [
2+
{
3+
event: 'message_chunk',
4+
data: {
5+
chat_session_id: '12345',
6+
content_chunk: ' Hello',
7+
finish_reason: null,
8+
},
9+
},
10+
{
11+
event: 'message_chunk',
12+
data: {
13+
chat_session_id: '12345',
14+
content_chunk: ',',
15+
finish_reason: null,
16+
},
17+
},
18+
{
19+
event: 'message_chunk',
20+
data: {
21+
chat_session_id: '12345',
22+
content_chunk: ' world',
23+
finish_reason: null,
24+
},
25+
},
26+
{
27+
event: 'message_chunk',
28+
data: {
29+
chat_session_id: '12345',
30+
content_chunk: '.',
31+
finish_reason: null,
32+
},
33+
},
34+
{
35+
event: 'message_chunk',
36+
data: {
37+
chat_session_id: '12345',
38+
content_chunk: '',
39+
finish_reason: 'stop',
40+
},
41+
},
42+
{
43+
event: 'records_cited',
44+
data: {
45+
citations: [
46+
{
47+
number: 1,
48+
record: {
49+
url: 'https://inkeep.com',
50+
title: 'Inkeep',
51+
breadcrumbs: ['Home', 'About'],
52+
},
53+
},
54+
],
55+
},
56+
},
57+
];

‎pnpm-lock.yaml

+153-4
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

‎turbo.json

+3-1
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,9 @@
1717
"PERPLEXITY_API_KEY",
1818
"REPLICATE_API_KEY",
1919
"NODE_ENV",
20-
"ASSISTANT_ID"
20+
"ASSISTANT_ID",
21+
"INKEEP_API_KEY",
22+
"INKEEP_INTEGRATION_ID"
2123
],
2224
"outputs": ["dist/**", ".next/**", "!.next/cache/**"]
2325
},

0 commit comments

Comments
 (0)
Please sign in to comment.