Skip to content

Commit

Permalink
feat(api): add vector stores (#776)
Browse files Browse the repository at this point in the history
  • Loading branch information
stainless-bot committed Apr 17, 2024
1 parent 6f72e7a commit 8bb929b
Show file tree
Hide file tree
Showing 32 changed files with 2,420 additions and 690 deletions.
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1 +1 @@
configured_endpoints: 55
configured_endpoints: 62
16 changes: 15 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ Documentation for each method, request param, and response field are available i
### Polling Helpers

When interacting with the API some actions such as starting a Run may take time to complete. The SDK includes
When interacting with the API some actions such as starting a Run and adding files to vector stores are asynchronous and take time to complete. The SDK includes
helper functions which will poll the status until it reaches a terminal state and then return the resulting object.
If an API method results in an action which could benefit from polling there will be a corresponding version of the
method ending in 'AndPoll'.
Expand All @@ -117,6 +117,20 @@ const run = await openai.beta.threads.runs.createAndPoll(thread.id, {

More information on the lifecycle of a Run can be found in the [Run Lifecycle Documentation](https://platform.openai.com/docs/assistants/how-it-works/run-lifecycle)

### Bulk Upload Helpers

When creating an interacting with vector stores, you can use the polling helpers to monitor the status of operations.
For convenience, we also provide a bulk upload helper to allow you to simultaneously upload several files at once.

```ts
const fileList = [
createReadStream('/home/data/example.pdf'),
...
];

const batch = await openai.vectorStores.fileBatches.uploadAndPoll(vectorStore.id, fileList);
```

### Streaming Helpers

The SDK also includes helpers to process streams and handle the incoming events.
Expand Down
156 changes: 90 additions & 66 deletions api.md

Large diffs are not rendered by default.

23 changes: 22 additions & 1 deletion helpers.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Streaming Helpers
# Helpers

OpenAI supports streaming responses when interacting with the [Chat](#chat-streaming) or [Assistant](#assistant-streaming-api) APIs.

Expand Down Expand Up @@ -449,3 +449,24 @@ See an example of a Next.JS integration here [`examples/stream-to-client-next.ts
#### Proxy Streaming to a Browser

See an example of using express to stream to a browser here [`examples/stream-to-client-express.ts`](examples/stream-to-client-express.ts).

# Polling Helpers

When interacting with the API some actions such as starting a Run and adding files to vector stores are asynchronous and take time to complete.
The SDK includes helper functions which will poll the status until it reaches a terminal state and then return the resulting object.
If an API method results in an action which could benefit from polling there will be a corresponding version of the
method ending in `_AndPoll`.

All methods also allow you to set the polling frequency, how often the API is checked for an update, via a function argument (`pollIntervalMs`).

The polling methods are:

```ts
client.beta.threads.createAndRunPoll(...)
client.beta.threads.runs.createAndPoll((...)
client.beta.threads.runs.submitToolOutputsAndPoll((...)
client.beta.vectorStores.files.uploadAndPoll((...)
client.beta.vectorStores.files.createAndPoll((...)
client.beta.vectorStores.fileBatches.createAndPoll((...)
client.beta.vectorStores.fileBatches.uploadAndPoll((...)
```
4 changes: 2 additions & 2 deletions src/lib/AssistantStream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ import {
ImageFile,
TextDelta,
Messages,
} from 'openai/resources/beta/threads/messages/messages';
} from 'openai/resources/beta/threads/messages';
import * as Core from 'openai/core';
import { RequestOptions } from 'openai/core';
import {
Expand All @@ -30,7 +30,7 @@ import {
MessageStreamEvent,
RunStepStreamEvent,
RunStreamEvent,
} from 'openai/resources/beta/assistants/assistants';
} from 'openai/resources/beta/assistants';
import { RunStep, RunStepDelta, ToolCall, ToolCallDelta } from 'openai/resources/beta/threads/runs/steps';
import { ThreadCreateAndRunParamsBase, Threads } from 'openai/resources/beta/threads/threads';
import MessageDelta = Messages.MessageDelta;
Expand Down
23 changes: 23 additions & 0 deletions src/lib/Util.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
/**
* Like `Promise.allSettled()` but throws an error if any promises are rejected.
*/
export const allSettledWithThrow = async <R>(promises: Promise<R>[]): Promise<R[]> => {
const results = await Promise.allSettled(promises);
const rejected = results.filter((result): result is PromiseRejectedResult => result.status === 'rejected');
if (rejected.length) {
for (const result of rejected) {
console.error(result.reason);
}

throw new Error(`${rejected.length} promise(s) failed - see the above errors`);
}

// Note: TS was complaining about using `.filter().map()` here for some reason
const values: R[] = [];
for (const result of results) {
if (result.status === 'fulfilled') {
values.push(result.value);
}
}
return values;
};

0 comments on commit 8bb929b

Please sign in to comment.