Skip to content

Build robust, production grade function calling assistants that work. Declarative and extensible. Built on top of LangChain ⚡️

License

Notifications You must be signed in to change notification settings

definitive-io/openassistants

Repository files navigation

OpenAssistants

Documentation OpenAssistants PyPI OpenAssistants-FastAPI PyPI npm version License: MIT Discord

OpenAssistants is a collection of open source libraries aimed at developing robust AI assistants rather than autonomous agents. By focusing on specific tasks and incorporating human oversight, OpenAssistants strives to minimize error rates typically found in agentic systems.

  • openassistants the core library responsible for the main function calling / message generating runtime
  • openassistants-fastapi a set of FastAPI routes used for interacting with the core runtime loop through a REST API
  • openassistants-react an example chat client that supports rich streaming outputs like tables, plots, form inputs and text.

OpenAssistants is built on LangChain and designed to be an open alternative to OpenAI's Assistants API. We also think it's easier to use!

✨ Check out the live demo

Join us in creating AI assistants that are not only useful but dependable for production use today.

Features

  • Included Chat UI
  • Support function calling with any LLM (open source & proprietary)
  • Declarative library of functions
  • Built-in SQL functions (DuckDB support)
  • Extend with any custom Python function
  • Support for 50+ functions in a single chat Assistant
  • Native OpenAI Functions integration



OpenAssistants UI

Quick Start

To run the project locally:

Api Server

  • Navigate to examples/fast-api-server to start the backend example:

    • Run poetry install
    • Activate the virtual environment with poetry shell
    • Set OPENAI_API_KEY env var with export OPENAI_API_KEY=sk-my-key
    • Start the server with ./run.sh

Frontend

  • From the root, run yarn install
  • Navigate to the sample next appcd examples/next
  • Launch the development server with yarn dev
  • Access the application at localhost:3000

Join our community and start contributing today!

Local development

To develop the UI library locally, youll also want to compile in watch for hot reloading. In a new terminal window

  • navigate to the component library cd packages/openassistants-react
  • run the compiler in watch mode yarn start

Now any changes to the library will also get picked and displayed in your process running the next app

About

Build robust, production grade function calling assistants that work. Declarative and extensible. Built on top of LangChain ⚡️

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published