Skip to content

anothrNick/openai-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Open AI HTTP Proxy (Stream)

A simple HTTP proxy which allows you to stream responses from Open AI's API. This can be used to secure your API key and to add additional functionality to your API calls.

Medium article

Streaming Open AI API responses with Server-Side Events and Golang

Run the proxy

# Set your API key
$ export OA_API_KEY=sk-...

# Run the proxy
$ go run main.go

Run the proxy with Docker

# Build the image
$ docker build -t openai-proxy .

# Run the image
$ docker run -p 8080:8080 -e OA_API_KEY=sk-... openai-proxy

Usage

curl -s -N -X POST -d '{"messages": [{"role": "user", "content": "Hello world!"}]}' http://localhost:8080/message

event:message
data:{"timestamp":1702330536,"content":""}

event:message
data:{"timestamp":1702330536,"content":"Hello"}

event:message
data:{"timestamp":1702330536,"content":"!"}

event:message
data:{"timestamp":1702330536,"content":" How"}

event:message
data:{"timestamp":1702330536,"content":" can"}

event:message
data:{"timestamp":1702330536,"content":" I"}

event:message
data:{"timestamp":1702330536,"content":" assist"}

event:message
data:{"timestamp":1702330536,"content":" you"}

event:message
data:{"timestamp":1702330536,"content":" today"}

event:message
data:{"timestamp":1702330536,"content":"?"}

event:message
data:{"timestamp":1702330536,"content":""}

Demo UI

This project has a very simple demo UI which can be used to test the proxy. The UI is built with React and can be found in the ui/chat-demo directory.

cd ui/chat-demo && yarn install && yarn start
...
Compiled successfully!

You can now view chat-demo in the browser.

  Local:            http://localhost:3000
  On Your Network:  http://192.168.0.238:3000

Note that the development build is not optimized.
To create a production build, use yarn build.

webpack compiled successfully

About

A Golang HTTP proxy which allows you to stream responses from Open AI's chat completions API.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published