Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vercel Long Running task (edge function) #1151

Open
Skn0tt opened this issue Jun 26, 2023 Discussed in #1150 · 8 comments
Open

Vercel Long Running task (edge function) #1151

Skn0tt opened this issue Jun 26, 2023 Discussed in #1150 · 8 comments

Comments

@Skn0tt
Copy link
Member

Skn0tt commented Jun 26, 2023

Discussed in #1150

Originally posted by nilooy June 25, 2023
is it possible to use quirrel queue with vercel edge function? i was looking specifically for this to run as background job by quirrel
https://github.com/inngest/vercel-ai-sdk/blob/main/examples/next-openai/app/api/chat/route.ts

i tried the following approach

import { Queue as TestQueue } from "quirrel/next";
import { Configuration, OpenAIApi } from "openai-edge";
import { OpenAIStream, StreamingTextResponse } from "ai";

export const runtime = "edge";

const config = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
});
const openai = new OpenAIApi(config);

// @ts-ignore
export default TestQueue("api/test", async (params) => {
  const response = await openai.createChatCompletion({
    model: "gpt-3.5-turbo",
    stream: true,
    messages: [{ role: "user", content: "explain the next js" }],
  });

  const stream = OpenAIStream(response);
  // Respond with the stream
  return new StreamingTextResponse(stream);
});

and ran from another route

await TestQueue.enqueue({ test: 123 });

this results in following error while running

👟Executing job
  queue: /api/test
     id: 7f0226c0-4824-4671-9efa-e926484e95ae
   body: {"test":123}
error - node_modules/quirrel/dist/esm/src/client/enhanced-json.js (13:0) @ Module.parse
error - Unexpected token o in JSON at position 1
null

WORKED PERFECTLY WITHOUT edge

@Skn0tt
Copy link
Member Author

Skn0tt commented Jun 26, 2023

In #1150 (comment), @nilooy mentions that it works when switching to quirrel/next-app.

@Skn0tt
Copy link
Member Author

Skn0tt commented Jun 26, 2023

Hi @nilooy! According to the Next.js docs, switching to runtime = "edge" changes the API of API Routes completely. You won't be able to use runtime = "edge" in conjunction with quirrel/next, which you already found out. Now you're mentioning that with quirrel/next-app, "streaming doesn't work". Can you go into more detail on that? What exactly doesn't work? Is it related to Quirrel? A reproduction case for that would be lovely.

@nilooy
Copy link

nilooy commented Jun 26, 2023

In #1150 (comment), @nilooy mentions that it works when switching to quirrel/next-app.

Thanks for switching it to issue. the error mentioned was gone after switching back to quirrel/next-app from quirrel/next but i'm not sure how to make the streaming work.

so, as far i understood, quirrel calls the api back on the scheduled time (or as background task) which is hosted in vercel and has a timeout limit of 60s but i need a long running task. in the code block below, i want to keep this process on until open ai finishes it's response's stream. main goal here is to, bypass the vercel timeout limit of 60s which can be extended to higher number by streaming the response.

export default TestQueue("api/test", async (params) => {
  const response = await openai.createChatCompletion({
    model: "gpt-3.5-turbo",
    stream: true,
    messages: [{ role: "user", content: "explain the next js" }],
  });

  const stream = OpenAIStream(response);
  // Respond with the stream
  return new StreamingTextResponse(stream);
});

mentioned in vercel stream docs: https://vercel.com/docs/concepts/functions/edge-functions/streaming

Edge Functions must begin sending a response within 30 seconds to fall within the maximum initial response time. Once a reply is made, the function can continue to run. This means that you can use Edge Functions to stream data, and the response will be delivered as soon as the first chunk of data is available.

@Skn0tt
Copy link
Member Author

Skn0tt commented Jun 26, 2023

I don't think that's currently possible with Quirrel. You're returning a StreamingTextResponse, but Quirrel isn't accessing that return value in any way. It's always returning "OK" as the response body:

body: "OK",

Before we think about solving this, please elaborate on your usecase for this. What's the reason you're accessing OpenAI from a Queue, where you can't send data to your frontend?

@nilooy
Copy link

nilooy commented Jun 26, 2023

the main usecase here is, i'm making a open ai call with quite a large prompt and then parsing the data into json, in total process can take upto 2 min. i can't let my users wait for that long in the frontend. so i must take the path of fan out jobs and let the user know when it's done instead of them keeping in the page for 2 mins.

at the moment there's few talks but not valid solution except defer or a custom one.

some info:
https://www.reddit.com/r/nextjs/comments/uhhmga/best_way_to_deal_with_long_background_jobs_when/
vercel/next.js#34266

@Skn0tt
Copy link
Member Author

Skn0tt commented Jun 26, 2023

Makes sense, thank you!

I don't think that Quirrel currently supports that, and i'd need to think a bit about the best way of implementing support for these long-running jobs. Have you looked into https://www.inngest.com/, would something like that solve your needs?

@nilooy
Copy link

nilooy commented Jun 26, 2023

Ok perfect, inngest has a very different way to solve this issue, but with that, i need to change my entire workflow of next.js api, and will get into strong vendor lock in. I have another way i tested with gcp function, probably will go with that or make a node.js server with bull mq.
i will check back later if you have any solutions in mind with quirrel. by the way, Quirrel is really good. Love it.

@nilooy
Copy link

nilooy commented Jun 26, 2023

@Skn0tt can you please rename the title of the issue to Vercel Long Running task (edge function) which might bring interest from other people who has same needs.

it doesn't allow me to change it.

@Skn0tt Skn0tt changed the title vercel edge function Vercel Long Running task (edge function) Jun 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants