Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Throttling (only run N functions over a specific time) #372

Open
DaleLJefferson opened this issue Jan 24, 2023 · 6 comments
Open

Throttling (only run N functions over a specific time) #372

DaleLJefferson opened this issue Jan 24, 2023 · 6 comments
Assignees
Labels
feature New feature or request

Comments

@DaleLJefferson
Copy link

DaleLJefferson commented Jan 24, 2023

While my functions are serverless I have some dependancies which are not, for example I have an on-premise whatsApp server which can only handle 25 requests per second.

I fire 400 events and they happily invoke serverless functions which invoke the whatsApp API as fast as they can, error and then retry. All my messages go out eventually but it's not pretty.

Concurrency limits will help but rate limiting is a slightly different issue, it's concurrency x function invocation time and function invocation time is variable.

If my functions take 100ms to execute and I have concurrency of 100 (Not sure what latency the system has but that's a lot of requests per second)

Workarounds would be to set a concurrency x function time to be conservatively lower than the rate limit or to introduce an artificially delay setTimeout(() => sendMessage(), 1000) or to delay the functions to spread them over a longer period.

Ideally the functions would error signalling to the queue to slow down in the same way the NonRetriableError tells the queue to not retry.

throw new RateLimitError("slow down");
@DaleLJefferson DaleLJefferson added the feature New feature or request label Jan 24, 2023
@john-griffin
Copy link

Yeah some form of global rate limiting is key for us too. We rely on this in Elixir's Oban when working with third party APIs.

@WesleyYue
Copy link

I had some discussion on rate limiting with the team here, if anyone's interested: https://discord.com/channels/842170679536517141/845000011040555018/1109915630917922958

@tonyhb
Copy link
Contributor

tonyhb commented Aug 15, 2023

Done!

@tonyhb tonyhb closed this as completed Aug 15, 2023
@WesleyYue
Copy link

@tonyhb that's exciting! Where can we find more information/documentation about the new feature?

@tonyhb tonyhb changed the title Rate Limiting Throttling (only run N functions over a specific time) Aug 31, 2023
@tonyhb tonyhb reopened this Aug 31, 2023
@tonyhb
Copy link
Contributor

tonyhb commented Aug 31, 2023

Re-opening this and aiming to clarify a few things:

  • "Rate limiting" is implemented and is lossy: it prevents new functions from running when over the rate limit.
  • "Throttling" needs to be implemented. This issue represents throttling. Throttling is "concurrency over time": run at most 5 functions per minute, enqueueing all other functions in a backlog until there's capacity.

This issue has been reopened and renamed to specifically discuss throttling. We're aiming to build throttling directly into function config.

RE. a custom error for retrying at specific times, you can implement your own backoff by throwing the RetryAt header, as parsed here: https://github.com/inngest/inngest/blob/f223b24454970bee45515ad7f85d7c8b1eaa67c8/pkg/execution/state/driver_response.go#L175-L177/. This will be exposed in V3 of the SDK, allowing you to "back off" and control retries however you'd like.

@karam-khanna
Copy link

+1 on this. Would be really helpful for handling external APIs with rate limits

IGassmann pushed a commit that referenced this issue Feb 14, 2024
* Update cron use case base with new design, ctas

* Delete draft page for now
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants