Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add limitConcurrency helper #8

Open
rektide opened this issue Jun 7, 2023 · 0 comments
Open

Add limitConcurrency helper #8

rektide opened this issue Jun 7, 2023 · 0 comments

Comments

@rektide
Copy link

rektide commented Jun 7, 2023

In #4 has the idea of a bufferAhead helper that can create concurrency. It has some discussion about ways it might want to limit concurrency, & in that PR I've advocated for keeping bufferAhead simpler / complecting less concerns into it. But the need seems real to create a way to limit concurrency (the number of unresolved promises at any given time).

let pages = asyncIteratorOfUrls
  .map(u => fetch(u))
  .limitConcurrency(2);

// bad consumer we want to guard against.
// more likely examples might be some kind of a thread pool.
for (let i = 0; i < 10; ++i) {
  pages.next().then(() => console.log("got page"));
}

There's a number of jobs where I've created or been around folks making Staged Event Driven Architecture (SEDA) like structures, where there's a bunch of processors & queues in between them. Along with bufferAhead, the limitConcurrency(n) helper proposed here should allow fulfillment of those architectures quite quickly:

const [staticReq, dynamicReq] = packet
  .map(parse).limitConcurrency(3).bufferAhead(3)
  .map(urlDispatch).limitConcurrency(3).bufferAhead(3)
  // imagining a fork/join helper to split/merge iterators
  .fork(staticOrDynamic)
const generated = dynamic.map(generateDynamic).limitConcurrency(3).bufferAhead(3)
const [hit, miss] = staticReq
  .map(checkCache).limitConcurrency(3).bufferAhead(3)
  .fork(isCacheHit)
const filled = miss.map(loadFile).limitConcurrency(3).bufferAhead(3)
const sent = join(generated, hit, filled).map(send).limitConcurrency(5).bufferAhead(5)

It's possible we could try to combine all these behaviors into one big helper, and the code above definitely looks a bit simple. But de-complecting the concerns feels like it creates more possibilities than trying to tackle everything together.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant