Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak seemingly caused by typescript __awaiter helper #3730

Closed
matthemsteger opened this issue Jan 28, 2020 · 8 comments
Closed

Memory leak seemingly caused by typescript __awaiter helper #3730

matthemsteger opened this issue Jan 28, 2020 · 8 comments
Assignees

Comments

@matthemsteger
Copy link

We have been investigating a memory leak in our GraphQL server implementation and believe we have a root cause. We eliminated a lot of the network stack and other external things, and still have a leak.

There is a large retained memory in a Generator. We believe this is caused by microsoft/TypeScript#36056, which is also referenced in https://stackoverflow.com/questions/58875158/nodejs-memory-growth-memory-leak-in-system. This matches our findings. We do not use any generators in our code, and we compile with babel using latest, which uses async/await natively.

Looking at the compiled Apollo source, it is littered with __awaiter.

I am wondering if Apollo can address this by releasing a version that does not use __awaiter and relies on native async/await?

We do not have many options to address this issue otherwise.

I can add additional information here from the heap dumps if necessary. I believe a test case can be created, unfortunately I do not have one readily available because I have been using our own internal (closed source) Apollo GraphQL server implementation.

@abernix abernix added this to the Release 3.x milestone Jan 28, 2020
@4n3ver
Copy link

4n3ver commented Jan 28, 2020

Are you using node12? It might be related to nodejs/node#30753

@abernix
Copy link
Member

abernix commented Jan 28, 2020

Interesting! I agree with your prognosis that this could be due to microsoft/TypeScript#36056.

Would you able to confirm whether an older version of TypeScript's helpers might be void of this behavior? I might support downgrading our TypeScript compiler to a somewhat recent version, if so. Similarly, perhaps there's a recent version of apollo-server-* packages which were compiled with an older tsc that wouldn't emit this buggy implementation. You're in a prime position to experiment, it would seem!

Unfortunately, releasing a version that transpiles to a new ECMAScript target isn't something we're likely to be able to do without bumping the major version of Apollo Server (and the associated work that goes with that). Granted, that major bump is something we're planning, and we're excited for/eager to do, it's not something we can offer immediately. I can't currently offer precision into the timeline for it, but it's of high urgency to us (particularly to me!) and that I hope to have (hopefully mostly stable!) pre-releases ready this quarter.

If urgency is critical and you need the most recent version of apollo-server, you can clone this repository and bump the compilation target and you'd be able to use npm pack'd tarballs from that project, though I realize that's far from automated. There's a number of CI scripts and tooling that could help make that easier, though it would take more elaboration to explain that in concrete steps. Happy to help with this.

I'm also hopeful that TypeScript will fix this if it's in fact a problem. It looks like they've earmarked it for an upcoming Milestone, though I can't be certain if that will pan out or when. They might be less eager to resolve it since ES6 is decreasing in popularity, though it sure seems like something that they might still want to resolve.

I have some other ideas on next steps here, but I'll leave with this for now. Thanks for reporting this!

@abernix abernix self-assigned this Jan 28, 2020
@abernix
Copy link
Member

abernix commented Jan 28, 2020

Oh, also, what version of Node.js are you using?

@jackmarketon
Copy link

jackmarketon commented Jan 28, 2020

@abernix I work with @matthemsteger, we use Node12, correct.

@abernix
Copy link
Member

abernix commented Jan 28, 2020

@jackmarketon @matthemsteger If @4n3ver's assessment turns out to be correct, it looks like the commit to v8 that has been backported to v12 and is purported to fix it would land in an upcoming Node.js v12.15: nodejs/node#31368.

Might you be able to test a build of node from the Node.js v12 trunk and see if it resolves the issue?

@matthemsteger
Copy link
Author

Thanks @abernix and @4n3ver. I think we can look at downgrading to node 10 for the time being in production and looking to upgrade once 12.15 comes out.

I can update this ticket once I confirm the problem goes away with node 10.

@abernix abernix added the 🍳 breaking-change Needs to wait for a major release. label Jan 28, 2020
@matthemsteger
Copy link
Author

I want to close the loop. We have downgraded to Node 10 and we are seeing stable memory usage in production.

@abernix abernix removed this from the Release 3.x milestone Feb 13, 2020
@abernix abernix removed the 🍳 breaking-change Needs to wait for a major release. label Feb 13, 2020
@abernix
Copy link
Member

abernix commented Feb 13, 2020

V8's v8/v8@d3a1a5b6c491, with the fix for this memory leak, has landed (after being delayed by the v12.15.0 security release) in Tuesday's v12.16.0 as nodejs/node@c004cf51c6 in nodejs/node#31691.

That's all to say, I hope you can upgrade to v12 again and be free of this! We'll still bump the ECMAScript compilation target in the next major version.

I'm going to close this, but please feel free to ping me if it's not in fact fixed. Thanks so much for reporting this originally! And a salute to @4n3ver for identifying the culprit. ❤️

@abernix abernix closed this as completed Feb 13, 2020
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Apr 20, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants
@abernix @matthemsteger @jackmarketon @4n3ver and others