Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Core dump crash #29353

Closed
jakeleventhal opened this issue Aug 28, 2019 · 14 comments
Closed

Core dump crash #29353

jakeleventhal opened this issue Aug 28, 2019 · 14 comments
Labels
http2 Issues or PRs related to the http2 subsystem.

Comments

@jakeleventhal
Copy link

  • Version: v12.9.0
  • Platform: Linux b8b01391ee38 4.9.125-linuxkit deps: update openssl to 1.0.1j #1 SMP Fri Sep 7 08:20:28 UTC 2018 x86_64 GNU/Linux
  • Subsystem: Running on the node:latest docker image

Please see the issue here googleapis/nodejs-firestore#739
This describes details of how I was able to reproduce this problem using the @google-cloud/firestore module.
Regardless of the module, node should not be crashing like this.

@Trott Trott added the http2 Issues or PRs related to the http2 subsystem. label Aug 28, 2019
@Trott
Copy link
Member

Trott commented Aug 28, 2019

@nodejs/http2

@addaleax
Copy link
Member

@jakeleventhal Thanks for the report! Any chance you could share reproduction code? It may seem like “nothing complicated” to you or other people familiar with the firestore module, but it would be very helpful to have something a bit more concrete.

@jakeleventhal
Copy link
Author

Here is the function that is being ran (I changed some variable names and removed some intermediate steps that are just manipulating variables - nothing that has to do with promises or networking). From the looks of it, there is nothing inconspicuous going on here

export const saveReportData = async (cateogories, reportData) => {
    // Create the map of updates
    const allUpdates = {};
    cateogories.forEach((category) => {
      allUpdates[category] = { items: {} };
    });
  
    await Promise.all(
      reportData
        .filter(
          ({ channel }) => cateogories.includes(channel)
        )
        .map(
          async ({
            id,
            email,
          }) => {
            // Get the existing item
            const existingItem = await firestore.doc(`Items/${id}`).get();
  
            // If the item is already saved in the database and hasn't been processed yet
            if (existingItem.exists && !existingItem.data().Processed) {
              const itemUpdatePayload = {
                Processed: true,
                User: email,
                Status: 'Complete'
              };
  
              // Update the item in the database
              await firestore.doc(`Items/${id}`).update(itemUpdatePayload);
            }
          }
        )
    );
  };

@addaleax
Copy link
Member

Unfortunately, I can’t reproduce an error with that code (ran it with dummy input saveReportData(['a', 'b', 'c'], [{ channel: 'a', id: 'foo', email: 'bar' }])).

We could just remove that check, but this does point to a bigger bug in the HTTP/2 implementation of some kind, and so it would be really good to figure out what that is.

@jakeleventhal
Copy link
Author

jakeleventhal commented Aug 28, 2019

Like i said in the other issue, I can’t get it to reliably reproduce. In fact it’s the minority of cases. And it appears to always happen after a deadline exceeded error.

on a side note, how are these deadline exceeded errors avoided anyway

@addaleax
Copy link
Member

on a side note, how are these deadline exceeded errors avoided anyway

I’m afraid that’s a question for the GCP folks.

@cjihrig
Copy link
Contributor

cjihrig commented Aug 28, 2019

on a side note, how are these deadline exceeded errors avoided anyway

I’m afraid that’s a question for the GCP folks.

To clarify that a bit more, the deadline is a gRPC specific concept - you can see the stack trace contains @grpc/grpc-js.

@jakeleventhal
Copy link
Author

Okay, well if it isn’t going to break anything, can the check be removed? At least until the source of the problem is found

addaleax added a commit to addaleax/node that referenced this issue Sep 1, 2019
Don’t start reading more input data if we’re still busy writing output.
This was overlooked in 8a4a193.

Fixes: nodejs#29353
Fixes: nodejs#29393
@addaleax
Copy link
Member

addaleax commented Sep 1, 2019

#29399 should fix this.

@jakeleventhal
Copy link
Author

#29399 should fix this.

Hopefully. How long does it take for node to release updates though...?
I am using the official node:latest docker image. Should I expect months of wait time before this gets resolved and is verified?

@bnoordhuis
Copy link
Member

@jakeleventhal For latest probably 1-2 weeks.

@danbriggs5
Copy link

Could this also be affecting older versions? Seeing something similar on v10.16. If so, how long would it take to get a fix released to 10.16?

@jakeleventhal
Copy link
Author

@addaleax is there any validation plan for this? I am unable to reliably produce the error

@bnoordhuis
Copy link
Member

@danbriggs5 #29202 (comment) :-)

addaleax added a commit to addaleax/node that referenced this issue Sep 19, 2019
Don’t start reading more input data if we’re still busy writing output.
This was overlooked in 8a4a193.

Fixes: nodejs#29353
Fixes: nodejs#29393

PR-URL: nodejs#29399
Reviewed-By: David Carlier <devnexen@gmail.com>
Reviewed-By: Colin Ihrig <cjihrig@gmail.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
addaleax added a commit to addaleax/node that referenced this issue Sep 19, 2019
Don’t start reading more input data if we’re still busy writing output.
This was overlooked in 8a4a193.

Fixes: nodejs#29353
Fixes: nodejs#29393

PR-URL: nodejs#29399
Reviewed-By: David Carlier <devnexen@gmail.com>
Reviewed-By: Colin Ihrig <cjihrig@gmail.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
targos pushed a commit that referenced this issue Sep 20, 2019
Don’t start reading more input data if we’re still busy writing output.
This was overlooked in 8a4a193.

Fixes: #29353
Fixes: #29393

PR-URL: #29399
Reviewed-By: David Carlier <devnexen@gmail.com>
Reviewed-By: Colin Ihrig <cjihrig@gmail.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
BethGriggs pushed a commit that referenced this issue Sep 25, 2019
Don’t start reading more input data if we’re still busy writing output.
This was overlooked in 8a4a193.

Fixes: #29353
Fixes: #29393

PR-URL: #29399
Backport-PR-URL: #29618
Reviewed-By: David Carlier <devnexen@gmail.com>
Reviewed-By: Colin Ihrig <cjihrig@gmail.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
BethGriggs pushed a commit that referenced this issue Oct 1, 2019
Don’t start reading more input data if we’re still busy writing output.
This was overlooked in 8a4a193.

Fixes: #29353
Fixes: #29393

PR-URL: #29399
Backport-PR-URL: #29619
Reviewed-By: David Carlier <devnexen@gmail.com>
Reviewed-By: Colin Ihrig <cjihrig@gmail.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
http2 Issues or PRs related to the http2 subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants