Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sentry: Error uploading sourcemap file - Error: connect EADDRNOTAVAIL 35.188.42.15:443 - Local #63

Open
igorescobar opened this issue Jun 22, 2022 · 27 comments
Labels

Comments

@igorescobar
Copy link

igorescobar commented Jun 22, 2022

Hi,

A few days from now we started to get this error on our pipelines:

Error ---------------------------------------------------
 
  Error: Sentry: Error uploading sourcemap file - Error: connect EADDRNOTAVAIL 35.188.42.15:443 - Local (172.17.0.2:0)
      at SentryPlugin.<anonymous> (/builds/services/node_modules/serverless-sentry/dist/index.js:603:31)
      at step (/builds/services/node_modules/serverless-sentry/dist/index.js:44:23)
      at Object.throw (/builds/services/node_modules/serverless-sentry/dist/index.js:25:53)
      at rejected (/builds/services/node_modules/serverless-sentry/dist/index.js:17:65)
      at processTicksAndRejections (internal/process/task_queues.js:95:5)
 
     For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.
 
  Get Support --------------------------------------------
     Docs:          docs.serverless.com
     Bugs:          github.com/serverless/serverless/issues
     Issues:        forum.serverless.com
 
  Your Environment Information ---------------------------
     Operating System:          linux
     Node Version:              14.19.3
     Framework Version:         2.72.3 (local)
     Plugin Version:            5.5.4
     SDK Version:               4.3.2
     Components Version:        3.18.2

This is what our dependencies look like at the moment:

    "@sentry/integrations": "^7.2.0",
    "@sentry/node": "^7.2.0",
    "serverless-sentry-lib": "^2.5.0",
    "serverless-sentry": "^2.5.0",

We even tried to disable this functionality by adding the environment variable SENTRY_SOURCEMAPS or adding the custom config:

sentry:
    sourceMaps: false

But it runs regardless and ignores the flag. Do you have any idea what is happening and how to work around this?

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

Maybe we are being too aggressive here:
https://github.com/arabold/serverless-sentry-plugin/blob/master/src/index.ts#L476 and we are running out of TCP ports/sockets or something because of the large number of files being sent?

@jonmast
Copy link
Contributor

jonmast commented Jun 22, 2022

Thanks for reporting @igorescobar, I'm sorry this is causing you problems. I did see that error myself but it went away on its own so I assumed it was just some sort of transient network issue.

It still uploading with sourceMaps: false is a bug, we should add a check to prevent that from happening.

As a temporary workaround you could remove your authToken config, that will disable releases entirely which will skip sourcemap upload.

My understanding is that there are around 30K ports available so I wouldn't have thought we'd be hitting that. Your setup is a bit different than mine though, I'm using serverless-webpack to bundle my files so there are only a few that get uploaded. If your setup is pulling in all of node_modules there will be many more and it's plausible you're hitting the limit. We should probably be ignoring that folder since most libraries don't have sourcemaps.

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

Yeah, I also noticed that after upgrading to there is no way to disable the source maps upload (besides doing what you described). the only way is rolling it back.

@arabold
Copy link
Owner

arabold commented Jun 22, 2022

I'm literally on a plane right now but should have a quick fix in a bit.

@arabold
Copy link
Owner

arabold commented Jun 22, 2022

No risk, no fun. Just published a fully untested 2.5.1 that (hopefully) fixes two problems:

  • Fix #63: Upload source maps serially to avoid running out of sockets.
  • Correctly disable uploading source maps if the config setting is false or unset.

🤞

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

Fix #63: Upload source maps serially to avoid running out of sockets.

Maybe we should do this in batches instead of calling all of them in parallel or in series.... creating batches of 50-100 requests might fix the issue. In our case it would take ages to submit all files I suppose.

@arabold
Copy link
Owner

arabold commented Jun 22, 2022

The best would be to upload like 3 to 5 simultaneously for best performance. 50-100 parallel connections are a lot. But I currently lack the time and ability to test this properly, so I went for the quickest fix. It's slower but should work regardless of how large the project is and how many artifacts it has to upload.

@igorescobar
Copy link
Author

The best would be to upload like 3 to 5 simultaneously for best performance. 50-100 parallel connections are a lot. But I currently lack the time and ability to test this properly, so I went for the quickest fix. It's slower but should work regardless of how large the project is and how many artifacts it has to upload.

I would also create a property to configure it if necessary. It will be hard to come up with a config that will be nice for everyone so be conservative but also allow people to be more aggressive if they have the computing power to do so 👍

@arabold
Copy link
Owner

arabold commented Jun 22, 2022

I have another update 2.5.2. Can you give it a try? It will upload 5 in parallel which should not impact your performance anymore. 10, 20, 50 really shouldn't make a noticeable difference, but feel free to adjust it and try it out if you notice something. I'm open to any changes. Again, this is fully untested 🙄

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

Tests checklist v2.5.2

  • Source code is now honoring the sourceMaps default false and it deployed the application with no issues. (Job finished in 7 minutes 37 seconds)
  • sourceMaps: true. Job couldn't finish. I was taking too much time. I stop the job at 45min.

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

Already 20 minutes in and counting 😄 . No errors but as expected... it will take ages to finish 😄

@arabold
Copy link
Owner

arabold commented Jun 22, 2022

😞 How long did it take before my changes to serialize the upload?

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

without sourcemaps it took about 7 minutes. with sourcemaps we are now 32 minutes in.

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

If you want I can try to cook a PR adding a new option where I can configure the number of files to be uploaded at the same time. I think we can easily increase it to 10 or 20.

Using Keep-Alive Agent inside the super-agent library might also speed up requests considerably.

@arabold
Copy link
Owner

arabold commented Jun 22, 2022

Well, that sucks. I don't think we need a configuration - the plugin should use a sensible value that works for most (if not all) scenarios. I'm okay with increasing it to whatever will work with a comparable performance than before. Technically, even 50 should be a problem. Seems that many, many small files still generate too much overhead. In a perfect world we would upload all files in a single request, but I haven't reviewed the Sentry APIs in ages... So, don't know if that would even be possible. Using Keep-Alive would definitely make sense as well!

If you have some availability, I'd be happy to merge in any PR that improves the situation. At least we have a working version now, even if it's unacceptable slow.

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

Could you increase it to like 25 for now? I'm trying to avoid having to add another dependency to the project as superagent alone needs an extra library to handle keep-alive within node I suppose.

@igorescobar
Copy link
Author

moving away to fetch would also be a preferable option 👍

@igorescobar
Copy link
Author

(I stopped the job at 45min)

@arabold
Copy link
Owner

arabold commented Jun 22, 2022

Please give 2.5.3 a shot whenever you have the chance. I increased it to 50 parallel uploads which matches the default the AWS JavaScript SDK uses as well: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/node-configuring-maxsockets.html

So, this should be a sensible default. Otherwise we can still add a configuration setting or think about more complex improvements for the next update.

@igorescobar
Copy link
Author

Thanks! 🚀
I'm testing it out and I will let you know.

@igorescobar
Copy link
Author

image

😄

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

With that being said... we definitely need to add some retries and grace periods into this logic. If you consider that you can have multiple builds running at the same time you don't want to fail the entire build because of this.

@igorescobar
Copy link
Author

Also... worth mentioning that even tho most requests failed because of the concurrency being too high the job finished at 16 minutes 59 seconds.

@igorescobar
Copy link
Author

igorescobar commented Jun 22, 2022

Something to explore:
https://superchargejs.com/docs/3.x/promise-pool#error-handling

It can give you access to a list of failed items which you can re-process later at a lower concurrency.

@igorescobar
Copy link
Author

import { PromisePool } from '@supercharge/promise-pool'

const errors = []

const { results } = await PromisePool
  .for(users)
  .handleError(async (error, user) => {
      // you must collect errors yourself
    if (error instanceof ValidationError) {
      return errors.push(error)
    }

    // Process error handling on specific errors
    if (error instanceof ThrottleError) {
      return await retryUser(user)
    }
  })
  .process(async data => {
    // the harder you work for something,
    // the greater you’ll feel when you achieve it
  })

await handleCollected(errors) // this may throw

return { results }

@arabold arabold added the bug label Jun 24, 2022
@igorescobar
Copy link
Author

Just an update on my side. We found out a few other ways @sentry recommends for source map upload and we ended up using https://github.com/getsentry/sentry-webpack-plugin which also uploads the source maps to their service and it worked nicely for us. 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants