Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: update core dependencies to 0.25 #635

Merged
merged 7 commits into from Aug 27, 2021

Conversation

dyladan
Copy link
Member

@dyladan dyladan commented Aug 23, 2021

What it says on the tin

@dyladan dyladan requested a review from a team as a code owner August 23, 2021 14:45
@codecov
Copy link

codecov bot commented Aug 23, 2021

Codecov Report

Merging #635 (8c0d32f) into main (c25bc38) will decrease coverage by 1.81%.
The diff coverage is 100.00%.

❗ Current head 8c0d32f differs from pull request most recent head a7ba541. Consider uploading reports for the commit a7ba541 to get more accurate results

@@            Coverage Diff             @@
##             main     #635      +/-   ##
==========================================
- Coverage   96.68%   94.87%   -1.82%     
==========================================
  Files          13      208     +195     
  Lines         634    12128   +11494     
  Branches      124     1155    +1031     
==========================================
+ Hits          613    11506   +10893     
- Misses         21      622     +601     
Impacted Files Coverage Δ
...njection/src/instrumentation/WebInstrumentation.ts 97.22% <ø> (ø)
...ages/opentelemetry-host-metrics/src/BaseMetrics.ts 57.57% <ø> (ø)
...ges/opentelemetry-host-metrics/test/metric.test.ts 96.89% <ø> (ø)
...metry-instrumentation-graphql/test/graphql.test.ts 100.00% <ø> (ø)
...entelemetry-instrumentation-graphql/test/helper.ts 100.00% <ø> (ø)
...e/opentelemetry-instrumentation-pg/test/pg.test.ts 95.01% <ø> (ø)
...pentelemetry-instrumentation-pg/test/utils.test.ts 98.36% <ø> (ø)
...tension-autoinjection/test/instrumentation.test.ts 100.00% <100.00%> (ø)
...ntelemetry-instrumentation-pg/test/pg-pool.test.ts 91.42% <100.00%> (ø)
...opentelemetry-resource-detector-aws/src/version.ts 0.00% <0.00%> (ø)
... and 194 more

@dyladan
Copy link
Member Author

dyladan commented Aug 23, 2021

Anyone know why this is failing? i'm not really sure

@Flarna
Copy link
Member

Flarna commented Aug 24, 2021

Best guess is too high load because of parallel testing. I see no other reason for e.g. "before all" hook in "WinstonInstrumentation" not finishing within 2000ms.

@dyladan
Copy link
Member Author

dyladan commented Aug 24, 2021

Best guess is too high load because of parallel testing. I see no other reason for e.g. "before all" hook in "WinstonInstrumentation" not finishing within 2000ms.

I agree. I can't reproduce it locally either.

@dyladan
Copy link
Member Author

dyladan commented Aug 24, 2021

Trying to limit concurrency to 4 (there are 2 cpus on the runners) to see if that helps. Probably we haven't seen this yet because we haven't had an update which touches so many packages.

@dyladan
Copy link
Member Author

dyladan commented Aug 24, 2021

Seems like the lambda failure is a legitimate failure

@dyladan
Copy link
Member Author

dyladan commented Aug 24, 2021

@willarmiros the AWS lambda tests are consistently failing and they pass locally. Any idea what might be up?

@willarmiros
Copy link
Contributor

Taking a look...seems like the problem is that the spans aren't getting exported. Were there any changes to the memory exporter or force flush?

@willarmiros
Copy link
Contributor

I can see no relevant changes were made, and that wouldn't really make sense since the tests are passing locally anyway. I can see that there many other tests that are using the in-memory exporter and then verifying the content of those exported spans. One difference is that we're using the BatchSpanProcessor and all others that I've seen are using the SimpleSpanProcessor, so maybe there's something to that

@dyladan
Copy link
Member Author

dyladan commented Aug 24, 2021

Taking a look...seems like the problem is that the spans aren't getting exported. Were there any changes to the memory exporter or force flush?

Not that I'm aware of

@dyladan
Copy link
Member Author

dyladan commented Aug 24, 2021

Quick perusal of the release notes from 0.25 shows only this that might have made a difference?

open-telemetry/opentelemetry-js#2396

@willarmiros
Copy link
Contributor

Yeah I saw that, don't think that's the culprit because it should have equally impacted the tests that use SimpleSpanProcessor... it could be something weird with the GitHub runner environment, just really not sure why we'd only see it with these tests on this particular PR.

@willarmiros
Copy link
Contributor

@anuraaga do you have any ideas?

@anuraaga
Copy link
Contributor

@willarmiros Since we don't have package-lock.json is it possible the dependencies aren't at the same version when run locally as what's running in CI? That's what comes to mind as a possible reason for not being able to reproduce.

We have this that relies on some weird behavior of the patching infrastructure but wonder if an update changed it.

https://github.com/open-telemetry/opentelemetry-js-contrib/blob/main/plugins/node/opentelemetry-instrumentation-aws-lambda/src/instrumentation.ts#L109

@dyladan
Copy link
Member Author

dyladan commented Aug 25, 2021

@anuraaga @willarmiros I wonder if it has to do with the hoisting? Dependencies are hoisted in CI to make the CI run faster.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet