Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flaky test-net-throttle on daily master #33135

Closed
sam-github opened this issue Apr 28, 2020 · 6 comments · Fixed by #33329
Closed

flaky test-net-throttle on daily master #33135

sam-github opened this issue Apr 28, 2020 · 6 comments · Fixed by #33329
Labels
flaky-test Issues and PRs related to the tests with unstable failures on the CI.

Comments

@sam-github
Copy link
Contributor

sam-github commented Apr 28, 2020

Failed last three nightlies:

Error Message
fail (1)
Stacktrace
build big string
server started on port 38697
assert.js:103
  throw new AssertionError(obj);
  ^

AssertionError [ERR_ASSERTION]: Expected values to be strictly equal:

true !== false

    at Server.<anonymous> (/home/iojs/build/workspace/node-test-commit-custom-suites-freestyle/test/pummel/test-net-throttle.js:38:10)
    at Server.emit (events.js:315:20)
    at TCP.onconnection (net.js:1537:8) {
  generatedMessage: true,
  code: 'ERR_ASSERTION',
  actual: true,
  expected: false,
  operator: 'strictEqual'
}
@sam-github sam-github added the flaky-test Issues and PRs related to the tests with unstable failures on the CI. label Apr 28, 2020
@sam-github
Copy link
Contributor Author

sam-github commented Apr 28, 2020

We were almost green last night... albeit by diligent skipping of tests.

@Trott
Copy link
Member

Trott commented May 9, 2020

First failed daily run was for f8d5474 and last successful daily run was for 24a4e61. That's a span of only 6 commits. If this reproduces locally, a bisect should find the problematic commit pretty quickly. And if it doesn't reproduce locally, using CI to find the problematic commit should be feasible.

@Trott
Copy link
Member

Trott commented May 9, 2020

Starting the bisect on CI since that will probably finish before the local compilation I'm doing for this will finish.

https://ci.nodejs.org/job/node-test-commit-custom-suites-freestyle/14177/ will be f8d5474 and should fail.

https://ci.nodejs.org/job/node-test-commit-custom-suites-freestyle/14178/ will be 24a4e61 and should pass.

https://ci.nodejs.org/job/node-test-commit-custom-suites-freestyle/14179/ will be 658cae0, which is between the two.

@Trott
Copy link
Member

Trott commented May 9, 2020

The test passes locally for me, and all three CI jobs failed, so that suggests it may have been a change on the CI host and not a change in the code...

@Trott
Copy link
Member

Trott commented May 9, 2020

It looks to me like the test makes assumptions about how much data it will take before the kernel buffer fills and data gets queued in memory, and that this assumption may no longer be true for the CI host. Is that your assessment too, @sam-github? Or should all systems queue data with this test?

@nodejs/build Any chance anyone beefed up test-rackspace-ubuntu1604-x64-1 around two weeks ago?

@Trott Trott mentioned this issue May 9, 2020
2 tasks
@Trott
Copy link
Member

Trott commented May 9, 2020

#33329 seems to fix it.

@Trott Trott closed this as completed in 5ded044 May 11, 2020
codebytere pushed a commit that referenced this issue May 15, 2020
Repeat writes until data is queued in memory, rather than assuming that
it will happen by a certain point.

Fixes: #33135

PR-URL: #33329
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
Reviewed-By: Juan José Arboleda <soyjuanarbol@gmail.com>
codebytere pushed a commit that referenced this issue Jun 7, 2020
Repeat writes until data is queued in memory, rather than assuming that
it will happen by a certain point.

Fixes: #33135

PR-URL: #33329
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
Reviewed-By: Juan José Arboleda <soyjuanarbol@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
flaky-test Issues and PRs related to the tests with unstable failures on the CI.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants