Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test: harden tmpdir/rmdirSync error handling #30517

Closed
wants to merge 2 commits into from

Conversation

lundibundi
Copy link
Member

@lundibundi lundibundi commented Nov 17, 2019

This makes rmdirSync call itself after handling ENOTEMPTY, EEXIST, EPERM
errors instead of directly using fs.rmdirSync to handle the possibility
of such errors repeating on the next try (i.e. FS finished writing new
file just as we are done with our loop to delete all files).

Refs: #29852

This should handle test/parallel/test-child-process-fork-exec-path.

Checklist
  • make -j4 test (UNIX), or vcbuild test (Windows) passes
  • commit message follows commit guidelines

/cc @nodejs/testing

This makes rmdirSync call itself after handling ENOTEMPTY, EEXIST, EPERM
errors instead of directly using fs.rmdirSync to handle the possibility
of such errors repeating on the next try (i.e. FS finished writing new
file just as we are done with our loop to delete all files).

Refs: nodejs#29852
@nodejs-github-bot nodejs-github-bot added the test Issues and PRs related to the tests. label Nov 17, 2019
@lundibundi lundibundi added flaky-test Issues and PRs related to the tests with unstable failures on the CI. fs Issues and PRs related to the fs subsystem / file system. labels Nov 17, 2019
@nodejs-github-bot
Copy link
Collaborator

nodejs-github-bot commented Nov 17, 2019

CI: https://ci.nodejs.org/job/node-test-pull-request/26643/ (other flakes failed)
Stress test-child-process-fork-exec-path: https://ci.nodejs.org/job/node-stress-single-test/16/

@@ -83,7 +83,7 @@ function rmdirSync(p, originalEr) {
rimrafSync(path.join(p, f));
}
});
fs.rmdirSync(p);
rmdirSync(p, null);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does there need to be some sort of retry limit to avoid infinite loops?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought about it but it should only happen if someone is infinitely (or create-check-retry) creating files in a directory we are trying to remove.
Though, I can add a retry counter as a third argument just to be sure WDYT?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well there's got to be a reason that rimraf by default caps the number of retries, so I think some sort of limit would be good.

Long term it would be good if #30074 (which removes the custom rimraf implementation) could be fixed up.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

@lundibundi
Copy link
Member Author

lundibundi commented Nov 17, 2019

New stress test https://ci.nodejs.org/job/node-stress-single-test/17/

Failed: looks like 100 retries is not enough under -j16 unfortunately. Before, when number of retries was unlimited it failed due to unhandled EBUSY error.

Though I still think this should decrease the flakiness of this test (and possibly other that rely on tmpdir on windows).

@Fishrock123
Copy link
Member

Fwiw, there is a PR to attempt to remove this test-only rimraf entirely: #30074

@lundibundi
Copy link
Member Author

@Fishrock123 yeah, but that looks like it has some issues that might not be fixed soon. And I see this issue fail windows CI quite often so I guess we might at least reinforce it.

@nodejs-github-bot
Copy link
Collaborator

@Trott
Copy link
Member

Trott commented Dec 7, 2019

@Fishrock123 yeah, but that looks like it has some issues that might not be fixed soon. And I see this issue fail windows CI quite often so I guess we might at least reinforce it.

An alternative was opened by @cjihrig that uses some other work they landed recently, as well as #30785 (which will hopefully land soon) and might obsolete both #30074 and this PR.

@lundibundi
Copy link
Member Author

Closing, for now, I'd very much like to see a proper solution from @cjihrig PRs land.

@lundibundi lundibundi closed this Dec 8, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
flaky-test Issues and PRs related to the tests with unstable failures on the CI. fs Issues and PRs related to the fs subsystem / file system. test Issues and PRs related to the tests.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants