Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(optimizer): holdUntilCrawlEnd option #15244

Merged
merged 6 commits into from Jan 23, 2024

Conversation

patak-dev
Copy link
Member

Description

After Vite 3.0, our cold start strategy changed as described in #8869

An issue of this strategy is that for the default non-https server, the browser can only request 6 concurrent requests. Each optimized dep found takes one of these requests, so requests could stall until the crawling of static deps is finished in the server. For large projects, this means that the browser is potentially idle when it could be downloading and parsing requests. We did this to avoid full-page reloads when the scanner misses a dependency and forces a full reload. But we are paying a cost in all projects where all deps are discovered by the scanner or manually included.

This PR adds a new option optimizeDeps.holdUntilCrawlEnd. If true (the default), it uses the current strategy. If false, it releases to the browser the optimized deps computed after the scanner as soon as they are available. We avoid the possibility of the browser being idle while the server crawls static imports, with the trade-off that if dependencies are missed by the scanner, there could be a full-page reload.

There are some differences with a similar strategy we had in Vite 2:

  • We have an optimization that avoids the need for the reload if the new dependencies don't generate new common chunks (from v3)
  • The scanner and optimized deps are run in parallel to the processing (from v3)
  • We have new warnings to guide the user to add the missed deps to optimizeDeps.include and avoid full-page reloads.

We discussed these strategies with @bluwy and @ArnaudBarre, and this PR could be a first step to change the default strategy for cold start to holdUntilCrawlEnd: false in a future minor, depending on the feedback we get from users.


What is the purpose of this pull request?

  • Bug fix
  • New Feature
  • Documentation update
  • Other

Copy link

stackblitz bot commented Dec 4, 2023

Review PR in StackBlitz Codeflow Run & review this pull request in StackBlitz Codeflow.

@patak-dev patak-dev added feat: deps optimizer Esbuild Dependencies Optimization performance Performance related enhancement labels Dec 4, 2023
@@ -160,7 +182,7 @@ async function createDepsOptimizer(
// On warm start or after the first optimization is run, we use a simpler
// debounce strategy each time a new dep is discovered.
let crawlEndFinder: CrawlEndFinder | undefined
if (isBuild || !cachedMetadata) {
if ((!noDiscovery && isBuild) || !cachedMetadata) {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When noDiscovery: true, we don't need the crawlEndFinder. This simplifies the logic in onCrawlEnd.

Comment on lines +253 to +258
const manuallyIncluded = Object.keys(manuallyIncludedDepsInfo)
discoveredDepsWhileScanning.push(
...Object.keys(metadata.discovered).filter(
(dep) => !deps[dep] && !manuallyIncluded.includes(dep),
),
)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In case we issue the warning to add dependencies to optimize.include, I think we should also ask the user to add the deps discovered while the scanner was running. The scanner may end before some of these are found during the crawling of static imports in future cold starts.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IIUC you mean we also log the dependencies found by the scanner? (Not the static imports crawler) I'm not sure if it would be ideal to request adding them to optimizeDeps.include manually. I think it would be enough to request from the crawler only.

In general, I think missed dependencies could mean a bug in our code (e.g. new URL('./somewhere', import.meta.url) which I never got around fixing 😅 ), or a optimizeDeps.entries misconfiguration. So we can be a bit conservative in requesting adding to optimizeDeps.include and try to fix the root cause instead.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, if something is found by the scanner, it is fine. I mean logging the dependencies that were not found by the scanner or manually included. But some of them may have been found before the scanner finished and right now we are also using them on the first optimize run. discoveredDepsWhileScanning should maybe be named missedDepsWhileScanning

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah ok thanks for the explanation. I don't have a preference for the names, but whichever you think fits better here.

Comment on lines +501 to +510
if (warnAboutMissedDependencies) {
logDiscoveredDepsWhileScanning()
config.logger.info(
colors.magenta(
`❗ add these dependencies to optimizeDeps.include to avoid a full page reload during cold start`,
),
{ timestamp: true },
)
warnAboutMissedDependencies = false
}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could ask the user to only add the dependencies that triggered the full-page reload (the ones that have custom chunks). But adding all the dependencies is better as it will avoid a new optimize run and let the browser download these dependencies as soon as the scan+optimize step is done. So better to ask for all of them to be included.

Comment on lines 685 to 699
if (isBuild) {
currentlyProcessing = false
const crawlDeps = Object.keys(metadata.discovered)
if (crawlDeps.length === 0) {
debug?.(
colors.green(
`✨ no dependencies found while processing user modules`,
),
)
firstRunCalled = true
} else {
runOptimizer()
}
return
}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved the build time logic here to simplify the dev logic. The comment could also be improved and during build we don't need the debounce. We want to remove this anyways, but the change helped with this PR.

Copy link
Member

@bluwy bluwy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to understand the optimizer flow, but I don't quite grasp it 😅 Other than that, I tested locally and it seems to work well though

@patak-dev
Copy link
Member Author

I tried to understand the optimizer flow, but I don't quite grasp it 😅 Other than that, I tested locally and it seems to work well though

In a future minor (so we get some time for testing it in a beta), we should start from scratch and move all this code to a state machine. It ended up being quite hard to follow after all the accumulated changes.

@patak-dev patak-dev added this to the 5.1 milestone Dec 6, 2023
@ArnaudBarre
Copy link
Member

(I have the review of this one on my todo, I will review it this weekend I think)

@patak-dev patak-dev added the p3-significant High priority enhancement (priority) label Dec 13, 2023
@patak-dev patak-dev merged commit b7c6629 into main Jan 23, 2024
15 checks passed
@patak-dev patak-dev deleted the perf/avoid-holding-optimized-deps-option branch January 23, 2024 15:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat: deps optimizer Esbuild Dependencies Optimization p3-significant High priority enhancement (priority) performance Performance related enhancement
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

None yet

3 participants