Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use LMDB cache instead of regular FS cache to speed up cache hits #932

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

pastelsky
Copy link

@pastelsky pastelsky commented Mar 6, 2022

Please check if the PR fulfills these requirements

  • Tests for the changes have been added (for bug fixes / features)
  • Docs have been added / updated (for bug fixes / features)

What kind of change does this PR introduce? (Bug fix, feature, docs update, ...)
Improved performance of babel-loader when files are cached. LMDB is much faster to access than the regular file cache used by babel-loader.

In the large codebase I work with, warm boot times on webpack 4 improved from — 79 seconds to 74 seconds.

What is the current behavior? (You can also link to an open issue here)
babel-loader saves files for caching built contents.

What is the new behavior?
Replaces the use of individual files with lmdb, which improves babel-loader times, when hitting caches in large projects.
LMDB is already used by projects such as Parcel and Elasticsearch's Kibana, as the storage layer for HarperDB and Gatsby for fast memory-mapped cached access.

This has a side-effect of making babel-loader's implementation a bit simpler as well.

Does this PR introduce a breaking change?

@pastelsky pastelsky force-pushed the sk/use-lmdb-cache branch 16 times, most recently from 4d3d769 to 1ef92e1 Compare March 7, 2022 05:07
@pastelsky
Copy link
Author

Pre-requisite for this PR —
#933

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant