Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pnpm audit goes into infinite loop #6203

Closed
kamsar opened this issue Mar 10, 2023 · 7 comments · Fixed by #6245
Closed

pnpm audit goes into infinite loop #6203

kamsar opened this issue Mar 10, 2023 · 7 comments · Fixed by #6245

Comments

@kamsar
Copy link
Member

kamsar commented Mar 10, 2023

pnpm version: >= 7.25.1

Code to reproduce the issue:

Reproduced on a large private repo; was unable to make a public replication scenario. I have a hypothesis of what's going on though. Feel free to close if nothing comes to mind as a cause :)

With 7.25.0 and earlier, pnpm audit worked fine on my private repo.

With 7.25.1-7.29.1 the CLI spins for a few seconds and then throws:

pnpm: Maximum call stack size exceeded
    at /usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:120713:21
    at Array.map (<anonymous>)
    at calculateCellWidthIndex (/usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:120712:20)
    at /usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:120785:71
    at Array.forEach (<anonymous>)
    at calculateMaximumColumnWidthIndex (/usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:120784:12)
    at makeColumns (/usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:120820:85)
    at makeConfig (/usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:120840:24)
    at table (/usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:120994:46)
    at Object.handler [as audit] (/usr/local/lib/node_modules/pnpm/dist/pnpm.cjs:121275:37)

#3073, which was added in 7.25.1 and alters the behavior of audit seems like a probable cause. I wonder if it's somehow creating an infinitely nested object that the table is trying to crawl?

Expected behavior:

audit runs without issue

Additional information:

  • node -v prints: 18.12.1
  • Windows, macOS, or Linux?: Mac Ventura 13.2.1 M1
@await-ovo
Copy link
Member

I'm sorry to hear this, it may have been brought on by my changes, but I haven't reproduced the problem yet.

Can you help confirm that pnpm -r list is working? Also is it possible to provide a minimal repo that can reproduce this issue? Thanks a lot.

@kamsar
Copy link
Member Author

kamsar commented Mar 10, 2023

pnpm -r list is working, yes. It's weird that #5917 seems like a probable cause from the changelog, but the stack trace doesn't seem to be in anything changed there - unless it somehow created an object to render that had a reference loop in it perhaps.

I'll try to spend a bit of time in the compiled code to see if I can track it down or at least see what it's recursing over.

@kamsar
Copy link
Member Author

kamsar commented Mar 10, 2023

It seems like it's dying because of some absolutely gigantic output coming into the table, specifically on the Paths column, which I think is related to the PR above.

I modified pnpm.cjs like so, adding a console log to capture the cell data as its width is computed:

var require_calculateCellWidthIndex = __commonJS({
  "../node_modules/.pnpm/@zkochan+table@1.0.0/node_modules/@zkochan/table/dist/calculateCellWidthIndex.js"(exports2) {
    "use strict";
    Object.defineProperty(exports2, "__esModule", {
      value: true
    });
    exports2.default = void 0;
    var _stringWidth = _interopRequireDefault(require_string_width());
    function _interopRequireDefault(obj) {
      return obj && obj.__esModule ? obj : { default: obj };
    }
    var calculateCellWidthIndex = (cells) => {
      return cells.map((value) => {
        console.log("\n\n\n\n\ncellmap", JSON.stringify(value))
        return Math.max(...value.split("\n").map((line) => {
          return (0, _stringWidth.default)(line);
        }));
      });
    };
    var _default = calculateCellWidthIndex;
    exports2.default = _default;
  }
});

The recursive stack trace fails here:

return Math.max(...value.split("\n").map((line) => {

The cause seems to be that it's getting a whopping 54MB object for the paths; it does not appear to be infinitely deep, since JSON.stringify works in my console log, but the computation seems to run out of call stack (strange, because it doesn't look recursive unless spread is using internal recursion). The 54MB figure is based on writing the CLI output to a file to analyse it :)

This does not occur if I remove the call to extendWithDependencyPaths introduced in #5917. I'll do a bit of spelunking in there next, but it does appear to be the cause.

@kamsar
Copy link
Member Author

kamsar commented Mar 10, 2023

Did some more debugging and I don't think there's anything fundamentally wrong with the code...I just have a large and interconnected monorepo with 110 packages, and it takes a long time to scan to find all paths for a deeply interconnected package.

It might be nice for large monorepos to have a flag to turn this feature off, and/or to limit the number of paths that will be resolved for a given dependency. In my case audit is done in a second or two without path resolution...and crashes with a stack overflow after about 90 seconds with it on.

Here's the relevant code from the compiled pnpm.cjs file, with some // NOTE comments from my testing:

async function extendWithDependencyPaths(auditReport, opts) {
      const { advisories } = auditReport;
      if (!Object.keys(advisories).length)
        return auditReport;
      const projectDirs = Object.keys(opts.lockfile.importers).map((importerId) => path_1.default.join(opts.lockfileDir, importerId));
      const searchOpts = {
        lockfileDir: opts.lockfileDir,
        depth: Infinity,
        include: opts.include
      };
      const _searchPackagePaths = searchPackagePaths.bind(null, searchOpts, projectDirs);
      for (const { findings, module_name: moduleName } of Object.values(advisories)) {
        for (const finding of findings) {
          // NOTE: in my case there are 20 total findings
          finding.paths = await _searchPackagePaths(`${moduleName}@${finding.version}`);
        }
      }
      return auditReport;
    }
    async function searchPackagePaths(searchOpts, projectDirs, pkg) {
      const pkgs = await (0, list_1.searchForPackages)([pkg], projectDirs, searchOpts);
      // NOTE: in my case there are 110 elements in pkgs every time, which seems weird since it's got [pkg] on there as a filter? The searchForPackages() being called for every issue seems to take up most of the execution time.
      const paths = [];

      // NOTE: the walk here seems to then create a gigantic list (the 54MB) of every package with an indirect reference; in my case, this can be very large. Adding some sort of max resolved paths here might help?
      for (const pkg2 of pkgs) {
        _walker([
          ...pkg2.optionalDependencies ?? [],
          ...pkg2.dependencies ?? [],
          ...pkg2.devDependencies ?? [],
          ...pkg2.unsavedDependencies ?? []
        ], path_1.default.relative(searchOpts.lockfileDir, pkg2.path) || ".");
      }
      return paths;
      function _walker(packages, depPath) {
        for (const pkg2 of packages) {
          const nextDepPath = `${depPath} > ${pkg2.name}@${pkg2.version}`;
          if (pkg2.dependencies?.length) {
            _walker(pkg2.dependencies, nextDepPath);
          } else {
            paths.push(nextDepPath);
          }
        }
      }
    }

@await-ovo
Copy link
Member

await-ovo commented Mar 12, 2023

When the number of paths is too large, e.g. more than 50, is it possible to display it as follows:

. > karma@2.0.5 > chokidar@2.1.8 > fsevents@1.2.9 >    node-pre-gyp@0.12.0 > mkdirp@0.5.1 > minimist@0.0.8                                                       
. > karma@2.0.5 > chokidar@2.1.8 > fsevents@1.2.9 >   node-pre-gyp@0.12.0 > tar@4.4.15 > mkdirp@0.5.1 > minimist@0.0.8 
 ...... 
 Excessive paths found, you can use `pnpm why minimist@0.0.8`for more information.

@await-ovo
Copy link
Member

When the number of paths is too large, e.g. more than 50, is it possible to display it as follows:

. > karma@2.0.5 > chokidar@2.1.8 > fsevents@1.2.9 >    node-pre-gyp@0.12.0 > mkdirp@0.5.1 > minimist@0.0.8                                                       
. > karma@2.0.5 > chokidar@2.1.8 > fsevents@1.2.9 >   node-pre-gyp@0.12.0 > tar@4.4.15 > mkdirp@0.5.1 > minimist@0.0.8 
 ...... 
 Excessive paths found, you can use `pnpm why minimist@0.0.8`for more information.

What do you think of this, cc @zkochan ~

@zkochan
Copy link
Member

zkochan commented Mar 17, 2023

I would say more than 3. It is really not useful to see so many paths. I guess with npm it is not an issue because everything is hoisted and mostly there is only one path.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants