Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[aws-lambda-nodejs] Recent changes in the module broke our setup (monorepo, private repos, typescript, lerna) #9032

Closed
gergan opened this issue Jul 13, 2020 · 23 comments · Fixed by #9067
Assignees
Labels
@aws-cdk/aws-lambda-nodejs bug This issue is a bug. in-progress This issue is being actively worked on. needs-triage This issue or PR still needs to be triaged. p1

Comments

@gergan
Copy link

gergan commented Jul 13, 2020

Reproduction Steps

So it could be misconfiguration on our end, but recent changes most probably the ones in #8837 broke our setup and I could not find a running configuration for now.

so we have a monorepo with lerna, yarn workspaces, only typescript and the following structure

bin -> some scripts
infra-cdk/
  bin/
  lib/
    lambdas/
      function.ts
      function2.ts
    stack....ts
    stack2..ts
packages/
  package1/
    src/
      sth.ts
      other.ts    
  package2/
node_modules/
lerna.json
package.json
yarn.lock

So we have only one node_modules and yarn.lock and generally the code before the changes used to find the node_modules which was there in the root of the project, our code has always set the projectRoot to the root of the project and in this way all has worked.

We didn't have package.json in the lambdas directory, which somehow does not work now - so I've created one.
Now I have a code state with changes which partially work if I

$ yarn install 
$ yarn build:fresh (which makes clean and builds the packages)
$ yarn workspace infra-cdk cdk diff DevelopmentStack

This results in some of the lambdas have been built and probably the second one (using our private package) is broken with the message, which is in Error Log Section. Parcel tries to find our package, but It is not there. I should mention, that there are some other problems, which correlate with this issue - after this error the node_modules is missing some modules as for example lerna and I need to make yarn install again in order to work with repo. So my assumption is that the code uses now erroneously the lambda package.json but installs in the root directory, which leads to this problem.

Error Log

Bundling...
✨ Built in 25.48s

../../../../asset-output/index.js    2.4 KB    837ms
yarn install v1.22.4
[1/4] Resolving packages...
info If you think this is a bug, please open a bug report with the information provided in "/asset-output/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.


stderr: warning package.json: No license field
warning Skipping preferred cache folder "/.cache/yarn" because it is not writable.
warning Selected the next writable cache folder in the list, will be "/tmp/.yarn-cache-501".
warning No license field
error An unexpected error occurred: "https://registry.yarnpkg.com/@infra%2faws-utils: Not found".

Environment

  • CLI Version :1.51.0
  • Framework Version:1.51.0
  • Node.js Version:v12.16.3
  • OS :MacoOS Catalina 10.15.5
  • Language (Version):TypeScript (3.9.6)

Other

If this is simply a configuration issue I'll be glad to test some ideas or advices on how to handle this, or if you need more information please tell me.


This is 🐛 Bug Report

@gergan gergan added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jul 13, 2020
@jogold
Copy link
Contributor

jogold commented Jul 13, 2020

What's in the packages field of your lerna.json? Is infra-cdk considered a package/workspace? And where are the NodejsFunction constructs instantiated?

@gergan
Copy link
Author

gergan commented Jul 13, 2020

So that's our lerna.json:

  "npmClient": "yarn",
  "packages": ["packages/*", "infra-cdk", "infra-cdk/lib/lambdas", "bin"],
  "version": "1.0.0",
  "useWorkspaces": true
}

I have added the lambdas now trying to find a working configuration, it wasn't there before.

in package.json the workspaces field is the same now

  "workspaces": [
    "packages/*",
    "infra-cdk/lib/lambdas",
    "infra-cdk",
    "bin"
  ],

we have multiple NodejsFunction-Constructs. In lib we have a dozen Stacks and there are mostly directly under lib. So we have infra-cdk/lib/metrics-stack.ts which references a lambda in infra-cdk/lib/lambdas/cloudwatch-ecs-eni-utilization.ts

export const lambdasPath = "lambdas/"

export const getProjectRoot = () => {
  // parcel is now used in docker so we need this in some way
  return path.resolve(__dirname, "../..")
}
{
...
    const metricLambda = new lambdaNodejs.NodejsFunction(this, "CloudWatchMetricLambda", {
      description: `Custom CloudWatch metric: ENI utilization per ECS Cluster.`,
      entry: path.join(__dirname, lambdasPath, "cloudwatch-ecs-eni-utilization.ts"),
      logRetention: logs.RetentionDays.ONE_WEEK,
      timeout: cdk.Duration.seconds(5),
      tracing: lambda.Tracing.ACTIVE,
      projectRoot: getProjectRoot(),
      externalModules: [],
      nodeModules: ["aws-sdk", "aws-xray-sdk-core", "axios", "node-fetch"],
    })

We need to add the projectRoot, because CodeBuild removes the .git directory

@gergan
Copy link
Author

gergan commented Jul 13, 2020

I have also moved one of the stacks parts in own directory as the file was grown to big:
infra-cdk/development-stack/constructs...ts and they are referencing the same lambdas-directory in the same manner

@jogold
Copy link
Contributor

jogold commented Jul 13, 2020

And where is this @infra/aws-utils from the yarn error log? And why does it get installed?

@gergan
Copy link
Author

gergan commented Jul 13, 2020

it is in packages/aws-utils

@gergan
Copy link
Author

gergan commented Jul 13, 2020

It gets installed as it is typescript and should be compiled, probably I could work around this with references in tsconfig - in fact the infra-cdk itself still has this reference, because in the earlier versions it was its package.json the source of truth for the lambdas

@jogold
Copy link
Contributor

jogold commented Jul 13, 2020

It gets installed as it is typescript and should be compiled

But Parcel can bundle it even if it's typescript, have you tried that? (remove it from the nodeModules list)

@gergan
Copy link
Author

gergan commented Jul 13, 2020

So I have tested this change and it goes through, though it takes longer as earlier, and it changes my yarn.lock and removes much of the packages from it - most notably lerna. So is there something which I could do to stop it from changing the yarn.lock and probably just build faster, but this would be just bonus.

@jogold
Copy link
Contributor

jogold commented Jul 13, 2020

So that's our lerna.json:

  "npmClient": "yarn",
  "packages": ["packages/*", "infra-cdk", "infra-cdk/lib/lambdas", "bin"],
  "version": "1.0.0",
  "useWorkspaces": true
}

Having nested packages/workspaces is usually an anti-pattern when working with lerna/yarn workspaces. You should normally have a single package.json under infra-cdk with the deps for your infra and the deps for your lambdas in there.

@gergan
Copy link
Author

gergan commented Jul 13, 2020

this was in fact the way it was before I had cdk updated and had the problem with the lambdas not building, will try now to revert some of the changes

@gergan
Copy link
Author

gergan commented Jul 13, 2020

So I have said, that it works but I had another problem, which has conceal the fact that it does not in fact work. So I had codebuild step-function implementation, which used the renderTask and not the new _renderTask interface and that's why after finishing with the lambdas, the problem was that there is no _renderTask method, which could be called.
So the current status at least for me is that it runs through all the lambdas, but at the end dies, because it changes the yarn.lock and the currently installed packages in node_modules and ironically could not find itself:

yarn workspace infra-cdk cdk diff DevelopmentStack                                          <aws:sellwerk>
yarn workspace v1.22.4
yarn run v1.22.4
$ cdk diff DevelopmentStack
Cannot find module '../../../package.json'
Require stack:
- /xxxx/infra/node_modules/aws-cdk/lib/api/cxapp/cloud-executable.js
- /xxxx/infra/node_modules/aws-cdk/bin/cdk.js
- /xxxx/infra/node_modules/aws-cdk/bin/cdk
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
error Command failed.
Exit code: 1
Command: /xxxx/.nvm/versions/node/v12.16.3/bin/node
Arguments: /xxxx/.nvm/versions/node/v12.16.3/lib/node_modules/yarn/lib/cli.js cdk diff DevelopmentStack
Directory: /xxxx/infra/infra-cdk
Output:

info Visit https://yarnpkg.com/en/docs/cli/workspace for documentation about this command.

The files and the directory itself do not exist after the run (and I don't have global aws-cdk installed):

$ ls /xxxx/infra/node_modules/aws-cdk/
ls: /xxxx/infra/node_modules/aws-cdk/: No such file or directory

The second thing is that the nodejsfunction build times are now really prohibitive, with a dozen lambdas it needs more than half an hour to complete.

@jogold
Copy link
Contributor

jogold commented Jul 13, 2020

The second thing is that the nodejsfunction build times are now really prohibitive, with a dozen lambdas it needs more than half an hour to complete.

Looks like somehow something is reinstalling your whole repo at each NodejsFunction build... this is what takes time, not the Parcel bundling itself. Does it occur only in CodeBuild or also locally? Can you try removing all nodeModules prop to test?

@gergan
Copy link
Author

gergan commented Jul 13, 2020

I'm running it locally and yes I have removed all the nodeModules on the last test.
It could be that the whole change to parcel 2 is the culprit and not as I thought your last commit, I'm investigating it now.

@gergan
Copy link
Author

gergan commented Jul 13, 2020

So now I have tested multiple configurations and settings.

The real problem is globally installed/or docker installed parcel-2 and yarn workspaces/lerna. One comment I have seen in the parcel issues has given me a clue for a workaround parcel-bundler/parcel#1588 (comment).

Adding parcel and @babel/core to the root-package devDependencies has solved the problem with the continuous overwriting of yarn.lock and node_modules, and has allowed me to rund cdk diff to an end (I'm still to test if all works ok after deployment, but it looks ok). This workaround seems really strange and more of a hack, then a permanent solution.

The speed was somewhat improved with the change, but it's still prohibitive slow - almost a minute pro lambda.

The last problem with the private package in the nodeModule will not work with the current implementation, as it runs out of the yarn workspaces/lerna and does not know anything about the private package. It would have been useful at least for me, because sometimes I need to look at the lambda's code in the aws console, and a bundled Lambda is really big, but if a solution could not be implemented, probably a line in the documentation explaining the problem with the private packages would be fine.

So I'm not sure how to proceed with this bug, for me it is partially an upstream bug, although there was a functionality in previous versions, which does not work anymore with the current version and I don't see the update as a step forward as besides this bug, the speed of the new parcel bundling is painfully slow.

@gergan
Copy link
Author

gergan commented Jul 14, 2020

So I have now run test against the parcel1 implementation (I have used 5491922 commit) and here are the times from both runs:

yarn workspace infra-cdk cdk diff DevelopmentStack  18.68s user 2.35s system 4% cpu 8:41.53 total
yarn workspace infra-cdk cdk diff DevelopmentStack  19.11s user 3.34s system 3% cpu 11:52.17 total

The first one is with parcel1 and the second is with parcel2. The times with parcel2 are worse but not that much as I expected. Probably the problem with the times is in Docker and some sort of caching.

@gergan
Copy link
Author

gergan commented Jul 14, 2020

So I now have the time for parcel1 without Docker (I have used 5b34ccb commit)

yarn workspace infra-cdk cdk diff DevelopmentStack  370.93s user 50.94s system 378% cpu 1:51.47 total

So I don't know - probably in this case, I'll just package this, probably move to parcel2 and call it a day.
I don't see the advantage for docker in this case (at least for us) - probably for other shops, which are using the CDK from other languages, which in fact is also a misnomer as we are talking for nodes-lambdas and not python ones.

@jogold
Copy link
Contributor

jogold commented Jul 14, 2020

@gergan I made some tests and I can confirm that there are issues in some monorepo setups (overwriting of node_modules and yarn.lock). Looking into this to find the correct fix.

Adding parcel and @babel/core to the root-package devDependencies has solved the problem with the continuous overwriting of yarn.lock and node_modules

This is the correct temporary workaround, though adding @babel/core should be sufficient.

jogold added a commit to jogold/aws-cdk that referenced this issue Jul 14, 2020
When Parcel cannot find `@babel/core` in the search paths for node
modules it tries to install it. This overwrites the lock file and
`node_modules`.

Add the `--no-autoinstall` flag to prevent Parcel from installing any
missing dependency and switch to a local install in the Docker image at
the root (`/`) of the filesystem. This way all dependencies in
`/node_modules` will be in the search paths of `/asset-input`.

This was not detected in this repo or in the init template because
`@babel/core` is a dependency of `jest`.

Closes aws#9032
@jogold
Copy link
Contributor

jogold commented Jul 14, 2020

@gergan can you give it a try with this branch https://github.com/jogold/aws-cdk/tree/parcel-no-auto-install / commit jogold@fcd9979?

It should work without any workaround in your root package.json (the first run could take some time because the Docker image must be rebuilt). Also this branch includes #9000 so bundling should be a little faster.

@gergan
Copy link
Author

gergan commented Jul 14, 2020

So I've tested it and unfortunately it didn't help. I think I've tested the no install option to no avail yesterday with a global parcel instance.

@jogold
Copy link
Contributor

jogold commented Jul 14, 2020

So I've tested it and unfortunately it didn't help.

What did you test exactly? Any error messages?

@gergan
Copy link
Author

gergan commented Jul 14, 2020

Sorry, for all the tests I have used an in-tree package, where I just copied the files. Which I did also for your change, but I now see, that I forgot to exchange the imports - as I'm changing between different branches the whole day ... I'll retest ASAP and report here.

jogold added a commit to jogold/aws-cdk that referenced this issue Jul 14, 2020
When Parcel cannot find `@babel/core` in the search paths for node
modules it tries to install it. This overwrites the lock file and
`node_modules`.

Add the `--no-autoinstall` flag to prevent Parcel from installing any
missing dependency and switch to a local install in the Docker image at
the root (`/`) of the filesystem. This way all dependencies in
`/node_modules` will be in the search paths of `/asset-input`.

This was not detected in this repo or in the init template because
`@babel/core` is a dependency of `jest`.

Closes aws#9032
@jogold
Copy link
Contributor

jogold commented Jul 14, 2020

Sorry, for all the tests I have used an in-tree package, where I just copied the files. Which I did also for your change, but I now see, that I forgot to exchange the imports - as I'm changing between different branches the whole day ... I'll retest ASAP and report here.

No problem, this should be the fix master...jogold:parcel-no-auto-install

@gergan
Copy link
Author

gergan commented Jul 14, 2020

Ok, so it worked and it is a little bit faster on first run

yarn workspace infra-cdk cdk diff DevelopmentStack  19.11s user 3.19s system 3% cpu 9:40.68 total

the second run is around 5:30 minutes

@SomayaB SomayaB added the in-progress This issue is being actively worked on. label Jul 15, 2020
@eladb eladb added the p1 label Jul 16, 2020
@mergify mergify bot closed this as completed in #9067 Jul 16, 2020
mergify bot pushed a commit that referenced this issue Jul 16, 2020
When Parcel cannot find `@babel/core` in the search paths for node
modules it tries to install it. This overwrites the lock file and
`node_modules`.

Add the `--no-autoinstall` flag to prevent Parcel from installing any
missing dependency and switch to a local install in the Docker image at
the root (`/`) of the filesystem. This way all dependencies in
`/node_modules` will be in the search paths of `/asset-input`.

This was not detected in this repo or in the init template because
`@babel/core` is a dependency of `jest`.

Closes #9032


----

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
curtiseppel pushed a commit to curtiseppel/aws-cdk that referenced this issue Aug 11, 2020
When Parcel cannot find `@babel/core` in the search paths for node
modules it tries to install it. This overwrites the lock file and
`node_modules`.

Add the `--no-autoinstall` flag to prevent Parcel from installing any
missing dependency and switch to a local install in the Docker image at
the root (`/`) of the filesystem. This way all dependencies in
`/node_modules` will be in the search paths of `/asset-input`.

This was not detected in this repo or in the init template because
`@babel/core` is a dependency of `jest`.

Closes aws#9032


----

*By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
@aws-cdk/aws-lambda-nodejs bug This issue is a bug. in-progress This issue is being actively worked on. needs-triage This issue or PR still needs to be triaged. p1
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants