Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unauthorized cache warning #6765

Open
anthonyshew opened this issue Dec 11, 2023 Discussed in #6740 · 39 comments
Open

Unauthorized cache warning #6765

anthonyshew opened this issue Dec 11, 2023 Discussed in #6740 · 39 comments

Comments

@anthonyshew
Copy link
Contributor

This is a bug. We would appreciate reproductions if you have one to provide. 🙏

Discussed in #6740

Originally posted by dcantu96 December 7, 2023
Hello I just updated from 1.10.16 to 1.11.0 and I started to see the warning below when I ran a turbo command. Some other helpful notes are that our company hosts the remote cache servers and that the TURBO_TOKEN is set in the root .env file. It's been working without warnings until now

 WARNING  artifact verification failed: Error making HTTP request: HTTP status client error (401 Unauthorized) for url (https://turborepo.company.app/v8/artifacts/123ID?teamId=MY_TEAM_ID)
```</div>
@Zertsov
Copy link
Contributor

Zertsov commented Dec 11, 2023

If we could have the command ran and a diff between what happens in 1.10 vs 1.11, that'd be appreciated!

@spacedawwwg
Copy link

spacedawwwg commented Dec 14, 2023

diff between what happens

Not 100% sure how to supply this... but I'll give you what I can.

turbo.json

{
  "$schema": "https://turbo.build/schema.json",
  "pipeline": {
    "lint": {},
    "lint:fix": {},
    "lint:report": {
      "outputs": ["reports/eslint.json"]
    },
    "test:unit": {},
    "test:unit:coverage": {
      "outputs": ["coverage/**"]
    },
    "typecheck": {
      "outputs": ["dist/**", "bin/**"]
    }
  }
}

Using 1.11 run turbo test:unit:coverage and I get:

 WARNING  artifact verification failed: Error making HTTP request: HTTP status client error (412 Precondition Failed) for url (https://mycustomsubdomain.azurewebsites.net/v8/artifacts/373260d19847f925?slug=team_uol)

Using 1.10, the above is not an issue

@chris-olszewski
Copy link
Contributor

@spacedawwwg

Can you confirm that on 1.10 that the remote cache gets written to? e.g. turbo lint followed by turbo lint --remote-only which should be a cache hit if the remote cache getting written to successfully by the first command.

@spacedawwwg
Copy link

spacedawwwg commented Dec 18, 2023

@chris-olszewski I can confirm running the above commands, using 1.10, the remote cache was written
image

@dgattey
Copy link

dgattey commented Dec 23, 2023

Getting a very similar error too after upgrading. I'm running turbo lint or turbo lint:types when there's no cache hit, and I'm getting this print out repeatedly:

WARNING  artifact verification failed: unknown status forbidden: Not authorized

And if I run with --remote-only, I get full turbo and success, so it's actually writing to remote cache successfully.

@attila
Copy link

attila commented Dec 28, 2023

Having bumped into this issue, I can also confirm this using ducktors / turborepo-remote-cache.

  • the remote cache written by turbo 1.10.16 is not recognised by 1.11.2 for cache hits
  • 1.11.2 cannot write to the remote cache at all, always showing "artifact verification failed" warnings, with no requests made to the remote cache service whatsoever

Moreover, it seems that this warning may not be remote caching issue, but a generic "artifact verification" related problem as I could just confirm the same bug with local filesystem caches:

  • Local cache entries generated with turbo 1.10.16 is not recognised by 1.11.2
  • 1.11.2 recognises its own local cached artefacts

The task I tested all the above with does not have any extra outputs, only stdout

Hope that helps.

@smakhtin
Copy link

smakhtin commented Jan 3, 2024

Also getting this error in our repo here - https://github.com/sovereign-nature/deep

@Zertsov
Copy link
Contributor

Zertsov commented Jan 4, 2024

@attila
I made a PR that I think addresses the issue, but after reproducing it one time with my own setup, I haven't been able to do so since. If you could, would you be able to try that version of Turbo to see if it continues to happen? The PR is here.

Also, can you confirm if this is still happening in v1.11.3?

@psychobolt
Copy link

psychobolt commented Jan 5, 2024

The issue is really random. However I am still able to reproduce it locally. See screenshot:

Screenshot 2024-01-04 at 10 03 06 PM

On my private repo in bitbucket, interesting case that slug is suffixed to warning:
WARNING artifact verification failed: Error making HTTP request: error sending request for url (https://vercel.com/api/v8/artifacts/7782cf756ea79997?slug=$TURBO_TEAM): operation timed out

On my public repo in github, runner log I get a different error:
https://github.com/psychobolt/vite-storybook-boilerplate/actions/runs/7415684842/job/20179281074#step:5:23

Could this be related? WARNING artifact verification failed: unknown status forbidden: The request is missing an authentication token

@admmasters
Copy link

Getting artifact verification failed: Error making HTTP request: error sending request for url (https://*****/v8/artifacts/******?teamId=***): error trying to connect: invalid peer certificate contents: invalid peer certificate: UnknownIssuer downgrading to 1.10.x works as expected

@Zertsov
Copy link
Contributor

Zertsov commented Jan 10, 2024

Hey all, I'm still investigating this. If anyone has any artifacts that they're willing to share to help debug that'd be appreciated, since I'm having problems reproducing this on my end.

@admmasters If you use the latest version of turbo, can you run whatever command you ran with --go-fallback at the end? Wondering if this is something to do with the Rust HTTP client vs the Go one.

@admmasters
Copy link

Yes it works perfectly with the --go-fallback, so looks like an incompatibility with the rust client.

@gsoltis
Copy link
Contributor

gsoltis commented Jan 11, 2024

@admmasters can you provide any details about the certificate your remote cache is using? Is it self-signed, or does it roll up to a custom root certificate?

@admmasters
Copy link

admmasters commented Jan 11, 2024

@gsoltis Its self signed yeah

@gsoltis
Copy link
Contributor

gsoltis commented Jan 12, 2024

@admmasters Got it. Ok, I believe you have a separate issue from what's reported here. This is likely a difference in the default verification cert behavior of the http clients (cc @Zertsov @NicholasLYang)

@NicholasLYang
Copy link
Contributor

Hi @admmasters, could you open a new issue? And if possible, include the following details: Are you using a proxy? Can you validate that you are getting cache hits using the --go-fallback flag from the local remote cache (using --remote-only to skip the filesystem cache)?

@chris-olszewski
Copy link
Contributor

Hi all, we're continuing to look into this. If people who ran into issues could try again with turbo@1.11.4-canary.2 and see if that version has issues as well that would be helpful. If the issue persists, please include the full error message along with how you're providing the various pieces of authentication:

  • How is your token provided by TURBO_TOKEN/--token/ stored in config file?
  • How is your team provided? TURBO_TEAM/--team/ stored team ID in .turbo/config.json
  • How is the API url provided?

@attila
Copy link

attila commented Jan 24, 2024

I did a quick test where in our monorepo:

  1. I installed turbo@1.10.16 on our monorepo – the last known stable and working version for us
  2. I ran a single task in a single workspace (turbo run typecheck --filter='workspace-name' --remote-only)
    1. First run was a cache miss – as expected
    2. Second run was full turbo – as expected
  3. I installed turbo@1.11.4-canary.2
  4. I ran the same task above which resulted in a cache miss but also gave this error:
    WARNING  failed to contact remote cache: Error making HTTP request: error sending request for url (https://mylambda.lambda-url.eu-west-1.on.aws/v8/artifacts/7f6df9e5f2ee4fd4?slug=team_my): error trying to connect: tcp connect error: Bad file descriptor (os error 9)
    
  5. Subsequent attempts of 4. will always result in a cache miss and the same warning shown

Note that instead of the previously reported "artifact verification failed" warnings with 1.11.2 I have "tcp connect error: Bad file descriptor" now.

Token, team and the remote cache URL are all provided via env vars

TURBO_API=https://mylambda.lambda-url.eu-west-1.on.aws
TURBO_TEAM=team_my
TURBO_TOKEN=my_token

I ran this on an Intel Mac OS 14.2.1 (23C71)

@gsoltis
Copy link
Contributor

gsoltis commented Jan 24, 2024

@attila Interesting. My suspicion is that the credentials are fine, but something significant to your setup is different in the TLS stack between the Rust and Go implementations.

If you're comfortable with it, can you email me (greg.soltis@vercel.com) the domain name (please no credentials or private keys) that is hosting your remote cache? I'd like to try to confirm if we can make a TLS connection to it.

Also, any details you're aware of about the TLS setup would be helpful. For instance: is the root cert for the domain signed by one of the default Certificate Authorities? Or do you have a custom root CA on your machine?

@attila
Copy link

attila commented Jan 25, 2024

@gsoltis I just sent an email with the full base URL of the service. Regarding TLS setup, I'm unaware of any exotic setup there, the service runs on a Lambda function URL, with the default hostname provided by AWS.

@gsoltis
Copy link
Contributor

gsoltis commented Jan 25, 2024

@attila Thanks, we'll see what we can find. AFAICT there is nothing weird with the endpoint setup (just poking at it w/ openssl s_client -connect), but maybe there's something about the rust tls setup that's getting tripped up.

@gsoltis
Copy link
Contributor

gsoltis commented Jan 26, 2024

@attila I've tried running turbo@1.11.4-canary.2 for some fake cache artifacts and with obviously-wrong credentials against the domain that you supplied, both from an M1 Mac as well as an Intel Mac. In both cases, I'm getting
WARNING failed to contact remote cache: Error making HTTP request: HTTP status client error (401 Unauthorized) for url (https://LAMBDA_ID.lambda-url.eu-west-1.on.aws/v8/artifacts/5ba3c902e45327e3?slug=my-team)
which is the correct response, given the lack of credentials. However, it is showing that I can connect and exchange data.

Let's see if we can isolate the problem. Can you try with a new monorepo:

npx -y create-turbo@canary my-turborepo npm
cd my-turborepo
TURBO_TEAM=my-team TURBO_TOKEN=my-token TURBO_API=https://my-api-endpoint npm exec turbo -- build -vv --remote-only --output-logs=none

Note that the logs will include your domain name. You should get an error writing to the cache, but hopefully the more verbose logging will give us a clue where to look.

@attila
Copy link

attila commented Jan 26, 2024

@gsoltis

Here's the sanitised log output from the command above
2024-01-26T07:14:12.934+0000 [DEBUG] turborepo_lib::shim: Global turbo version: 1.11.4-canary.2
2024-01-26T07:14:12.935+0000 [DEBUG] turborepo_lib::shim: Repository Root: /Users/attila/my-turborepo
2024-01-26T07:14:12.936+0000 [DEBUG] turborepo_lib::shim: Local turbo path: /Users/attila/my-turborepo/node_modules/turbo-darwin-64/bin/turbo
2024-01-26T07:14:12.936+0000 [DEBUG] turborepo_lib::shim: Local turbo version: 1.11.4-canary.2
2024-01-26T07:14:12.942+0000 [DEBUG] log: starting new connection: https://turbo.build/
2024-01-26T07:14:12.942+0000 [DEBUG] hyper::client::connect::dns: resolving host="turbo.build"
2024-01-26T07:14:12.945+0000 [DEBUG] hyper::client::connect::http: connecting to 76.76.21.241:443
2024-01-26T07:14:14.140+0000 [DEBUG] turborepo_lib::shim: Currently running turbo is local turbo.
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_lib::commands::run: using the experimental rust codepath
2024-01-26T07:14:14.141+0000 [DEBUG] turborepo_lib::commands::run: configured run struct: Run { base: CommandBase { repo_root: AbsoluteSystemPathBuf("/Users/attila/my-turborepo"), ui: UI { should_strip_ansi: false }, config: OnceCell(<uninit>), args: Args { version: false, skip_infer: false, no_update_notifier: false, api: None, color: false, cpu_profile: None, cwd: Some("/Users/attila/my-turborepo"), heap: None, login: None, no_color: false, preflight: false, remote_cache_timeout: None, team: None, token: None, trace: None, verbosity: Verbosity { verbosity: None, v: 2 }, check_for_update: false, test_run: false, run_args: None, command: Some(Run(RunArgs { cache_dir: None, cache_workers: 10, concurrency: None, continue_execution: false, dry_run: None, go_fallback: false, single_package: false, force: None, framework_inference: true, global_deps: [], graph: None, env_mode: Infer, filter: [], scope: [], ignore: [], since: None, include_dependencies: false, no_deps: false, no_cache: false, daemon: false, no_daemon: false, output_logs: Some(None), log_order: Auto, only: false, parallel: false, pkg_inference_root: None, profile: None, anon_profile: None, remote_only: true, remote_cache_read_only: false, summarize: None, log_prefix: Auto, tasks: ["build"], pass_through_args: [], experimental_space_id: None })) }, version: "1.11.4-canary.2" }, processes: ProcessManager(Mutex { data: ProcessManagerInner { is_closing: false, children: [] }, poisoned: false, .. }) }
2024-01-26T07:14:14.142+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.142+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.142+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.142+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.142+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.142+0000 [DEBUG] turborepo_lib::daemon::connector: looking for pid in lockfile: AbsoluteSystemPathBuf("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/5d0d76a86387f534/turbod.pid")
2024-01-26T07:14:14.142+0000 [DEBUG] turborepo_lib::daemon::connector: no pid found, starting daemon
2024-01-26T07:14:14.145+0000 [DEBUG] turborepo_lib::daemon::connector: got daemon with pid: 2966
2024-01-26T07:14:14.146+0000 [DEBUG] turborepo_lib::daemon::connector: creating AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/5d0d76a86387f534")
2024-01-26T07:14:14.146+0000 [DEBUG] turborepo_lib::daemon::connector: watching AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/5d0d76a86387f534")
2024-01-26T07:14:14.157+0000 [DEBUG] turborepo_lib::daemon::connector: creating AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/5d0d76a86387f534")
2024-01-26T07:14:14.157+0000 [DEBUG] turborepo_lib::daemon::connector: watching AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/5d0d76a86387f534")
2024-01-26T07:14:14.157+0000 [DEBUG] turborepo_lib::daemon::connector: connecting to socket: /var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/5d0d76a86387f534/turbod.sock
2024-01-26T07:14:14.157+0000 [DEBUG] h2::client: binding client connection
2024-01-26T07:14:14.157+0000 [DEBUG] h2::client: client connection bound
2024-01-26T07:14:14.157+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.157+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.157+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T07:14:14.157+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.157+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.157+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.157+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.158+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.158+0000 [DEBUG] h2::proto::settings: received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384 }
2024-01-26T07:14:14.176+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.176+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.177+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.177+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.177+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.177+0000 [DEBUG] turborepo_lib::daemon::connector: connected in 34741µs
2024-01-26T07:14:14.177+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.177+0000 [DEBUG] turborepo_lib::run: running in daemon mode
2024-01-26T07:14:14.177+0000 [DEBUG] turborepo_repository::discovery: discovering packages using caching strategy
2024-01-26T07:14:14.177+0000 [DEBUG] turborepo_repository::discovery: no cached data, running primary strategy
2024-01-26T07:14:14.177+0000 [DEBUG] turborepo_repository::discovery: discovering packages using fallback strategy
2024-01-26T07:14:14.177+0000 [DEBUG] turborepo_repository::discovery: attempting primary strategy
2024-01-26T07:14:14.199+0000 [DEBUG] turborepo_repository::discovery: discovering packages using optional strategy
2024-01-26T07:14:14.199+0000 [DEBUG] turborepo_lib::run::package_discovery: discovering packages using daemon
2024-01-26T07:14:14.199+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T07:14:14.199+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.199+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.199+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.200+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.200+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.200+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.200+0000 [DEBUG] turborepo_repository::discovery: discovering packages using caching strategy
2024-01-26T07:14:14.214+0000 [DEBUG] turborepo_repository::discovery: discovering packages using caching strategy
2024-01-26T07:14:14.214+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.214+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.214+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
• Packages in scope: @repo/eslint-config, @repo/typescript-config, @repo/ui, docs, web
• Running build in 5 packages
• Remote caching enabled
2024-01-26T07:14:14.223+0000 [DEBUG] turborepo_lib::run::global_hash: global hash env vars []
2024-01-26T07:14:14.391+0000 [DEBUG] turborepo_lib::run::global_hash: external deps hash: 2ab9b9b7ca4f0498
2024-01-26T07:14:14.391+0000 [DEBUG] turborepo_lib::run: global hash: eea8d017656bd0ad
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.392+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.393+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.393+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.393+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.414+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.414+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.414+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.414+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.414+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.439+0000 [DEBUG] turborepo_lib::run: running visitor
2024-01-26T07:14:14.439+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.439+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.439+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for @repo/eslint-config:build
 vars: []
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_graph::visitor: task @repo/eslint-config#build hash is 4d1d2eaec67bc97f
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for @repo/typescript-config:build
 vars: []
2024-01-26T07:14:14.440+0000 [DEBUG] log: Engine visitor dropped callback sender without sending result
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_graph::visitor: task @repo/typescript-config#build hash is d4fe0f0d0b570cfb
2024-01-26T07:14:14.440+0000 [DEBUG] log: Engine visitor dropped callback sender without sending result
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for @repo/ui:build
 vars: []
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_graph::visitor: task @repo/ui#build hash is d51673cb09827d95
2024-01-26T07:14:14.440+0000 [DEBUG] log: Engine visitor dropped callback sender without sending result
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_hash: auto detected framework for docs
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_lib::task_hash: framework: nextjs, env_prefix: ["NEXT_PUBLIC_*"]
2024-01-26T07:14:14.440+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for docs:build
 vars: []
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_lib::task_graph::visitor: task docs#build hash is 503bb6cab51a9039
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.441+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_lib::task_hash: auto detected framework for web
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_lib::task_hash: framework: nextjs, env_prefix: ["NEXT_PUBLIC_*"]
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:14.441+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.441+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for web:build
 vars: []
2024-01-26T07:14:14.441+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_lib::task_graph::visitor: task web#build hash is 8aafdfc02ef725ee
2024-01-26T07:14:14.441+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T07:14:14.441+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T07:14:14.441+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.441+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.441+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:14.453+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.453+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.453+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.453+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.453+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.453+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:14.453+0000 [DEBUG] log: starting new connection: https://my-api-endpoint.lambda-url.eu-west-1.on.aws/
2024-01-26T07:14:14.453+0000 [DEBUG] log: starting new connection: https://my-api-endpoint.lambda-url.eu-west-1.on.aws/
2024-01-26T07:14:14.453+0000 [DEBUG] hyper::client::connect::dns: resolving host="my-api-endpoint.lambda-url.eu-west-1.on.aws"
2024-01-26T07:14:14.453+0000 [DEBUG] hyper::client::connect::dns: resolving host="my-api-endpoint.lambda-url.eu-west-1.on.aws"
2024-01-26T07:14:14.454+0000 [DEBUG] hyper::client::connect::http: connecting to 52.215.129.118:443
2024-01-26T07:14:14.455+0000 [DEBUG] hyper::client::connect::http: connecting to 52.215.129.118:443
2024-01-26T07:14:15.373+0000 [DEBUG] hyper::client::connect::http: connected to 52.215.129.118:443
2024-01-26T07:14:15.373+0000 [DEBUG] log: No cached session for DnsName("my-api-endpoint.lambda-url.eu-west-1.on.aws")
2024-01-26T07:14:15.373+0000 [DEBUG] log: Not resuming any session
2024-01-26T07:14:15.373+0000 [DEBUG] hyper::client::connect::http: connected to 52.215.129.118:443
2024-01-26T07:14:15.373+0000 [DEBUG] log: No cached session for DnsName("my-api-endpoint.lambda-url.eu-west-1.on.aws")
2024-01-26T07:14:15.373+0000 [DEBUG] log: Not resuming any session
2024-01-26T07:14:15.405+0000 [DEBUG] log: ALPN protocol is None
2024-01-26T07:14:15.405+0000 [DEBUG] log: Using ciphersuite TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
2024-01-26T07:14:15.405+0000 [DEBUG] log: Server supports tickets
2024-01-26T07:14:15.408+0000 [DEBUG] log: ECDHE curve is ECParameters { curve_type: NamedCurve, named_group: secp256r1 }
2024-01-26T07:14:15.408+0000 [DEBUG] log: Server DNS name is DnsName("my-api-endpoint.lambda-url.eu-west-1.on.aws")
2024-01-26T07:14:15.411+0000 [DEBUG] log: ALPN protocol is None
2024-01-26T07:14:15.411+0000 [DEBUG] log: Using ciphersuite TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
2024-01-26T07:14:15.411+0000 [DEBUG] log: Server supports tickets
2024-01-26T07:14:15.414+0000 [DEBUG] log: ECDHE curve is ECParameters { curve_type: NamedCurve, named_group: secp256r1 }
2024-01-26T07:14:15.414+0000 [DEBUG] log: Server DNS name is DnsName("my-api-endpoint.lambda-url.eu-west-1.on.aws")
2024-01-26T07:14:15.435+0000 [DEBUG] hyper::proto::h1::io: flushed 277 bytes
2024-01-26T07:14:15.441+0000 [DEBUG] hyper::proto::h1::io: flushed 277 bytes
2024-01-26T07:14:15.650+0000 [DEBUG] hyper::proto::h1::io: parsed 9 headers
2024-01-26T07:14:15.650+0000 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (32 bytes)
2024-01-26T07:14:15.650+0000 [DEBUG] hyper::proto::h1::conn: incoming body completed
2024-01-26T07:14:15.650+0000 [DEBUG] hyper::client::pool: pooling idle connection for ("https", my-api-endpoint.lambda-url.eu-west-1.on.aws)
2024-01-26T07:14:15.650+0000 [DEBUG] turborepo_cache::http: logging fetch: AnalyticsEvent { session_id: None, source: Remote, event: Miss, hash: "8aafdfc02ef725ee", duration: 0 }
2024-01-26T07:14:15.653+0000 [DEBUG] turborepo_lib::process::child: waiting for task
2024-01-26T07:14:15.855+0000 [DEBUG] hyper::client::pool: reuse idle connection for ("https", my-api-endpoint.lambda-url.eu-west-1.on.aws)
2024-01-26T07:14:15.855+0000 [DEBUG] hyper::proto::h1::io: flushed 384 bytes
2024-01-26T07:14:15.913+0000 [DEBUG] hyper::proto::h1::io: parsed 9 headers
2024-01-26T07:14:15.913+0000 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (2 bytes)
2024-01-26T07:14:15.913+0000 [DEBUG] hyper::proto::h1::conn: incoming body completed
2024-01-26T07:14:15.913+0000 [DEBUG] hyper::client::pool: pooling idle connection for ("https", my-api-endpoint.lambda-url.eu-west-1.on.aws)
2024-01-26T07:14:18.933+0000 [DEBUG] hyper::proto::h1::io: parsed 9 headers
2024-01-26T07:14:18.933+0000 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (32 bytes)
2024-01-26T07:14:18.934+0000 [DEBUG] turborepo_cache::http: logging fetch: AnalyticsEvent { session_id: None, source: Remote, event: Miss, hash: "503bb6cab51a9039", duration: 0 }
2024-01-26T07:14:18.934+0000 [DEBUG] log: Sending warning alert CloseNotify
2024-01-26T07:14:18.936+0000 [DEBUG] turborepo_lib::process::child: waiting for task
2024-01-26T07:14:19.138+0000 [DEBUG] hyper::client::pool: reuse idle connection for ("https", my-api-endpoint.lambda-url.eu-west-1.on.aws)
2024-01-26T07:14:19.138+0000 [DEBUG] hyper::proto::h1::io: flushed 384 bytes
2024-01-26T07:14:19.192+0000 [DEBUG] hyper::proto::h1::io: parsed 9 headers
2024-01-26T07:14:19.192+0000 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (2 bytes)
2024-01-26T07:14:19.192+0000 [DEBUG] log: Sending warning alert CloseNotify
2024-01-26T07:14:35.841+0000 [DEBUG] turborepo_lib::process::child: child process exited normally
2024-01-26T07:14:35.841+0000 [DEBUG] turborepo_lib::process::child: child process stopped
2024-01-26T07:14:35.841+0000 [DEBUG] turborepo_lib::run::cache: caching outputs: outputs: TaskOutputs { inclusions: ["apps/web/.next/**", "apps/web/.turbo/turbo-build.log"], exclusions: ["apps/web/.next/cache/**"] }
2024-01-26T07:14:35.845+0000 [DEBUG] turborepo_lib::process::child: child process exited normally
2024-01-26T07:14:35.845+0000 [DEBUG] turborepo_lib::process::child: child process stopped
2024-01-26T07:14:35.845+0000 [DEBUG] turborepo_lib::run::cache: caching outputs: outputs: TaskOutputs { inclusions: ["apps/docs/.next/**", "apps/docs/.turbo/turbo-build.log"], exclusions: ["apps/docs/.next/cache/**"] }
2024-01-26T07:14:35.845+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T07:14:35.845+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:35.845+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:35.845+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:35.848+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T07:14:35.848+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:35.848+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:35.848+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:35.857+0000 [DEBUG] log: starting new connection: https://my-api-endpoint.lambda-url.eu-west-1.on.aws/
2024-01-26T07:14:35.857+0000 [DEBUG] hyper::client::connect::dns: resolving host="my-api-endpoint.lambda-url.eu-west-1.on.aws"
2024-01-26T07:14:35.857+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:35.857+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:35.857+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:35.857+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:35.857+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T07:14:35.857+0000 [DEBUG] h2::codec::framed_read: received

 Tasks:    2 successful, 2 total
Cached:    0 cached, 2 total
  Time:    21.717s

2024-01-26T07:14:35.858+0000 [DEBUG] hyper::client::connect::http: connecting to 34.249.228.148:443
>   ...Finishing writing to cache...                                                                                                                                       2024-01-26T07:14:35.860+0000 [DEBUG] turborepo_lib::process: waiting for 2 processes to exit
2024-01-26T07:14:35.860+0000 [DEBUG] log: starting new connection: https://my-api-endpoint.lambda-url.eu-west-1.on.aws/
2024-01-26T07:14:35.860+0000 [DEBUG] hyper::client::connect::dns: resolving host="my-api-endpoint.lambda-url.eu-west-1.on.aws"
2024-01-26T07:14:35.861+0000 [DEBUG] hyper::client::connect::http: connecting to 34.249.228.148:443
2024-01-26T07:14:35.886+0000 [DEBUG] hyper::client::connect::http: connected to 34.249.228.148:443
2024-01-26T07:14:35.886+0000 [DEBUG] log: Resuming session
2024-01-26T07:14:35.889+0000 [DEBUG] hyper::client::connect::http: connected to 34.249.228.148:443
2024-01-26T07:14:35.889+0000 [DEBUG] log: Resuming session
2024-01-26T07:14:35.920+0000 [DEBUG] log: ALPN protocol is None
2024-01-26T07:14:35.920+0000 [DEBUG] log: Using ciphersuite TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
2024-01-26T07:14:35.920+0000 [DEBUG] log: Server supports tickets
2024-01-26T07:14:35.923+0000 [DEBUG] log: ECDHE curve is ECParameters { curve_type: NamedCurve, named_group: secp256r1 }
2024-01-26T07:14:35.923+0000 [DEBUG] log: Server DNS name is DnsName("my-api-endpoint.lambda-url.eu-west-1.on.aws")
2024-01-26T07:14:35.924+0000 [DEBUG] log: ALPN protocol is None
2024-01-26T07:14:35.924+0000 [DEBUG] log: Using ciphersuite TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
2024-01-26T07:14:35.924+0000 [DEBUG] log: Server supports tickets
2024-01-26T07:14:35.928+0000 [DEBUG] log: ECDHE curve is ECParameters { curve_type: NamedCurve, named_group: secp256r1 }
2024-01-26T07:14:35.928+0000 [DEBUG] log: Server DNS name is DnsName("my-api-endpoint.lambda-url.eu-west-1.on.aws")
2024-01-26T07:14:35.957+0000 [DEBUG] hyper::proto::h1::io: flushed 196608 bytes
2024-01-26T07:14:35.960+0000 [DEBUG] hyper::proto::h1::io: flushed 196608 bytes
2024-01-26T07:14:35.980+0000 [DEBUG] hyper::proto::h1::io: flushed 4692 bytes
2024-01-26T07:14:35.980+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:35.987+0000 [DEBUG] hyper::proto::h1::io: flushed 3244 bytes
2024-01-26T07:14:35.987+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
>>  ...Finishing writing to cache...                                                                                                                                       2024-01-26T07:14:35.990+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:35.990+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.003+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.003+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.005+0000 [DEBUG] hyper::proto::h1::io: flushed 5763 bytes
2024-01-26T07:14:36.005+0000 [DEBUG] hyper::proto::h1::io: flushed 5763 bytes
2024-01-26T07:14:36.006+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.006+0000 [DEBUG] hyper::proto::h1::io: flushed 5763 bytes
2024-01-26T07:14:36.013+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.013+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.015+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.015+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.016+0000 [DEBUG] hyper::proto::h1::io: flushed 5763 bytes
2024-01-26T07:14:36.016+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.017+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.017+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.028+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.028+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.029+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.029+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.030+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.030+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.030+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.030+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.031+0000 [DEBUG] hyper::proto::h1::io: flushed 13955 bytes
2024-01-26T07:14:36.031+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.032+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.032+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.033+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.033+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.036+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.036+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.038+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.038+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.040+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.040+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.043+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.043+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.045+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.045+0000 [DEBUG] hyper::proto::h1::io: flushed 20699 bytes
2024-01-26T07:14:36.047+0000 [DEBUG] hyper::proto::h1::io: flushed 12478 bytes
2024-01-26T07:14:36.047+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.048+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.048+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.050+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.050+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.052+0000 [DEBUG] hyper::proto::h1::io: flushed 22147 bytes
2024-01-26T07:14:36.052+0000 [DEBUG] hyper::proto::h1::io: flushed 2838 bytes
2024-01-26T07:14:36.052+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.052+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.055+0000 [DEBUG] hyper::proto::h1::io: flushed 22147 bytes
2024-01-26T07:14:36.055+0000 [DEBUG] hyper::proto::h1::io: flushed 12478 bytes
2024-01-26T07:14:36.056+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.056+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.059+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.059+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.060+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.060+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.062+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.062+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.065+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.065+0000 [DEBUG] hyper::proto::h1::io: flushed 13117 bytes
2024-01-26T07:14:36.088+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.088+0000 [DEBUG] hyper::proto::h1::io: flushed 12507 bytes
2024-01-26T07:14:36.089+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.089+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.092+0000 [DEBUG] hyper::proto::h1::io: flushed 11059 bytes
2024-01-26T07:14:36.092+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.094+0000 [DEBUG] hyper::proto::h1::io: flushed 5763 bytes
2024-01-26T07:14:36.094+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
2024-01-26T07:14:36.097+0000 [DEBUG] hyper::proto::h1::io: flushed 4315 bytes
2024-01-26T07:14:36.097+0000 [DEBUG] hyper::proto::h1::io: flushed 2867 bytes
    ...Finishing writing to cache...                                                                                                                                       2024-01-26T07:14:36.249+0000 [DEBUG] hyper::proto::h1::io: flushed 65536 bytes
2024-01-26T07:14:36.249+0000 [DEBUG] hyper::proto::h1::io: flushed 3178 bytes
>>> ...Finishing writing to cache...                                                                                                                                       2024-01-26T07:14:36.711+0000 [DEBUG] hyper::proto::h1::io: parsed 9 headers
2024-01-26T07:14:36.711+0000 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (46 bytes)
2024-01-26T07:14:36.711+0000 [DEBUG] hyper::proto::h1::conn: incoming body completed
2024-01-26T07:14:36.711+0000 [DEBUG] hyper::client::pool: pooling idle connection for ("https", my-api-endpoint.lambda-url.eu-west-1.on.aws)
    ...Finishing writing to cache...                                                                                                                                       2024-01-26T07:14:36.837+0000 [DEBUG] hyper::proto::h1::io: parsed 9 headers
2024-01-26T07:14:36.837+0000 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (46 bytes)
2024-01-26T07:14:36.837+0000 [DEBUG] hyper::proto::h1::conn: incoming body completed
2024-01-26T07:14:36.837+0000 [DEBUG] hyper::client::pool: pooling idle connection for ("https", my-api-endpoint.lambda-url.eu-west-1.on.aws)
2024-01-26T07:14:36.837+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:36.837+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T07:14:36.837+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T07:14:36.837+0000 [DEBUG] h2::proto::connection: Connection::poll; connection error
2024-01-26T07:14:36.837+0000 [DEBUG] turborepo_lib::cli: Skipping telemetry close - not initialized

I observed that I am no longer getting errors and there is a remote cache hit on subsequent attempts (I cleaned up **/.turbo and node_modules/.cache/turbo before subsequent runs)

I am still trying to understand the differences between our own and the scaffolded monorepos, so far no luck.

In the meantime, I took the liberty to run a task with verbose logging on our monorepo that shows the issue.

The command I ran was a simple typescript "linting" command that only has the "^topo" task as a dependency.

TURBO_TEAM=my-team TURBO_TOKEN=my-token TURBO_API=https://my-api-endpoint npm exec turbo -- typecheck -vv --filter='@client-project/util-logger' --remote-only
Here's the sanitised log output from the monorepo where it does not work
npm exec turbo -- typecheck -vv --filter='@client-project/util-logger' --remote-only
2024-01-26T08:54:33.487+0000 [DEBUG] turborepo_lib::shim: Global turbo version: 1.11.4-canary.2
2024-01-26T08:54:33.491+0000 [DEBUG] turborepo_lib::shim: Repository Root: /Users/attila/Projects/clients/client/my-monorepo
2024-01-26T08:54:33.491+0000 [DEBUG] turborepo_lib::shim: Local turbo path: /Users/attila/Projects/clients/client/my-monorepo/node_modules/turbo-darwin-64/bin/turbo
2024-01-26T08:54:33.491+0000 [DEBUG] turborepo_lib::shim: Local turbo version: 1.11.4-canary.2
2024-01-26T08:54:33.491+0000 [DEBUG] turborepo_lib::shim: Currently running turbo is local turbo.
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_lib::commands::run: using the experimental rust codepath
2024-01-26T08:54:33.492+0000 [DEBUG] turborepo_lib::commands::run: configured run struct: Run { base: CommandBase { repo_root: AbsoluteSystemPathBuf("/Users/attila/Projects/clients/client/my-monorepo"), ui: UI { should_strip_ansi: false }, config: OnceCell(<uninit>), args: Args { version: false, skip_infer: false, no_update_notifier: false, api: None, color: false, cpu_profile: None, cwd: Some("/Users/attila/Projects/clients/client/my-monorepo"), heap: None, login: None, no_color: false, preflight: false, remote_cache_timeout: None, team: None, token: None, trace: None, verbosity: Verbosity { verbosity: None, v: 2 }, check_for_update: false, test_run: false, run_args: None, command: Some(Run(RunArgs { cache_dir: None, cache_workers: 10, concurrency: None, continue_execution: false, dry_run: None, go_fallback: false, single_package: false, force: None, framework_inference: true, global_deps: [], graph: None, env_mode: Infer, filter: ["@client-project/util-logger"], scope: [], ignore: [], since: None, include_dependencies: false, no_deps: false, no_cache: false, daemon: false, no_daemon: false, output_logs: None, log_order: Auto, only: false, parallel: false, pkg_inference_root: None, profile: None, anon_profile: None, remote_only: true, remote_cache_read_only: false, summarize: None, log_prefix: Auto, tasks: ["typecheck"], pass_through_args: [], experimental_space_id: None })) }, version: "1.11.4-canary.2" }, processes: ProcessManager(Mutex { data: ProcessManagerInner { is_closing: false, children: [] }, poisoned: false, .. }) }
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: looking for pid in lockfile: AbsoluteSystemPathBuf("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/152c79eae0c118f0/turbod.pid")
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: found pid: 8343
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: got daemon with pid: 8343
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: creating AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/152c79eae0c118f0")
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: watching AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/152c79eae0c118f0")
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: creating AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/152c79eae0c118f0")
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: watching AbsoluteSystemPath("/var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/152c79eae0c118f0")
2024-01-26T08:54:33.498+0000 [DEBUG] turborepo_lib::daemon::connector: connecting to socket: /var/folders/6z/gvpcf9fs1wsfd__twkgzyflr0000gn/T/turbod/152c79eae0c118f0/turbod.sock
2024-01-26T08:54:33.498+0000 [DEBUG] h2::client: binding client connection
2024-01-26T08:54:33.498+0000 [DEBUG] h2::client: client connection bound
2024-01-26T08:54:33.498+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.498+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.498+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.499+0000 [DEBUG] h2::proto::settings: received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384 }
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.499+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.499+0000 [DEBUG] turborepo_lib::daemon::connector: connected in 1622µs
2024-01-26T08:54:33.499+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:33.499+0000 [DEBUG] turborepo_lib::run: running in daemon mode
2024-01-26T08:54:33.500+0000 [DEBUG] turborepo_repository::discovery: discovering packages using caching strategy
2024-01-26T08:54:33.500+0000 [DEBUG] turborepo_repository::discovery: no cached data, running primary strategy
2024-01-26T08:54:33.500+0000 [DEBUG] turborepo_repository::discovery: discovering packages using fallback strategy
2024-01-26T08:54:33.500+0000 [DEBUG] turborepo_repository::discovery: attempting primary strategy
2024-01-26T08:54:33.500+0000 [DEBUG] turborepo_repository::discovery: discovering packages using optional strategy
2024-01-26T08:54:33.500+0000 [DEBUG] turborepo_lib::run::package_discovery: discovering packages using daemon
2024-01-26T08:54:33.500+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T08:54:33.500+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.500+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.500+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:33.500+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.500+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.500+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.500+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:33.511+0000 [DEBUG] turborepo_repository::discovery: discovering packages using caching strategy
2024-01-26T08:54:34.551+0000 [DEBUG] turborepo_repository::discovery: discovering packages using caching strategy
2024-01-26T08:54:34.551+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.551+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.551+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
• Packages in scope: @client-project/util-logger
• Running typecheck in 1 packages
• Remote caching enabled
2024-01-26T08:54:34.560+0000 [DEBUG] turborepo_lib::run::global_hash: global hash env vars []
2024-01-26T08:54:34.569+0000 [DEBUG] turborepo_lib::run::global_hash: external deps hash: f006bd3ad1ed047d
2024-01-26T08:54:34.569+0000 [DEBUG] turborepo_lib::run: global hash: 27354b572da75158
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.570+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.695+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.695+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.696+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.717+0000 [DEBUG] turborepo_lib::run: running visitor
2024-01-26T08:54:34.717+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.717+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.717+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.717+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.717+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.718+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for @client-project/misc-eslint-config:topo
 vars: []
2024-01-26T08:54:34.718+0000 [DEBUG] turborepo_lib::task_graph::visitor: task @client-project/misc-eslint-config#topo hash is 261fe7960581265d
2024-01-26T08:54:34.718+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.718+0000 [DEBUG] log: Engine visitor dropped callback sender without sending result
2024-01-26T08:54:34.718+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.718+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.731+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.731+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.731+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for @client-project/misc-tsconfig:topo
 vars: []
2024-01-26T08:54:34.731+0000 [DEBUG] turborepo_lib::task_graph::visitor: task @client-project/misc-tsconfig#topo hash is 69feac92806ed0d3
2024-01-26T08:54:34.731+0000 [DEBUG] log: Engine visitor dropped callback sender without sending result
2024-01-26T08:54:34.731+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.732+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:34.732+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.732+0000 [DEBUG] turborepo_lib::task_hash: task hash env vars for @client-project/util-logger:typecheck
 vars: []
2024-01-26T08:54:34.732+0000 [DEBUG] turborepo_lib::task_graph::visitor: task @client-project/util-logger#typecheck hash is 4b2eeb65a58601ea
2024-01-26T08:54:34.732+0000 [DEBUG] turborepo_telemetry::config: Telemetry config path: /Users/attila/Library/Application Support/turborepo/telemetry.json
2024-01-26T08:54:34.732+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T08:54:34.732+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:34.732+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:34.732+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:34.744+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:34.744+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:34.744+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:34.744+0000 [DEBUG] log: starting new connection: https://my-api-endpoint.lambda-url.eu-west-1.on.aws/
2024-01-26T08:54:34.744+0000 [DEBUG] hyper::client::connect::dns: resolving host="my-api-endpoint.lambda-url.eu-west-1.on.aws"
2024-01-26T08:54:34.747+0000 [DEBUG] hyper::client::connect::http: connecting to 54.194.193.253:443
2024-01-26T08:54:34.748+0000 [DEBUG] hyper::client::connect::http: connecting to 54.73.94.89:443
2024-01-26T08:54:34.749+0000 [DEBUG] hyper::client::connect::http: connecting to 54.155.126.150:443
@client-project/util-logger:typecheck: cache miss, executing 4b2eeb65a58601ea
2024-01-26T08:54:34.753+0000 [DEBUG] turborepo_lib::process::child: waiting for task
@client-project/util-logger:typecheck:
@client-project/util-logger:typecheck: > @client-project/util-logger@0.0.0 typecheck
@client-project/util-logger:typecheck: > tsc
@client-project/util-logger:typecheck:
2024-01-26T08:54:36.240+0000 [DEBUG] turborepo_lib::process::child: child process exited normally
2024-01-26T08:54:36.240+0000 [DEBUG] turborepo_lib::process::child: child process stopped
2024-01-26T08:54:36.240+0000 [DEBUG] turborepo_lib::run::cache: caching outputs: outputs: TaskOutputs { inclusions: ["packages/utils/logger/.turbo/turbo-typecheck.log"], exclusions: [] }
2024-01-26T08:54:36.241+0000 [DEBUG] tower::buffer::worker: "processing request"
2024-01-26T08:54:36.241+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:36.241+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:36.241+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:36.241+0000 [DEBUG] log: starting new connection: https://my-api-endpoint.lambda-url.eu-west-1.on.aws/
2024-01-26T08:54:36.241+0000 [DEBUG] hyper::client::connect::dns: resolving host="my-api-endpoint.lambda-url.eu-west-1.on.aws"
2024-01-26T08:54:36.242+0000 [DEBUG] hyper::client::connect::http: connecting to 54.194.193.253:443
2024-01-26T08:54:36.243+0000 [DEBUG] hyper::client::connect::http: connecting to 54.73.94.89:443
2024-01-26T08:54:36.244+0000 [DEBUG] hyper::client::connect::http: connecting to 54.155.126.150:443
 WARNING  failed to contact remote cache: Error making HTTP request: error sending request for url (https://my-api-endpoint.lambda-url.eu-west-1.on.aws/v8/artifacts/4b2eeb65a58601ea?slug=my-team): error trying to connect: tcp connect error: Bad file descriptor (os error 9)
2024-01-26T08:54:36.245+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:36.245+0000 [DEBUG] h2::codec::framed_read: received
2024-01-26T08:54:36.245+0000 [DEBUG] h2::codec::framed_read: received

 Tasks:    1 successful, 1 total
Cached:    0 cached, 1 total
  Time:    2.754s

2024-01-26T08:54:36.267+0000 [DEBUG] turborepo_lib::process: waiting for 1 processes to exit
2024-01-26T08:54:36.268+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:36.268+0000 [DEBUG] turborepo_telemetry: telemetry sender not initialized
2024-01-26T08:54:36.268+0000 [DEBUG] turborepo_lib::cli: Skipping telemetry close - not initialized
2024-01-26T08:54:36.268+0000 [DEBUG] h2::codec::framed_write: send
2024-01-26T08:54:36.268+0000 [DEBUG] h2::proto::connection: Connection::poll; connection error

I'll continue the investigation as to why this Bad file descriptor error occurs, but any pointers you see from the above logs is much appreciated, thank you.

@spacedawwwg
Copy link

Still prevalent in 1.12 (and no --go-fallback option)

WARNING failed to contact remote cache: Error making HTTP request: HTTP status client error (412 Precondition Failed) for url (https://subdomain.azurewebsites.net/v8/artifacts/0c30e2a7c696e865?slug=team_uol)

@gsoltis
Copy link
Contributor

gsoltis commented Feb 7, 2024

Continuing to look into this. However, we still have been unable to reproduce it, and it looks like the 412 Precondition Failed is something that ducktors / turborepo-remote-cache returns, so digging in to that as well.

I do believe that the artifact verification error has been addressed. If anyone is still running into this, please open a new issue.

@spacedawwwg
Copy link

@ajwhitehead88 did we use ducktors / turborepo-remote-cache for our remote cache in the end?

@ajwhitehead88
Copy link

@ajwhitehead88 did we use ducktors / turborepo-remote-cache for our remote cache in the end?

Yes we use the docker container for ours

@gsoltis
Copy link
Contributor

gsoltis commented Feb 8, 2024

I've set up ducktors / turborepo-remote-cache locally and am still unable to reproduce the problem. However, from looking at the code, it appears the Precondition Failed is returned if something here throws. That means it is either a problem reading the artifact, or piping it to whatever storage backend.

Can you check logs from your cache, or your storage backend, or possibly instrument your cache to see what the underlying error is?

@dhoulker
Copy link

We are running into the same issue, on version 1.12.3 but using Vercel for our remote cache.

...Finishing writing to cache... 
WARNING  failed to contact remote cache: unknown status forbidden: You don't have permission to create the cache artifact.

We've configured Vercel with turbo login and turbo link steps, and can see the expected teamId in .turbo/config.json

What steps can we follow to debug this?

@gsoltis
Copy link
Contributor

gsoltis commented Feb 12, 2024

@dhoulker I think you have a different scenario, I've opened a new issue for you at #7359

chris-olszewski added a commit that referenced this issue Feb 12, 2024
### Description

Enable the feature for using native certs and not just the ones shipped
with `turbo`. See [this
readme](https://github.com/rustls/rustls-native-certs?tab=readme-ov-file#should-i-use-this-or-webpki-roots)
for a comparison between these features. If you compare the Go
implementations ([linux](https://go.dev/src/crypto/x509/root_linux.go),
[macos](https://go.dev/src/crypto/x509/root_darwin.go),
[windows](https://go.dev/src/crypto/x509/root_windows.go)), this gets us
closer to that behavior.

Both `weppki-roots` and `rustls-native-certs` can be used at the same
time and [both
sources](https://docs.rs/reqwest/latest/src/reqwest/async_impl/client.rs.html#465)
will be added to the [client when
built](https://docs.rs/reqwest/latest/src/reqwest/async_impl/client.rs.html#482)

I believe this should address
#7317 and some reports in
#6765

### Testing Instructions

Verified that new build still works with Vercel Remote Cache. Given that
this feature is additive, I don't expect us to lose any functionality.


Closes TURBO-2333
@attila
Copy link

attila commented Feb 14, 2024

Following up on #6765 (comment)

I wanted to test "tcp connect error: Bad file descriptor (os error 9)" more but I have a very limited understanding of it.
Using 1.12.4-canary.1 I tried cloning our repository in a new directory and after installing dependencies, it didn't show the "bad file descriptor error", and it was able to write to the remote cache!

Suspecting there are ignored files in my original workspace that may affect this, I cleaned up the workspace using git clean -fdX, reinstalled dependencies, to no avail.

Not quite understanding if the daemon has anything to do with this, I tried turbo daemon clean and the --no-daemon switch, to no avail.

Any other ideas what to look into before I try to delete the entire original workspace and re-clone the repository? Is there a corrupted var or tmp folder that turbo relies on?

@gsoltis
Copy link
Contributor

gsoltis commented Feb 14, 2024

@attila There is a config file at the XDG_CONFIG_DIR (on macos, it's under ${HOME}/Library/Application Support/turborepo/, the daemon stuff is mostly in a /tmp/ directory and shouldn't affect the upload to the cache. Other than that, there's the local cache itself at node_modules/.cache/turbo, and task logs in each package under .turbo.

Glad to hear the clean checkout is working though. I wouldn't spend too much time debugging beyond that. This issue has collected a few different problems, and I don't think anyone else has reported this specific variant.

@toakleaf
Copy link

I've set up ducktors / turborepo-remote-cache locally and am still unable to reproduce the problem. However, from looking at the code, it appears the Precondition Failed is returned if something here throws. That means it is either a problem reading the artifact, or piping it to whatever storage backend.

Can you check logs from your cache, or your storage backend, or possibly instrument your cache to see what the underlying error is?

I've encountered this issue only when using ducktors / turborepo-remote-cache with S3. We actually unhooked S3 just to get around the issue temporarily.

@spacedawwwg
Copy link

Is there any alternatives to ducktors / turborepo-remote-cache that don't have this issue?

@MisterJimson
Copy link

I'm seeing this as well, 1.10.16 works but newer versions do not. Also using https://github.com/ducktors/turborepo-remote-cache

@fungilation
Copy link

Reporting that I'm seeing same error on current latest turbo v1.13.3. With blitz build:

...
 Tasks:    3 successful, 3 total
Cached:    0 cached, 3 total
  Time:    2m9.609s

>>> ...Finishing writing to cache...                                                                                                                                          WARNING  failed to contact remote cache: skipping HTTP Request, too many failures have occurred.
Last error: error sending request for url (https://vercel.com/api/v8/artifacts/f8aa721d5242a963?teamId=<teamId>): operation timed out
  • <teamId> is correctly what's in my .turbo/config.json
  • blitz: 2.0.8
  • next: 14.3.0-canary.40

@rory-orennia
Copy link

rory-orennia commented May 15, 2024

Seeing the same behaviour with the ducktors implementation of the remote cache. I did find a workaround though so it seems like a bug in the turbo.
config.json

{
  "teamid": "teamName"
  "token": "tokenHere",
  "apiurl": "https://somelambda.lambda-url.us-east-1.on.aws"
}

Doesn't work:

pnpm turbo lint
>   ...Finishing writing to cache...
WARNING  failed to contact remote cache: Error making HTTP request: HTTP status client error (400 Bad Request) for url (https://somelambda.lambda-url.us-east-1.on.aws/v8/artifacts/0a23a7bf63234b7f)

server error:

{
    "severity": "WARNING",
    "level": 40,
    "time": 1715731148245,
    "pid": 8,
    "hostname": "169.254.75.181",
    "reqId": "kthNhXIVSQGbtCCrFlxT8w-1",
    "data": null,
    "isBoom": true,
    "isServer": false,
    "output": {
        "statusCode": 400,
        "payload": {
            "statusCode": 400,
            "error": "Bad Request",
            "message": "querystring should have required property 'teamId'"
        },
        "headers": {}
    },
    "stack": "Error: querystring should have required property 'teamId'\n    at Object.handler (/var/task/index.js:909873:40)\n    at preHandlerCallback (/var/task/index.js:4200:42)\n    at preValidationCallback (/var/task/index.js:4188:9)\n    at handler2 (/var/task/index.js:4158:11)\n    at handleRequest (/var/task/index.js:4121:9)\n    at runPreParsing (/var/task/index.js:41437:9)\n    at next (/var/task/index.js:3853:11)\n    at handleResolve (/var/task/index.js:3868:9)",
    "type": "Error",
    "message": "querystring should have required property 'teamId'"
}

Does work:

pnpm turbo lint --team="teamName"

So it seems like there's something funky around teamid vs team as adding --team makes it send the teamid correctly. Adding "team": "teamName" to the config.json doesn't work either

@fungilation
Copy link

fungilation commented May 15, 2024

I tried turbo build --team="<team>". Doesn't work for me, hit this error still:

>>> ...Finishing writing to cache...                                                                                                                                          WARNING  failed to contact remote cache: skipping HTTP Request, too many failures have occurred.
Last error: error sending request for url (https://vercel.com/api/v8/artifacts/...?teamId=<team>&slug=<team>): operation timed out

On Next.js 14.3.0-canary.63

@rory-orennia
Copy link

rory-orennia commented May 15, 2024

I wonder if the 3rd party cache connection code is a bit different than the Vercel one. On Turbo v 1.13.3 I ended up just moving everything to environment vars and removing the config.json. This makes it work for me without having to adjust every CLI turbo call.

~/.zshrc

export TURBO_API=https://somelambda.lambda-url.us-east-1.on.aws
export TURBO_TEAM=name
export TURBO_TOKEN=token
...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests