Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Add retry mechanism #778

Open
1 of 2 tasks
YuviGold opened this issue Oct 6, 2022 · 4 comments
Open
1 of 2 tasks

Feature Request: Add retry mechanism #778

YuviGold opened this issue Oct 6, 2022 · 4 comments

Comments

@YuviGold
Copy link
Contributor

YuviGold commented Oct 6, 2022

Expected Behaviour

arkade is being used in CI pipelines and from time to time might fail on network hiccups.
It would be great if internally it could retry its http requests for common failures.

Current Behaviour

In the following examples you can see a network failures that caused to exit with an error without retrying

[2022-10-05T12:40:07.291Z] Downloading: terragrunt
[2022-10-05T12:40:07.291Z] 2022/10/05 12:40:07 Looking up version for terragrunt
[2022-10-05T12:40:07.291Z] 2022/10/05 12:40:07 Found: v0.39.0
[2022-10-05T12:40:07.292Z] Downloading: https://github.com/gruntwork-io/terragrunt/releases/download/v0.39.0/terragrunt_linux_amd64
[2022-10-05T12:40:13.910Z] Error: incorrect status for downloading tool: 503
[2022-10-06T00:02:02.842Z] Downloading: terragrunt
[2022-10-06T00:02:02.842Z] 2022/10/06 00:02:00 Looking up version for terragrunt
[2022-10-06T00:02:11.003Z] Error: Head "https://github.com/gruntwork-io/terragrunt/releases/latest": context deadline exceeded (Client.Timeout exceeded while awaiting headers)

Are you a GitHub Sponsor yet (Yes/No?)

  • Yes
  • No

Possible Solution

Using https://github.com/hashicorp/go-retryablehttp that wouldn't require any refactoring. it just wraps the familiar HTTP client interface with automatic retries and exponential backoff.

Context

Your Environment

  • What arkade version is this?
arkade version

            _             _      
  __ _ _ __| | ____ _  __| | ___ 
 / _` | '__| |/ / _` |/ _` |/ _ \
| (_| | |  |   < (_| | (_| |  __/
 \__,_|_|  |_|\_\__,_|\__,_|\___|

Open Source Marketplace For Developer Tools

Version: 0.8.44
Git Commit: b543e60a68285ce8a147d9b0a6493f573747b8d1

 🐳 arkade needs your support: https://github.com/sponsors/alexellis

@alexellis
Copy link
Owner

Hi @YuviGold

Thanks for your interest in arkade.

This is the first time hearing of issues with GitHub's releases pages which use S3 underneath.

How often do you get a 503 from GitHub / S3, and do you know why this affects you specifically?

I think that we'd have to hear pain from a bunch more users to consider changing the code.

What company is this for? Perhaps if they wanted to become a GitHub Sponsor it may make it worth us considering.

Alex

@Jasstkn
Copy link
Contributor

Jasstkn commented Oct 24, 2022

FYI: There are some examples in the e2e-tests

@alexellis
Copy link
Owner

@Jasstkn thanks

Can you have the tests print out headers on error, to see if GitHub is trying to tell us something?

@Jasstkn
Copy link
Contributor

Jasstkn commented Feb 2, 2023

@Jasstkn thanks

Can you have the tests print out headers on error, to see if GitHub is trying to tell us something?

usually it is the server error - 503/500 and so on. Can be the overload on the Github's side or just a network glitch. I think that we may want to introduce this as an option to the arkade get command --retry=2. This can cover some cases for CI/CD pipelines as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants