Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

download_cached_file should retry if there is a http error code #3804

Open
jayvdb opened this issue Feb 28, 2017 · 10 comments · May be fixed by #6099
Open

download_cached_file should retry if there is a http error code #3804

jayvdb opened this issue Feb 28, 2017 · 10 comments · May be fixed by #6099

Comments

@jayvdb
Copy link
Member

jayvdb commented Feb 28, 2017

Follows on from #3803

The interface of Bear.download_cached_file isnt suited to failure. It needs to retry if possible.

@jayvdb
Copy link
Member Author

jayvdb commented Feb 28, 2017

Note unlike #3803 , this is not a regression as the pre-requests code didn't do this.

@jayvdb
Copy link
Member Author

jayvdb commented Feb 28, 2017

Note that requests has native max_retries logic which we can use.
It requires working at the requests session & adapter level.

I've also looked for a sane requests wrapper which provides simplified retry, and oddly not found one.

@jayvdb
Copy link
Member Author

jayvdb commented Mar 1, 2017

Note, if there is a retry, then surely there needs to be a new status code check somewhere, but this doesnt appear to be documented anywhere in requests; see #3805 (comment).

@jayvdb
Copy link
Member Author

jayvdb commented Apr 18, 2018

It looks like requests now supports max_retries everywhere, so this should be fairly simple.

@jayvdb
Copy link
Member Author

jayvdb commented Apr 18, 2018

Optional: if a transient error occurred, such as 500+, the bear should be disabled with a warning.

However this could complexity can be deferred to be added later in #3332

@xurror
Copy link

xurror commented Mar 19, 2019

I would like to be assigned this issue

@gr455
Copy link

gr455 commented Dec 11, 2019

What value of max_retries is expected? Also what should be an optimal backoff_factor value for the retries?
Also what is to be done if error code persists after max_retries?

@gr455
Copy link

gr455 commented Dec 11, 2019

If max_retries is hit and the error code persists, a RetryError is raised which is causing the pytest to fail. How do I handle the pytest failing?

@gr455
Copy link

gr455 commented Dec 14, 2019

How about handling the RetryError by attempting to download once more(in the except statement) so that HTTPError can be raised with the latest status code

@gr455 gr455 linked a pull request Dec 16, 2019 that will close this issue
2 tasks
@gr455
Copy link

gr455 commented Dec 16, 2019

Please review the PR and suggest changes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging a pull request may close this issue.

3 participants