Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there currently any support for handling custom retries? #405

Open
jsungg opened this issue Feb 20, 2024 · 1 comment
Open

Is there currently any support for handling custom retries? #405

jsungg opened this issue Feb 20, 2024 · 1 comment

Comments

@jsungg
Copy link

jsungg commented Feb 20, 2024

Hi there. I tried looking in the docs to see if there was any support for explicitly setting number of retries for the Google Storage Client when writing to a cloud bucket using a CloudPath object. Specifically for my use case I am trying to read/write to a GCS bucket many times but I sometimes get a 503 PUT error from GCS, and our infra wants to retry more frequently than the default retry rate for gcs.

Seems like this problem happened before in this issue: #267

@pjbull
Copy link
Member

pjbull commented Feb 23, 2024

It looks like the GCS retry functionality is per-method (rather than being set at a client level). We just took a PR with a similar structure for chunked downloads, so I could see implementing something like that. We could accept a ConditionalRetryPolicy object as a retry_policy kwarg for the GSClient and then use it to override the default if it is set.

The other option would be to implement it in your code using a library like tenacity wherever you have methods that use CloudPaths to do reading/writing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants