Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming Response #271

Closed
leifg opened this issue Dec 18, 2018 · 3 comments · May be fixed by #498
Closed

Streaming Response #271

leifg opened this issue Dec 18, 2018 · 3 comments · May be fixed by #498

Comments

@leifg
Copy link

leifg commented Dec 18, 2018

According to the README, Tesla supports streaming if the underlying adapter supports it.

From the code and the examples I found it seems though that streaming means in this instance, streaming of the request.

I am interested in how I would be able to stream the response of a request (very useful for something like large downloads).

Is this supported, if so: how?

@teamon teamon added the question label Jan 3, 2019
@teamon
Copy link
Member

teamon commented Jan 3, 2019

Unfortunately streaming response body is not implemented, but it could be done.
I can imagine something like:

case Tesla.get(client, "/stream", stream_response: true) do
  {:ok, env} -> # on successful read of status & headers env.body is a Stream
  {:error, reason} -> # ...
end

@teamon
Copy link
Member

teamon commented May 8, 2022

I'd go with Tesla.get(client, "/path", response: :stream) (instead of stream_response: true)

@teamon
Copy link
Member

teamon commented Apr 11, 2024

Response streaming added for Finch in https://github.com/elixir-tesla/tesla/releases/tag/v1.9.0

@teamon teamon closed this as completed Apr 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

2 participants