New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does PowerShell support the real "Stream" response? #23783
Comments
Hi, if the existing Invoke-WebRequest and Invoke-RestMethod don't support the response mechanisms or formats that you need and you find yourself repeating the same blocks of .NET method calls to the SDK, then I would recommend building a cmdlet in C# that does all the networking API and then streams the response to the pipeline output. Then you can have PowerShell functions or scriptblocks that are fed from the output pipeline of the cmdlet. I suggest that is the closest you might get to a stream in PowerShell. |
I don't believe Invoke-WebRequest supports streaming of the response, since it returns a complete response object, so you'll have to implement it yourself, either as a C# binary cmdlet, or the way you do now. As an aside, it might be a worthwhile addition to add a |
I suggest a proof of concept would be needed for this. Essentially you would need the HTTPResponse without the content written to the output pipeline, followed by the chunks of the response as byte []. The stage following the pipeline will need to deal with the chunks of arbitrary size. I am hard pressed to think of an example where this complexity would be appropriate with the PowerShell programming model and level of abstraction. |
Wouldn't send both the response and the content, that would be very annoying to process. Just pass through the string segments (or byte[]) as they come, or maybe split the string line-by-line for consistency. |
Absolutely, hence why a proof of concept to show how it would work. I don't think it practical for the cmdlet to be doing much processing given it has no knowledge of the format of the content other than the read chunk sizes from the response. There are many text formats which have no need for any line splitting (eg JSON and XML) and the lines may not align with the chunks read from the response, a chunk may end mid-line. It can also be ambiguous to detect whether the response is a text or binary format. There may be important information in the headers that the caller still needs, along with the status code. |
Prerequisites
Steps to reproduce
I am writing a PowerShell module for OpenAI (and Azure OpenAI or other GPT services), they are supporting the "stream" mode of chat completions (https://platform.openai.com/docs/api-reference/chat/create). In other languages, for example, Python. we can easily get the response by some code below.
But I didn't find a similar way to implement this in PowerShell, instead I need to write a lot of code like below.
Expected behavior
I want to read the chunk one by one.
Actual behavior
It seems like PowerShell supports the stream mode, but it will response to me until it read all the chunks.
Error details
No response
Environment data
Visuals
No response
The text was updated successfully, but these errors were encountered: