Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using PrepareMsg API in ServerStream #3489

Closed
eafzali opened this issue Mar 31, 2020 · 2 comments · Fixed by #3480
Closed

Using PrepareMsg API in ServerStream #3489

eafzali opened this issue Mar 31, 2020 · 2 comments · Fixed by #3480
Assignees
Labels
fixit P2 Type: Feature New features or improvements in behavior

Comments

@eafzali
Copy link
Contributor

eafzali commented Mar 31, 2020

Use case(s) - what problem will this feature solve?

We have server that sends really large payloads, our benchmarks shows that most of the cpu time on server is used for serialization. we tried caching the PreparedMsgs and sending them and we saw a huge improvement.

Proposed Solution

#3480

Alternatives Considered

We are now using a custom Codec and send []byte in stream.SendMsg() as a workaround.

Additional Context

#2432

@eafzali eafzali added the Type: Feature New features or improvements in behavior label Mar 31, 2020
@menghanl menghanl added the P2 label Apr 30, 2020
@easwars
Copy link
Contributor

easwars commented May 4, 2021

Assigning to @dfawley who is the assigned reviewer of #3480.

@easwars
Copy link
Contributor

easwars commented May 4, 2021

Assigning the fixit label to make a decision on whether we want to proceed with this change or not.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 18, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
fixit P2 Type: Feature New features or improvements in behavior
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants