Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support missing for contextual deserializer/serializer #357

Closed
mgravell opened this issue Jun 26, 2019 · 9 comments · Fixed by #376
Closed

Support missing for contextual deserializer/serializer #357

mgravell opened this issue Jun 26, 2019 · 9 comments · Fixed by #376
Assignees
Milestone

Comments

@mgravell
Copy link
Contributor

mgravell commented Jun 26, 2019

Right now, the code directly hooks into the legacy .Serializer / .Deserializer, which provides access to the Func<T, byte[]> etc, however IMO this violates the API.

The primary API is now the contextual serializer, by which I mean:

  • Grpc.Core[.Api] always uses the contextual serializer (except for some tests)
  • the contextual serializer always works (it is emulated when initialized from the legacy API), where-as the legacy serializer does not; see Marshaller.cs 44-47
// gRPC only uses contextual serializer/deserializer internally, so emulating the legacy
// (de)serializer is not necessary.
this.serializer = (msg) => { throw new NotImplementedException(); };
this.deserializer = (payload) => { throw new NotImplementedException(); };

This has two significant problems:

  1. any custom framework that tries to use the contextual serializer will not work from grpc-dotnet
  2. any improvements made to the core implementation of Marshaller will not be automatically exposed

For context on 2, see this PR, which does an in-place upgrade on how regular protoc-style protobuf deserializes, using the contextual API to significantly reduce buffer allocations. If grpc-dotnet does not use the contextual API: it will not benefit if/when those changes get merged.

I am aware of #30, but IMO there are two different things:

You can also probably infer from this that I'm keen on using the contextual API, and generally improving this space - presumably looking into a new writer API too.

So, I guess my main point here is: grpc-dotnet should not use the legacy API - it should switch to the contextual API. I can offer some help with that if it would be welcome.


As a caveat / disclosure thing: I will acknowledge the remark:

/// Note: experimental API that can change or be removed without any prior notice.

but... if the API surface changes, consumers need to update anyway. I don't personally see this as a reason not to consume this API.

@mgravell
Copy link
Contributor Author

I've added a PR to the core that (if accepted) will fix "1" above; to fix "2" requires taking the experimental API

@JamesNK
Copy link
Member

JamesNK commented Jun 28, 2019

the contextual serializer always works (it is emulated when initialized from the legacy API), where-as the legacy serializer does not; see Marshaller.cs 44-47

Is there overhead in using the emulation?

@mgravell
Copy link
Contributor Author

no, basically; it shapes the API a little differently, mostly - but the emulation layer hides that; the writer API hasn't really been touched yet; the reader API only has 2 methods:

  • give me the payload as a byte[]
  • give me the payload as a ROS

In either case, the emulated layer just stashes the byte[] that you already provide, and re-exposes it directly (zero copy etc) as either the byte[] or ROS; no allocations etc. So, with the PR as written - again: "no"

For context: the Google implementation almost always uses emulation in the other direction (i.e. protobuf subscribes to the legacy API, the implementation consumes the emulated contextual API); as a side effect of this, if the PR above gets accepted into the core, that'll mean that if you leave your code alone, regular protobuf will be using the direct non-emulated route, and only my custom bindings would be using the emulated API.

@mgravell
Copy link
Contributor Author

mgravell commented Jul 3, 2019

Based in feedback on the PR, the PR seems unlikely to be merged

@jtattermusch
Copy link
Contributor

grpc-dotnet should switch to using Marshaller.ContextualSerializer and Marshaller.ContextualDeserializer internally instead of Marshaller.Serializer/Deserializer (there's a few occurrences in *CallHandler classes and in the managed client).

@JamesNK
Copy link
Member

JamesNK commented Jul 4, 2019

Ok! I think caching will probably be at a higher level (the entire call context) but that will be in the future. An extra allocation per message isn't the end of the world.

@mgravell
Copy link
Contributor Author

mgravell commented Jul 4, 2019

You and me have very different views on allocations :)

Right now I'm on a treasure-hunt for avoidable allocations in the Grpc.Core/Grpc.Core.Api path (the latter of which you should benefit from indirectly); I would do the same for Grpc-Dotnet, but ... it is kinda awkward to work with right now unless I keep separate machines/VMs for preview6 and preview7 - and even then, it isn't the easiest build since the dependency-tree is fast-moving.

But in all seriousness; if you can avoid an alloc per call (without bending the code too far out of shape), IMO you should avoid that alloc per call; they add up surprisingly quickly, and can quickly swamp the necessary allocations (i.e. the actual data payloads); on a high-throughput server allocations and GC can quickly dominate, or at least make a server more expensive than it needs to be (which can mean in terms of overhead on a box, or actual hosting costs for cloud/hosted services), even if performance is still acceptable.

@JamesNK
Copy link
Member

JamesNK commented Jul 4, 2019

I have plans to remove almost all non-message allocations on the server. I don't want to micro-optimize for the serialization context if code will be thrown away post 3.0.

@JamesNK
Copy link
Member

JamesNK commented Jul 4, 2019

I created an issue for reusing the ServerCallContext. If you're interested in improving performance then you're welcome to take a look 😋

#369

https://twitter.com/JamesNK/status/1107374097846034432

@JamesNK JamesNK added this to the Preview 8 milestone Jul 8, 2019
@JamesNK JamesNK self-assigned this Jul 9, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants