-
Notifications
You must be signed in to change notification settings - Fork 317
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for HTTP(S) OTLP endpoint #4197
base: main
Are you sure you want to change the base?
Conversation
private static async Task<byte[]> ReadFullyAsync(HttpContext context) | ||
{ | ||
using var ms = new MemoryStream(); | ||
await context.Request.Body.CopyToAsync(ms).ConfigureAwait(false); | ||
return ms.ToArray(); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be using pipelines to buffer without allocating a memory stream and a byte[]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do I achieve this?
Google.Protobuf.MessageParser<T>
wants either ReadOnlySequence<byte>
, ReadOnlySpan<byte>
or a Stream
. but the HttpContext.Request.Body-Stream can not be passed directly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@davidfowl I got rid of the byte[] and stream allocation
The formatting changes make it really hard to review. I don't know if you can undo those but with a big change like this it really makes the diff bigger than it needs to be. |
I reverted the format changes. Sorry for that |
Are there tests that can be written for this new functionality? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like a good addition, thanks. Left a few comments.
It'd be great to add a playground application that exercised these options, so there was an F5-able test bed, for manual testing.
There are integration tests for the current gRPC-based OTLP interactions. Having similar tests for the HTTP interactions would be valuable to ensure this scenario doesn't break.
I added for all endpoints IntegrationTests in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some small nits but also big issues. I need to read the OTLP HTTP spec to understand what exactly we need to support.
I'm on leave at the moment, but when I get back (beginning of June) are you ok with me making changes to your PR? I think communicating everything via comments will be too much overhead.
return await ExportOtlpData(httpContext, | ||
sequence => service.Export(ExportMetricsServiceRequest.Parser.ParseFrom(sequence), null!)) | ||
.ConfigureAwait(false); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this API accepting a protobuf request message and then returning a JSON response message? I couldn't see any Protobuf serialization code so I assume that the MapPost func is returning a value and ASP.NET Core is doing the serialization, which is always JSON.
I need to read the specification for OTLP+HTTP. I know it supports both Protobuf and JSON, but both in one request seems very weird. I think to support both the API method would have to look at the request content-type and do some content negotiation. I've yet to do that sort of thing inside minimal API methods so I don't know how hard/easy it is.
Also, I'm surprised the Protobuf response message just works when serialized to JSON if the .NET serializer is just accepting a plain old C# class and serializing it. Protobuf has rules about how it maps Protobuf messages to JSON (and unfortunately OTLP+HTTP/JSON breaks some of them). Maybe it works because the export response message is so simple.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it will be much easier to initially only support OTLP+HTTP with Protobuf. Supporting JSON will require a lot of custom JSON serialization.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The OTLP .NET SDK supports only HTTP/Protobuf and grpc (https://github.com/open-telemetry/opentelemetry-dotnet/blob/main/src/OpenTelemetry.Exporter.OpenTelemetryProtocol/README.md). I suggest to to wait with JSON until OLTP .NET does also support it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Regarding the return value, you a probably right. I have to check the spec here.
The OTLP .NET HTTP Exporter pretty much just checks the status code of the response and stores the headers in bad case:
.ConfigureAwait(false); | ||
} | ||
).RequireAuthorization(OtlpAuthorization.PolicyName) | ||
.RequireHost($"*:{httpEndpoint.Port}"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I used connection middleware to ensure OTLP + gRPC request came in on the right endpoint. I'd like OTLP + HTTP to be consistent.
There are probably some relatively modest changes/improvements to reuse the same code for both.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@JamesNK any idea how to make this work elegantly ?
@JamesNK It's a big honor to receive a review of Newtonsoft.Json's author. Thanks for all your work! |
// https://docs.microsoft.com/en-us/dotnet/standard/garbage-collection/large-object-heap | ||
private const int MaxSizeLessThanLOH = 84999; | ||
|
||
public static WebApplication ConfigureHttpOtlp(this WebApplication app, Uri httpEndpoint) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think, you have to configure CORS too, so browsers can send data to this endpoint.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets avoid the cors configuration until we understand what it means to do this securely.
Hi Aspire-Team,
Great work with the Aspire-Dashboard.
I'm currently looking for a replacement for current OTLP-Backend (https://github.com/SigNoz/signoz), and Aspire Dashboard seems to be a lightweight sulution.
However we are relying in the HTTP OTLP Receiver (#3688). Therefore i tried my best to add HTTP OTLP endpoint support and it works quite will in our usecase.
With this changes Aspire Dashboard can consume OTLP Grpc and http(s) simultaneously. Due to the http/protobuf I could reuse the existing grpc-otlps models and services
Microsoft Reviewers: Open in CodeFlow