Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extraneous Partial Success Error Message Exporting Traces to OTel Collector #3432

Closed
markhildreth-gravity opened this issue Oct 30, 2022 · 3 comments · Fixed by #3438
Closed
Assignees
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@markhildreth-gravity
Copy link

markhildreth-gravity commented Oct 30, 2022

Description

I have successfully set up to use the Go OpenTelemetry libraries to export trace spans to an OpenTelemetry Collector using OTLP over gRPC. However, even though all the spans are being exported to the collector correctly, I am getting error messages in the OpenTelemetry log in my Go application which look like the following:

2022/10/30 22:55:09 OTLP partial success: empty message (0 spans rejected)

This message repeats as more spans are sent. I believe what is happening here is that the spans are sent, the response is a full success, but the Go library is mistaking it as a partial success. See the code below:

return c.requestFunc(ctx, func(iCtx context.Context) error {
resp, err := c.tsc.Export(iCtx, &coltracepb.ExportTraceServiceRequest{
ResourceSpans: protoSpans,
})
if resp != nil && resp.PartialSuccess != nil {
otel.Handle(internal.PartialSuccessToError(
internal.TracingPartialSuccess,
resp.PartialSuccess.RejectedSpans,
resp.PartialSuccess.ErrorMessage,
))
}
// nil is converted to OK.
if status.Code(err) == codes.OK {
// Success.
return nil
}
return err
})

I added a fmt.Printf("%#v\n%#v", resp, resp.PartialSuccess) in this code just before line 204 (the if statement) and got this...

&v1.ExportTraceServiceResponse{
    state:impl.MessageState{
        NoUnkeyedLiterals:pragma.NoUnkeyedLiterals{},
        DoNotCompare:pragma.DoNotCompare{},
        DoNotCopy:pragma.DoNotCopy{},
        atomicMessageInfo:(*impl.MessageInfo)(0x4000580948)
    },
    sizeCache:0,
    unknownFields:[]uint8(nil),
    PartialSuccess:(*v1.ExportTracePartialSuccess)(0x40006404c0)
}
&v1.ExportTracePartialSuccess{
    state:impl.MessageState{
        NoUnkeyedLiterals:pragma.NoUnkeyedLiterals{},
        DoNotCompare:pragma.DoNotCompare{},
        DoNotCopy:pragma.DoNotCopy{},
        atomicMessageInfo:(*impl.MessageInfo)(nil)
    },
    sizeCache:0,
    unknownFields:[]uint8(nil),
    RejectedSpans:0,
    ErrorMessage:""
}

As you can see, there were no rejected spans and no error message, so I believe this should be seen as a full success, and thus the log message is not needed.

Environment

  • OS: alpine 3.16.2
  • Architecture: amd64
  • Go Version: 1.18
  • opentelemetry-go version: v1.11.1 (2fe8861)
  • OTel Collector: 0.63.0

Steps To Reproduce

1.) Set up an OTel collector that accepts traces over gRPC (in my case, no TLS).
2.) Set up an application using go's otel library to export traces to otel.
3.) See error messages in stderr even as traces are correct in exporting.

Expected behavior

The traces should continue to successfully export, but there should be no error messages printed out, as a partial success did not actually occur.

@markhildreth-gravity markhildreth-gravity added the bug Something isn't working label Oct 30, 2022
@markhildreth-gravity markhildreth-gravity changed the title Extraneous Error Message Exporting Traces to OTel Collector Extraneous Partial Success Error Message Exporting Traces to OTel Collector Oct 30, 2022
@dmathieu
Copy link
Member

I wonder if this isn't an issue in the collector rather than this library.
If the collector returns a partial success, we should expect at least one error. Otherwise, it should return a full success.

@markhildreth-gravity
Copy link
Author

@dmathieu That may certainly be a possibility. Unfortunately I was not keen enough with wireshark to be able to see what the collector was actually returning to verify who was at fault.

@Aneurysm9
Copy link
Member

I believe this to have been caused by this change in the collector. This comment on the protobuf indicates that we need to check for at least a non-empty error_message field before treating it as an error/warning.

@MrAlias MrAlias added the help wanted Extra attention is needed label Oct 31, 2022
@MrAlias MrAlias added this to the Release v1.12.0 milestone Oct 31, 2022
@MrAlias MrAlias self-assigned this Nov 2, 2022
MrAlias added a commit to MrAlias/opentelemetry-go that referenced this issue Nov 2, 2022
Fix open-telemetry#3432.

The OTLP server will respond with empty partial success responses (i.e.
empty messages and 0 count). Treat these as equivalent to it not being
set/present like the documentation specifies in the proto:
https://github.com/open-telemetry/opentelemetry-proto/blob/724e427879e3d2bae2edc0218fff06e37b9eb46e/opentelemetry/proto/collector/trace/v1/trace_service.proto#L58
@MrAlias MrAlias closed this as completed in b5b6852 Nov 3, 2022
kzys added a commit to kzys/containerd that referenced this issue Dec 20, 2022
This upgrade fixes
open-telemetry/opentelemetry-go#3432 which is
too noisy to use OpenTelemetry from ctr.

Signed-off-by: Kazuyoshi Kato <katokazu@amazon.com>
kzys added a commit to kzys/containerd that referenced this issue Dec 20, 2022
This upgrade fixes
open-telemetry/opentelemetry-go#3432 which is
too noisy to use OpenTelemetry from ctr.

Signed-off-by: Kazuyoshi Kato <katokazu@amazon.com>
kzys added a commit to kzys/containerd that referenced this issue Dec 20, 2022
This upgrade fixes
open-telemetry/opentelemetry-go#3432 which is
too noisy to use OpenTelemetry from ctr.

Signed-off-by: Kazuyoshi Kato <katokazu@amazon.com>
kzys added a commit to kzys/containerd that referenced this issue Dec 22, 2022
This upgrade fixes
open-telemetry/opentelemetry-go#3432 which is
too noisy to use OpenTelemetry from ctr.

Signed-off-by: Kazuyoshi Kato <katokazu@amazon.com>
kzys added a commit to kzys/containerd that referenced this issue Dec 22, 2022
This upgrade fixes
open-telemetry/opentelemetry-go#3432 which is
too noisy to use OpenTelemetry from ctr.

Signed-off-by: Kazuyoshi Kato <katokazu@amazon.com>
kzys added a commit to kzys/containerd that referenced this issue Dec 23, 2022
This upgrade fixes
open-telemetry/opentelemetry-go#3432 which is
too noisy to use OpenTelemetry from ctr.

Signed-off-by: Kazuyoshi Kato <katokazu@amazon.com>
kzys added a commit to kzys/containerd that referenced this issue Jan 11, 2023
This upgrade fixes
open-telemetry/opentelemetry-go#3432 which is
too noisy to use OpenTelemetry from ctr.

Signed-off-by: Kazuyoshi Kato <katokazu@amazon.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants