Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: apollo federation tracer was race prone #2366

Merged
merged 1 commit into from Sep 13, 2022
Merged

fix: apollo federation tracer was race prone #2366

merged 1 commit into from Sep 13, 2022

Conversation

roeest
Copy link
Contributor

@roeest roeest commented Sep 12, 2022

The apollo federation tracer was using a global state that was accessed from different goroutines regardless of the request they are currently handling.
Added req headers to operation context to allow it to be fetched in InterceptOperation

When running the following test (via go test -race ./...) i was able to consistently get error messages from go's race detection.

func TestApolloTracing_Concurrent(t *testing.T) {
	h := testserver.New()
	h.AddTransport(transport.POST{})
	h.Use(&apollofederatedtracingv1.Tracer{})
	for i := 0; i < 2; i++ {
		go func() {
			resp := doRequest(h, http.MethodPost, "/graphql", `{"query":"{ name }"}`)
			assert.Equal(t, http.StatusOK, resp.Code, resp.Body.String())
			var respData struct {
				Extensions struct {
					FTV1 string `json:"ftv1"`
				} `json:"extensions"`
			}
			require.NoError(t, json.Unmarshal(resp.Body.Bytes(), &respData))
			tracing := respData.Extensions.FTV1
			pbuf, err := base64.StdEncoding.DecodeString(tracing)
			require.Nil(t, err)

			ftv1 := &generated.Trace{}
			err = proto.Unmarshal(pbuf, ftv1)
			require.Nil(t, err)
			require.NotZero(t, ftv1.StartTime.Nanos)
		}()
	}
}

Made some additional fixes to the test so it passes with -race flag.

The tracer was using a global state across different goroutines
Added req headers to operation context to allow it to be fetched in InterceptOperation
@coveralls
Copy link

Coverage Status

Coverage increased (+0.04%) to 75.475% when pulling 7435c90 on roeest:tracing-race into fc01855 on 99designs:master.

Copy link
Collaborator

@StevenACoffman StevenACoffman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@StevenACoffman StevenACoffman merged commit 59493af into 99designs:master Sep 13, 2022
@roeest
Copy link
Contributor Author

roeest commented Sep 13, 2022

Thanks for approving, any idea when the next release will take place? Would love to get this fix in my project

@StevenACoffman
Copy link
Collaborator

I'm getting test failures on windows with some regularity:

--- FAIL: TestApolloTracing (0.01s)
    tracing_test.go:48: 
        	Error Trace:	tracing_test.go:48
        	Error:      	"25[26](https://github.com/99designs/gqlgen/actions/runs/3046701310/jobs/4910136009#step:4:27)05000" is not less than "252605000"
        	Test:       	TestApolloTracing
FAIL

I can re-run it, and eventually it seems like it passes, but I wonder if you can look to see if there's something to make it more reliable?

@StevenACoffman
Copy link
Collaborator

@roeest I went ahead and released https://github.com/99designs/gqlgen/releases/tag/v0.17.17 but I would like you to look at improving your test reliability if possible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants