Skip to content
This repository has been archived by the owner on Jul 31, 2023. It is now read-only.

Goroutine leak #12

Open
prestonvanloon opened this issue Aug 13, 2019 · 2 comments
Open

Goroutine leak #12

prestonvanloon opened this issue Aug 13, 2019 · 2 comments
Labels
bug Something isn't working

Comments

@prestonvanloon
Copy link

prestonvanloon commented Aug 13, 2019

Please answer these questions before submitting a bug report.

What version of the Exporter are you using?

5b8293c

What version of OpenCensus are you using?

7bbec1755a8162b5923fc214a494773a701d506a

What version of Go are you using?

1.11

What did you do?

I am not sure how to reproduce.

What did you expect to see?

Low amount of goroutines.

What did you see instead?

50k+ goroutines.

goroutine profile: total 49263
27310 @ 0x43236f 0x442d29 0x442cff 0x442a9d 0x4764d9 0x474804 0x114618f 0x11466d1 0x4600c1
#	0x442a9c	sync.runtime_SemacquireMutex+0x3c						GOROOT/src/runtime/sema.go:71
#	0x4764d8	sync.(*Mutex).Lock+0x108							GOROOT/src/sync/mutex.go:134
#	0x474803	sync.(*Cond).Wait+0xb3								GOROOT/src/sync/cond.go:57
#	0x114618e	google.golang.org/api/support/bundler.(*Bundler).acquire+0x8e			external/org_golang_google_api/support/bundler/bundler.go:286
#	0x11466d0	google.golang.org/api/support/bundler.(*Bundler).startFlushLocked.func1+0x90	external/org_golang_google_api/support/bundler/bundler.go:271

21851 @ 0x43236f 0x443d19 0x443cef 0x4747ee 0x114618f 0x11466d1 0x4600c1
#	0x443cee	sync.runtime_notifyListWait+0xce						GOROOT/src/runtime/sema.go:510
#	0x4747ed	sync.(*Cond).Wait+0x9d								GOROOT/src/sync/cond.go:56
#	0x114618e	google.golang.org/api/support/bundler.(*Bundler).acquire+0x8e			external/org_golang_google_api/support/bundler/bundler.go:286
#	0x11466d0	google.golang.org/api/support/bundler.(*Bundler).startFlushLocked.func1+0x90	external/org_golang_google_api/support/bundler/bundler.go:271

Additional context

This library is the only one in our binary that usese google.golang.org/api/support/bundler.

This is how we have configured it

       exporter, err := jaeger.NewExporter(jaeger.Options{
		CollectorEndpoint: endpoint,
		Process: jaeger.Process{
			ServiceName: serviceName,
			Tags: []jaeger.Tag{
				jaeger.StringTag("process_name", processName),
				jaeger.StringTag("version", version.GetVersion()),
			},
		},
		BufferMaxCount: 256 * 1e6, // 256Mb
		OnError: func(err error) {
			log.WithError(err).Error("Failed to process span")
		},
	})
@prestonvanloon prestonvanloon added the bug Something isn't working label Aug 13, 2019
@prestonvanloon
Copy link
Author

One thing to notice is that we have set the "count" to 256Mb since this is referencing a bytes limit, not a count of messages.

// Set BufferedByteLimit with the total number of spans that are permissible to be held in memory.
// This needs to be done since the size of messages is always set to 1. Failing to set this would allow
// 1G messages to be held in memory since that is the default value of BufferedByteLimit.
if o.BufferMaxCount != 0 {
bundler.BufferedByteLimit = o.BufferMaxCount

@terev
Copy link

terev commented Jun 2, 2020

@prestonvanloon any luck with your merged change? Setting BufferMaxCount seems weird because they say they set the size for each msg to 1 regardless of the size. That means tuning memory for the process is harder than it should be.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants