Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Windows Service memory grows indefinitely with Tracing and Profiling enabled. #3375

Closed
jamescrosswell opened this issue May 16, 2024 · 5 comments · Fixed by #3382
Closed
Assignees
Labels
Bug Something isn't working Product: Profiling

Comments

@jamescrosswell
Copy link
Collaborator

jamescrosswell commented May 16, 2024

From jforward5 on Discord.

I've added Sentry to my .Net core project and see the project using an infinite amount of memory. My question is, how do i make it stop using all the memory?

If I comment out this code my app runs normally, and only uses about 24MB of memory, but uncommenting causes the memory to grow endlessly by 2-3MB per second:
builder.ConfigureLogging(loggingBuilder =>
{
loggingBuilder.AddSentry(o =>
{
o.Dsn = "https://my_dsn/";
o.Debug = true;
o.TracesSampleRate = 1.0;
o.ProfilesSampleRate = 1.0;
o.AddIntegration(new ProfilingIntegration(
TimeSpan.FromMicroseconds(500)
));
});
});

I should also mention this is a Windows Service using .Net Core 8

Demo project to reproduce

i've attached a sample app that reproduces the issue. The app is a plain, "empty" windows service. It is using .Net 8 and Sentry. Run the app from VS and watch your task manager, you will see the process memory grow as long as the service is running. If you turn off the profiling and tracing it no longer consumes memory, and stabilizes. You'll need to add your own "DSN" since i removed mine, for security reasons.

Memory Test Worker Service.zip

@jamescrosswell jamescrosswell added the Bug Something isn't working label May 16, 2024
@jamescrosswell
Copy link
Collaborator Author

So there does appear to be an issue here. This is what I'm seeing when I run the service in a memory profiler:
image

Although the LOH and POH do get cleaned up occassionally (somewhat), over time these just keep growing and growing. Heap Gen 0, 1 and 2 all appears stable.

The only thing that I can think might be using the LOH or POH is profiling, which might need to pin objects when passing pointers to unsafe code. I see a number of instances where the PerView module pins objects (e.g. here).

That's consistent with the call tree, which indicates 99% of the memory is being used by instances of EventMarker, which is an internal class used by modules/perfview/src/TraceEvent/EventPipe/EventCache.cs:
image

The EventMarker constructor takes a PinnedBuffer as an argument.

It looks like the ProcessEventBlock method creates multiple EventMarker instances and never explicitly cleans any of these up.

I'll play around to see if I can explicitly free these up.

@jamescrosswell
Copy link
Collaborator Author

When I look at the generations I can see:
image

Pretty much all of that is accounted for by Microsoft.Diagnostics.Tracing.Etlx.TraceLog+EventsToStackIndex[]

image

I see events get added to that stack here but the only place I see them being removed is in the internal FlushRealtimeEvents method, which in turn gets called by TraceLogProcess, which we call here when starting the profiling:

Task.Factory.StartNew(eventSource.Process, TaskCreationOptions.LongRunning)

As far as I can tell then, that GrowableArray just gets bigger and bigger, and there are no public APIs that we can use to flush out old item.

@vaind any ideas?

@vaind
Copy link
Collaborator

vaind commented May 23, 2024

@jamescrosswell any chance you could verify the fix in #3382 ?

@jamescrosswell
Copy link
Collaborator Author

@jamescrosswell any chance you could verify the fix in #3382 ?

It works - thanks @vaind!🕺🏻

image

@vaind
Copy link
Collaborator

vaind commented May 23, 2024

thank you!

for reference: this was the actual fix: microsoft/perfview@36d2e2c#diff-31ffbfdad96fc84f3aa7d25e1980cdca5a4d7870529c669ac9cc975cf24969c8R342

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working Product: Profiling
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

3 participants