Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak in request_queue when sending messages with files #1091

Open
Jaskowicz1 opened this issue Feb 22, 2024 · 8 comments · Fixed by #1112
Open

Memory leak in request_queue when sending messages with files #1091

Jaskowicz1 opened this issue Feb 22, 2024 · 8 comments · Fixed by #1112
Assignees
Labels
bug Something isn't working

Comments

@Jaskowicz1
Copy link
Contributor

Jaskowicz1 commented Feb 22, 2024

Git commit reference
9c549f5

Describe the bug
Sending messages, with files, causes a memory leak. The leak is bigger if the file is bigger.

The issue seems to be coming from:

/* Check for deletable items every second regardless of select status */
while (responses_to_delete.size() && now >= responses_to_delete.begin()->first) {
	delete responses_to_delete.begin()->second.first;
	delete responses_to_delete.begin()->second.second;
	responses_to_delete.erase(responses_to_delete.begin());
}

and the queue itself.

The data doesn't actually get removed until the final request has been deleted from the queue.

Iteration 11 running to check RAM from queue. 46837760 RAM in use.
responses_to_delete: 1
Deleting response...
Iteration 12 running to check RAM from queue. 27660288 RAM in use

if you're making frequent requests, the map will never empty, meaning the data is never freed (not sure why).

However, even when the queue is now empty, the ram doesn't fully recover. For example:

Test 0 started, 14778368 RAM in use.
// .......
Test 11 started, 45486080 RAM in use.
Iteration 0 running to check RAM from queue. 46837760 RAM in use.
// .......
Iteration 11 running to check RAM from queue. 46837760 RAM in use.
// .......
Iteration 12 running to check RAM from queue. 27660288 RAM in use.

To Reproduce

  1. Run the following code:
int main() {
	/* Create the bot */
	dpp::cluster bot("Token");
	bot.on_log(dpp::utility::cout_logger());
	
	bot.on_ready([&bot](const dpp::ready_t& event) {
		for (int i = 0; i < 12; i++) {
			bot.message_create(dpp::message(channel_id, "").add_file("test.jpg", dpp::utility::read_file("image.jpg")));
			std::this_thread::sleep_for(std::chrono::seconds(5)); // Just to avoid rate limits.
		}
	});
	
	bot.start(dpp::st_wait);
	
	return 0;
}
  1. Watch the memory increase, then wait 120 seconds (for all 12 messages to clear) and watch the memory massively decrease.

Expected behavior
The memory should slowly clear with every remove from the queue.

Ideally, we should use unique_ptr so we can avoid the 60 second queue sillyness.

System Details:

  • OS: Tested on Ubuntu (WSL).
  • Desktop Discord Client.
@Jaskowicz1 Jaskowicz1 added the bug Something isn't working label Feb 22, 2024
@braindigitalis
Copy link
Contributor

as well as unique_ptr you may have to rehash the maps periodically by copying all data to a new one. This is the only way to force unordered map to release its buckets.

@Jaskowicz1
Copy link
Contributor Author

This does not seem to occur on Windows and currently looks like a Linux only bug...

@Jaskowicz1
Copy link
Contributor Author

I'm now working on this.

@Jaskowicz1
Copy link
Contributor Author

To further make this confusing, this leak does not happen on Mac OSX either. So it really is Linux only...

@Neko-Life
Copy link
Contributor

:thonk: I think i can look into that, ima test on my machine

@Mishura4 Mishura4 self-assigned this Mar 29, 2024
@Jaskowicz1 Jaskowicz1 linked a pull request Mar 29, 2024 that will close this issue
5 tasks
@Jaskowicz1
Copy link
Contributor Author

The fix that @Mishura4 implemented doesn't seem to work. This also seems to be an issue with only libstdc++. Upgrading to a newer g++ fixes it (as long as you use that newer g++) as it appears newer versions of libstdc++ have this fixed.

@braindigitalis
Copy link
Contributor

updating to what? I'm on g++ 12.3.0 on my production machines, where the problem persists...? It's literally the latest available g++ on ubuntu 22.04.

The fix that @Mishura4 implemented doesn't seem to work. This also seems to be an issue with only libstdc++. Upgrading to a newer g++ fixes it (as long as you use that newer g++) as it appears newer versions of libstdc++ have this fixed.

@Jaskowicz1
Copy link
Contributor Author

updating to what? I'm on g++ 12.3.0 on my production machines, where the problem persists...? It's literally the latest available g++ on ubuntu 22.04.

The fix that @Mishura4 implemented doesn't seem to work. This also seems to be an issue with only libstdc++. Upgrading to a newer g++ fixes it (as long as you use that newer g++) as it appears newer versions of libstdc++ have this fixed.

Oh? Maybe upgrading isn't a fix then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: In Progress
Development

Successfully merging a pull request may close this issue.

4 participants