Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converting large files is not feasible due to high memory usage #30

Open
dylanpdx opened this issue Dec 20, 2020 · 1 comment
Open

Converting large files is not feasible due to high memory usage #30

dylanpdx opened this issue Dec 20, 2020 · 1 comment
Labels
enhancement New feature or request

Comments

@dylanpdx
Copy link

This tool works great for small files, but trying to convert a relatively large file (1GB) will cause a MemoryError, even with 32 GB of RAM

@Theelx
Copy link
Collaborator

Theelx commented Jan 16, 2021

Indeed. There's a bit of a tradeoff between memory and speed, and we've mostly gone the speed route since it was envisioned that people wouldn't try to encode a file above a few tens of megabytes. If there's an interest in adding modes for memory or speed, I'd be happy to work on that. For now though, can you run a memory profiler (scalene or memprof work well) to tell us where the code uses most of the memory on your machine, or send the 1GB file here so I can test?

@Theelx Theelx added the enhancement New feature or request label Jan 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants