Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmark results #17

Open
aminya opened this issue Aug 25, 2020 · 1 comment
Open

Benchmark results #17

aminya opened this issue Aug 25, 2020 · 1 comment

Comments

@aminya
Copy link

aminya commented Aug 25, 2020

It would be nice to have a markdown file showing the benchmark results.

@zandaqo
Copy link
Owner

zandaqo commented Aug 31, 2020

Yes, indeed, making benchmarks more accessible and useful would be nice, and I hope to get around it somewhere down the line.

Though, I doubt the usefulness of just putting the results of existing benchmarks in a file. At the moment, the benchmarks serve more as a development tool to watch for regressions, rather than something for an end user to judge the general performance. Most of the structures are too general and have many use cases making it a) hard to tell what cases are more general and benchmarks would be useful; b) extremely easy to engineer the data and use cases for a necessary effect.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants