-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Show current benchmarking results #123
Comments
Coincidentally, I just updated the code size benchmarks this morning, cleanup up the code as well. It's easy to measure code size, not so much the performance. For example, google-protobuf deserializes into an intermediate state. So a simple roundtrip might show different results, compared to when all fields are set with the getter / setter methods. That being said, having performance benchmarks for protobuf.js would be great. |
Partially address timostamm#123 Results of the perf benchmark on my machine: ### read binary google-protobuf : 413.662 ops/s ts-proto : 1,324.736 ops/s protobuf-ts (speed) : 1,461.452 ops/s protobuf-ts (speed, bigint) : 1,475.889 ops/s protobuf-ts (size) : 1,250.677 ops/s protobuf-ts (size, bigint) : 1,255.167 ops/s protobufjs : 1,732.049 ops/s ### write binary google-protobuf : 906.883 ops/s ts-proto : 3,805.993 ops/s protobuf-ts (speed) : 430.632 ops/s protobuf-ts (speed, bigint) : 448.063 ops/s protobuf-ts (size) : 378.682 ops/s protobuf-ts (size, bigint) : 392.511 ops/s protobufjs : 1,539.768 ops/s ### from partial ts-proto : 4,503.332 ops/s protobuf-ts (speed) : 1,568.577 ops/s protobuf-ts (size) : 1,555.881 ops/s ### read json ts-proto : 3,719.366 ops/s protobuf-ts (speed) : 889.613 ops/s protobuf-ts (size) : 890.034 ops/s protobufjs : 4,120.232 ops/s ### write json ts-proto : 13,200.842 ops/s protobuf-ts (speed) : 1,865.668 ops/s protobuf-ts (size) : 1,862.537 ops/s protobufjs : 4,199.372 ops/s ### read json string ts-proto : 957.057 ops/s protobuf-ts (speed) : 416.284 ops/s protobuf-ts (size) : 421.48 ops/s protobufjs : 910.256 ops/s ### write json string ts-proto : 1,000.572 ops/s protobuf-ts (speed) : 943.338 ops/s protobuf-ts (size) : 949.784 ops/s protobufjs : 1,446.891 ops/s
Thanks for the PR! The makefile bug from this comment is fixed in commit 326299f, making sure the performance benchmarks run with the same payload every time. Not saying benchmarks should only run with the large payload, this was just fixing the obvious makefile bug. All testees are in the same ballpark with the large payload size:
I think the benchmarks should run on several payload sizes. There are some factor 10 gaps with the smaller payload you were measuring that are worth closer investigation. |
For reference: Large payload: 1.2MiB - FileDescriptorSet for |
Great results! One thing that caught my eyes is the massive difference in writing JSON between |
But look closely at the numbers. They are put into relation when you realize that they measure turning the internal representation into a JSON object. What you need in practice is a JSON string:
|
I think the manual deserves a performance comparison table at the end of the section Code size vs speed. It should just show numbers for binary I/O and JSON (string) I/O. It should show generator version number and parameters, preferably in a one simple table. It should be mentioned how and where this is measured, and with what payload size. The table should be generated by a script, similar to the code size report. These are the results including protobuf.js:
|
Looks like there has been a regression in v2.0.0-alpha.9. We stopped generating
|
Was this this discovery made in some off-github discussion? I'm curious more than anything since I can't find any issue or PR mentioning this. Glad it was spotted though! Completely unrelated, but what on Earth is going on with ts-proto's "write json object" benchmark? Being nearly an order of magnitude faster than the underlying library it uses (protobufjs) seems odd. |
See #147 (comment) and #147 (comment)
It's impressive, right? I don't think ts-proto is sharing any code with protobufjs for JSON. |
The manual currently provides a comparison between code size vs speed, but it only shows the resulting size of the generated code so we don't know the runtime difference. There is already a benchmarks package which provides code to perform benchmarks, but it leave something to be desired. Namely: the current results. :)
It would be helpful to show this information (especially the perf.ts results) somewhere. Ideally it could be included in the code size vs speed section of the manual.
P.S. A comparison against protobuf.js would be very nice to compare the performance against since that library tends to be the fastest out there at the moment.
P.P.S. I've already taken a crack at adding protobuf.js to the perf.ts benchmarks locally and it seems like protobuf.js can decode/encode the binary about twice as fast as protobuf-ts@2.0.0-alpha.27. Any ideas how that gap could be closed? Is protobuf.js taking shortcuts that aren't conformant to the proto spec? Are there any techniques that could be copied from protobuf.js?
Thank you again for this wonderful project!
The text was updated successfully, but these errors were encountered: