fix packet encoding/decoding which seems to break after 2 billions packets sent #1743
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hello,
Long story short, I've been using the encoding/decoding logic of packet number for another purpose than encoding pn, and observed that it would start decoding incorrect values when reaching 2^31.
I dived into RFC9000, on the Appendix for the algorithmic details of encoding/decoding, and observed that the encoding algorithm in quiche does not match the encoding algorithm suggested by the RFC. I suppose there is a reason for this, but it eludes me.
So, this pull request does an implementation close to what is suggested (hopefully more efficient), and fixes my problem, and the bug with packet numbers too. I've also added test cases for packet encoding/decoding using example values from the RFC.
This passes all unit tests, and a simple integration test in which I downloaded a 5 GiB file through the H3 module using quiche-[client/server]. However, I am unsure about: