Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incremental BlockEncoder API #222

Open
Gozala opened this issue Nov 8, 2022 · 0 comments
Open

Incremental BlockEncoder API #222

Gozala opened this issue Nov 8, 2022 · 0 comments

Comments

@Gozala
Copy link
Contributor

Gozala commented Nov 8, 2022

One of the problems we keep running into is: does this data going to fit the block size limit ?

Right now there is no great solution for this, only thing we can do try and encode bunch of times and measure the size. Unfortunately that is not really a great option, because:

  1. Previously encoded data is not reused, so we waste computation and create more data to be GC-ed.
  2. If we are going above the block size limit there is no good way to backtrack

I do not know what the answer is here, but I do like the way CARBufferWriter came out and I think maybe something along the same lines could work here as well. Specifically I would like to:

  1. Allocate and pass in buffer to encode data into as opposed to just buffer out for the encoded node.
    • This also provides better control in cases where we want to encode several things into a larger buffer.
    • It also implies you can't accidentally create a block size which is greater than block size limit.
  2. Ideally API should allow encoder avoid re-encoding same data over and over again.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant