Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

construct from buffer #5

Open
swiing opened this issue May 4, 2022 · 5 comments
Open

construct from buffer #5

swiing opened this issue May 4, 2022 · 5 comments

Comments

@swiing
Copy link
Owner

swiing commented May 4, 2022

Typed array constructors can take an ArrayBuffer instance as single argument. E.g.:

let buffer = new ArrayBuffer(12);
new Uint32Array(buffer);

The length of the typed array is the length of the buffer, divided by the TypedArray.BYTES_PER_ELEMENT.

If applied to BitArrays, that would imply the length of BitArrays can only be multiple of 8. However, there is no reason that length of bit arrays should be so. Hence, I have written:

I have not yet made my mind whether it makes sense to construct a BitArray by passing an array buffer to the constructor.

This issue is created to open the discussion.

@swiing
Copy link
Owner Author

swiing commented May 4, 2022

@rollie42 wrote:

I actually have a use case! I have a settings object in my react app; it has a great number of boolean or near boolean enums. I want to store the user settings in a query param, so I load the settings into a bit array, and then turn that into base64 from that array buffer. This allows the settings to be shared or survive a page reload. To deserialize, I load the bas64 string back into array buffer, and then into the bitarray to reconstruct the settings.

How do you manage the length of the bit array?

@rollie42
Copy link

rollie42 commented May 4, 2022

It has padding at the end, but that's fine - if I write 51 bits of data and actually serialize a bit extra, it's not optimal but it's fine because when I deserialize, I never read whatever went into the padding bits.

@rollie42
Copy link

rollie42 commented May 4, 2022

Here's a gist of the code - really what would be ideal is to allow an arbitrary serialization, say something like encodeToCharacterSet(char[]); so to use,

const bitSet = BitArray.from("111100111010")

// 4 letters, so it encodes 2 bits at a time: [11 11 00 11 10 10]
// yields: "ddadcc"
bitSet.encodeWithCharacterSet("abcd") 

// 8 letters, so it encodes 3 bits at a time: [101 100 111 010]
// yields: "hehc" (I think)
bitSet.encodeWithCharacterSet("abcdefgh") 

the encoding to base64 would just be an instance of this logic. And then of course you'd need BitArray.decodeFromCharacterSet(char[], string) as well. Presumably you just throw if the passed in character set isn't a size that is a multiple of 2. It would actually be possible to handle this case, but...complicated :P

So in my particular case, I don't actually want ArrayBuffer -> BitArray, I just use it as a means to an end that could be achieved more optimally.

Honestly I don't understand 100% of what the whole proxy thing is doing in the code, but otherwise the logic I'm sure I could implement if you wanted to accept the change.

@swiing
Copy link
Owner Author

swiing commented May 5, 2022

I see: you don't actually care the length, because it is pre-determined fixed-size, which is a specific case.

Feel free to give it a try (it hopefully does not need to mess up with the proxy thing). If you do so, please include test cases as well!

@rollie42
Copy link

rollie42 commented May 5, 2022

Created PR with this change - let me know :)

I put it on the 'extension' repo instead of this one, as this one seems to be for matching the exact TypedArray interface only

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants