Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Live decoding on Android #206

Open
iPhaeton opened this issue May 25, 2018 · 0 comments
Open

Live decoding on Android #206

iPhaeton opened this issue May 25, 2018 · 0 comments

Comments

@iPhaeton
Copy link

Hi. We use Aurora.js in an Android application on React-Native. It took quite a while to make it work with Reac-Native, but it works great for IOS. For Android everything is fine, when we record a file using microphone and then decode it in Aurora. However, on Android we can't decode a file while it is still being recorded.
We create an asset using fromBuffer method. The only difference between the buffers we pass to Aurora is bytes 25 to 28 that, apparently, define the file length. The problem is that M4ADemuxer can't deal with a buffer for which length is not defined. Is there any way to make M4ADemuxer work with audio data for which final length is unknown?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant