Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Block with more than u16 events fails to be decoded #5782

Open
3 tasks done
Leouarz opened this issue Jan 16, 2024 · 2 comments
Open
3 tasks done

Block with more than u16 events fails to be decoded #5782

Leouarz opened this issue Jan 16, 2024 · 2 comments
Assignees
Labels

Comments

@Leouarz
Copy link

Leouarz commented Jan 16, 2024

  • I'm submitting a ...

    • Bug report
  • What is the current behavior and expected behavior?

When a block has more than 64 * 1024 events, it fails to be decoded by the api and makes other component panic (like the polkadot UI).
To reproduce, I connected to a local polkadot node 1.0.0 and sent 4 batch of 8500 remark with events all in the same block.
The node will process it correctly, but on the api side, if i try to decode the events, i will get an error :
Error: Unable to decode storage system.events:: createType(Vec<FrameSystemEventRecord>):: Vec length 72264 exceeds 65536
This is coming from this file.
Here's also my initial question on Stack exchange.

  • What is the motivation for changing the behavior?

With the rise of new types of chain, rollups and even inscriptions, block can definitely contain a lot of events and polkadot api is the main library to interract with such chains. This issue is not reproduced in subxt for example. If it's a choice, I would be interested to know the motivations.

  • Please tell us about your environment:
    • Version:

      • Polkadot API : ^10.10.1
      • Polkadot node : v1.0.0
    • Environment: Ubuntu 22.04

      • Node.js
    • Language:

      • TypeScript (include tsc --version : ^5.2.2)
@IkerAlus IkerAlus added the bug label Mar 15, 2024
@jamesbayly
Copy link

Hi @IkerAlus is there an update here? It's starting to impact us

@TarikGul
Copy link
Member

I am happy to start taking a look at this, but that being said, I am hesitant to just change the MAX_LENGTH for Vec's without some heavy testing.

Not sure what the residual affects could be, but I am sure Jaco put that there for good reason (I hope).

@TarikGul TarikGul self-assigned this May 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Development

No branches or pull requests

4 participants