New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache.add() encountered a network error #389
Comments
I don't think this is related to the typical browser cache, but just for good measure I cleared my cache, but same issue. |
@thekevinscott I'm facing the same issue on chrome v124 i cleared my cache but no improvement |
cc: @CharlieFRuan |
Is this issue encountered for all models? To be honest the To triage a bit, could you check, in the console, Besides, is this encountered for all models? Perhaps could you try, say TinyLlama? Wanted to know if it is due to a single weight shard being too large. |
@CharlieFRuan I've just tried again (same Chrome version) and, lo and behold, it appears to be working. Perhaps there was a new deploy of I see entries under Cache storage, but since it now appears to be working, not sure if that's helpful. I wonder if @ucalyptus2 is still seeing the issue. |
I'm seeing a
Cache.add()
error when trying to loadLlama-3-8B-Instruct-q4f16_1-MLC
:This is Chrome 124.0.6367.119, running the demo at https://webllm.mlc.ai/
The text was updated successfully, but these errors were encountered: