You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for reporting the error. Could you show the log in console? Not sure if there is more info there. Besides, does this issue occur in all models? Could you perhaps try a smaller model like gemma 2b?
CharlieFRuan
changed the title
simple-chat:After system initalize, loading onto webgpu Generate error
simple-chat: error during loading params onto WebGPU, GPUPipelineError: A valid external Instance reference no longer exists
Mar 13, 2024
This is a bit strange. WizardMath has q4f16_1, could you also try Llama-2-7B-q4f16_1? Besides, how much RAM do you have? My guess is that it is some OOM issue. Looking at the field vram_required_MB in https://github.com/mlc-ai/web-llm/blob/main/examples/simple-chat/src/gh-config.js, Llama-2-7B-q4f32_1 requires roughly 2GB more vram than Llama-2-7B-q4f16_1 (hence also WizardMath).
The text was updated successfully, but these errors were encountered: