New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pytorch support #1625
Comments
If you want to do inference the easiest would likely be to convert the model to ONNX then use it in WASM with ONNX.js See the upstream issue: pytorch/pytorch#25091 Building Pytorch in Pyodide is theoretically possible however looking at the list of required build and run dependencies in conda-forge it would be very complex and time consuming. Also it would be CPU only. So projects like ONNX.js would be more appropriate because there you would also be able to use the GPU via WebGL. Also you can very well do some pre processing of your data or post processing of the predictions in Pyodide, with the main model being run in ONNX.js Happy to keep this issue open for future reference, but it's unlikely we would be able to do much about it on the Pyodide side. |
ONNX doesn't work. ONNX javascript does not support all the functions of PyTorch, and, most importantly, it does not support some custom functions using in the PyTorch model. And it looks like a dead project at its repo(https://github.com/microsoft/onnxjs) I have already built a project with ONNX, and I feel it is not really a good solution. I hope pyodide will have a certain way to support Pytorch, so I can deploy something like a custom transformer model. Or it may allow me to train it on GPU. In addition, this post has said that Pyodide does not support Pytorch because of Pyodide itself. Again, I believe if it supports Pytorch, it will shrine-like Reactjs because Pytorch is easier to use. And the deploying issue is what Pytorch lacks. |
Yes, I forgot I previously answered this question in https://stackoverflow.com/a/64407516/1791279 and there is indeed a blocker on ctypes / CFFI support (#728)
ONNX.js is indeed outdated, however there is apparently a WASM backend for onnxruntime directly microsoft/onnxjs#292 (comment) Have you tried raising that issue with them? For classical Language Models with pytorch it works quite well (at least outside of WASM). I understand the interest, it's just that doing this is a very large project and we have fairly limited resources. |
ONNX Runtime Web, as a successor of ONNX.js. is release in ONNX Runtime v1.8.0 (2021-06-02). This uses WebAssembly to compile the whole ONNX Runtime and supports all operators in latest ONNX opset. please check it at https://github.com/microsoft/onnxruntime/tree/master/js/web. |
@JonathanSum You could try Google's Tensorflow.js or Alibaba's MNN.js for inference, which can make use of GPU. Pyodide is suitable for pre and post processing instead of inference. |
Question: if the blockers for pytorch had been cffi support and ctypes support, and both are resolved ( https://github.com/pyodide/pyodide/tree/main/packages/cffi and #1656), are there any other blockers to running pytorch via pyodide? Is this something reasonable to attempt? |
@lsb I once tried to build PyTorch via pyodide naively with most options disabled, but it was unsuccessful at that time. |
Flagging this to @seemethere @malfet |
👀 |
Does pyodide support Pytorch?
Or does wasm support Pytorch?
The text was updated successfully, but these errors were encountered: