Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for ai.onnx.ml opset #156

Open
redseal798 opened this issue Apr 2, 2023 · 12 comments
Open

Support for ai.onnx.ml opset #156

redseal798 opened this issue Apr 2, 2023 · 12 comments

Comments

@redseal798
Copy link

Is your feature request related to a problem? Please describe.

Support ONNX ml operators: https://github.com/onnx/onnx/blob/main/docs/Operators-ml.md

Describe the solution you'd like

When using ONNX ML Operators it should work, such as using ONNX models created from sklearn pipelines

Describe alternatives you've considered

NA

Additional context

@pixelspark
Copy link
Collaborator

While I think we encourage anyone interested to work on this (and I'd be happy to review PRs), this is not currently in scope unfortunately as it is quite a bit of work.

I'm not sure our architecture fits this either. The ML operators appear to be quite a bit more high-level than the ONNX operators we currently implement, and it may not be feasible or practical to implement them in the current architecture (where operators are 1:1 translated to a GPU shader, which are executed in a predefined order with no room for CPU work in between).

@redseal798
Copy link
Author

Thanks for the feedback!

@SludgePhD
Copy link
Contributor

FWIW tf2onnx seems to add an ai.onnx.ml opset import no matter whether those operators are actually used by the network, I've had to manually patch the generated networks to remove the import to get them to work. It's possible that other frameworks have the same problem.

@schell
Copy link

schell commented Jul 12, 2023

I've had to manually patch the generated networks to remove the import to get them to work.

@SludgePhD what method did you use to do this?

@SludgePhD
Copy link
Contributor

@schell There is no good tooling for this (in fact there is no official tooling for manipulating ONNX files at all as far as I can tell), so I had to write a Python script:

original_model = onnx.load(model_path)

if len(original_model.opset_import) == 2:
    print("opset imports before: ", original_model.opset_import)
    original_model.opset_import.pop()
    print("opset imports after: ", original_model.opset_import)

(it's not very robust)

@schell
Copy link

schell commented Jul 16, 2023

Thank you none the less, @SludgePhD :)

@pixelspark
Copy link
Collaborator

I am not sure from the above if wonnx throws an error when just the import is present? If so we can actually make wonnx a bit more lenient and let it ignore the import as long as no ops from that set are used (unknown ops lead to an error anyway even from the 'basic' set of ops).

@schell
Copy link

schell commented Jul 21, 2023

That would be nice @pixelspark :)

@schell
Copy link

schell commented Sep 26, 2023

@pixelspark how would one figure out if any ai.onnx.ml ops are used in the model? I'm currently writing a cli tool that would do this and remove the opset if possible.

@schell
Copy link

schell commented Sep 26, 2023

@pixelspark sorry, nevermind - I figured it out between the wonnx source and protobuf :) 👍

@Mek101
Copy link

Mek101 commented Dec 11, 2023

I'm currently writing a cli tool that would do this and remove the opset if possible.

Did that end up anywhere?

@schell
Copy link

schell commented Dec 12, 2023

Sorry @Mek101 - no, as it's part of my day job and was just a quick spike I didn't push the code up anywhere. But it did end up "working". The cli tool traverses the graph looking for uses of the opset and if none are found it removes it and saves the graph.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants