-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
autogenerate operator status #44
Comments
how about we restructure the overview, so it can be autogenerated?
|
@alrevuelta with our current approach (no weak symbols) we may need to generate all onnx operators for this to work. |
This makes me think something that we have been avoiding from the beginning. We are currently testing the the operators using the onnx "test vectors". However, these "test vectors" don't test all data types, but a single one (typically float as far as I have seen). So lets say we implement an operator type that the onnx backend is not testing. To me, an operator that is not tested is not implemented. With this statement I'm saying that we should consider that an operator is implemented if a set of test cases for that operator are passing. So first of all I think we should think a way to get one test vector for each type. As a first idea, we could reuse the onnx testing backend in Secondly, once we have the testcases for each data type, run them, and mark with ✔ the ones that are passing. |
Any thoughts on this? As I previously stated I don't think the default test vectors that onnx provides are sufficient for us. As I already suggested, I think we can sort of reuse them and convert each on to the types that we need. Quick example. Lets say we want to test Using the magic of what you have already used, we can access programatically the input types that each operator has. all_schemas = [ s for s in onnx_cpp2py_export.defs.get_all_schemas_with_history()] So continuing with With something like this, we could say that a given operator is implemented if the corresponding testcase(s) are passing. Some thoughts:
I ran some "statistics" on the operators, and among all 321 operators/versions, a total of 260 could be easily autogenerated (because they match a. and b. above). I'm bringing this up because as I said I think the way that we can track if an operator is implemented or not is by looking t the testcases, and so far our testing strategy lacks some things. The main decision I think we need to take is to:
I would go with option 2, but would like to discuss it with you. |
@alrevuelta
This will produce a lot of tests, generate a lot of data without producing a massive number of files. |
Agree
Can you show where is this random float generation done? What I have seen so far are not randomly generated. example Its nice to autogenerate as much as possible, but I think it is important to have some "manual" work when writing the testcases, so we can take into account the different particularities of each operator or type. So not just generate some float values and convert them to other types, but try to find some edge cases.
We are lucky that onnx is already implemented and working, so there is no need to use numpy. We can just use the onnx runtime to calculate the expected values. |
It would be nice to autogenerate the operator overview
The text was updated successfully, but these errors were encountered: