[RELAY][FRONTEND]Onnx to relay frontend#2302
Conversation
|
@zhreshold I had some patches on top of this change. you can apply this. |
fea4c06 to
c045be0
Compare
|
@srkreddy1238 @nishi-t @Huyuwei @hlu1 please help review this PR |
srkreddy1238
left a comment
There was a problem hiding this comment.
Initial review. Will revisit after the CI passed.
| 'dilations': ('dilation', (0, 0)), | ||
| 'pads': ('padding', (0, 0), revert_caffe2_pad), | ||
| 'group': ('groups', 1)}, | ||
| custom_check=dimension_constraint())(inputs, attr, params) |
There was a problem hiding this comment.
I think we shouldn't pass 3 inputs (if bias available).
There was a problem hiding this comment.
will strip to first 2 inputs
| }, | ||
| disables=['output_shape'], | ||
| extras={'use_bias': len(inputs) == 3}, | ||
| custom_check=dimension_constraint())(inputs, attr, params) |
| return AttrCvt('upsampling')(inputs, attr) | ||
|
|
||
|
|
||
| class Shape(OnnxOpConverter): |
There was a problem hiding this comment.
The shape op is a workaround in NNVM. We may need to revisit in relay.
Can be disabled here and handled outside this PR.
| 'LRN': LRN.get_converter(opset), | ||
|
|
||
| # defs/reduction | ||
| #'ReduceMax': AttrCvt('max', transforms={'axes': 'axis'}), |
There was a problem hiding this comment.
Did you forget to delete above comment line?
There was a problem hiding this comment.
yes, will clean the comments
|
@srkreddy1238 @nishi-t Can you guys help review again? |
jroesch
left a comment
There was a problem hiding this comment.
Added a bunch of comments on the PR. Mostly issues with documentation, otherwise if tests pass looks good.
| return default | ||
|
|
||
| def get_relay_op(op_name): | ||
| """Get the callable function from relay based on opname: |
There was a problem hiding this comment.
| """Get the callable function from relay based on opname: | |
| """Get the callable function from Relay based on operator name. |
| Parameters | ||
| ---------- | ||
| op_name : str | ||
| The relay operator name. |
There was a problem hiding this comment.
| The relay operator name. | |
| The Relay operator name. |
| return out_shapes | ||
|
|
||
| def infer_channels(inputs, transpose=False): | ||
| """A hack for getting 'channels' or 'units' since caffe2 don't provide |
There was a problem hiding this comment.
| """A hack for getting 'channels' or 'units' since caffe2 don't provide | |
| """A hack for getting 'channels' or 'units' since caffe2 does not provide |
| @@ -0,0 +1,1077 @@ | |||
| # pylint: disable=invalid-name, import-self, len-as-condition, unused-argument, too-many-lines | |||
| """ONNX: Open Neural Network Exchange frontend for relay.""" | |||
There was a problem hiding this comment.
| """ONNX: Open Neural Network Exchange frontend for relay.""" | |
| """ONNX: Open Neural Network Exchange frontend for Relay.""" |
| return _impl | ||
|
|
||
| def revert_caffe2_pad(pads): | ||
| """Caffe2 require two times the normal padding.""" |
There was a problem hiding this comment.
| """Caffe2 require two times the normal padding.""" | |
| """Caffe2 requires two times the normal padding.""" |
| inputs, | ||
| attrs, | ||
| opset): | ||
| """Convert from onnx operator to relay operator. |
There was a problem hiding this comment.
| """Convert from onnx operator to relay operator. | |
| """Convert ONNX operator into a Relay operator. |
| def from_onnx(model, | ||
| shape=None, | ||
| dtype="float32"): | ||
| """Convert from ONNX"s model into compatible relay Function. |
There was a problem hiding this comment.
| """Convert from ONNX"s model into compatible relay Function. | |
| """Convert a ONNX model into an equivalent Relay function. |
| shape=None, | ||
| dtype="float32"): | ||
| """Convert from ONNX"s model into compatible relay Function. | ||
| Onnx graph is a python protobuf object. The companion parameters will be handled automatically. |
There was a problem hiding this comment.
| Onnx graph is a python protobuf object. The companion parameters will be handled automatically. | |
| ONNX graphs are represented as a Python Protobuf object. The companion parameters will be handled automatically. |
|
|
||
| This article is an introductory tutorial to deploy ONNX models with Relay. | ||
|
|
||
| For us to begin with, onnx module is required to be installed. |
There was a problem hiding this comment.
| For us to begin with, onnx module is required to be installed. | |
| For us to begin with, ONNX package must be installed. |
| with relay.build_config(opt_level=1): | ||
| graph, lib, params = relay.build(sym, target, params=params) | ||
|
|
||
| ###################################################################### |
There was a problem hiding this comment.
Can we maybe just show how to use the create_executor interface in the tutorial as well? its much simpler then building a graph runtime by hand
There was a problem hiding this comment.
@jroesch Can you please advice how to elegantly evaluate executor created with an unordered dict of weights?
exec = relay.build_module.create_executor('graph', mod, tvm.cpu(0), 'llvm')
# params is a dict of {'name', tvm.ndarray}
tvm_output = exec.evaluate(mode)(x, *params.values()).asnumpy() # wrong orderHow do I feed the executor with correct weights without changing the params dict to odict?
There was a problem hiding this comment.
@zhreshold I realize the current API does not actually support keyword argument style for parameters.
I'm going to update it with a PR right now, it should work like this:
exec = relay.build_module.create_executor('graph', mod, tvm.cpu(0), 'llvm')
tvm_output = exec.evaluate(mode)(x, **params).asnumpy()
Does that seem elegant enough?
There was a problem hiding this comment.
Yes, that would perfectly solve my current problem
There was a problem hiding this comment.
There was a problem hiding this comment.
Will update once the PR get merged.
Co-authored-by: Siju Samuel <siju.samuel@huawei.com>
Co-authored-by: Siju Samuel <siju.samuel@huawei.com>
| def _impl_v1(cls, inputs, attr, params): | ||
| # Result of this operator is prominently used by reshape operator. | ||
| # Just pass the input as it is so that reshape_like can be used there. | ||
| print("Shape: Differently implemented in relay as a bypass (dummy operator)") |
There was a problem hiding this comment.
Use logging.warning instead of print.
|
@srkreddy1238 @jroesch @nishi-t please help review this PR again |
|
@zhreshold Should we put the tests under https://github.com/dmlc/tvm/tree/master/tests/python/frontend, and add it to CI https://github.com/dmlc/tvm/blob/master/tests/scripts/task_python_frontend.sh/#L30-L34? |
|
@Huyuwei Seems like there is |
|
@zhreshold tests/python/frontend is the right place to hold frontend tests, tests/python/relay/frontend is duplicated and should be removed. In fact, tests/python/relay/frontend is not running in the CI |
|
@Huyuwei I'll remove the duplicate tests in a separate PR. |
|
ok, working on that |
ONNX -> relay frontend migration.
This is co-authered by @siju-samuel with initial attempt #2245
relay/frontendcommonmodule.