[Frontend][PaddlePaddle] Add autopad for conv/pool#9295
Conversation
merge to newest code
|
@areusch @comaniac @AndrewZhaoLuo @mbrookhart Hi, Could you help to review this pull request, all the tests have passed |
|
I'll take a look today. |
AndrewZhaoLuo
left a comment
There was a problem hiding this comment.
Some comments, overall LGTM though I need to maybe read the spec a bit more closely.
|
|
||
| def _get_pad_size(in_size, dilated_kernel_size, stride_size): | ||
| """Calculate the paddings size for Conv/Pool in SAME padding mode.""" | ||
| def _autopad( |
There was a problem hiding this comment.
Can you just use
tvm/python/tvm/relay/frontend/onnx.py
Lines 412 to 472 in d0c6ca5
- Refactor _autopad in the onnx.py file to
tvm/python/tvm/relay/frontend/common.py
There was a problem hiding this comment.
Good advice, I think this function also works for tensorflow and tflite to solve dynamic shape problem.
shape_of and autopad are both removed to common.py
| pad_h = _get_pad_size(in_h, (k_h - 1) * dilations[0] + 1, strides[0]) | ||
| pad_w = _get_pad_size(in_w, (k_w - 1) * dilations[1] + 1, strides[1]) | ||
| paddings = [pad_h[0], pad_w[0], pad_h[1], pad_w[1]] | ||
| dilations = [1, 1] |
There was a problem hiding this comment.
do we mean to override the dilations?
There was a problem hiding this comment.
This is a history issue for Paddle framework, while padding==SAME, it will force dliations = 1
Here is the implementation code https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/fluid/operators/conv_op.h#L113
There was a problem hiding this comment.
To avoid confusion, I put a comment on this line of code to explain this problem.
Refactor autopad in the onnx.py and paddlepaddle.py to relay/frontend…
* Add autopad for conv/pool * add autopad for conv/pool * fix pylint warning * add some annotations * add som annotations * add som annotations * Refactor autopad in the onnx.py and paddlepaddle.py to relay/frontend/common.py * add comment for conv2d Co-authored-by: heliqi <1101791222@qq.com>
* Add autopad for conv/pool * add autopad for conv/pool * fix pylint warning * add some annotations * add som annotations * add som annotations * Refactor autopad in the onnx.py and paddlepaddle.py to relay/frontend/common.py * add comment for conv2d Co-authored-by: heliqi <1101791222@qq.com>
This will solve the problem with dynamic input shape while padding==
SAMEinconv2d/pool2dalso include
pad3dandsqueeze, this two operators will be used for padding tensor.And the
_autopadfunction refers to ONNX frontend @mbrookhartThanks for contributing to TVM! Please refer to guideline https://tvm.apache.org/docs/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers by @ them in the pull request thread.