[AutoScheduler] New layout rewrite option: Weight pre-transpose#6750
[AutoScheduler] New layout rewrite option: Weight pre-transpose#6750merrymercy merged 15 commits intoapache:mainfrom
Conversation
5042c6a to
f819b60
Compare
|
Hey, thank you for the contribution! May I know the difference between layout rewrite and this weight pre-transpose? Looks like weight pre-transpose can be done in compile time, so why do we insert a stage instead? Thanks a lot! |
| * \brief Several options for applying layout rewrite. | ||
| * This is a optimization to rewrite the shape of input tensor according to the schedule we get. | ||
| */ | ||
| enum class LayoutRewriteOption : int { |
There was a problem hiding this comment.
enum class LayoutRewriteOption : uint8 should be enough.
This could keep the weight shape is the same as before. However, I am curious too what benefit this will bring in. |
|
The code looks good to me. But the naming is not intuitive. For example, the "RewriteWithPlaceholder" option in your code actually accept "PreTransposed" tensors as inputs. /*!
* \brief Options for applying layout rewrite.
* This is an optimization to rewrite the layout of input tensors according to the schedule we get.
*/
enum class LayoutRewriteOption : int {
/*! \brief Do not process layout rewrite. */
NoRewrite = 0,
/*! \brief Insert layout transformation stages for input placeholders in the compute DAG */
InsertTransformStage = 1,
/*!
* \brief Do not insert layout transformation stages and assume the input placeholders
* are pre-transformed.
* \note The lowered function with this option does not accept the origial input shapes,
* so this option must be used along with a layout conversion pass in Relay.
*/
RewriteForPreTransformed = 2,
};In addition, I prefer "transform" over "transpose" because we can support other kinds of rewriting besides the current simple "transpose". |
|
@junrushao1994 @FrozenGene Thanks! |
comaniac
left a comment
There was a problem hiding this comment.
I don't have other comments. Thanks!
|
@jcf94 Please fix the test cases. |
|
The problem is not about random seed. It is because the condition of |
Emm ... That may also be a problem, I'm thinking that these float operations should be in theory absolutely the same, even with a very strict precision. At least the schedules generated with fixed random seed 0 can pass these check. |
|
@merrymercy @jcf94 the tests added by this PR seem flaky. Please see https://ci.tlcpack.ai/blue/organizations/jenkins/tvm/detail/main/124/pipeline |
|
It turns out that #6828 didn't really disable the flaky tests. It simply comments out the function call in |
|
#6841 filed. |
…he#6750) * Add pre transpose support for layout rewrite * Update * Bug fix * Bug fix * Update * Bug fix * CI Fix * Update * Update * Re-trigger CI * Update * Update test_auto_scheduler_layout_rewrite.py * Update test_auto_scheduler_layout_rewrite.py * Update task_scheduler ut, re-trigger CI Co-authored-by: Lianmin Zheng <lianminzheng@gmail.com>
…he#6750) * Add pre transpose support for layout rewrite * Update * Bug fix * Bug fix * Update * Bug fix * CI Fix * Update * Update * Re-trigger CI * Update * Update test_auto_scheduler_layout_rewrite.py * Update test_auto_scheduler_layout_rewrite.py * Update task_scheduler ut, re-trigger CI Co-authored-by: Lianmin Zheng <lianminzheng@gmail.com>
…he#6750) * Add pre transpose support for layout rewrite * Update * Bug fix * Bug fix * Update * Bug fix * CI Fix * Update * Update * Re-trigger CI * Update * Update test_auto_scheduler_layout_rewrite.py * Update test_auto_scheduler_layout_rewrite.py * Update task_scheduler ut, re-trigger CI Co-authored-by: Lianmin Zheng <lianminzheng@gmail.com>
In Ansor, we have an optimization called "Layout rewrite", which modifies the input weight of a specific op according to its schedule to get better performance, a previous PR about this is #6297 .
This PR brings another option for this feature, besides directly modify the input placeholder, we now support to insert a transpose stage between the placeholder and compute op.
Others:
cc @merrymercy @comaniac @minminsun