[DOCS] Introduction to Relay IR.#2185
Merged
tqchen merged 5 commits intoapache:masterfrom Nov 29, 2018
Merged
Conversation
Member
Author
Member
|
cc @szha |
944c947 to
7c64e66
Compare
masahi
reviewed
Nov 28, 2018
| framework developer choose the representation they are familiar with. | ||
| This does, however, have some implications on how we write passes: | ||
|
|
||
| - If you come from a data-flow background and want to handle let, keep a map of var to the expressions so you can perform lookup when encountering a var. This is a likely means a minimum change as we already need a map from expr-> transformed expression anyway. Note that this will effectively remove all the let in the program. |
Member
There was a problem hiding this comment.
This is a likely means -> This likely means
zhiics
reviewed
Nov 28, 2018
| The Module can be viewed as a ``Map<GlobalVar, Function>``. Here GlobalVar is just an id that is used to represent the functions | ||
| in the module. ``@muladd`` and ``@myfunc`` are GlobalVars in the above example. When a CallNode is used to call another function, | ||
| the corresponding GlobalVar is stored in the op field of the CallNode. It contains a level of indirection -- we need to look up | ||
| body of the called function from the modele using the corresponding GlobalVar. In this particular case, we could also directly |
|
|
||
| Different data structures will impact how you might write transformations, and we need to keep that in mind. | ||
| So now, as a deep learning framework developer, you might ask, why do we need let-binding. | ||
| Yours PL friends will always tell you that let is important -- as PL is a quite established field, |
zhiics
reviewed
Nov 28, 2018
| This article introduces Relay IR -- the second generation of NNVM. | ||
| We expect readers from two kinds of background -- those who have a programming language background and deep learning | ||
| framework developers who are familiar with the computational graph representation. | ||
| This article is mainly written for deep learning framework developers who are familiar with the computational graph representation. |
zhiics
reviewed
Nov 28, 2018
| Build Computational Graph with Relay | ||
| ------------------------------------ | ||
| Traditional deep learning frameworks use computational graphs as their intermediate representation. | ||
| A computational graph (or data-flow graph), is a directed acyclic graph (DAG) that represent the computation. |
zhiics
reviewed
Nov 28, 2018
| construct a simple two-node graph. You can find that the syntax of the example is not that different from existing | ||
| computational graph IR like NNVMv1, with the only difference in terms of terminology: | ||
|
|
||
| - Existing frameworks usually uses graph and subgraph |
zhiics
reviewed
Nov 28, 2018
|
|
||
| Each data-flow node is a CallNode in Relay. The relay python DSL allows you to construct a data-flow quickly. | ||
| One thing we want to highlight in the above code -- is that we explicitly constructed an Add node with | ||
| both input points to ``%1``. When a deep learning framework evaluates the above program, it will compute |
zhiics
reviewed
Nov 28, 2018
| } | ||
|
|
||
| Let binding solves this problem, as the computation of the value happens at the let node. In both programs, | ||
| if we change ``%1 = log(%x)`` to ``let %v1 = log(%x)``, we clearly specifies the computation location to |
zhiics
reviewed
Nov 28, 2018
| -- we don’t need to worry about where to put the let when we generate the code. The dataflow form also gives more freedom | ||
| to the later passes to decide where to put the evaluation point. As a result, it might not be a bad idea to use data flow | ||
| form of the program in the initial phases of optimizations when you find it is convenient. | ||
| As a matter of fact, many optimizations in relay today are written to optimize dataflow programs. |
Member
Author
zhiics
approved these changes
Nov 29, 2018
Member
|
Cool. I think the tutorial is really helpful for people to understand Relay. |
masahi
approved these changes
Nov 29, 2018
zhiics
reviewed
Nov 29, 2018
| Since program optimizations take these AST data structures and transform them, the two different structure will | ||
| affect the compiler code we are going to write. For example, if we want to detect a pattern ``add(log(x), y)``: | ||
|
|
||
| - In the data-flow form, we can first access the add node, then directly look at its first arguments to see if it is a log |
| One thing we want to highlight in the above code -- is that we explicitly constructed an Add node with | ||
| both input point to ``%1``. When a deep learning framework evaluates the above program, it will compute | ||
| the nodes in topological order, and ``%1`` will only be computed once. | ||
| While the this fact is very natural to deep learning framework builders, it is something that might |
Member
|
I found two more typos after another reading. |
Member
Author
16 tasks
FrozenGene
pushed a commit
to FrozenGene/tvm
that referenced
this pull request
Dec 27, 2018
wweic
pushed a commit
to neo-ai/tvm
that referenced
this pull request
Feb 20, 2019
wweic
pushed a commit
to neo-ai/tvm
that referenced
this pull request
Feb 20, 2019
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This is an introduction material to Relay IR for developers who have a background on data-flow and computational graphs. This tutorial is a result of discussion with @jroesch @yzhliu @junrushao1994 @MarisaKirisame @slyubomirsky @joshpoll and other folks. We try to blend the views from deep learning frameworks and PL