• Home
  • Raw
  • Download

Lines Matching full:shape

1 # Shape Inference
3 Shape inference as discussed here is considered a specific instance of type
6 dimensions. While some operations have no compile time fixed shape (e.g., output
7 shape is dictated by data) we could still have some knowledge of
11 shape.
15 `InferShapedTypeOpInterface` is used to implement the shape and element type
16 inference. The return type can often be deduced from the deduced return shape
20 ## Shape functions
22 The C++ interfaces are the base mechanism whereby shape inference is queried and
23 executed, but not the intended way to specify shape constraints in general.
25 Initially the shape inference will be declaratively specified using:
34 type (shape and elemental type) between operands and results (e.g., the
37 NOTE: The C++ shape functions are an intermediate step until the shape dialect
43 Shape inference is currently tested alongside type inference by
54 ## Shape dialect
56 This section details the shape type inference dialect (`shape`). The initial
57 focus will be on shape functions that describe shape functions could be used in
62 This will focus on the shape functions (e.g., determine the rank and dimensions
63 of the output shape). As shown in the shaped container type, shape will be one
69 * Not all shape functions need to provide all the information (e.g., one could
75 An argument could be made that these are metadata function instead of shape
76 functions, with some considering shape and elemental types different and some considering them both…
77 part of shape. But `shape function` is IMHO descriptive and metadata can span
82 The requirements for the shape inference functions are determined by the
83 requirements of shape inference, but we believe the requirements below still
84 allow freedom to consider different shape inference approaches and so we do not
85 impose a particular shape inference approach here.
87 #### Shape inference functions
89 * **Expressiveness** shape functions need to support programs where tensors
92 * **Shape error detection** Many operations will have constraints on their
96 * This also aligns with the requirement that the shape function description
98 * Shape error functions should be easy to understand, at least what
99 constraint of the operation is violated. This also requires that shape
101 shape function (e.g., the author would be able to give the semantic
106 [Inlining shape checking](#inline)) be elided.
111 * Shape functions usable by compiler and runtime.
115 * Shape function description should not be constrained by either runtime
120 shapes, then it need not consider a more generic shape lattice even
121 though the shape description supports it.
131 * Shape inference functions are expressible at runtime
133 * User can define a shape function for a new operation dynamically at runtime,
134 this allows for vendors to describe an operation and shape function
139 * Doesn't require graph-wide shape information (e.g., only require local
142 * Shape functions should be cheap to invoke on each kernel launch.
143 * Shape function can be dictated by arguments (operands, attributes and regions)
146 * Shape information that needs higher-level/graph information should use
151 * Shape functions should be pure functions.
157 determining the shape & then post to be able to actually consume the
160 * The shape function operation dialect should be interoperable with non-shape function dialect op…
177 * The shape function should be expandable such that symbolic equality and
179 shape inference.
181 * E.g., the shape functions may contain more information that is only
182 useful when used from shape inference;
184 * Shape functions are allowed to fail and report an error. The error reporting
195 1. The shape dialect is an IR representations and not a programming language;
200 1. Describe the shape inference approach that will use the shape functions;
201 * The goal is that the shape functions and the constraints one could
204 static information is used for shape output, unranked for everything
209 * While the shape functions will be able to emit errors optionally, it
212 the literature that the iteration order for shape inference affect the
213 quality of the error message produced, and the shape functions do not
215 1. Flow sensitive shape functions;
216 * To enable scalable/cheap shape inference, the shape functions do not
219 the shape functions/constraints due to the shape functions.
226 #### Inline shape inference checks {#inline}
228 Shape functions should be lowerable to runtime checks for validity. E.g. verify
230 shape dynamically and or falling back to runtime checks for attributes not
235 shape inference should not insert constructs that interfere with optimization
245 an operation. Where these are sufficient to constrain the output shape (e.g.,
246 `SameOperandAndResultType` or broadcastable) we should generate the shape
247 function from those. Where not, an explicit shape function should be specified
251 #### Why not extract the shape function from reference implementation?
253 This could be done in future! The extracted shape function would use the shape
255 structured way, one could autogenerate the shape function.
257 #### How/in what language will the shape functions be authored?
262 #### What shape inference approach is being suggested here?
264 None. There are multiple different shape inference approaches that we could
266 useful (return fixed shape for constant inputs/arguments) to the more advanced
272 1. Should shape functions that produce dynamic outputs given all statically
279 Shape functions are determined by attributes and could be arbitrarily
283 type and shape]) and so these should be easy to specify. Algebraic relationships
289 Instead of specifying an additional mechanism to specify a shape transfer
291 the shape function. The reference implementation is general and can support the