Home
last modified time | relevance | path

Searched refs:alignment_size (Results 1 – 4 of 4) sorted by relevance

/external/pytorch/torch/nested/_internal/
Dsdpa.py587 tensor: torch.Tensor, alignment_size: int, slice: bool
595 if last_dim_size % alignment_size == 0:
597 pad_count = alignment_size - (last_dim_size % alignment_size)
/external/pytorch/torch/_inductor/fx_passes/
Dpad_mm.py119 def get_padded_length(x: Union[int, torch.SymInt], alignment_size) -> int: argument
121 if isinstance(x, torch.SymInt) or alignment_size == 0 or x % alignment_size == 0:
128 return int((x // alignment_size + 1) * alignment_size) - x
/external/pytorch/aten/src/ATen/native/transformers/
Dattention.cpp578 template <int alignment_size, bool slice>
581 if (last_dim_size % alignment_size == 0) { in pad_last_dim()
584 auto pad_count = alignment_size - (last_dim_size % alignment_size); in pad_last_dim()
/external/pytorch/test/
Dtest_transformers.py1734 def pad_last_dim(input_tensor, alignment_size, slice: bool = False): argument
1736 if (last_dim_size % alignment_size == 0):
1738 pad_count = alignment_size - (last_dim_size % alignment_size)