Home
last modified time | relevance | path

Searched defs:can_use_flash_attention (Results 1 – 2 of 2) sorted by relevance

/external/pytorch/torch/backends/cuda/
D__init__.py357 def can_use_flash_attention(params: SDPAParams, debug: bool = False) -> bool: function
/external/pytorch/aten/src/ATen/native/transformers/cuda/
Dsdp_utils.cpp553 bool can_use_flash_attention(sdp_params const& params, bool debug) { in can_use_flash_attention() function