Home
last modified time | relevance | path

Searched refs:CudnnBatchNormForwardInferenceThunk (Results 1 – 3 of 3) sorted by relevance

/external/tensorflow/tensorflow/compiler/xla/service/gpu/
Dcudnn_batchnorm_thunk.h48 class CudnnBatchNormForwardInferenceThunk : public Thunk {
50 CudnnBatchNormForwardInferenceThunk(ThunkInfo thunk_info,
59 CudnnBatchNormForwardInferenceThunk(
60 const CudnnBatchNormForwardInferenceThunk&) = delete;
61 CudnnBatchNormForwardInferenceThunk& operator=(
62 const CudnnBatchNormForwardInferenceThunk&) = delete;
Dcudnn_batchnorm_thunk.cc34 CudnnBatchNormForwardInferenceThunk::CudnnBatchNormForwardInferenceThunk( in CudnnBatchNormForwardInferenceThunk() function in xla::gpu::CudnnBatchNormForwardInferenceThunk
50 Status CudnnBatchNormForwardInferenceThunk::ExecuteOnStream( in ExecuteOnStream()
Dir_emitter_unnested.cc1469 CudnnBatchNormForwardInferenceThunk>( in EmitBatchNormThunkFromMlir()