Home
last modified time | relevance | path

Searched refs:is_tpu (Results 1 – 3 of 3) sorted by relevance

/external/tensorflow/tensorflow/python/keras/distribute/
Ddistributed_training_utils.py56 is_tpu = backend.is_tpu_strategy(strategy)
57 if ((not is_tpu) and strategy and ds_context.in_cross_replica_context()):
Dminimize_loss_test.py418 combinations.combine(is_tpu=[False])) + combinations.combine(
422 is_tpu=[True]))
423 def testRunStepsWithOutputContext(self, distribution, optimizer_fn, is_tpu): argument
/external/tensorflow/tensorflow/core/common_runtime/eager/
Dexecute.cc291 const bool is_tpu = device != nullptr && device->device_type() == "TPU"; in GetDeviceForInput() local
294 is_tpu ? MTypeFromDTypeIntsOnDevice(tensor_handle->dtype) in GetDeviceForInput()