Home
last modified time | relevance | path

Searched refs:replicas (Results 1 – 19 of 19) sorted by relevance

/external/tensorflow/tensorflow/compiler/xla/service/
Dservice.cc415 TF_ASSIGN_OR_RETURN(auto replicas, Replicas(*backend, device_handles[i])); in ExecuteParallelAndRegisterResult()
416 CHECK_EQ(replicas.size(), arguments[i].size()); in ExecuteParallelAndRegisterResult()
417 for (int64 replica = 0; replica < replicas.size(); ++replica) { in ExecuteParallelAndRegisterResult()
418 device_assignment(replica, i) = replicas[replica]->device_ordinal(); in ExecuteParallelAndRegisterResult()
424 TF_ASSIGN_OR_RETURN(auto replicas, Replicas(*backend, device_handles[i])); in ExecuteParallelAndRegisterResult()
425 CHECK_EQ(replicas.size(), arguments[i].size()); in ExecuteParallelAndRegisterResult()
427 for (int64 replica = 0; replica < replicas.size(); ++replica) { in ExecuteParallelAndRegisterResult()
429 backend->BorrowStream(replicas[replica])); in ExecuteParallelAndRegisterResult()
525 TF_ASSIGN_OR_RETURN(auto replicas, Replicas(*backend, device_handle)); in ExecuteAndRegisterResult()
526 TF_RET_CHECK(!replicas.empty()); in ExecuteAndRegisterResult()
[all …]
Dhlo.proto210 // Describes how parameters behave with regards to replicas.
/external/tensorflow/tensorflow/core/api_def/base_api/
Dapi_def_AllToAll.pbtxt49 summary: "An Op to exchange data across TPU replicas."
52 `split_dimension` and send to the other replicas given group_assignment. After
53 receiving `split_count` - 1 blocks from other replicas, we concatenate the
56 For example, suppose there are 2 TPU replicas:
Dapi_def_TPUReplicate.pbtxt13 additional arguments to broadcast to all replicas. The
41 the number of replicas of the computation to run.
84 replicas.
Dapi_def_TPUReplicateMetadata.pbtxt7 Number of replicas of the computation
/external/tensorflow/tensorflow/python/distribute/
Dinput_lib.py135 replicas = []
148 replicas.append(next_element)
184 lambda: replicas[i][j],
189 replicas = results
195 flattened_replicas = nest.flatten(replicas)
198 replicas = nest.pack_sequence_as(replicas, flattened_replicas)
200 return values.regroup(self._input_workers.device_map, replicas)
/external/autotest/site_utils/
Dcheck_slave_db_delay.py102 for replica in options.replicas:
104 if not options.replicas:
/external/tensorflow/tensorflow/python/tpu/
Ddevice_assignment.py50 for core, replicas in core_to_replicas.items():
51 core_to_sorted_replicas[core] = sorted(replicas)
Dtpu.py824 replicas = [flat_inputs[replica][i] for replica in xrange(num_replicas)]
826 tpu_ops.tpu_replicated_input(replicas, name="input{}".format(i)))
/external/tensorflow/tensorflow/compiler/xrt/
Dxrt.proto20 // As many replicas as there are in the replicated computation.
30 // The number of replicas the computation will be run on. If this is
/external/tensorflow/tensorflow/core/protobuf/tpu/
Doptimization_parameters.proto214 // for storing the replicas of hot IDs for the embedding table. In future, the
215 // number of replicas for a particular hot ID could be adjusted based on its
216 // frequency. The max_slot_count value captures the total number of replicas
/external/autotest/
DREADME.md32 * Infrastructure to set up miniature replicas of a full lab. A full lab does
/external/tensorflow/tensorflow/compiler/xla/
Dxla_data.proto307 // number of replicas.
347 // ComputationDevice represents the device ids assinged to the replicas.
609 // The ids of the replicas that belongs to the same group. The ordering of the
635 // Describes whether all data-parallelism replicas will receive the same
Dxla.proto299 // Number of replicas of the computation to run. If zero, uses the default
300 // number of replicas for the XLA service.
/external/tensorflow/tensorflow/compiler/xla/g3doc/
Doperation_semantics.md49 all replicas belong to one group in the order of 0 - (n-1). Alltoall will be
81 : : : replicas; otherwise, this :
83 : : : of replicas in each group. :
458 replicas.
786 Computes a sum across replicas.
792 `operand` | `XlaOp` | Array to sum across replicas.
796 replicas and the operand has the value `(1.0, 2.5)` and `(3.0, 5.25)`
797 respectively on the two replicas, then the output value from this op will be
798 `(4.0, 7.75)` on both replicas.
801 either be empty (all replicas belong to a single group), or contain the same
[all …]
/external/python/cpython2/Doc/library/
Dturtle.rst1409 Both are replicas of the corresponding TurtleScreen methods.
/external/jline/src/src/test/resources/jline/example/
Denglish.gz
/external/cldr/tools/java/org/unicode/cldr/util/data/transforms/
Dinternal_raw_IPA.txt137526 replicas %32411 rˈɛpləkəz
Dinternal_raw_IPA-old.txt164102 replicas %21841 rˈɛpləkəz