Searched refs:NCCL (Results 1 – 15 of 15) sorted by relevance
63 NCCL = 273, enumerator160 #define NCCL 273 macro
47 #define LEAF case CCL: case NCCL: case CHAR: case DOT: case FINAL: case ALL:651 case NCCL: in primary()652 np = op2(NCCL, NIL, (Node *) cclenter((char *) rlxstr)); in primary()683 case CHAR: case DOT: case ALL: case EMPTYRE: case CCL: case NCCL: case '$': case '(': in concat()855 return NCCL; in relex()887 || (k == NCCL && !member(c, (char *) f->re[p[i]].lval.up) && c != 0 && c != HAT)) { in cgoto()962 if (f->re[i].ltype == CCL || f->re[i].ltype == NCCL) in freefa()
53 %token <i> FINAL DOT ALL CCL NCCL CHAR OR STAR QUEST PLUS EMPTYRE
131 NCCL = 273, enumerator228 #define NCCL 273 macro
9 name: "NCCL"
2 # Wrap NVIDIA (https://github.com/NVIDIA/nccl) NCCL with tensorflow ops.
32 # Link NCCL libray and header where the build script expects them.
401 cross_device_ops_lib.CollectiveCommunication.NCCL) and
940 NCCL = "NCCL" variable in CollectiveCommunication
306 communication=cross_device_ops_lib.CollectiveCommunication.NCCL))
444 message(FATAL_ERROR "NCCL is required for GPU-build")
144 * NCCL (GPU build on Linux)
443 // If true, use NCCL for CollectiveOps. This feature is highly
8 * Moved NCCL to core.318 …ilt against NCCL 2.2 and no longer include NCCL in the binary install. TensorFlow usage with multi…