| /external/google-cloud-java/java-webrisk/google-cloud-webrisk/src/main/java/com/google/cloud/webrisk/v1/ |
| D | WebRiskServiceClient.java | 8 * https://www.apache.org/licenses/LICENSE-2.0 39 // AUTO-GENERATED DOCUMENTATION AND CLASS. 49 * // It will require modifications to work: 50 * // - It may require correct/in-range values for request initialization. 51 * // - It may require specifying regional endpoints when creating the service client as shown in 64 * as threads. In the example above, try-with-resources is used, which automatically calls close(). 82 * <p>Many parameters require resource names to be formatted in a particular way. To assist with 93 * // It will require modifications to work: 94 * // - It may require correct/in-range values for request initialization. 95 * // - It may require specifying regional endpoints when creating the service client as shown in [all …]
|
| /external/google-cloud-java/java-webrisk/google-cloud-webrisk/src/main/java/com/google/cloud/webrisk/v1beta1/ |
| D | WebRiskServiceV1Beta1Client.java | 8 * https://www.apache.org/licenses/LICENSE-2.0 37 // AUTO-GENERATED DOCUMENTATION AND CLASS. 47 * // It will require modifications to work: 48 * // - It may require correct/in-range values for request initialization. 49 * // - It may require specifying regional endpoints when creating the service client as shown in 63 * resources such as threads. In the example above, try-with-resources is used, which automatically 82 * <p>Many parameters require resource names to be formatted in a particular way. To assist with 93 * // It will require modifications to work: 94 * // - It may require correct/in-range values for request initialization. 95 * // - It may require specifying regional endpoints when creating the service client as shown in [all …]
|
| /external/mbedtls/docs/architecture/psa-migration/ |
| D | md-cipher-dispatch.md | 1 PSA migration strategy for hashes and ciphers 50 because this is inefficient and error-prone. 54 …e). However, this can't be done without breaking [backward compatibility](#backward-compatibility). 56 The goal of this work is to arrange for more non-PSA interfaces to use PSA interfaces under the hoo… 58 …likely has better performance, and sometimes better security, than the built-in software implement… 64 …TLS_SHA256_C` for SHA256, `MBEDTLS_AES_C && MBEDTLS_CIPHER_MODE_CBC` for AES-CBC, etc. In code tha… 78 ### Non-goals 92 …-covered modules call PSA, but only [when this will actually work](#why-psa-is-not-always-possible… 93 …-covered module which calls another module, for example X.509 calling pk for PSS verification whic… 107 * **Legacy domain**: does not interact with PSA. Implementations of hashes, of cipher primitives, o… [all …]
|
| D | strategy.md | 10 G2. Allow isolation of long-term secrets (for example, private keys). 11 G3. Allow isolation of short-term secrets (for example, TLS session keys). 18 implemented, see `docs/use-psa-crypto.md`, where new APIs are about (G2), and 28 Compile-time options 31 We currently have a few compile-time options that are relevant to the migration: 33 - `MBEDTLS_PSA_CRYPTO_C` - enabled by default, controls the presence of the PSA 35 - `MBEDTLS_USE_PSA_CRYPTO` - disabled by default (enabled in "full" config), 38 - `PSA_CRYPTO_CONFIG` - disabled by default, supports builds with drivers and 43 - it's not fully compatible with `MBEDTLS_ECP_RESTARTABLE`: you can enable 46 - to avoid a hard/default dependency of TLS, X.509 and PK on [all …]
|
| /external/pigweed/pw_build/py/pw_build/ |
| D | pip_install_python_deps.py | 7 # https://www.apache.org/licenses/LICENSE-2.0 32 def _parse_args() -> tuple[argparse.Namespace, list[str]]: 35 '--python-dep-list-files', 44 '--gn-packages', 51 '--editable-pip-install', 55 '\'--editable\' option.' 70 ) -> int: 76 with --require-hashes. 98 command_args = [sys.executable, "-m", "pip"] 101 command_args.append('--editable') [all …]
|
| /external/pigweed/pw_build/python_dist/ |
| D | setup.bat | 7 :: https://www.apache.org/licenses/LICENSE-2.0 20 python -m venv "%ROOT_DIR%\python-venv" 22 set "python=%ROOT_DIR%\python-venv\Scripts\python.exe" 27 set "CONSTRAINT_ARG=--constraint=%CONSTRAINT_PATH%" 33 set "EXTRA_REQUIREMENT_ARG=--requirement=%EXTRA_REQUIREMENT_PATH%" 37 :: Note: pip install --require-hashes will be triggered if any hashes are present 39 call "%python%" -m pip install ^ 40 "--find-links=%ROOT_DIR%python_wheels" ^ 41 "--requirement=requirements.txt" %EXTRA_REQUIREMENT_ARG% %CONSTRAINT_ARG%
|
| D | setup.sh | 9 # https://www.apache.org/licenses/LICENSE-2.0 17 set -o xtrace -o errexit -o nounset 20 DIR="$(python3 -c "import os; print(os.path.dirname(os.path.abspath(os.path.realpath(\"$SRC\"))))")" 21 VENV="${DIR}/python-venv" 26 if [ ! -z "${1-}" ]; then 27 VENV="${1-}" 32 if [ -f "${CONSTRAINTS_PATH}" ]; then 33 CONSTRAINTS_ARG="-c ""${CONSTRAINTS_PATH}" 37 if [ -f "${EXTRA_REQUIREMENT_PATH}" ]; then 38 EXTRA_REQUIREMENT_ARG="-r ""${EXTRA_REQUIREMENT_PATH}" [all …]
|
| /external/skia/tools/run-wasm-gm-tests/ |
| D | run-wasm-gm-tests.js | 3 …-wasm-gm-tests --js_file ../../out/wasm_gm_tests/wasm_gm_tests.js --wasm_file ../../out/wasm_gm_te… 5 const puppeteer = require('puppeteer'); 6 const express = require('express'); 7 const path = require('path'); 8 const bodyParser = require('body-parser'); 9 const fs = require('fs'); 10 const commandLineArgs = require('command-line-args'); 11 const commandLineUsage = require('command-line-usage'); 27 description: '(required) The hashes that should not be written to disk.' 41 description: 'Whether we should run in non-headless mode with GPU.', [all …]
|
| /external/swiftshader/third_party/llvm-16.0/llvm/lib/DebugInfo/PDB/Native/ |
| D | TpiStreamBuilder.cpp | 1 //===- TpiStreamBuilder.cpp - -------------------------------------------===// 5 // SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception 7 //===----------------------------------------------------------------------===// 67 // FIXME: Require it. in addTypeRecord() 74 ArrayRef<uint32_t> Hashes) { in addTypeRecords() argument 75 // Ignore empty type buffers. There should be no hashes or sizes in this case. in addTypeRecords() 77 assert(Sizes.empty() && Hashes.empty()); in addTypeRecords() 84 assert(Sizes.size() == Hashes.size() && "sizes and hashes should be in sync"); in addTypeRecords() 90 llvm::append_range(TypeHashes, Hashes); in addTypeRecords() 99 H->Version = VerHeader; in finalize() [all …]
|
| /external/rappor/bin/ |
| D | README.md | 7 -------------- 9 ### decode-dist 11 Decode a distribution -- requires a "counts" file (summed bits from reports), 12 map file, and a params file. See `test.sh decode-dist` in this dir for an 15 ### decode-assoc 18 `test.sh decode-assoc-R` or `test.sh decode-assoc-cpp` in this dir for an 25 Both of these tools are written in R, and require several R libraries to be 26 installed (see `../setup.sh r-packages`). 28 `decode-assoc` also shells out to a native binary written in C++ if 29 `--em-executable` is passed. This requires a C++ compiler (see [all …]
|
| /external/swiftshader/third_party/llvm-10.0/llvm/lib/DebugInfo/CodeView/ |
| D | TypeStreamMerger.cpp | 1 //===-- TypeStreamMerger.cpp ------------------------------------*- C++ -*-===// 5 // SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception 7 //===----------------------------------------------------------------------===// 26 return Idx.getIndex() - TypeIndex::FirstNonSimpleIndex; in slotForIndex() 46 /// - Begin with a new empty stream, and a new empty hash table that maps from 48 /// - For each new type stream, maintain a map from source type index to 50 /// - For each record, copy it and rewrite its type indices to be valid in the 52 /// - If the new type record is not already present in the destination stream 55 /// - If the type record already exists in the destination stream, discard it 70 // might potentially back-reference this data. We also don't want to resolve in TypeStreamMerger() [all …]
|
| /external/swiftshader/third_party/llvm-16.0/llvm/lib/DebugInfo/CodeView/ |
| D | TypeStreamMerger.cpp | 1 //===-- TypeStreamMerger.cpp ------------------------------------*- C++ -*-===// 5 // SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception 7 //===----------------------------------------------------------------------===// 26 return Idx.getIndex() - TypeIndex::FirstNonSimpleIndex; in slotForIndex() 46 /// - Begin with a new empty stream, and a new empty hash table that maps from 48 /// - For each new type stream, maintain a map from source type index to 50 /// - For each record, copy it and rewrite its type indices to be valid in the 52 /// - If the new type record is not already present in the destination stream 55 /// - If the type record already exists in the destination stream, discard it 70 // might potentially back-reference this data. We also don't want to resolve in TypeStreamMerger() [all …]
|
| /external/cronet/net/http/ |
| D | transport_security_state.h | 2 // Use of this source code is governed by a BSD-style license that can be 31 #include "third_party/abseil-cpp/absl/types/optional.h" 55 // This object manages the in-memory store. Register a Delegate with 59 // http://tools.ietf.org/html/ietf-websec-strict-transport-sec. 99 // |chain|, with the set of hashesh |hashes|. Note that |hashes| and 100 // |chain| are not guaranteed to be in the same order - that is, the first 101 // hash in |hashes| is NOT guaranteed to be for the leaf cert in |chain|. 105 const HashValueVector& hashes) = 0; 153 const HashedHost& hostname() const { return iterator_->first; } in hostname() 154 const STSState& domain_state() const { return iterator_->second; } in domain_state() [all …]
|
| D | transport_security_state.cc | 2 // Use of this source code is governed by a BSD-style license that can be 43 #include "third_party/abseil-cpp/absl/types/optional.h" 66 // true: Unless a delegate says otherwise, require CT. 75 cert_chain->GetPEMEncodedChain(&pem_encoded_chain); in GetPEMEncodedChainAsList() 97 // Formats a time compliant to ISO 8601 in UTC, e.g. "2020-12-31T23:59:59.999Z". 105 "%04d-%02d-%02dT%02d:%02d:%02d.%03dZ", exploded.year, exploded.month, in TimeFormatAsIso8601() 123 report.Set("include-subdomains", pkp_state.include_subdomains); in GetHPKPReport() 124 report.Set("noted-hostname", pkp_state.domain); in GetHPKPReport() 130 report.Set("served-certificate-chain", in GetHPKPReport() 132 report.Set("validated-certificate-chain", in GetHPKPReport() [all …]
|
| /external/grpc-grpc/tools/run_tests/sanity/ |
| D | check_bazel_workspace.py | 9 # http://www.apache.org/licenses/LICENSE-2.0 25 git_hash_pattern = re.compile("[0-9a-f]{40}") 27 # Parse git hashes from submodules 62 # TODO(stanleycheung): remove when prometheus-cpp has new release 92 # TODO(stanleycheung): remove when prometheus-cpp has new release 141 # we will only be looking for git commit hashes, so concatenating 157 # Parse git hashes from bazel/grpc_deps.bzl {new_}http_archive rules 180 print("- GRPC_DEP_NAMES only:", grpc_dep_names_set - names_set) 181 print("- grpc_deps.bzl only:", names_set - grpc_dep_names_set) 184 # There are some "bazel-only" deps that are exceptions to this sanity check, [all …]
|
| /external/rust/crates/grpcio-sys/grpc/tools/run_tests/sanity/ |
| D | check_bazel_workspace.py | 9 # http://www.apache.org/licenses/LICENSE-2.0 25 git_hash_pattern = re.compile('[0-9a-f]{40}') 27 # Parse git hashes from submodules 111 # we will only be looking for git commit hashes, so concatenating 127 # Parse git hashes from bazel/grpc_deps.bzl {new_}http_archive rules 149 # There are some "bazel-only" deps that are exceptions to this sanity check, 150 # we don't require that there is a corresponding git module for these. 166 if len(workspace_git_hashes - git_submodule_hashes) > 0: 172 print(("workspace_git_hashes - git_submodule_hashes: %s" % 173 (workspace_git_hashes - git_submodule_hashes)))
|
| /external/pigweed/pw_build/ |
| D | python_venv.gni | 7 # https://www.apache.org/licenses/LICENSE-2.0 28 # path = "test-venv" 49 # pip_generate_hashes: (Default: false) Use --generate-hashes When 50 # running pip-compile to compute the final requirements.txt 52 # source_packages: A list of in-tree pw_python_package targets that will be 141 "--depfile", 143 "--destination-dir", 145 "--stampfile", 173 "--gn-root-build-dir", 175 "--output-requirement-file", [all …]
|
| D | python_dist.gni | 7 # https://www.apache.org/licenses/LICENSE-2.0 26 # .wheel sub-targets along with the .wheel sub-targets of all dependencies, 82 "--prefix", 84 "--suffix", 86 "--out-dir", 150 # TODO: b/235245034 - Remove the plumbing-through of invoker's public_deps. 349 "--repo-root", 351 "--tree-destination-dir", 353 "--input-list-files", 361 "--setupcfg-common-file", [all …]
|
| /external/harfbuzz_ng/.github/workflows/ |
| D | macos-ci.yml | 1 name: macos-ci 14 runs-on: macos-latest 17 - name: Checkout 19 - name: Setup Ccache 20 uses: hendrikmuhs/ccache-action@6d1841ec156c39a52b1b23a810da917ab98da1f4 # v1.2.10 22 key: ${{ github.job }}-${{ runner.os }}-${{ runner.arch }} 23 - name: Install Dependencies 31 gobject-introspection \ 36 pkg-config 37 - name: Install Python Dependencies [all …]
|
| D | msvc-ci.yml | 14 runs-on: ${{ matrix.os }} 17 fail-fast: false 19 os: [windows-2019, windows-latest] 21 - name: msvc-2019-x86 22 os: windows-2019 24 - name: msvc-2019-amd64 25 os: windows-latest 30 - name: Checkout 32 - name: Setup Ccache 33 uses: hendrikmuhs/ccache-action@6d1841ec156c39a52b1b23a810da917ab98da1f4 # v1.2.10 [all …]
|
| D | linux-ci.yml | 1 name: linux-ci 15 runs-on: ubuntu-20.04 18 - name: Checkout 20 - name: Setup Ccache 21 uses: hendrikmuhs/ccache-action@6d1841ec156c39a52b1b23a810da917ab98da1f4 # v1.2.10 23 key: ${{ github.job }}-${{ runner.os }}-${{ runner.arch }} 24 - name: Install Dependencies 26 sudo apt-get update 27 sudo apt-get install \ 29 gobject-introspection \ [all …]
|
| /external/google-cloud-java/.github/workflows/ |
| D | readme.yaml | 7 # http://www.apache.org/licenses/LICENSE-2.0 19 - cron: '0 1 * * *' 23 runs-on: ubuntu-latest 26 - uses: actions/checkout@v2 27 - uses: actions/setup-python@v4 29 python-version: '3.7' 31 - run: python3 -m pip install --require-hashes -r .github/requirements.txt 32 - run: python3 generate-readme.py 33 - uses: googleapis/code-suggester@v4 39 upstream_repo: google-cloud-java [all …]
|
| /external/tink/python/tools/distribution/ |
| D | create_release.sh | 8 # http://www.apache.org/licenses/LICENSE-2.0 21 set -euox pipefail 23 declare -a PYTHON_VERSIONS= 45 # [1] https://packaging.python.org/en/latest/glossary/#term-Built-Distribution 61 local -r tink_base_dir="/tmp/tink" 62 local -r tink_py_relative_path="${PWD##*/}" 63 local -r workdir="${tink_base_dir}/${tink_py_relative_path}" 66 --volume "${TINK_PYTHON_ROOT_PATH}/..:${tink_base_dir}" \ 67 --workdir "${workdir}" \ 68 -e TINK_PYTHON_SETUPTOOLS_OVERRIDE_BASE_PATH="${tink_base_dir}" \ [all …]
|
| /external/google-auth-library-java/.kokoro/release/ |
| D | publish_javadoc.sh | 8 # http://www.apache.org/licenses/LICENSE-2.0 16 set -eo pipefail 18 if [[ -z "${CREDENTIALS}" ]]; then 22 if [[ -z "${STAGING_BUCKET}" ]]; then 31 python3 -m pip install --require-hashes -r .kokoro/requirements.txt 34 mvn clean install -B -q -DskipTests=true 36 export NAME=google-auth-library 37 export VERSION=$(grep ${NAME}: versions.txt | cut -d: -f3) 40 mvn site -B -q 45 python3 -m docuploader create-metadata \ [all …]
|
| /external/google-cloud-java/owl-bot-postprocessor/synthtool/gcp/templates/java_library/.kokoro/release/ |
| D | publish_javadoc.sh | 8 # http://www.apache.org/licenses/LICENSE-2.0 16 set -eo pipefail 18 if [[ -z "${CREDENTIALS}" ]]; then 22 if [[ -z "${STAGING_BUCKET}" ]]; then 31 python3 -m pip install --require-hashes -r .kokoro/requirements.txt 34 mvn clean install -B -q -DskipTests=true 37 export VERSION=$(grep ${NAME}: versions.txt | cut -d: -f3) 40 mvn site -B -q 45 python3 -m docuploader create-metadata \ 46 --name ${NAME} \ [all …]
|