Home
last modified time | relevance | path

Searched +full:- +full:- +full:require +full:- +full:hashes (Results 1 – 25 of 395) sorted by relevance

12345678910>>...16

/external/google-cloud-java/java-webrisk/google-cloud-webrisk/src/main/java/com/google/cloud/webrisk/v1/
DWebRiskServiceClient.java8 * https://www.apache.org/licenses/LICENSE-2.0
39 // AUTO-GENERATED DOCUMENTATION AND CLASS.
49 * // It will require modifications to work:
50 * // - It may require correct/in-range values for request initialization.
51 * // - It may require specifying regional endpoints when creating the service client as shown in
64 * as threads. In the example above, try-with-resources is used, which automatically calls close().
82 * <p>Many parameters require resource names to be formatted in a particular way. To assist with
93 * // It will require modifications to work:
94 * // - It may require correct/in-range values for request initialization.
95 * // - It may require specifying regional endpoints when creating the service client as shown in
[all …]
/external/google-cloud-java/java-webrisk/google-cloud-webrisk/src/main/java/com/google/cloud/webrisk/v1beta1/
DWebRiskServiceV1Beta1Client.java8 * https://www.apache.org/licenses/LICENSE-2.0
37 // AUTO-GENERATED DOCUMENTATION AND CLASS.
47 * // It will require modifications to work:
48 * // - It may require correct/in-range values for request initialization.
49 * // - It may require specifying regional endpoints when creating the service client as shown in
63 * resources such as threads. In the example above, try-with-resources is used, which automatically
82 * <p>Many parameters require resource names to be formatted in a particular way. To assist with
93 * // It will require modifications to work:
94 * // - It may require correct/in-range values for request initialization.
95 * // - It may require specifying regional endpoints when creating the service client as shown in
[all …]
/external/openthread/third_party/mbedtls/repo/docs/architecture/psa-migration/
Dmd-cipher-dispatch.md1 PSA migration strategy for hashes and ciphers
50 because this is inefficient and error-prone.
54 …e). However, this can't be done without breaking [backward compatibility](#backward-compatibility).
56 The goal of this work is to arrange for more non-PSA interfaces to use PSA interfaces under the hoo…
58 …likely has better performance, and sometimes better security, than the built-in software implement…
64 …TLS_SHA256_C` for SHA256, `MBEDTLS_AES_C && MBEDTLS_CIPHER_MODE_CBC` for AES-CBC, etc. In code tha…
78 ### Non-goals
92-covered modules call PSA, but only [when this will actually work](#why-psa-is-not-always-possible…
93-covered module which calls another module, for example X.509 calling pk for PSS verification whic…
107 * **Legacy domain**: does not interact with PSA. Implementations of hashes, of cipher primitives, o…
[all …]
Dstrategy.md10 G2. Allow isolation of long-term secrets (for example, private keys).
11 G3. Allow isolation of short-term secrets (for example, TLS session keys).
18 implemented, see `docs/use-psa-crypto.md`, where new APIs are about (G2), and
28 Compile-time options
31 We currently have a few compile-time options that are relevant to the migration:
33 - `MBEDTLS_PSA_CRYPTO_C` - enabled by default, controls the presence of the PSA
35 - `MBEDTLS_USE_PSA_CRYPTO` - disabled by default (enabled in "full" config),
38 - `PSA_CRYPTO_CONFIG` - disabled by default, supports builds with drivers and
43 - it's not fully compatible with `MBEDTLS_ECP_RESTARTABLE`: you can enable
46 - to avoid a hard/default dependency of TLS, X.509 and PK on
[all …]
/external/mbedtls/docs/architecture/psa-migration/
Dmd-cipher-dispatch.md1 PSA migration strategy for hashes and ciphers
50 because this is inefficient and error-prone.
54 …e). However, this can't be done without breaking [backward compatibility](#backward-compatibility).
56 The goal of this work is to arrange for more non-PSA interfaces to use PSA interfaces under the hoo…
58 …likely has better performance, and sometimes better security, than the built-in software implement…
64 …TLS_SHA256_C` for SHA256, `MBEDTLS_AES_C && MBEDTLS_CIPHER_MODE_CBC` for AES-CBC, etc. In code tha…
78 ### Non-goals
92-covered modules call PSA, but only [when this will actually work](#why-psa-is-not-always-possible…
93-covered module which calls another module, for example X.509 calling pk for PSS verification whic…
107 * **Legacy domain**: does not interact with PSA. Implementations of hashes, of cipher primitives, o…
[all …]
Dstrategy.md10 G2. Allow isolation of long-term secrets (for example, private keys).
11 G3. Allow isolation of short-term secrets (for example, TLS session keys).
18 implemented, see `docs/use-psa-crypto.md`, where new APIs are about (G2), and
28 Compile-time options
31 We currently have a few compile-time options that are relevant to the migration:
33 - `MBEDTLS_PSA_CRYPTO_C` - enabled by default, controls the presence of the PSA
35 - `MBEDTLS_USE_PSA_CRYPTO` - disabled by default (enabled in "full" config),
38 - `PSA_CRYPTO_CONFIG` - disabled by default, supports builds with drivers and
43 - it's not fully compatible with `MBEDTLS_ECP_RESTARTABLE`: you can enable
46 - to avoid a hard/default dependency of TLS, X.509 and PK on
[all …]
/external/pigweed/pw_build/py/pw_build/
Dpip_install_python_deps.py7 # https://www.apache.org/licenses/LICENSE-2.0
32 def _parse_args() -> tuple[argparse.Namespace, list[str]]:
35 '--python-dep-list-files',
44 '--gn-packages',
51 '--editable-pip-install',
55 '\'--editable\' option.'
70 ) -> int:
76 with --require-hashes.
98 command_args = [sys.executable, "-m", "pip"]
101 command_args.append('--editable')
[all …]
/external/bazelbuild-rules_python/python/private/pypi/requirements_parser/
Dresolve_target_platforms.py5 may handle more things. We require a `python` interpreter that can run on the
9 re-using the same code that is used in the `whl_library` installer. See
13 - Depends only on `packaging` and core Python.
14 - Produces the same result irrespective of the Python interpreter platform or version.
49 entry, prefix, hashes = requirement_line.partition("--hash")
50 hashes = prefix + hashes
/external/pigweed/pw_build/python_dist/
Dsetup.bat7 :: https://www.apache.org/licenses/LICENSE-2.0
20 python -m venv "%ROOT_DIR%\python-venv"
22 set "python=%ROOT_DIR%\python-venv\Scripts\python.exe"
27 set "CONSTRAINT_ARG=--constraint=%CONSTRAINT_PATH%"
33 set "EXTRA_REQUIREMENT_ARG=--requirement=%EXTRA_REQUIREMENT_PATH%"
37 :: Note: pip install --require-hashes will be triggered if any hashes are present
39 call "%python%" -m pip install ^
40 "--find-links=%ROOT_DIR%python_wheels" ^
41 "--requirement=requirements.txt" %EXTRA_REQUIREMENT_ARG% %CONSTRAINT_ARG%
Dsetup.sh9 # https://www.apache.org/licenses/LICENSE-2.0
17 set -o xtrace -o errexit -o nounset
20 DIR="$(python3 -c "import os; print(os.path.dirname(os.path.abspath(os.path.realpath(\"$SRC\"))))")"
21 VENV="${DIR}/python-venv"
26 if [ ! -z "${1-}" ]; then
27 VENV="${1-}"
32 if [ -f "${CONSTRAINTS_PATH}" ]; then
33 CONSTRAINTS_ARG="-c ""${CONSTRAINTS_PATH}"
37 if [ -f "${EXTRA_REQUIREMENT_PATH}" ]; then
38 EXTRA_REQUIREMENT_ARG="-r ""${EXTRA_REQUIREMENT_PATH}"
[all …]
/external/rappor/bin/
DREADME.md7 --------------
9 ### decode-dist
11 Decode a distribution -- requires a "counts" file (summed bits from reports),
12 map file, and a params file. See `test.sh decode-dist` in this dir for an
15 ### decode-assoc
18 `test.sh decode-assoc-R` or `test.sh decode-assoc-cpp` in this dir for an
25 Both of these tools are written in R, and require several R libraries to be
26 installed (see `../setup.sh r-packages`).
28 `decode-assoc` also shells out to a native binary written in C++ if
29 `--em-executable` is passed. This requires a C++ compiler (see
[all …]
/external/skia/tools/run-wasm-gm-tests/
Drun-wasm-gm-tests.js3-wasm-gm-tests --js_file ../../out/wasm_gm_tests/wasm_gm_tests.js --wasm_file ../../out/wasm_gm_te…
5 const puppeteer = require('puppeteer');
6 const express = require('express');
7 const path = require('path');
8 const bodyParser = require('body-parser');
9 const fs = require('fs');
10 const commandLineArgs = require('command-line-args');
11 const commandLineUsage = require('command-line-usage');
27 description: '(required) The hashes that should not be written to disk.'
41 description: 'Whether we should run in non-headless mode with GPU.',
[all …]
/external/swiftshader/third_party/llvm-16.0/llvm/lib/DebugInfo/PDB/Native/
DTpiStreamBuilder.cpp1 //===- TpiStreamBuilder.cpp - -------------------------------------------===//
5 // SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
7 //===----------------------------------------------------------------------===//
67 // FIXME: Require it. in addTypeRecord()
74 ArrayRef<uint32_t> Hashes) { in addTypeRecords() argument
75 // Ignore empty type buffers. There should be no hashes or sizes in this case. in addTypeRecords()
77 assert(Sizes.empty() && Hashes.empty()); in addTypeRecords()
84 assert(Sizes.size() == Hashes.size() && "sizes and hashes should be in sync"); in addTypeRecords()
90 llvm::append_range(TypeHashes, Hashes); in addTypeRecords()
99 H->Version = VerHeader; in finalize()
[all …]
/external/swiftshader/third_party/llvm-10.0/llvm/lib/DebugInfo/CodeView/
DTypeStreamMerger.cpp1 //===-- TypeStreamMerger.cpp ------------------------------------*- C++ -*-===//
5 // SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
7 //===----------------------------------------------------------------------===//
26 return Idx.getIndex() - TypeIndex::FirstNonSimpleIndex; in slotForIndex()
46 /// - Begin with a new empty stream, and a new empty hash table that maps from
48 /// - For each new type stream, maintain a map from source type index to
50 /// - For each record, copy it and rewrite its type indices to be valid in the
52 /// - If the new type record is not already present in the destination stream
55 /// - If the type record already exists in the destination stream, discard it
70 // might potentially back-reference this data. We also don't want to resolve in TypeStreamMerger()
[all …]
/external/swiftshader/third_party/llvm-16.0/llvm/lib/DebugInfo/CodeView/
DTypeStreamMerger.cpp1 //===-- TypeStreamMerger.cpp ------------------------------------*- C++ -*-===//
5 // SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
7 //===----------------------------------------------------------------------===//
26 return Idx.getIndex() - TypeIndex::FirstNonSimpleIndex; in slotForIndex()
46 /// - Begin with a new empty stream, and a new empty hash table that maps from
48 /// - For each new type stream, maintain a map from source type index to
50 /// - For each record, copy it and rewrite its type indices to be valid in the
52 /// - If the new type record is not already present in the destination stream
55 /// - If the type record already exists in the destination stream, discard it
70 // might potentially back-reference this data. We also don't want to resolve in TypeStreamMerger()
[all …]
/external/cronet/tot/net/http/
Dtransport_security_state.h2 // Use of this source code is governed by a BSD-style license that can be
61 // This object manages the in-memory store. Register a Delegate with
65 // http://tools.ietf.org/html/ietf-websec-strict-transport-sec.
105 // |chain|, with the set of hashesh |hashes|. Note that |hashes| and
106 // |chain| are not guaranteed to be in the same order - that is, the first
107 // hash in |hashes| is NOT guaranteed to be for the leaf cert in |chain|.
111 const HashValueVector& hashes) = 0;
159 const HashedHost& hostname() const { return iterator_->first; } in hostname()
160 const STSState& domain_state() const { return iterator_->second; } in domain_state()
195 // Optional; hashes of pinned SubjectPublicKeyInfos.
[all …]
/external/cronet/stable/net/http/
Dtransport_security_state.h2 // Use of this source code is governed by a BSD-style license that can be
61 // This object manages the in-memory store. Register a Delegate with
65 // http://tools.ietf.org/html/ietf-websec-strict-transport-sec.
105 // |chain|, with the set of hashesh |hashes|. Note that |hashes| and
106 // |chain| are not guaranteed to be in the same order - that is, the first
107 // hash in |hashes| is NOT guaranteed to be for the leaf cert in |chain|.
111 const HashValueVector& hashes) = 0;
159 const HashedHost& hostname() const { return iterator_->first; } in hostname()
160 const STSState& domain_state() const { return iterator_->second; } in domain_state()
195 // Optional; hashes of pinned SubjectPublicKeyInfos.
[all …]
/external/grpc-grpc/tools/run_tests/sanity/
Dcheck_bazel_workspace.py9 # http://www.apache.org/licenses/LICENSE-2.0
25 git_hash_pattern = re.compile("[0-9a-f]{40}")
27 # Parse git hashes from submodules
62 # TODO(stanleycheung): remove when prometheus-cpp has new release
92 # TODO(stanleycheung): remove when prometheus-cpp has new release
141 # we will only be looking for git commit hashes, so concatenating
157 # Parse git hashes from bazel/grpc_deps.bzl {new_}http_archive rules
180 print("- GRPC_DEP_NAMES only:", grpc_dep_names_set - names_set)
181 print("- grpc_deps.bzl only:", names_set - grpc_dep_names_set)
184 # There are some "bazel-only" deps that are exceptions to this sanity check,
[all …]
/external/rust/android-crates-io/crates/grpcio-sys/grpc/tools/run_tests/sanity/
Dcheck_bazel_workspace.py9 # http://www.apache.org/licenses/LICENSE-2.0
25 git_hash_pattern = re.compile('[0-9a-f]{40}')
27 # Parse git hashes from submodules
111 # we will only be looking for git commit hashes, so concatenating
127 # Parse git hashes from bazel/grpc_deps.bzl {new_}http_archive rules
149 # There are some "bazel-only" deps that are exceptions to this sanity check,
150 # we don't require that there is a corresponding git module for these.
166 if len(workspace_git_hashes - git_submodule_hashes) > 0:
172 print(("workspace_git_hashes - git_submodule_hashes: %s" %
173 (workspace_git_hashes - git_submodule_hashes)))
/external/scapy/scapy/layers/msrpce/
Dmsnrpc.py1 # SPDX-License-Identifier: GPL-2.0-or-later
7 [MS-NRPC] Netlogon Remote Protocol
9 https://learn.microsoft.com/en-us/openspecs/windows_protocols/ms-nrpc/ff8f970f-3e37-40f7-bd4b-af733…
50 from cryptography.hazmat.primitives import hashes, hmac
54 hashes = hmac = Cipher = algorithms = modes = DES = None variable
63 # --- RFC
65 # [MS-NRPC] sect 3.1.4.2
71 # database is out-of-date.
81 # G: Does not require ValidationLevel 2 fornongeneric passthrough.
89 # K: Supports generic pass-through authentication.
[all …]
/external/google-cloud-java/.github/workflows/
Dreadme.yaml7 # http://www.apache.org/licenses/LICENSE-2.0
19 - cron: '0 1 * * *'
23 runs-on: ubuntu-latest
26 - uses: actions/checkout@v2
27 - uses: actions/setup-python@v4
29 python-version: '3.7'
31 - run: python3 -m pip install --require-hashes -r .github/requirements.txt
32 - run: python3 generate-readme.py
33 - uses: googleapis/code-suggester@v4
39 upstream_repo: google-cloud-java
[all …]
/external/pigweed/pw_build/
Dpython_dist.gni7 # https://www.apache.org/licenses/LICENSE-2.0
26 # .wheel sub-targets along with the .wheel sub-targets of all dependencies,
82 "--prefix",
84 "--suffix",
86 "--out-dir",
150 # TODO: b/235245034 - Remove the plumbing-through of invoker's public_deps.
355 "--repo-root",
357 "--tree-destination-dir",
359 "--input-list-files",
367 "--setupcfg-common-file",
[all …]
/external/harfbuzz_ng/.github/workflows/
Dmsvc-ci.yml14 runs-on: ${{ matrix.os }}
17 fail-fast: false
19 os: [windows-2019, windows-latest]
21 - name: msvc-2019-x86
22 os: windows-2019
24 - name: msvc-2019-amd64
25 os: windows-latest
30 - name: Checkout
32 - name: Setup Ccache
33 uses: hendrikmuhs/ccache-action@ed74d11c0b343532753ecead8a951bb09bb34bc9 # v1.2.14
[all …]
Dmacos-ci.yml1 name: macos-ci
14 runs-on: macos-latest
17 - name: Checkout
19 - name: Setup Ccache
20 uses: hendrikmuhs/ccache-action@ed74d11c0b343532753ecead8a951bb09bb34bc9 # v1.2.14
22 key: ${{ github.job }}-${{ runner.os }}-${{ runner.arch }}
23 - name: Install Dependencies
27 brew rm -f pkg-config@0.29.2
32 gobject-introspection \
38 - name: Setup Python
[all …]
/external/google-auth-library-java/.kokoro/release/
Dpublish_javadoc.sh8 # http://www.apache.org/licenses/LICENSE-2.0
16 set -eo pipefail
18 if [[ -z "${CREDENTIALS}" ]]; then
22 if [[ -z "${STAGING_BUCKET}" ]]; then
31 python3 -m pip install --require-hashes -r .kokoro/requirements.txt
34 mvn clean install -B -q -DskipTests=true
36 export NAME=google-auth-library
37 export VERSION=$(grep ${NAME}: versions.txt | cut -d: -f3)
40 mvn site -B -q
45 python3 -m docuploader create-metadata \
[all …]

12345678910>>...16