Searched +full:- +full:- +full:rockset +full:- +full:workspace (Results 1 – 8 of 8) sorted by relevance
22 "test-reports",25 r"test-reports-test-(?P<name>[\w\-]+)-\d+-\d+-(?P<runner>[\w\.-]+)_(?P<job>\d+).zip"35 ) -> list[dict[str, Any]]:73 "workflow_id": workflow_run_id, # type: ignore[dict-item]74 "run_attempt": workflow_run_attempt, # type: ignore[dict-item]90 def generate_partition_key(repo: str, doc: Dict[str, Any]) -> str:99 hash_content = hashlib.md5(json.dumps(doc).encode("utf-8")).hexdigest()105 description="Upload dynamo perf stats from S3 to Rockset"108 "--workflow-run-id",114 "--workflow-run-attempt",[all …]
14 import rockset # type: ignore[import]23 # NB: Rockset has an upper limit of 5000 documents in one request27 def _get_request_headers() -> dict[str, str]:34 def _get_artifact_urls(prefix: str, workflow_run_id: int) -> dict[Path, str]:35 """Get all workflow artifacts with 'test-report' in the name."""56 ) -> Path:59 # re-run a workflow and produce a new set of artifacts. To avoid name60 # collisions, we add `-runattempt1<run #>-` somewhere in the artifact name.64 atoms = str(artifact_name).split("-")84 ) -> list[Path]:[all …]
5 …workflows: [inductor-A100-perf-nightly, inductor-perf-nightly-A10g, inductor-perf-nightly-aarch64,…7 - completed10 get-conclusion:11 runs-on: ubuntu-latest13 conclusion: ${{ fromJson(steps.get-conclusion.outputs.data).conclusion }}15 - name: Get workflow run conclusion16 uses: octokit/request-action@v2.1.017 id: get-conclusion23 upload-perf-stats:24 needs: get-conclusion[all …]
5 …trunk, periodic, inductor, unstable, slow, unstable-periodic, inductor-periodic, rocm, inductor-mi…7 - completed11 …ion adapted from https://github.com/community/community/discussions/21090#discussioncomment-322627114 runs-on: ubuntu-latest18 - name: Get workflow run conclusion19 uses: octokit/request-action@v2.1.026 upload-test-stats:32 runs-on: ubuntu-22.0433 environment: upload-stats36 - name: Print workflow information[all …]
6 build-environment:9 description: Top-level label for what's being built/tested.10 docker-image:19 run-doxygen:24 sync-tag:30 job with the same `sync-tag` is identical.31 s3-bucket:35 default: "gha-artifacts"36 aws-role-to-assume:41 upload-aws-role-to-assume:[all …]
5 # This source code is licensed under the BSD-style license found in the8 # NB: the following functions are used in Meta-internal workflows379 # This query needs read-org permission444 RE_GHSTACK_HEAD_REF = re.compile(r"^(gh/[^/]+/[0-9]+/)head$")448 r"https://github.com/(?P<owner>[^/]+)/(?P<repo>[^/]+)/pull/(?P<number>[0-9]+)",452 RE_DIFF_REV = re.compile(r"^Differential Revision:.+?(D[0-9]+)", re.MULTILINE)460 INTERNAL_CHANGES_CHECKRUN_NAME = "Meta Internal-Only Changes Check"464 # This could be set to -1 to ignore all flaky and broken trunk failures. On the469 def gh_get_pr_info(org: str, proj: str, pr_no: int) -> Any:475 def gh_get_team_members(org: str, name: str) -> List[str]:[all …]
... xla hash.", 13 "headRefName": "update-xla-commit-hash/5573005593-54-1", ...