Home
last modified time | relevance | path

Searched +full:upload +full:- +full:stats +full:- +full:to +full:- +full:rockset (Results 1 – 16 of 16) sorted by relevance

/external/pytorch/tools/stats/
DREADME.md1 # PyTorch CI Stats
3 We track various stats about each CI job.
5 1. Jobs upload their artifacts to an intermediate data store (either GitHub
7 …/pytorch/blob/a9f6a35a33308f3be2413cc5c866baec5cfe3ba1/.github/workflows/_linux-build.yml#L144-L151
9 …`upload-test-stats.yml`](https://github.com/pytorch/pytorch/blob/d9fca126fca7d7780ae44170d30bda901…
10 3. `upload-test-stats` downloads the raw stats from the intermediate data store
11 and uploads them as JSON to Rockset, our metrics backend.
15 J1[Job with AWS creds<br>e.g. linux, win] --raw stats--> S3[(AWS S3)]
16 J2[Job w/o AWS creds<br>e.g. mac] --raw stats--> GHA[(GH artifacts)]
18 S3 --> uts[upload-test-stats.yml]
[all …]
Dupload_dynamo_perf_stats.py13 from tools.stats.upload_stats_lib import (
22 "test-reports",
25 r"test-reports-test-(?P<name>[\w\-]+)-\d+-\d+-(?P<runner>[\w\.-]+)_(?P<job>\d+).zip"
35 ) -> list[dict[str, Any]]:
47 # Unzip to get perf stats csv files
73 "workflow_id": workflow_run_id, # type: ignore[dict-item]
74 "run_attempt": workflow_run_attempt, # type: ignore[dict-item]
90 def generate_partition_key(repo: str, doc: Dict[str, Any]) -> str:
99 hash_content = hashlib.md5(json.dumps(doc).encode("utf-8")).hexdigest()
105 description="Upload dynamo perf stats from S3 to Rockset"
[all …]
Dupload_test_stat_aggregates.py11 import rockset # type: ignore[import]
13 from tools.stats.upload_stats_lib import upload_to_s3
16 def get_oncall_from_testfile(testfile: str) -> list[str] | None:
42 def get_test_stat_aggregates(date: datetime.date) -> Any:
43 # Initialize the Rockset client with your API key
45 rockset_api_server = "api.rs2.usw2.rockset.com"
47 rs = rockset.RocksetClient(host="api.usw2a1.rockset.com", api_key=rockset_api_key)
49 # Define the name of the Rockset collection and lambda function
53 rockset.models.QueryParameter(name="startTime", type="string", value=iso_date)
70 description="Upload test stat aggregates to Rockset."
[all …]
Dupload_test_stats_intermediate.py4 from tools.stats.test_dashboard import upload_additional_info
5 from tools.stats.upload_test_stats import get_tests
9 parser = argparse.ArgumentParser(description="Upload test stats to Rockset")
11 "--workflow-run-id",
13 help="id of the workflow to get artifacts from",
16 "--workflow-run-attempt",
27 # Flush stdout so that any errors in Rockset upload show up last in the logs.
Dupload_test_stats.py12 from tools.stats.test_dashboard import upload_additional_info
13 from tools.stats.upload_stats_lib import (
26 ) -> list[dict[str, Any]]:
27 """Convert a test report xml file into a JSON-serializable list of test cases."""
46 # jit/test_dce.py). For sharding/test selection purposes, we want to
49 # To do this, we leverage an implementation detail of how we write out
58 def process_xml_element(element: ET.Element) -> dict[str, Any]:
59 """Convert a test suite element into a JSON-serializable dict."""
69 # The XML format encodes all values as strings. Convert to ints/floats if
70 # possible to make aggregation possible in Rockset.
[all …]
Dupload_sccache_stats.py10 from tools.stats.upload_stats_lib import (
18 ) -> list[dict[str, Any]]:
24 download_s3_artifacts("sccache-stats", workflow_run_id, workflow_run_attempt)
34 parser = argparse.ArgumentParser(description="Upload test stats to Rockset")
36 "--workflow-run-id",
39 help="id of the workflow to get artifacts from",
42 "--workflow-run-attempt",
48 stats = get_sccache_stats(args.workflow_run_id, args.workflow_run_attempt) variable
50 args.workflow_run_id, args.workflow_run_attempt, "sccache_stats", stats
Dupload_external_contrib_stats.py13 from tools.stats.upload_stats_lib import upload_to_s3
18 "facebook-github-bot",
19 "pytorch-bot[bot]",
32 ) -> Any:
44 key in err.headers for key in ["X-RateLimit-Limit", "X-RateLimit-Used"]
47 … f"Rate limit exceeded: {err.headers['X-RateLimit-Used']}/{err.headers['X-RateLimit-Limit']}"
56 ) -> list[dict[str, Any]]:
70 ) -> list[dict[str, Any]]:
77 period_end_date = period_begin_date + datetime.timedelta(days=period_length - 1)
87 … label:"open source" label:Merged -label:Reverted closed:{period_begin_date}..{period_end_date}',
[all …]
Dcheck_disabled_tests.py11 from tools.stats.upload_stats_lib import (
17 from tools.stats.upload_test_stats import process_xml_element
26 ) -> dict[str, dict[str, int]]:
28 Return a list of disabled tests that should be re-enabled and those that are still
35 # * Success test should be re-enable if it's green after rerunning in all platforms
37 # * Failures from pytest because pytest-flakefinder is used to run the same test
41 # We want to keep track of how many times the test fails (num_red) or passes (num_green)
47 # Under --rerun-disabled-tests mode, a test is skipped when:
82 # Under --rerun-disabled-tests mode, if a test is not skipped or failed, it's
86 stats = json.loads(skipped.get("message", ""))
[all …]
Dupload_metrics.py15 # Keeping this logic here so that callers don't have to
23 print(f"Unable to import boto3. Will not be emitting metrics.... Reason: {e}")
26 # another, so we need to specify the table's ARN explicitly.
28 "arn:aws:dynamodb:us-east-1:308535385114:table/torchci-metrics"
36 # Used to cast the value of the env_var to the correct type (defaults to str)
45 ) -> None:
51 def value(self) -> Any:
54 # Github CI will set some env vars to an empty string
62 "environment variable to pass in this value."
73 def add_global_metric(metric_name: str, metric_value: Any) -> None:
[all …]
/external/pytorch/.github/workflows/
Dupload-torch-dynamo-perf-stats.yml1 name: Upload torch dynamo performance stats
5 …workflows: [inductor-A100-perf-nightly, inductor-perf-nightly-A10g, inductor-perf-nightly-aarch64,…
7 - completed
10 get-conclusion:
11 runs-on: ubuntu-latest
13 conclusion: ${{ fromJson(steps.get-conclusion.outputs.data).conclusion }}
15 - name: Get workflow run conclusion
16 uses: octokit/request-action@v2.1.0
17 id: get-conclusion
23 upload-perf-stats:
[all …]
Dnightly-rockset-uploads.yml1 name: Nightly Upload to rockset
6 - cron: 37 7 * * *
9 - 'tools/stats/upload_external_contrib_stats.py'
10 - 'tools/stats/upload_test_stat_aggregates.py'
11 - '.github/workflows/nightly-rockset-uploads.yml'
14 …group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}-${{ github.eve…
15 cancel-in-progress: true
19 upload-stats-to-rockset:
20 runs-on: ubuntu-22.04
21 environment: upload-stats
[all …]
Dupload-test-stats.yml1 name: Upload test stats
5 …trunk, periodic, inductor, unstable, slow, unstable-periodic, inductor-periodic, rocm, inductor-mi…
7 - completed
11 …ion adapted from https://github.com/community/community/discussions/21090#discussioncomment-3226271
14 runs-on: ubuntu-latest
18 - name: Get workflow run conclusion
19 uses: octokit/request-action@v2.1.0
26 upload-test-stats:
32 runs-on: ubuntu-22.04
33 environment: upload-stats
[all …]
Dupload-alerts.yml1 # upload alerts every 10 minutes
3 name: Upload Alerts to AWS/Rockset
7 - cron: '*/10 * * * *'
10 - 'tools/alerts/create_alerts.py'
11 - '.github/workflows/upload-alerts.yml'
14 upload-alerts:
16 runs-on: ubuntu-22.04
17 environment: upload-stats
19 - name: Checkout repo
22 fetch-depth: 1
[all …]
D_win-test.yml1 name: win-test
6 build-environment:
9 description: Top-level label for what's being built/tested.
10 cuda-version:
13 description: What CUDA version to build with, "cpu" for none.
14 test-matrix:
17 description: JSON description of what test configs to run.
18 sync-tag:
23 If this is set, our linter will use this to make sure that every other
24 job with the same `sync-tag` is identical.
[all …]
/external/pytorch/.github/scripts/
Drockset_mocks.json.gz
Dgql_mocks.json.gz ... xla hash.", 13 "headRefName": "update-xla-commit-hash/5573005593-54-1", ...