BigQuery
Load JSONL execution traces into a BigQuery table to query across builds.
- Export traces to a JSONL file:
grog traces export --format=jsonl --output traces.jsonl- Create a BigQuery dataset and load the data:
bq mk --dataset my_project:grog_traces
bq load \ --source_format=NEWLINE_DELIMITED_JSON \ --autodetect \ my_project:grog_traces.builds \ traces.jsonlExample queries
Section titled “Example queries”Average build duration over the last 30 days:
SELECT DATE(TIMESTAMP_MILLIS(start_time_unix_millis)) AS build_date, command, AVG(total_duration_millis) / 1000 AS avg_duration_sec, COUNT(*) AS build_countFROM `my_project.grog_traces.builds`WHERE start_time_unix_millis > UNIX_MILLIS(TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 30 DAY))GROUP BY build_date, commandORDER BY build_date DESCSlowest targets by average command duration:
SELECT span.label, AVG(span.command_duration_millis) AS avg_cmd_ms, AVG(span.queue_wait_millis) AS avg_queue_ms, COUNT(*) AS executionsFROM `my_project.grog_traces.builds`, UNNEST(spans) AS spanWHERE span.cache_result = 'CACHE_MISS'GROUP BY span.labelHAVING executions > 5ORDER BY avg_cmd_ms DESCLIMIT 20Cache hit rate trend:
SELECT DATE(TIMESTAMP_MILLIS(start_time_unix_millis)) AS build_date, SUM(cache_hit_count) AS hits, SUM(total_targets) AS total, SAFE_DIVIDE(SUM(cache_hit_count), SUM(total_targets)) * 100 AS hit_rate_pctFROM `my_project.grog_traces.builds`GROUP BY build_dateORDER BY build_date DESCAutomating ingestion in CI
Section titled “Automating ingestion in CI”Add a step to your CI pipeline that exports and loads traces after each build:
- name: Export and upload traces run: | grog traces export --format=jsonl --limit 1 --output trace.jsonl bq load --source_format=NEWLINE_DELIMITED_JSON \ my_project:grog_traces.builds trace.jsonl