Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
---
title: Query Analytics Engine from the command line
description: Wrangler now supports querying Workers Analytics Engine datasets directly from the CLI.
products:
- workers-analytics-engine
- workers
date: 2025-12-30
---

import { TypeScriptExample } from "~/components";

Wrangler now supports querying [Workers Analytics Engine](/analytics/analytics-engine/) directly from the command line, allowing you to quickly explore your datasets without writing code or ad-hoc `curl` commands.

Workers Analytics Engine lets you ingest high-cardinality data at scale and query it using SQL. Use it to build custom analytics, usage-based billing, or observability tools. With the new `wrangler analytics-engine` command, you can run ad-hoc queries, debug data pipelines, and automate reports from your terminal.

For example, if your Worker tracks product usage:

<TypeScriptExample>

```ts
env.USAGE.writeDataPoint({
indexes: ["customer_12345"],
blobs: ["acme-app", "user_789", "file.upload", "enterprise"],
doubles: [1, 2048000], // count, bytes
});
```

</TypeScriptExample>

You can query that data directly from your terminal:

```sh
npx wrangler analytics-engine run "SELECT blob1 AS app, blob3 AS action, SUM(double1) AS total FROM usage GROUP BY app, action ORDER BY total DESC"
```

Output defaults to table format in interactive terminals and JSON when piped to other tools like `jq`. Use `--file` to run queries from a SQL file.

To get started, see [Querying from CLI](/analytics/analytics-engine/cli-querying/) or the [wrangler analytics-engine command reference](/workers/wrangler/commands/#analytics-engine).
107 changes: 107 additions & 0 deletions src/content/docs/analytics/analytics-engine/cli-querying.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,107 @@
---
title: Querying from CLI
pcx_content_type: reference
sidebar:
order: 5
head:
- tag: title
content: Querying Workers Analytics Engine from CLI
---

You can query Workers Analytics Engine datasets directly from the command line using Wrangler. This is useful for ad-hoc queries, debugging, and scripting.

## Prerequisites

Before you can query Analytics Engine from the CLI, you need:

1. [Wrangler installed](/workers/wrangler/install-and-update/) (version 4.0.0 or later)
2. Authentication configured via `wrangler login` or a `CLOUDFLARE_API_TOKEN` environment variable with the `Account Analytics Read` permission

## Basic usage

Run a query directly:

```sh
npx wrangler analytics-engine run "SELECT * FROM my_dataset LIMIT 10"
```

Run a query from a file:

```sh
npx wrangler analytics-engine run --file=query.sql
```

## Output formats

By default, results are displayed as a table when running interactively, or as JSON when piped to another command.

You can explicitly set the output format:

```sh
# Table format (default for interactive)
npx wrangler analytics-engine run "SELECT * FROM my_dataset" --format=table

# JSON format (default for non-interactive)
npx wrangler analytics-engine run "SELECT * FROM my_dataset" --format=json
```

JSON output is useful for piping to other tools like `jq`:

```sh
npx wrangler analytics-engine run "SELECT * FROM my_dataset" --format=json | jq '.data[0]'
```

## Example queries

### List all datasets

```sh
npx wrangler analytics-engine run "SHOW TABLES"
```

### Query with time filtering

```sh
npx wrangler analytics-engine run "SELECT blob1 AS city, AVG(double1) AS avg_temp FROM weather WHERE timestamp > NOW() - INTERVAL '1' DAY GROUP BY city"
```

### Query from a file

Create a file `query.sql`:

```sql
SELECT
intDiv(toUInt32(timestamp), 300) * 300 AS t,
blob1 AS city,
SUM(_sample_interval * double1) / SUM(_sample_interval) AS avg_temp
FROM weather
WHERE timestamp >= NOW() - INTERVAL '1' DAY
GROUP BY t, city
ORDER BY t, city
```

Then run:

```sh
npx wrangler analytics-engine run --file=query.sql
```

## Use in scripts

The CLI is useful for automation and scripting. For example, you can create a daily report:

```bash
#!/bin/bash

# Query yesterday's data and save to a file
npx wrangler analytics-engine run \
"SELECT blob1 AS endpoint, SUM(_sample_interval) AS request_count FROM api_requests WHERE timestamp >= NOW() - INTERVAL '1' DAY GROUP BY endpoint ORDER BY request_count DESC" \
--format=json > daily_report.json
```

## Related resources

- [SQL API reference](/analytics/analytics-engine/sql-api/) - Direct HTTP API access
- [SQL reference](/analytics/analytics-engine/sql-reference/) - Full SQL syntax documentation
- [Querying from a Worker](/analytics/analytics-engine/worker-querying/) - Query from within a Worker
- [Querying from Grafana](/analytics/analytics-engine/grafana/) - Visualize data in Grafana
19 changes: 13 additions & 6 deletions src/content/docs/analytics/analytics-engine/get-started.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -61,16 +61,17 @@ Currently, the `writeDataPoint()` API accepts ordered arrays of values. This mea

## 3. Query data using the SQL API

You can query the data you have written in two ways:
You can query the data you have written in several ways:

* [**SQL API**](/analytics/analytics-engine/sql-api) — Best for writing your own queries and integrating with external tools like Grafana.
* [**GraphQL API**](/analytics/graphql-api/) — This is the same API that powers the Cloudflare dashboard.
* [**Wrangler CLI**](/analytics/analytics-engine/cli-querying/) — Query directly from the command line using `wrangler analytics-engine`.
* [**SQL API**](/analytics/analytics-engine/sql-api/) — Best for writing your own queries and integrating with external tools like Grafana.
* [**GraphQL API**](/analytics/graphql-api/) — This is the same API that powers the Cloudflare dashboard.

For the purpose of this example, we will use the SQL API.
For the purpose of this example, we will use Wrangler or the SQL API.

### Create an API token

Create an [API Token](https://dash.cloudflare.com/profile/api-tokens) that has the `Account Analytics Read` permission.
Create an [API Token](https://dash.cloudflare.com/profile/api-tokens) that has the `Account Analytics Read` permission. If you are using Wrangler and have already run `wrangler login`, you can skip this step.

### Write your first query

Expand All @@ -92,7 +93,13 @@ LIMIT 10
We are using a custom averaging function to take [sampling](/analytics/analytics-engine/sql-api/#sampling) into account.
:::

You can run this query by making an HTTP request to the SQL API:
You can run this query using Wrangler:

```sh
npx wrangler analytics-engine run "SELECT blob1 AS city, SUM(_sample_interval * double2) / SUM(_sample_interval) AS avg_humidity FROM WEATHER WHERE double1 > 0 GROUP BY city ORDER BY avg_humidity DESC LIMIT 10"
```

Or by making an HTTP request to the SQL API:

```bash
curl "https://api.cloudflare.com/client/v4/accounts/{account_id}/analytics_engine/sql" \
Expand Down
44 changes: 44 additions & 0 deletions src/content/docs/workers/wrangler/commands.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@
- [`pages`](#pages) - Configure Cloudflare Pages.
- [`pipelines`](#pipelines) - Configure Cloudflare Pipelines.
- [`queues`](#queues) - Configure Workers Queues.
- [`analytics-engine`](#analytics-engine) - Query Workers Analytics Engine datasets.
- [`login`](#login) - Authorize Wrangler with your Cloudflare account using OAuth.
- [`logout`](#logout) - Remove Wrangler's authorization for accessing your account.
- [`whoami`](#whoami) - Retrieve your user information and test your authentication configuration.
Expand Down Expand Up @@ -496,7 +497,7 @@
✓ Select an account: › My account
| Name | ID | StoreID | Comment | Scopes | Status | Created | Modified |
|-----------------------------|-------------------------------------|-------------------------------------|---------|---------|---------|------------------------|------------------------|
| ServiceA_key-1 | 13bc7498c6374a4e9d13be091c3c65f1 | 8f7a1cdced6342c18d223ece462fd88d | | workers | active | 4/9/2025, 10:06:01 PM | 4/15/2025, 09:13:05 AM |

Check warning on line 500 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)

Check warning on line 500 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)
```

<WranglerCommand command="secrets-store secret delete" headingLevel={3} />
Expand Down Expand Up @@ -550,7 +551,7 @@
┌─────────┬──────────────────────────────────┬──────────────────────────────────┬──────────────────────┬──────────────────────┐
│ Name │ ID │ AccountID │ Created │ Modified │
├─────────┼──────────────────────────────────┼──────────────────────────────────┼──────────────────────┼──────────────────────┤
│ default │ 8876bad33f164462bf0743fe8adf98f4 │ REDACTED │ 4/9/2025, 1:11:48 PM │ 4/9/2025, 1:11:48 PM │

Check warning on line 554 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)

Check warning on line 554 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)
└─────────┴──────────────────────────────────┴──────────────────────────────────┴──────────────────────┴──────────────────────┘
```

Expand Down Expand Up @@ -613,6 +614,49 @@

---

## `analytics-engine`

Query [Workers Analytics Engine](/analytics/analytics-engine/) datasets using the SQL API.

:::note
`ae` is available as a shorthand alias for `analytics-engine`.
:::

```txt
wrangler analytics-engine run [QUERY] [OPTIONS]
```

- `QUERY` <Type text="string" /> <MetaInfo text="optional" />
- The SQL query to execute. Either `QUERY` or `--file` must be provided.
- `--file` <Type text="string" /> <MetaInfo text="optional" />
- Path to a file containing the SQL query to execute. Either `--file` or `QUERY` must be provided.
- `--format` <Type text="'json' | 'table'" /> <MetaInfo text="optional" />
- Output format for query results. Defaults to `table` when running in a TTY, `json` otherwise.

<Render file="wrangler-commands/global-flags" product="workers" />

### Examples

Query a dataset directly:

```sh
npx wrangler analytics-engine run "SELECT * FROM my_dataset LIMIT 10"
```

Query from a file:

```sh
npx wrangler analytics-engine run --file=query.sql
```

Output as JSON:

```sh
npx wrangler analytics-engine run "SELECT * FROM my_dataset" --format=json
```

---

## `login`

Authorize Wrangler with your Cloudflare account using OAuth. Wrangler will attempt to automatically open your web browser to login with your Cloudflare account.
Expand Down Expand Up @@ -818,7 +862,7 @@
Success! Uploaded mTLS Certificate my-origin-cert
ID: 99f5fef1-6cc1-46b8-bd79-44a0d5082b8d
Issuer: CN=my-secured-origin.com,OU=my-team,O=my-org,L=San Francisco,ST=California,C=US
Expires: 1/01/2025

Check warning on line 865 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)
```

You can then add this certificate as a [binding](/workers/runtime-apis/bindings/) in your [Wrangler configuration file](/workers/wrangler/configuration/):
Expand Down Expand Up @@ -847,13 +891,13 @@
ID: 99f5fef1-6cc1-46b8-bd79-44a0d5082b8d
Name: my-origin-cert
Issuer: CN=my-secured-origin.com,OU=my-team,O=my-org,L=San Francisco,ST=California,C=US
Created on: 1/01/2023

Check warning on line 894 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)
Expires: 1/01/2025

Check warning on line 895 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)

ID: c5d004d1-8312-402c-b8ed-6194328d5cbe
Issuer: CN=another-origin.com,OU=my-team,O=my-org,L=San Francisco,ST=California,C=US
Created on: 1/01/2023

Check warning on line 899 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)
Expires: 1/01/2025

Check warning on line 900 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)
```

<WranglerCommand command="mtls-certificate delete" headingLevel={3} />
Expand Down Expand Up @@ -892,7 +936,7 @@
Success! Uploaded mTLS Certificate my-origin-cert
ID: 99f5fef1-6cc1-46b8-bd79-44a0d5082b8d
Issuer: CN=my-secured-origin.com,OU=my-team,O=my-org,L=San Francisco,ST=California,C=US
Expires: 1/01/2025

Check warning on line 939 in src/content/docs/workers/wrangler/commands.mdx

View workflow job for this annotation

GitHub Actions / Semgrep

semgrep.style-guide-potential-date-year

Potential year found. Documentation should strive to represent universal truth, not something time-bound. (add [skip style guide checks] to commit message to skip)
```

Note that the certificate and private keys must be in separate (typically `.pem`) files when uploading.
Expand Down
Loading