---
title: Log Explorer
description: Log Explorer is Cloudflare's native observability and forensics product that enables security teams and developers to analyze, investigate, and monitor issues directly from the Cloudflare dashboard, without the expense and complexity of forwarding logs to third-party tools.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/index.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Log Explorer

Store and explore your Cloudflare logs directly within the Cloudflare dashboard or API.

Log Explorer is Cloudflare's native observability and forensics product that enables security teams and developers to analyze, investigate, and monitor issues directly from the Cloudflare dashboard, without the expense and complexity of forwarding logs to third-party tools.

Log Explorer provides access to Cloudflare logs with all the context available within the Cloudflare platform. You can monitor security and performance issues with custom dashboards or investigate and troubleshoot issues with log search. Benefits include:

* **Reduced cost and complexity**: Drastically reduce the expense and operational overhead associated with forwarding, storing, and analyzing terabytes of log data in external tools.
* **Faster detection and triage**: Access Cloudflare-native logs directly, eliminating cumbersome data pipelines and the ingest lags that delay critical security insights.
* **Accelerated investigations with full context**: Investigate incidents with Cloudflare's unparalleled contextual data, accelerating your analysis and understanding of "What exactly happened?" and "How did it happen?"
* **Minimal recovery time**: Seamlessly transition from investigation to action with direct mitigation capabilities via the Cloudflare platform.

Contract customers can choose to store their logs in Log Explorer for up to two years, at an additional cost of $0.10 per GB per month. Customers interested in this feature can contact their account team to have it added to their contract.

## Permissions

Access to Log Explorer features is controlled through specific permissions. Each permission grants users the ability to perform certain actions, such as querying logs, managing datasets, or creating dashboards.

| Feature                     | Required Permission | Description                             |
| --------------------------- | ------------------- | --------------------------------------- |
| **Manage datasets**         | Logs Edit           | Add, enable, or disable datasets.       |
| **Log Search**              | Logs Read           | Query logs in the dashboard or via API. |
| **Log Search (save query)** | Logs Write          | Save log search queries.                |
| **Custom dashboards**       | Analytics Read      | Create and view custom dashboards.      |

These permissions apply across both the dashboard and the API, and must be granted at either the account or zone level depending on which datasets you need to access.

Authentication with the API can be done via an API token or API key with an email. Refer to [Create API token](https://developers.cloudflare.com/fundamentals/api/get-started/create-token/) for further instructions.

## Features

### Log Search

Explore your Cloudflare logs directly within the Cloudflare dashboard or [API](https://developers.cloudflare.com/log-explorer/api/).

[ Use Log Search ](https://developers.cloudflare.com/log-explorer/log-search/) 

### Custom dashboards

Design customized views for tracking application security, performance, and usage metrics.

[ Use Custom dashboards ](https://developers.cloudflare.com/log-explorer/custom-dashboards/) 

### Manage datasets

Manage the data you want to store within Log Explorer.

[ Use Manage datasets ](https://developers.cloudflare.com/log-explorer/manage-datasets/) 

### API

Manage configuration and perform queries via the API.

[ Use API ](https://developers.cloudflare.com/log-explorer/api/) 

## Related products

**[Logpush](https://developers.cloudflare.com/logs/)** 

Forward Cloudflare logs to third-party tools for debugging, identifying configuration adjustments, and creating analytics dashboards.

**[Analytics](https://developers.cloudflare.com/analytics/)** 

Visualize the metadata collected by our products in the Cloudflare dashboard.

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}}]}
```

---

---
title: Log Search
description: Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or API, giving you visibility into your logs without the need to forward them to third-party services. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the dashboard or SQL API.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/log-search.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Log Search

Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or API, giving you visibility into your logs without the need to forward them to third-party services. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the dashboard or SQL API.

## When to use Log Explorer

Use Log Explorer when you need to investigate what actually happened with real production traffic:

* Analyzing historical data and trends
* Investigating security incidents after they occur
* Searching for patterns across thousands of requests
* Monitoring application performance over time
* Providing forensic evidence to support teams

Use [Trace](https://developers.cloudflare.com/rules/trace-request/) when you need to test what would happen with a simulated request:

* Understanding why a rule did not trigger as expected
* Testing how your rules handle different request scenarios
* Seeing the evaluation order of your rules
* Simulating requests from different geolocations or conditions

The key difference is that Log Explorer shows actual traffic, while Trace shows simulated "what-if" scenarios.

## Use Log Explorer

You can filter and view your logs via the Cloudflare dashboard or the API.

1. In the Cloudflare dashboard, go to the **Log Explorer** \> **Log Search** page.  
[ Go to **Log search** ](https://dash.cloudflare.com/?to=/:account/log-explorer/log-search)
2. Select the **Dataset** you want to use and in **Columns** select the dataset fields. If you selected a zone scoped dataset, select the zone you would like to use.
3. Enter a **Limit**. A limit is the maximum number of results to return, for example, 50.
4. Select the **Time period** from which you want to query, for example, the previous 12 hours.
5. Select **Add filter** to create your query. Select a **Field**, an **Operator**, and a **Value**, then select **Apply**.
6. A query preview is displayed. Select **Custom SQL** to change the query.
7. Select **Run query** when you are done. The results are displayed below within the **Query results** section.

For example, to find an HTTP request with a specific [Ray ID](https://developers.cloudflare.com/fundamentals/reference/cloudflare-ray-id/), go to **Custom SQL**, and enter the following SQL query:

```

SELECT

  clientRequestScheme,

  clientRequestHost,

  clientRequestMethod,

  edgeResponseStatus,

  clientRequestUserAgent

FROM http_requests

WHERE RayID = '806c30a3cec56817'

LIMIT 1


```

As another example, to find Cloudflare Access requests with selected columns from a specific timeframe you could perform the following SQL query:

```

SELECT

  CreatedAt,

  AppDomain,

  AppUUID,

  Action,

  Allowed,

  Country,

  RayID,

  Email,

  IPAddress,

  UserUID

FROM access_requests

WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'


```

### Headers and cookies

To query request headers, response headers, and cookies you must first enable logging for these fields using [Custom fields](https://developers.cloudflare.com/logs/logpush/logpush-job/custom-fields/). Configure the list of custom fields using the API or the dashboard; there is no need to modify the Logpush job itself.

The example below shows how to query HTTP requests by date, timestamp, client country, and a custom request header. Be sure to log the specific headers or cookies you plan to query in advance.

Terminal window

```

SELECT clientip, clientrequesthost, clientrequestmethod, edgeendtimestamp, edgestarttimestamp, rayid, clientcountry, requestheaders

FROM http_requests

WHERE Date >= '2025-07-17'

  AND Date <= '2025-07-17'

  AND edgeendtimestamp >= '2025-07-17T07:54:19Z'

  AND edgeendtimestamp <= '2025-07-18T07:54:19Z'

  AND clientcountry = 'us'

  AND requestheaders."x-test-header" like '%654AM%';


```

### Save queries

After selecting all the fields for your query, you can save it by selecting **Save query**. Provide a name and description to help identify it later. To view your saved and recent queries, select **Queries** — they will appear in a side panel where you can insert a new query, or delete any query.

## Integration with Security Analytics

You can also access the Log Explorer dashboard directly from the [Security Analytics dashboard](https://developers.cloudflare.com/waf/analytics/security-analytics/#logs). When doing so, the filters you applied in Security Analytics will automatically carry over to your query in Log Explorer.

## Optimize your queries

All the tables supported by Log Explorer contain a special column called `date`, which helps to narrow down the amount of data that is scanned to respond to your query, resulting in faster query response times. The value of `date` must be in the form of `YYYY-MM-DD`. For example, to query logs that occurred on October 12, 2023, add the following to your `WHERE` clause: `date = '2023-10-12'`. The column supports the standard operators of `<`, `>`, and `=`.

1. Log in to the [Cloudflare dashboard ↗](https://dash.cloudflare.com/login) and select your account.
2. Go to **Log Explorer** \> **Log Search** \> **Custom SQL**.
3. Enter the following SQL query:

```

SELECT

  clientip,

  clientrequesthost,

  clientrequestmethod,

  clientrequesturi,

  edgeendtimestamp,

  edgeresponsestatus,

  originresponsestatus,

  edgestarttimestamp,

  rayid,

  clientcountry,

  clientrequestpath,

  date

FROM

  http_requests

WHERE

  date = '2023-10-12' LIMIT 500


```

### Additional query optimization tips

* Narrow your query time frame. Focus on a smaller time window to reduce the volume of data processed. This helps avoid querying excessive amounts of data and speeds up response times.
* Omit `ORDER BY` and `LIMIT` clauses. These clauses can slow down queries, especially when dealing with large datasets. For queries that return a large number of records, reduce the time frame instead of limiting to the newest `N` records from a broader time frame.
* Select only necessary columns. For example, replace `SELECT *` with the list of specific columns you need. You can also use `SELECT RayId` as a first iteration and follow up with a query that filters by the Ray IDs to retrieve additional columns. Additionally, you can use `SELECT COUNT(*)` to probe for time frames with matching records without retrieving the full dataset.

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/log-search/","name":"Log Search"}}]}
```

---

---
title: SQL queries supported
description: This page outlines the SQL features supported by Log Explorer, including common aggregation functions, expressions, and query clauses.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/sql-queries.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# SQL queries supported

This page outlines the SQL features supported by Log Explorer, including common aggregation functions, expressions, and query clauses.

The diagram below illustrates the general shape of a valid query supported in Log Explorer. It shows how standard SQL clauses — such as `SELECT`, `WHERE`, `GROUP BY`, and `ORDER BY` — can be composed to form supported queries.

![Supported SQL grammar](https://developers.cloudflare.com/_astro/supported-sql-grammar-graph.bOILnB7v_1BnHeS.webp) 

Examples of queries include:

* `SELECT * FROM table WHERE (a = 1 OR b = "hello") AND c < 25.89`
* `SELECT a, b, c FROM table WHERE d >= "GB" LIMIT 10`

Note

* A default `LIMIT` of 10,000 is applied if the `LIMIT` clause is omitted.
* The `WHERE` clause supports up to 25 predicates, which can be grouped using parentheses.

### SQL Clauses in detail

The following SQL clauses define the structure and logic of queries in Log Explorer:

* `SELECT` \- The `SELECT` clause specifies the columns that you want to retrieve from the database tables. It can include individual column names, expressions, or even wildcard characters to select all columns.
* `FROM` \- The `FROM` clause specifies the tables from which to retrieve data. It indicates the source of the data for the `SELECT` statement.
* `WHERE` \- The `WHERE` clause filters the rows returned by a query based on specified conditions. It allows you to specify conditions that must be met for a row to be included in the result set.
* `SELECT DISTINCT` \- Removes duplicate rows from the result set.
* `GROUP BY` \- Groups rows for aggregation. The `GROUP BY` clause is used to group rows that have the same values into summary rows.
* `ORDER BY` \- Sorts the result set. The `ORDER BY` clause is used to sort the result set by one or more columns in ascending or descending order.
* `LIMIT` \- Restricts the number of rows returned. The `LIMIT` clause is used to constrain the number of rows returned by a query. It is often used in conjunction with the `ORDER BY` clause to retrieve the top `N` rows or to implement pagination.
* `OFFSET` \- Skips a specified number of rows before returning results.

The sections that follow break down the remaining components shown in the diagram — such as aggregation functions, string and numeric expressions, and supported operators — in more detail.

## Functions

Log Explorer supports a range of SQL functions to transform, evaluate, or summarize data. These include scalar and aggregation functions.

### Scalar functions

These help manipulate or evaluate values (often strings):

* `ARRAY_CONTAINS(array, element)` – Checks if the array contains the element.  
Example  
`ARRAY_CONTAINS(['US', 'CA'], ClientCountry)`  
Returns rows where `ClientCountry` is either `US` or `CA`.
* `SUBSTRING(string, from_number, for_number)` – Extracts part of a string.  
Example  
`SUBSTRING(ClientRequestPath, 0, 5)`  
Extracts the first `5` characters from `ClientRequestPath`.
* `LOWER(string)` – Converts to lowercase.  
Example  
`LOWER(ClientRequestUserAgent)`  
Converts the user agent string to lowercase.
* `UPPER(string)` – Converts to uppercase.  
Example  
`UPPER(ClientCountry)`  
Converts the country code to uppercase.

### Aggregation functions

Used to perform calculations on sets of rows:

* `SUM(expression)` – Total of values.  
Example  
`SUM(ClientRequestBytes)`  
Adds up the total number of bytes requested by clients.
* `MIN(expression)` – Minimum value.  
Example  
`MIN(OriginResponseDurationMs)`  
Finds the shortest response time from origin servers.
* `MAX(expression)` – Maximum value.  
Example  
`MAX(OriginResponseDurationMs)`  
Finds the longest response time.
* `COUNT(expression)` – Number of rows (can be all rows or non-null values).  
Example  
`COUNT(ClientRequestUserAgent)`  
Counts how many rows have a user agent value.
* `COUNT(DISTINCT expression)` – Number of distinct non-null values.  
Example  
`COUNT(DISTINCT ClientIP)`  
Counts how many unique client IPs made requests.
* `AVG(expression)` – Average of numeric values.  
Example  
`AVG(OriginResponseDurationMs)`  
Computes the average origin response time in milliseconds.

The diagram below represents the grammar for SQL expressions including scalar and aggregate functions.

![Scalar and aggregate functions](https://developers.cloudflare.com/_astro/scalar-aggregate-functions.ucmFeJbw_Z172y6U.webp) 

## Expressions

Conditions or logic used in queries:

* `CASE WHEN` – Conditional logic (like if-else).
* `AS` – Alias for columns or tables.
* `LIKE` – Pattern matching.
* `IN (list)` – Checks if a value is in a list.
* `BETWEEN ... AND ...` – Checks if a value is within a range.
* `Unary operator` – Operates on one operand (for example, `-5`).
* `Binary operator` – Operates on two operands (for example, `5 + 3`).
* `Nested Expressions` – Expression wrapped with parentheses, like `( x > y )` or `( 1 )`.
* `Compound identifier` – Multi-part name (for example, `schema.table.column`).
* `Array` – A collection of values (supported differently across SQL dialects).
* `Literals` \- represent values such as strings, numbers, or arrays.

The diagram below represents the grammar for SQL expressions, detailing the various forms an expression can take, including columns, literals, functions, operators, and aliases.

![SQL expressions](https://developers.cloudflare.com/_astro/expressions.BHSBeoXm_1Sx8ac.webp) 

The diagram below defines the grammar for unary operators, which operate on a single operand (for example, negation or logical `NOT`):

![Grammar for unary operators](https://developers.cloudflare.com/_astro/not.BmwQbTYc_Z1Idv4u.webp) 

## Binary Operators

Used for arithmetic, comparison, logical operations:

* Arithmetic: `+`, `-`, `*`, `/`, `%` (modulo)
* Comparison: `>`, `<`, `>=`, `<=`, `=`, `!=` (or `<>`)\`
* Logical: `AND`, `OR`, `XOR`
* Bitwise: `&`, `|`, `^`, `>>`, `<<`
* String concat: `||`

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/sql-queries/","name":"SQL queries supported"}}]}
```

---

---
title: Custom dashboards
description: Custom dashboards allow you to create tailored dashboards to monitor application security, performance, and usage. You can create monitors for ongoing monitoring of a previous incident, use them to identify indicators of suspicious activity, and access templates to help you get started.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/custom-dashboards.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Custom dashboards

Custom dashboards allow you to create tailored dashboards to monitor application security, performance, and usage. You can create monitors for ongoing monitoring of a previous incident, use them to identify indicators of suspicious activity, and access templates to help you get started.

Note

Enterprise customers can create up to 100 dashboards.

Customers on Pro and Business plans can create up to 5 dashboards.

Dashboards provide a visual interface that displays key metrics and analytics, helping you monitor and analyze data efficiently. Different dashboards serve different purposes. For example, a security dashboard tracks attack attempts and threats, a performance dashboard monitors API latency and uptime, and a usage dashboard analyzes traffic patterns and user behavior.

Different metrics serve distinct roles in providing insights into your application's performance. For example, total HTTP requests offer an overview of traffic volume, while average response time helps assess application speed. Additionally, usage metrics such as traffic patterns and user behavior provide insight into how users interact with your application. These metrics together enable you to spot trends, identify problems, and make informed, data-driven decisions.

## Create a new dashboard

To create a new dashboard, go to the **Log Explorer** \> **Dashboards** page.

[ Go to **Custom dashboards** ](https://dash.cloudflare.com/?to=/:account/log-explorer/dashboards) 

When creating a dashboard, you have two options: building one from scratch or using a pre-designed template.

* A **templates** provide a faster way to set up a dashboard with commonly used metrics and charts. They are useful for standard use cases, such as monitoring security threats, API performance, or bot traffic. Templates help you get started quickly while still allowing modifications to fit your requirements.
* On the other hand, **from-scratch dashboard** gives you full control over its structure, allowing you to choose the exact datasets, metrics, and visualizations that fit your needs. This approach is ideal if you have specific monitoring goals or need a highly customized view of your data.

Choosing between these options depends on whether you need a quick setup with predefined insights or a fully customized dashboard tailored to your unique analysis needs.

### Create a dashboard from scratch

When creating a dashboard from scratch, select the option **Create new**. You can follow the instructions in the following sections to start adding charts to your dashboard.

#### Create a new chart

To create a new chart, select **Add chart**. There are two ways to create a chart:

* **Use a prompt**: Enter a query like `Compare status code ranges over time.` The AI model decides the most appropriate visualization and constructs your chart configuration.
* **Customize your chart**: Select the chart elements manually, including the chart type, title, dataset to query, metrics, and filters. This option gives you full control over your chart's structure.

Refer to the following sections for more information about the charts, datasets, fields, metrics, and filters available.

##### Chart types

The available chart types include:

* **Timeseries**: Displays trends over time, enabling comparisons across multiple series.
* **Categorical**: Compares proportions across different series.
* **Stat**: Highlights a single value, showing its delta and sparkline for quick insights.
* **Percentage**: Represents one value as a percentage of another.
* **Top N**: Identifies the highest-ranking values for a given attribute.

##### Datasets and metrics

The available metrics and filters vary based on the dataset you want to use. For example, when using the HTTP Requests dataset, you can select **origin response duration** as a metric. You can then choose your preferred aggregation method for that metric, such as total, median, or quantiles. The following table outlines the datasets, fields, and available metrics:

| Dataset                      | Field                                                                                                               | Definition                                                                     | Metrics |
| ---------------------------- | ------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------ | ------- |
| HTTP Requests                | Requests                                                                                                            | The number of requests sent by a client to a server over the HTTP protocol.    | Total   |
| DNS Response Time            | The time taken for a DNS query to be resolved, measured from when a request is made to when a response is received. | Total, Average, Median, 95th percentile, 99th percentile                       |         |
| Time to First Byte           | The duration from when a request is made to when the first byte of the response is received from the server.        | Total, Average, Median, 95th percentile, 99th percentile                       |         |
| Bytes returned to the Client | The amount of data (in bytes) sent from the server to the client in response to requests.                           | Total, Average, Median, 95th percentile, 99th percentile                       |         |
| Number of visits             | Unique visits or sessions to a website or application.                                                              | Total                                                                          |         |
| Origin response duration     | The time taken by the origin server to process and respond to a request.                                            | Total, Average, Median, 95th percentile, 99th percentile                       |         |
| Security Events              | Security events                                                                                                     | Actions taken by Application Security products such as WAF and Bot Management. | Total   |

##### Filters

You can also adjust the scope of your analytics by entering filter conditions. This allows you to focus on the most relevant data.

1. Select **Add filter**.
2. Select a **field**, an **operator**, and a **value**. For example, to filter events by source IP address, select the _Source IP_ field, select the _equals_ operator, and enter the IP address.
3. Select **Apply**.

### Create a dashboard from a template

Alternatively, you can choose to create your dashboard using a pre-designed dashboard template. The templates available are:

* **Bot monitoring**: Allows you to identify automated traffic accessing your website.
* **API Security**: Allows you to monitor data transfers and exceptions for API endpoints in your application.
* **Account takeover**: Allows you to monitor login attempts, usage of leaked credentials, and account takeover attacks.
* **API Performance**: Allows you to view timing data for API endpoints in your application, along with error rates.
* **Performance monitoring**: Allows you to identify slow hosts and paths on your origin server, and view time to first byte metrics over time.

Choose one of the templates and select **Use template**.

## Edit a dashboard or chart

After creating your dashboard, to view your saved dashboards, select **Back to all dashboards** to access the full list. Regardless of the way you choose to create your dashboard, you can always edit existing charts and add new ones as needed.

## Further analysis

For each chart, you can:

* Review related traffic in [Security Analytics](https://developers.cloudflare.com/waf/analytics/security-analytics/).
* Explore detailed logs in [Log Search](https://developers.cloudflare.com/log-explorer/log-search/).

This ensures deeper insights into your application's security, performance, and usage patterns.

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/custom-dashboards/","name":"Custom dashboards"}}]}
```

---

---
title: Example SQL queries
description: SQL queries for traffic, security, and performance analysis.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/example-queries.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Example SQL queries

The following examples show practical SQL queries you can use with the `http_requests` dataset in Log Explorer. For the full list of supported SQL syntax, refer to [SQL queries supported](https://developers.cloudflare.com/log-explorer/sql-queries/).

Adjust the date ranges in each example to match the time period you want to query.

## Summarize CDN usage

Get a high-level summary of total requests and data transfer for a specific time period. Results include total bytes transferred and conversions to megabytes and gigabytes.

```

SELECT

  COUNT(*) AS total_requests,

  SUM(EdgeResponseBytes) AS total_data_transfer,

  SUM(EdgeResponseBytes) / (1024.0 * 1024.0 * 1024.0) AS total_data_transfer_gb,

  SUM(EdgeResponseBytes) / (1024.0 * 1024.0) AS total_data_transfer_mb

FROM

  http_requests

WHERE {{ timeFilter }}


```

## Review distribution of security actions

Understand how security actions, such as blocks and challenges, are distributed across your traffic and identify the most common security responses applied to requests.

```

SELECT

  SecurityAction,

  COUNT(*) AS ActionCount

FROM http_requests

WHERE SecurityAction != 'unknown'

  AND SecurityAction IS NOT NULL

GROUP BY SecurityAction

ORDER BY ActionCount DESC


```

## Find IPs that triggered challenges

Identify the top client IP addresses and request URIs that triggered managed, JavaScript, or interactive challenges to investigate potential bot activity or targeted attacks.

```

SELECT

  ClientIP,

  ClientRequestURI,

  SecurityActions,

  COUNT(*) AS Count

FROM http_requests

WHERE {{ timeFilter }}

  AND (

    ARRAY_CONTAINS(SecurityActions, 'challenge')

    OR ARRAY_CONTAINS(SecurityActions, 'managedChallenge')

    OR ARRAY_CONTAINS(SecurityActions, 'jsChallenge')

    OR ARRAY_CONTAINS(SecurityActions, 'challengeSolved')

  )

GROUP BY

  ClientIP,

  ClientRequestURI,

  SecurityActions

ORDER BY Count DESC

LIMIT 20


```

## Find highest bandwidth consumers by URI

Identify which request URIs consume the most bandwidth to pinpoint large assets or endpoints that drive the most data transfer.

```

SELECT

  ClientRequestURI,

  SUM(EdgeResponseBytes) / (1024 * 1024) AS MegabytesTransferred

FROM http_requests

WHERE  {{ timeFilter }}

GROUP BY ClientRequestURI

ORDER BY MegabytesTransferred DESC

LIMIT 10


```

## Analyze client round-trip time by country

Analyze client TCP round-trip time (RTT) across different countries to identify regions with high latency that might benefit from additional optimization.

```

SELECT

  ClientCountry,

  COUNT(*) AS requests,

  AVG(ClientTCPRttMs) AS avg_rtt,

  MIN(ClientTCPRttMs) AS min_rtt,

  MAX(ClientTCPRttMs) AS max_rtt

FROM http_requests

WHERE {{ timeFilter }}

GROUP BY ClientCountry

ORDER BY avg_rtt DESC

LIMIT 20


```

## Summarize CDN traffic by cache status

Break down traffic by cache status and measure the average time to first byte (TTFB) for each status to evaluate cache effectiveness and identify opportunities to improve cache hit ratios.

```

SELECT

  CacheCacheStatus,

  COUNT(*) AS requests,

  SUM(EdgeResponseBytes) AS total_bytes,

  AVG(EdgeTimeToFirstByteMs) AS avg_ttfb

FROM http_requests

WHERE {{ timeFilter }}

GROUP BY CacheCacheStatus

ORDER BY requests DESC


```

## Find slowest paths by time to first byte

Find request paths with the highest average time to first byte (TTFB), along with request counts and server error counts toidentify slow endpoints that may need optimization.

```

SELECT

  ClientRequestPath,

  AVG(EdgeTimeToFirstByteMs) AS avg_ttfb,

  COUNT(*) AS requests,

  SUM(CASE WHEN EdgeResponseStatus >= 500 THEN 1 ELSE 0 END) AS errors

FROM http_requests

WHERE {{ timeFilter }}

GROUP BY ClientRequestPath

ORDER BY avg_ttfb DESC

LIMIT 10


```

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/example-queries/","name":"Example SQL queries"}}]}
```

---

---
title: Manage datasets
description: Log Explorer allows you to enable or disable which datasets are available to query in Log Search.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/manage-datasets.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Manage datasets

Log Explorer allows you to enable or disable which datasets are available to query in Log Search.

Note

Canceling a Log Explorer subscription stops renewal, but it does not automatically stop log ingestion during the current billing cycle. To completely turn off Log Explorer, refer to [How do I turn off Log Explorer?](https://developers.cloudflare.com/log-explorer/faq/#how-do-i-turn-off-log-explorer).

## Supported datasets

Log Explorer currently supports the following datasets:

### Zone level

* [HTTP Requests](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/http%5Frequests/) (`http_requests`)
* [Firewall Events](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/firewall%5Fevents/) (`firewall_events`)
* [DNS Logs](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/dns%5Flogs/) (`dns_logs`)
* [NEL Reports](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/nel%5Freports/) (`nel_reports`)
* [Page Shield Events](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/page%5Fshield%5Fevents/) (`page_shield_events`) (events for client-side security)
* [Spectrum Events](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/spectrum%5Fevents/) (`spectrum_events`)
* [Zaraz Events](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/zaraz%5Fevents/) (`zaraz_events`)

### Account level

* [Access requests](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/access%5Frequests/) (`access_requests`)
* [CASB findings](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/casb%5Ffindings/) (`casb_findings`)
* [Device posture results](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/device%5Fposture%5Fresults/) (`device_posture_results`)
* [Gateway DNS](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/gateway%5Fdns/) (`gateway_dns`)
* [Gateway HTTP](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/gateway%5Fhttp/) (`gateway_http`)
* [Gateway Network](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/gateway%5Fnetwork/) (`gateway_network`)
* [Zero Trust Network Session Logs](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/zero%5Ftrust%5Fnetwork%5Fsessions/) (`zero_trust_network_sessions`)
* [Audit Logs](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/audit%5Flogs/) (`audit_logs`)
* [Audit\_logs\_v2](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/audit%5Flogs%5Fv2/) (`audit_logs_v2`)
* [Browser Isolation User Actions](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/biso%5Fuser%5Factions/) (`biso_user_actions`)
* [DNS firewall logs](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/dns%5Ffirewall%5Flogs/) (`dns_firewall_logs`)
* [Email security alerts](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/email%5Fsecurity%5Falerts/) (`email_security_alerts`)
* [Magic IDS Detections](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/magic%5Fids%5Fdetections/) (`magic_ids_detections`)
* [Network Analytics](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/network%5Fanalytics%5Flogs/) (`network_analytics_logs`)
* [Sinkhole HTTP Logs](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/sinkhole%5Fhttp%5Flogs/) (`sinkhole_http_logs`)
* [IP Sec Logs](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/account/ipsec%5Flogs/) (`ipsec_logs`)

## Enable Log Explorer

In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API.

1. In the Cloudflare dashboard, go to the **Log Explorer** \> **Manage datasets** page.  
[ Go to **Manage datasets** ](https://dash.cloudflare.com/?to=/:account/log-explorer/manage-sources)
2. Select **Add dataset** to select the datasets you want to query.
3. Choose a dataset and then a zone. Then, select **Add**. You can always return to this page to enable more datasets or manage your existing ones.

Note

It may take a few minutes for the logs to become available for querying.

If you are using the API, Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs.

The following `curl` command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds.

Terminal window

```

curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/datasets \

--header "Authorization: Bearer <API_TOKEN>" \

--json '{

  "dataset": "http_requests"

}'


```

```

{

  "result": {

    "dataset": "http_requests",

    "object_type": "zone",

    "object_id": "<ZONE ID>",

    "created_at": "2025-06-03T14:33:16Z",

    "updated_at": "2025-06-03T14:33:16Z",

    "dataset_id": "01973635f7e273a1964a02f4d4502499",

    "enabled": true

  },

  "success": true,

  "errors": [],

  "messages": []

}


```

To enable an account-level dataset, replace `zones/{zone_id}` with `accounts/{account_id}` in the `curl` command. For example:

Terminal window

```

curl https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/explorer/datasets \

--header "Authorization: Bearer <API_TOKEN>" \

--json '{

  "dataset": "access_requests"

}'


```

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/manage-datasets/","name":"Manage datasets"}}]}
```

---

---
title: Log Explorer API
description: Configuration and Log searches are also available via a public API.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/api.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Log Explorer API

Configuration and Log searches are also available via a public API.

## Authentication

Authentication with the API can be done via an API token or API key with an email. Refer to [Create API token](https://developers.cloudflare.com/fundamentals/api/get-started/create-token/) for further instructions.

For detailed information on permissions required for each Log Explorer feature, refer to the [Permissions](https://developers.cloudflare.com/log-explorer/#permissions) section.

## Query data

Log Explorer includes a SQL API for submitting queries.

For example, to find an HTTP request with a specific [Ray ID](https://developers.cloudflare.com/fundamentals/reference/cloudflare-ray-id/), use the following SQL query:

Terminal window

```

curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/query/sql \

--header "Authorization: Bearer <API_TOKEN>" \

--url-query query="SELECT clientRequestScheme, clientRequestHost, clientRequestMethod, edgeResponseStatus, clientRequestUserAgent FROM http_requests WHERE RayID = '806c30a3cec56817' LIMIT 1"


```

This command returns the following HTTP request details:

```

{

  "result": [

    {

      "clientrequestscheme": "https",

      "clientrequesthost": "example.com",

      "clientrequestmethod": "GET",

      "clientrequestuseragent": "curl/7.88.1",

      "edgeresponsestatus": 200

    }

  ],

  "success": true,

  "errors": [],

  "messages": []

}


```

As another example, you could find Cloudflare Access requests with selected columns from a specific timeframe by performing the following SQL query:

Terminal window

```

curl https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/explorer/query/sql \

--header "Authorization: Bearer <API_TOKEN>" \

--url-query query="SELECT CreatedAt, AppDomain, AppUUID, Action, Allowed, Country, RayID, Email, IPAddress, UserUID FROM access_requests WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'"


```

This command returns the following request details:

```

{

  "result": [

    {

      "createdat": "2025-01-14T18:17:55Z",

      "appdomain": "example.com",

      "appuuid": "a66b4ab0-ccdf-4d60-a6d0-54a59a827d92",

      "action": "login",

      "allowed": true,

      "country": "us",

      "rayid": "90fbb07c0b316957",

      "email": "user@example.com",

      "ipaddress": "1.2.3.4",

      "useruid": "52859e81-711e-4de0-8b31-283336060e79"

    }

  ],

  "success": true,

  "errors": [],

  "messages": []

}


```

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/api/","name":"Log Explorer API"}}]}
```

---

---
title: Pricing and managing usage
description: Log Explorer billing is based on the volume of logs ingested and stored, measured in gigabytes (GB). Your charges scale with the amount of log data you choose to retain in Log Explorer.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/pricing.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Pricing and managing usage

Log Explorer billing is based on the volume of logs ingested and stored, measured in gigabytes (GB). Your charges scale with the amount of log data you choose to retain in Log Explorer.

Unlike query-based billing models, charges are not based on how often you search or scan your data. Once logs are ingested and stored, you can query them without additional cost.

## Availability

Log Explorer is available as a paid add-on for any Application Services or Zero Trust purchase. There is no free version or trial available at this time.

## Billable usage

Log Explorer billing is strictly consumption-based, calculated by the GBs ingested and stored.

### Attack traffic

Because Log Explorer is a forensics product, attack traffic is considered valuable data for analysis and is included in your billable usage.

Note

Logs generated from Layer 7 (L7) DDoS attack traffic are not ingested by default and do not count toward your Log Explorer usage.

## Estimate usage

To estimate your Log Explorer usage, review your request volumes in **Analytics** for specific Cloudflare log datasets.

### Record size by dataset

The following table provides average and maximum record sizes for each dataset to help you estimate potential storage needs:

| Dataset                        | Average Record Size | Maximum Record Size |
| ------------------------------ | ------------------- | ------------------- |
| audit\_logs                    | 2.69 kB             | 172 kB              |
| email\_security\_alerts        | 6.74 kB             | 74.9 kB             |
| firewall\_events               | 1.36 kB             | 47.2 kB             |
| audit\_logs\_v2                | 1.73 kB             | 28.5 kB             |
| zaraz\_events                  | 7.30 kB             | 11.7 kB             |
| http\_requests                 | 1.56 kB             | 9.76 kB             |
| gateway\_dns                   | 1.44 kB             | 6.23 kB             |
| dex\_application\_tests        | 3.29 kB             | 5.67 kB             |
| casb\_findings                 | 2.67 kB             | 3.80 kB             |
| gateway\_http                  | 1.47 kB             | 2.60 kB             |
| dex\_device\_state\_events     | 1.98 kB             | 2.57 kB             |
| page\_shield\_events           | 443 B               | 2.02 kB             |
| network\_analytics\_logs       | 1.31 kB             | 1.87 kB             |
| zero\_trust\_network\_sessions | 1.21 kB             | 1.52 kB             |
| gateway\_network               | 877 B               | 1.17 kB             |
| device\_posture\_results       | 730 B               | 944 B               |
| spectrum\_events               | 685 B               | 925 B               |
| sinkhole\_http\_logs           | 705 B               | 705 B               |
| access\_requests               | 446 B               | 541 B               |
| dns\_firewall\_logs            | 387 B               | 469 B               |
| dns\_logs                      | 199 B               | 409 B               |
| magic\_ids\_detections         | 334 B               | 349 B               |
| warp\_toggle\_changes          | 327 B               | 335 B               |
| ipsec\_logs                    | 207 B               | 260 B               |
| nel\_reports                   | 204 B               | 224 B               |

## Monitor usage

Cloudflare provides three primary ways to track your consumption and maintain financial oversight:

* **In-product quick indicator**: View your current month's usage directly within the Log Explorer interface at the top of the **Log Search** and **Manage Datasets** sections.
* **Account-level billing**: Access a detailed view of current and previous months' cumulative usage under **Manage Account** \> **Billing**.
* **Usage alerts**: Set up automated notifications to trigger when billable usage exceeds a defined threshold.

### Configure a usage alert

1. Log in to the [Cloudflare dashboard ↗](https://dash.cloudflare.com/) and select **Manage account**.
2. Go to **Notifications** \> **Add**.
3. Select **Usage-based Billing**.
4. Define your threshold and the notification destination (email, PagerDuty, or webhooks).

## Deactivate Log Explorer

To stop using Log Explorer and end associated charges, you must complete both of the following steps:

### 1\. Stop log ingestion

Disabling datasets stops additional ingestion charges immediately.

1. Go to the [Manage datasets ↗](https://dash.cloudflare.com/?to=/:account/log-explorer/manage-sources) page at the account level.
2. Use the toggle to turn off each dataset you no longer need.
3. Select **Stop ingesting logs** to confirm.

### 2\. Cancel the subscription

This prevents the subscription from renewing at the next billing cycle.

1. Go to the [Billing ↗](https://dash.cloudflare.com/?to=/:account/billing) page.
2. In the **Subscriptions** tab, find the **Log Explorer** subscription and select **Cancel**.

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/pricing/","name":"Pricing and managing usage"}}]}
```

---

---
title: FAQ
description: Find answers to common questions about Log Explorer.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/faq.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# FAQ

## Which fields (or columns) are available for querying?

All fields listed in [Datasets](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/) for the [supported datasets](https://developers.cloudflare.com/log-explorer/manage-datasets/#supported-datasets) are viewable in Log Explorer.

## Why does my query not complete or time out?

Log Explorer performs best when query parameters focus on narrower ranges of time. You may experience query timeouts when your query would return a large quantity of data. Consider refining your query to improve performance.

## Why do I not see any logs in my queries after enabling the dataset?

Log Explorer starts ingesting logs from the moment you enable the dataset. It will not display logs for events that occurred before the dataset was enabled. Make sure that new events have been generated since enabling the dataset, and check again.

## My query returned an error. How do I figure out what went wrong?

We are actively working on improving error codes. If you receive a generic error, check your SQL syntax (if you are using the custom SQL feature), and make sure you have included a date and a limit. If the query still fails it is likely timing out. Try refining your filters.

## Where is the data stored?

The data is stored in Cloudflare R2\. Each Log Explorer dataset is stored on a per-customer level, similar to Cloudflare D1, ensuring that your data is kept separate from that of other customers. In the future, this single-tenant storage model will provide you with the flexibility to create your own retention policies and decide in which regions you want to store your data.

## Does Log Explorer support Customer Metadata Boundary?

Customer Metadata Boundary is currently not supported for Log Explorer.

## Are there any constraints on the log volume that Log Explorer can support?

We are continually scaling the Log Explorer data platform. At present, Log Explorer supports log ingestion rates of up to 50,000 records per second. If your needs exceed this, contact your account team.

## How is Log Explorer different from Logpush? Do I need both?

Log Explorer allows you to search and analyze your Cloudflare logs directly in the dashboard or via API. [Logpush](https://developers.cloudflare.com/logs/logpush/), on the other hand, delivers raw logs to third-party SIEMs or storage systems. You generally do not need both, but some customers choose to use Log Explorer for quick investigation and Logpush for long-term storage or integration with other tools.

## Is there a free version or trial of Log Explorer?

Log Explorer is available as a paid add-on for any Application Services or Zero Trust purchase. There is no free version at this time.

## How is Log Explorer billed?

Log Explorer billing is based on the volume of logs indexed and stored, measured in gigabytes (GB). Your charges scale with the amount of log data you choose to retain in Log Explorer. Unlike query-based billing models (for example, BigQuery), charges are not based on how often you search or scan your data. Once logs are ingested and stored, you can query them without additional cost.

## Are logs from attack traffic included in my Log Explorer usage?

Yes. In general, Log Explorer bills based on the total volume of logs ingested and stored, including attack traffic. Since these logs are often critical for investigating security incidents, they are treated the same as all other log data.

However, logs generated from Layer 7 (L7) DDoS attack traffic are not ingested by default and therefore do not count toward your Log Explorer usage.

## How does Log Explorer store data in R2, and why do I not see it in my own R2 bucket?

Log Explorer uses Cloudflare Logpush and R2 behind the scenes to stream and store logs. For technical and performance reasons, the data is stored in internal, customer-specific R2 buckets managed by Cloudflare. These buckets are single-tenant to keep your data isolated, but they are not visible in your account's R2 interface. You are not billed separately for this storage — it is included in your Log Explorer usage.

## Are Custom Dashboards based on R2 Log Explorer data, or on GraphQL?

Custom Dashboards currently run on [GraphQL](https://developers.cloudflare.com/analytics/graphql-api/sampling/). Over time, this will evolve to include deeper integration between the two features, such as building charts directly from logs.

## How can I track my Log Explorer usage?

Your monthly usage is displayed at the top of the Log Search and Manage Datasets dashboard sections within Log Explorer.

![Usage display in the dashboard](https://developers.cloudflare.com/_astro/log-explorer-usage.CTcGXtWV_Z1AOepV.webp) 

## How do I turn off Log Explorer?

To turn off Log Explorer you must:

1. **Stop log ingestion to immediately stop incurring additional charges.** To stop log ingestion, disable any enabled datasets at both the account level and zone level.
2. **Cancel the Log Explorer subscription to stop renewal.** Your subscription may remain active until the end of the current billing cycle.

### 1\. Stop log ingestion

After performing the following steps, you will immediately stop incurring additional charges for Log Explorer.

#### Review and disable account-level datasets

1. In the Cloudflare dashboard, go to the account-level **Manage datasets** page.  
[ Go to **Manage datasets** ](https://dash.cloudflare.com/?to=/:account/log-explorer/manage-sources)
2. Turn off each dataset you no longer need using the toggle. To confirm each operation, select **Stop ingesting logs**.

#### Review and disable zone-level datasets

1. In the Cloudflare dashboard, go to the zone-level **Manage datasets** page.  
[ Go to **Manage datasets** ](https://dash.cloudflare.com/?to=/:account/:zone/log-explorer/manage-sources)
2. Turn off each dataset you no longer need using the toggle. To confirm each operation, select **Stop ingesting logs**.
3. Repeat for all relevant zones.

### 2\. Cancel the Log Explorer subscription

This operation will stop Log Explorer's renewal.

1. In the Cloudflare dashboard, go to the **Billing** page.  
[ Go to **Billing** ](https://dash.cloudflare.com/?to=/:account/billing)
2. In the **Subscriptions** tab, find the Log Explorer subscription and select **Cancel**.

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/faq/","name":"FAQ"}}]}
```

---

---
title: Changelog
description: Cloudflare Log Explorer now allows you to customize exactly which data fields are ingested and stored when enabling or managing log datasets.
image: https://developers.cloudflare.com/core-services-preview.png
---

[Skip to content](#%5Ftop) 

Was this helpful?

YesNo

[ Edit page ](https://github.com/cloudflare/cloudflare-docs/edit/production/src/content/docs/log-explorer/changelog.mdx) [ Report issue ](https://github.com/cloudflare/cloudflare-docs/issues/new/choose) 

Copy page

# Changelog

[ Subscribe to RSS ](https://developers.cloudflare.com/changelog/rss/log-explorer.xml) 

## 2026-03-11

  
**Ingest field selection for Log Explorer**   

Cloudflare Log Explorer now allows you to customize exactly which data fields are ingested and stored when enabling or managing log datasets.

Previously, ingesting logs often meant taking an "all or nothing" approach to data fields. With **Ingest Field Selection**, you can now choose from a list of available and recommended fields for each dataset. This allows you to reduce noise, focus on the metrics that matter most to your security and performance analysis, and manage your data footprint more effectively.

#### Key capabilities

* **Granular control:** Select only the specific fields you need when enabling a new dataset.
* **Dynamic updates:** Update fields for existing, already enabled logstreams at any time.
* **Historical consistency:** Even if you disable a field later, you can still query and receive results for that field for the period it was captured.
* **Data integrity:** Core fields, such as `Timestamp`, are automatically retained to ensure your logs remain searchable and chronologically accurate.

#### Example configuration

When configuring a dataset via the dashboard or API, you can define a specific set of fields. The `Timestamp` field remains mandatory to ensure data indexability.

```

{

  "dataset": "firewall_events",

  "enabled": true,

  "fields": [

    "Timestamp",

    "ClientRequestHost",

    "ClientIP",

    "Action",

    "EdgeResponseStatus",

    "OriginResponseStatus"

  ]

}


```

For more information, refer to the [Log Explorer documentation](https://developers.cloudflare.com/log-explorer/).

## 2026-02-09

  
**Tabs and pivots**   

Log Explorer now supports multiple concurrent queries with the new Tabs feature. Work with multiple queries simultaneously and pivot between datasets to investigate malicious activity more effectively.

#### Key capabilities

* **Multiple tabs:** Open and switch between multiple query tabs to compare results across different datasets.
* **Quick filtering:** Select the filter button from query results to add a value as a filter to your current query.
* **Pivot to new tab:** Use Cmd + click on the filter button to start a new query tab with that filter applied.
* **Preserved progress:** Your query progress is preserved on each tab if you navigate away and return.

For more information, refer to the [Log Explorer documentation](https://developers.cloudflare.com/log-explorer/).

## 2025-11-13

  
**Fixed custom SQL date picker inconsistencies**   

We've resolved a bug in Log Explorer that caused inconsistencies between the custom SQL date field filters and the date picker dropdown. Previously, users attempting to filter logs based on a custom date field via a SQL query sometimes encountered unexpected results or mismatching dates when using the interactive date picker.

This fix ensures that the custom SQL date field filters now align correctly with the selection made in the date picker dropdown, providing a reliable and predictable filtering experience for your log data. This is particularly important for users creating custom log views based on time-sensitive fields.

## 2025-11-13

  
**Log Explorer adds 14 new datasets**   

We've significantly enhanced Log Explorer by adding support for 14 additional Cloudflare product datasets.

This expansion enables Operations and Security Engineers to gain deeper visibility and telemetry across a wider range of Cloudflare services. By integrating these new datasets, users can now access full context to efficiently investigate security incidents, troubleshoot application performance issues, and correlate logged events across different layers (like application and network) within a single interface. This capability is crucial for a complete and cohesive understanding of event flows across your Cloudflare environment.

The newly supported datasets include:

#### Zone Level

* `Dns_logs`
* `Nel_reports`
* `Page_shield_events`
* `Spectrum_events`
* `Zaraz_events`

#### Account Level

* `Audit Logs`
* `Audit_logs_v2`
* `Biso_user_actions`
* `DNS firewall logs`
* `Email_security_alerts`
* `Magic Firewall IDS`
* `Network Analytics`
* `Sinkhole HTTP`
* `ipsec_logs`

Note

`Auditlog` and `Auditlog_v2` datasets require `audit-log.read` permission for querying.

The `biso_user_actions` dataset requires either the `Super Admin` or `ZT PII` role for querying.

#### Example: Correlating logs

You can now use Log Explorer to query and filter with each of these datasets. For example, you can identify an IP address exhibiting suspicious behavior in the `FW_event` logs, and then instantly pivot to the `Network Analytics` logs or `Access` logs to see its network-level traffic profile or if it bypassed a corporate policy.

To learn more and get started, refer to the [Log Explorer documentation](https://developers.cloudflare.com/log-explorer/) and the [Cloudflare Logs documentation](https://developers.cloudflare.com/logs/).

## 2025-11-11

  
**Resize your custom SQL window in Log Explorer**   

We're excited to announce a quality-of-life improvement for Log Explorer users. You can now resize the custom SQL query window to accommodate longer and more complex queries.

Previously, if you were writing a long custom SQL query, the fixed-size window required excessive scrolling to view the full query. This update allows you to easily drag the bottom edge of the query window to make it taller. This means you can view your entire custom query at once, improving the efficiency and experience of writing and debugging complex queries.

To learn more and get started, refer to the [Log Explorer documentation](https://developers.cloudflare.com/log-explorer/).

## 2025-11-04

  
**Log Explorer now supports query cancellation**   

We're excited to announce that Log Explorer users can now cancel queries that are currently running.

This new feature addresses a common pain point: waiting for a long, unintended, or misconfigured query to complete before you can submit a new, correct one. With query cancellation, you can immediately stop the execution of any undesirable query, allowing you to quickly craft and submit a new query, significantly improving your investigative workflow and productivity within Log Explorer.

## 2025-11-04

  
**Log Explorer now shows query result distribution**   

We're excited to announce a new feature in Log Explorer that significantly enhances how you analyze query results: the Query results distribution chart.

This new chart provides a graphical distribution of your results over the time window of the query. Immediately after running a query, you will see the distribution chart above your result table. This visualization allows Log Explorer users to quickly spot trends, identify anomalies, and understand the temporal concentration of log events that match their criteria. For example, you can visually confirm if a spike in traffic or errors occurred at a specific time, allowing you to focus your investigation efforts more effectively. This feature makes it faster and easier to extract meaningful insights from your vast log data.

The chart will dynamically update to reflect the logs matching your current query.

## 2025-09-11

  
**Contextual pivots**   

Directly from [Log Search](https://developers.cloudflare.com/log-explorer/log-search/) results, customers can pivot to other parts of the Cloudflare dashboard to immediately take action as a result of their investigation.

From the `http_requests` or `fw_events` dataset results, right click on an IP Address or JA3 Fingerprint to pivot to the Investigate portal to lookup the reputation of an IP address or JA3 fingerprint.

![Investigate IP address](https://developers.cloudflare.com/_astro/investigate-ip-address.BMVSMzDi_Z1KASOQ.webp) 

Easily learn about error codes by linking directly to our documentation from the **EdgeResponseStatus** or **OriginResponseStatus** fields.

![View documentation](https://developers.cloudflare.com/_astro/view-documentation.Cem5QgeO_Z1vzjwR.webp) 

From the `gateway_http` dataset, click on a **policyid** to link directly to the Zero Trust dashboard to review or make changes to a specific Gateway policy.

![View policy](https://developers.cloudflare.com/_astro/policyid.CVjEdahj_1GFFHp.webp) 

## 2025-09-11

  
**New results table view**   

The results table view of **Log Search** has been updated with additional functionality and a more streamlined user experience. Users can now easily:

* Remove/add columns.
* Resize columns.
* Sort columns.
* Copy values from any field.
![New results table view](https://developers.cloudflare.com/_astro/new-table.C2Q8mWJ9_ZFs2Aq.webp) 

## 2025-09-03

  
**Logging headers and cookies using custom fields**   

[Log Explorer](https://developers.cloudflare.com/log-explorer/) now supports logging and filtering on header or cookie fields in the [http\_requests dataset](https://developers.cloudflare.com/logs/logpush/logpush-job/datasets/zone/http%5Frequests/).

Create a custom field to log desired header or cookie values into the `http_requests` dataset and Log Explorer will import these as searchable fields. Once configured, use the custom SQL editor in Log Explorer to view or filter on these requests.

![Edit Custom fields](https://developers.cloudflare.com/_astro/edit-custom-fields.Cy4qXSpL_1ma19s.webp) 

For more details, refer to [Headers and cookies](https://developers.cloudflare.com/log-explorer/log-search/#headers-and-cookies).

## 2025-08-15

  
**Extended retention**   

Customers can now rely on Log Explorer to meet their log retention compliance requirements.

Contract customers can choose to store their logs in Log Explorer for up to two years, at an additional cost of $0.10 per GB per month. Customers interested in this feature can contact their account team to have it added to their contract.

## 2025-07-09

  
**Usage tracking**   

[Log Explorer](https://developers.cloudflare.com/log-explorer/) customers can now monitor their data ingestion volume to keep track of their billing. Monthly usage is displayed at the top of the [Log Search](https://developers.cloudflare.com/log-explorer/log-search/) and [Manage Datasets](https://developers.cloudflare.com/log-explorer/manage-datasets/) screens in Log Explorer.

![Ingested data](https://developers.cloudflare.com/_astro/ingested-data.D2flqRIu_Z2v4FHF.webp) 

## 2025-06-18

  
**Log Explorer is GA**   

[Log Explorer](https://developers.cloudflare.com/log-explorer/) is now GA, providing native observability and forensics for traffic flowing through Cloudflare.

Search and analyze your logs, natively in the Cloudflare dashboard. These logs are also stored in Cloudflare's network, eliminating many of the costs associated with other log providers.

![Log Explorer dashboard](https://developers.cloudflare.com/_astro/log-explorer-dash.CJSVLZ7Y_ZXS1TD.webp) 

With Log Explorer, you can now:

* **Monitor security and performance issues with custom dashboards** – use natural language to define charts for measuring response time, error rates, top statistics and more.
* **Investigate and troubleshoot issues with Log Search** – use data type-aware search filters or custom sql to investigate detailed logs.
* **Save time and collaborate with saved queries** – save Log Search queries for repeated use or sharing with other users in your account.
* **Access Log Explorer at the account and zone level** – easily find Log Explorer at the account and zone level for querying any dataset.

For help getting started, refer to [our documentation](https://developers.cloudflare.com/log-explorer/).

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/log-explorer/","name":"Log Explorer"}},{"@type":"ListItem","position":3,"item":{"@id":"/log-explorer/changelog/","name":"Changelog"}}]}
```
