# Indexing Dashboard

The **Indexing Dashboard** gives you a complete view of how Google indexes your website. It uses the Google URL Inspection API to check each page individually, tracks status changes over time, and shows you exactly which pages are indexed, which aren't, and why.

The dashboard has **three tabs**:

* **Overview** — High-level indexing status, trends, and recent movements
* **Visual Diagnostics** — Charts that expose structural site issues (velocity, directory health, crawl budget, internal links, funnel)
* **Log Analyzer** — A combined data table merging GSC data, internal links, and server log hits with actionable insights

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Indexing Dashboard with three tabs: Overview, Visual Diagnostics, and Log Analyzer</p></figcaption></figure>

***

## Getting Started

To use the Indexing Dashboard, you need a Google Search Console property connected to SEO Utils with at least one sitemap added.

{% stepper %}
{% step %}
**Connect a GSC Property**

If you haven't already, connect your Google Search Console property using either a **Google OAuth Token** or a **Google Service Account**. See the [Google Search Console setup guide](https://help.seoutils.app/guide/google-search-console) for instructions.
{% endstep %}

{% step %}
**Add a Sitemap**

Navigate to **Sitemaps** from the property dropdown and add your sitemap URL (e.g., `https://yoursite.com/sitemap.xml`). SEO Utils will automatically fetch URLs from your sitemap daily.
{% endstep %}

{% step %}
**Open the Indexing Dashboard**

Select your property from the dropdown, then click **Indexing** in the navigation menu. The dashboard will begin collecting data automatically.
{% endstep %}
{% endstepper %}

{% hint style="warning" %}
On first visit, the dashboard shows "Based on 0/N URLs inspected" because no URLs have been inspected yet. Click **Run Initial Scan** in the top-right to start the first inspection batch, or wait for the automatic inspection to run (every 4 hours).
{% endhint %}

***

## Shared Controls

These controls appear at the top of the dashboard and apply across all three tabs:

### Date Filter

The date dropdown in the top-right controls the time range for the chart, table impressions/clicks, movements, funnel, and velocity data.

Available presets: **7 days, 14 days, 30 days, 2 months, 3 months, 6 months, 12 months**. The default is 30 days.

### Sitemap Errors Banner

If any sitemap has a fetch error (e.g., HTTP 404, timeout), a dismissible banner appears at the top showing the errors. This is visible on all tabs.

### Settings

Click the **cog icon** in the top-right to open the Indexing Settings modal. See the [Settings](#settings) section below for details.

***

## Tab 1: Overview

The Overview tab shows your high-level indexing status, historical trends, and recent status changes.

### Status Tabs

The three tabs at the top show your indexing breakdown:

| Tab             | Description                                                |
| --------------- | ---------------------------------------------------------- |
| **All**         | Total number of URLs discovered from your sitemap          |
| **Indexed**     | URLs with "Submitted and indexed" status (with percentage) |
| **Not indexed** | URLs that are not indexed, with percentage                 |

Switching tabs filters both the **chart** and the **pages table** below.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Status tabs showing All, Indexed, and Not indexed counts with percentages</p></figcaption></figure>

#### Sub-Status Chips

When you select the **Not indexed** tab, clickable chips appear showing the breakdown by reason:

* Crawled - currently not indexed
* Discovered - currently not indexed
* Duplicate, Google chose different canonical
* Not found (404)
* Server error (5xx)
* URL is unknown to Google
* And more...

Click any chip to filter the chart and table to that specific status. You can select multiple chips at once. Chips automatically reset when you switch to another tab.

### Percent Toggle

The **% toggle switch** (next to the tabs, aligned right) switches between absolute counts and percentages:

* **Off** — Tabs show counts (e.g., "Indexed 49"), chart y-axis shows numbers
* **On** — Tabs show percentages (e.g., "Indexed 49%"), chart y-axis and tooltips show %

### Indexing Overview Chart

The stacked bar chart inside the **Indexing Overview** card shows your indexing trend over time:

* **All tab** — Green (indexed) + orange (not indexed) stacked bars
* **Indexed tab** — Green bars only
* **Not indexed tab** — Multi-color bars by sub-status

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Stacked bar chart showing indexed vs not-indexed trend over 30 days</p></figcaption></figure>

Hover over any bar to see the exact counts (or percentages) in a tooltip.

### Pages Table

The pages table shows detailed indexing information for every URL in your sitemap.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Pages table with URL, clicks, impressions, status, last crawl, and inspection columns</p></figcaption></figure>

#### Columns

| Column              | Description                                                     |
| ------------------- | --------------------------------------------------------------- |
| **URL**             | The page path (click to open in browser)                        |
| **Clicks**          | Search clicks within the selected date period                   |
| **Impressions**     | Search impressions within the selected date period              |
| **Status**          | Coverage state with colored badge (e.g., Submitted and indexed) |
| **Last Crawl**      | When Google last crawled the page (relative + absolute date)    |
| **Rich Results**    | Shows "FAIL with N errors" if structured data issues found      |
| **Last Inspection** | When the URL was last inspected, with next scheduled date       |

#### Filters

The table supports two levels of filtering:

**Basic filters** (shown inline in the toolbar):

* **Page** — Filter by URL pattern (contains, does not contain, regex, exactly matches)
* **Status** — Select one or more coverage states
* **Content Group** — Filter by content cluster (if configured)
* **Pages at risk of de-indexing** — Toggle to show indexed pages where Google hasn't crawled in over 90 days

**Advanced filters** (click "Advanced" to expand):

* **Clicks** — Range filter (from/to)
* **Impressions** — Range filter (from/to)
* **Last Crawl** — Date range picker

#### Row Actions

Click the **...** menu on any row for these actions:

| Action               | Description                                                                                         |
| -------------------- | --------------------------------------------------------------------------------------------------- |
| **Check Index**      | Re-inspects the URL via the URL Inspection API and shows the updated status in a toast notification |
| **Request Indexing** | Submits the URL to Google for indexing via the Indexing API                                         |

#### Bulk Actions

Select multiple rows using the checkboxes, then use the **Bulk Actions** dropdown:

* **Check Index** — Re-inspect all selected URLs
* **Submit Index** — Submit all selected URLs for indexing

### Recent Movements

The movements section tracks changes in indexing status over time, helping you spot trends and catch issues early.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Recent Movements section with status changes and field type badges</p></figcaption></figure>

| Tab                                  | What it shows                                                               |
| ------------------------------------ | --------------------------------------------------------------------------- |
| **All Movements**                    | Every status change across all tracked fields                               |
| **Indexing Changes**                 | Only coverage state and indexing state changes (most relevant)              |
| **Recently Published & Not Indexed** | URLs first discovered in the selected date range that are still not indexed |

The **All Movements** tab includes a **Type** column showing which field changed (Coverage, Indexing, Robots, Canonical). You can filter by type and by page URL.

***

## Tab 2: Visual Diagnostics

The Visual Diagnostics tab shows five charts that help you identify structural indexing problems at a glance. Data is loaded only when you navigate to this tab.

### Internal Link Crawler

At the top of the Diagnostics tab, a **Crawler Progress Card** shows the status of the internal link crawler:

* **Not run yet** — "Internal link crawler has not been run yet." with a **Run Link Crawler** button
* **Running** — "Crawling... X/Y URLs" with live progress updates
* **Completed** — "Last crawl: Apr 5, 2026 — 2000 URLs"

Click **Run Link Crawler** to start a manual crawl. The crawler uses headless Chrome to visit each sitemap URL, extract all internal `<a>` links, and store the link graph (anchor text, rel attributes, follow/nofollow). This data powers the scatter plot and the inlink counts in the Log Analyzer table.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Crawler progress card showing last crawl date and Run Link Crawler button</p></figcaption></figure>

{% hint style="info" %}
The crawler processes up to **2,000 URLs per cycle**, prioritizing uncrawled URLs first. For larger sites, the full site is covered over multiple weekly cycles. You can configure the crawl rate and schedule in [Settings](#internal-link-crawler-settings).
{% endhint %}

### Indexing Velocity

This chart answers: **"Is Google getting faster or slower at indexing my new content?"**

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Indexing Velocity chart showing median time-to-index with P90 band over weeks</p></figcaption></figure>

* **Blue line** — Median (P50) time-to-index in days
* **Red shaded area** — 90th percentile (P90) band, showing how long the slowest 10% of URLs take

The X-axis shows weeks. Each data point represents all URLs published that week and how long they took to reach "Submitted and indexed" status.

{% hint style="info" %}
This is a **forward-looking metric**. It only tracks URLs discovered after you set up SEO Utils, so the data won't include historical pages that existed before your first sitemap sync. New users will see an empty state: *"Awaiting new content. Velocity tracking begins when you publish new URLs to your sitemap."*
{% endhint %}

### Directory Health

Horizontal stacked bars showing the **indexed vs. not-indexed ratio** for each subdirectory of your site.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Directory Health chart showing indexed percentage per directory with green/red stacked bars</p></figcaption></figure>

* **Green** = indexed percentage
* **Red** = not-indexed percentage
* Shows the **top 10 directories** by URL count, with an "Other" bucket for the rest
* Root-level pages (like `/about`, `/contact`) are grouped into the `/` bucket

Use the **Group By** dropdown (top-right of the chart) to switch between:

* **Directory Path (Auto)** — Groups by first URL path segment (default)
* **Content Groups** — Groups by the content groups you've created in the [Insights tab](https://help.seoutils.app/guide/google-search-console) (using URL filters or manual page selection)

### Crawl Budget by Directory

A bar chart showing how **Googlebot allocates its crawl budget** across your site's directories.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Crawl Budget chart showing Googlebot hit distribution by directory</p></figcaption></figure>

This chart is only available when you have a **log report linked** to this domain. If no log report is linked, you'll see: *"Link a log report to see crawl budget allocation by directory."*

{% hint style="success" %}
**The "aha moment"**: Compare the Directory Health chart with the Crawl Budget chart side by side. If `/search/` is only 15% indexed but consumes 75% of all Googlebot hits, you've found a crawl trap that needs fixing.
{% endhint %}

### Internal Links vs. Indexation

A scatter plot that shows the **correlation between internal link count and indexation success**.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Scatter plot showing internal links vs indexation with two horizontal bands</p></figcaption></figure>

* **X-axis** — Number of internal inlinks (from the local crawler)
* **Y-axis** — Two bands: "Indexed" (top) and "Not Indexed" (bottom)
* **Green dots** = indexed URLs, **Red dots** = not-indexed URLs
* Dot sizes are uniform (no bubble chart)

You'll typically see orphan pages (low inlinks) clustered in the "Not Indexed" band, visually proving that pages need internal links to get indexed.

When log data is available, a **Hide Zero-Hit Pages** toggle appears to filter out URLs that Googlebot hasn't visited.

{% hint style="warning" %}
This chart requires the internal link crawler to have run. If the crawler hasn't been run yet, you'll see: *"Run the internal link crawler to see this chart."*
{% endhint %}

### Indexing Funnel

A horizontal bar chart showing **where your URLs currently sit** in the indexing pipeline. Each bar's width is proportional to the total sitemap count, so you can instantly see the distribution.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Indexing Funnel showing distribution of URLs across pipeline stages with color-coded bars</p></figcaption></figure>

| Stage           | Color  | What it means                                                                     |
| --------------- | ------ | --------------------------------------------------------------------------------- |
| **In Sitemap**  | Gray   | Total URLs from your sitemap (baseline — always 100%)                             |
| **Discovered**  | Amber  | URLs Google knows about but hasn't crawled yet — may need better internal linking |
| **Crawled**     | Red    | URLs Google crawled but rejected — content quality or technical issues            |
| **Indexed**     | Green  | Successfully indexed and eligible to appear in search                             |
| **Impressions** | Violet | Indexed pages that received at least 1 impression in the selected period          |

Each bar shows the count and percentage of total. Hover over any bar for a description of what the stage means.

**Intentional Exclusions** are listed below the bars as a text summary:

* **Blocked by robots.txt** — URLs your robots.txt prevents Google from crawling
* **Excluded (noindex / duplicate canonical)** — URLs intentionally excluded from the index

{% hint style="info" %}
A large red "Crawled" bar relative to the green "Indexed" bar means Google is visiting your pages but rejecting the content. This is usually the most actionable insight — check those pages for thin content, duplicate issues, or soft 404s.
{% endhint %}

***

## Tab 3: Log Analyzer

The Log Analyzer tab combines data from **three sources** into a single actionable table:

1. **GSC URL Inspection API** — Coverage state for each URL
2. **Internal Link Crawler** — Inlink count per URL
3. **Server Log Files** — Googlebot hit counts from your server logs

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Log Analyzer table showing URL, GSC Status, Inlinks, Log Hits, and Actionable Insight columns</p></figcaption></figure>

### Columns

| Column                 | Source             | Description                                                                  |
| ---------------------- | ------------------ | ---------------------------------------------------------------------------- |
| **URL**                | Any                | The page path (click to open in browser)                                     |
| **GSC Status**         | URL Inspection API | Coverage state badge. Shows **"Not in Sitemap"** for URLs found only in logs |
| **Inlinks**            | Local Crawler      | Internal link count. Shows "—" if crawler hasn't run                         |
| **Log Hits (30d)**     | Server Logs        | Googlebot hit count. Shows "—" if no log report linked                       |
| **Actionable Insight** | Computed           | Color-coded badge based on the rules below                                   |

### Actionable Insights

Each URL gets an automatically computed insight based on its data:

| Insight              | Color  | Condition                                                                     |
| -------------------- | ------ | ----------------------------------------------------------------------------- |
| **Healthy**          | Green  | URL is indexed                                                                |
| **High Crawl Waste** | Red    | High log hits + not indexed (Googlebot keeps visiting but Google won't index) |
| **Low Crawl Waste**  | Orange | Moderate log hits + not indexed                                               |
| **Orphan Page**      | Amber  | Few or zero internal links + not indexed (needs internal links)               |
| **Not in Sitemap**   | Purple | URL found in server logs but not in your sitemap                              |

The thresholds for "high" and "low" crawl waste and the orphan inlink threshold are configurable in [Settings](#actionable-insight-thresholds).

### Filters

* **URL search** — Text search across all URLs
* **GSC Status** — Faceted filter (select one or more coverage states, or "Not in Sitemap")
* **Insight** — Faceted filter (Healthy, High Crawl Waste, Orphan, etc.)
* **Inlinks** — Range filter (from/to)
* **Log Hits** — Range filter (from/to)

### CSV Export

Click the **Export CSV** button to download the full merged dataset as a CSV file. The export includes all columns and all rows (not just the current page).

{% hint style="info" %}
The Log Analyzer table works with whatever data is available. If you haven't run the crawler, the Inlinks column shows "—". If no log report is linked, the Log Hits column shows "—". The table is still useful with just GSC data alone.
{% endhint %}

***

## Settings

Click the **cog icon** in the top-right to open the Indexing Settings modal. The settings are organized into four sections.

<figure><img src="https://1176579443-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F2DwV6sJBiKjUHMDggb4d%2Fuploads%2Fgit-blob-23706452980da93a610748dc3288e115c241b73c%2Fplaceholder.png?alt=media" alt=""><figcaption><p>Indexing Settings modal with auto-submit, normalization, thresholds, and crawler settings</p></figcaption></figure>

### Auto Submit for Indexing

| Setting                      | Description                                                                                                                                 |
| ---------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------- |
| **Auto Submit for Indexing** | When enabled, automatically submits not-indexed URLs to Google via the Indexing API. URL inspection always runs regardless of this setting. |
| **Resubmission Cooldown**    | Minimum days between resubmission attempts for the same URL. Default: 7 days. Only shown when auto-submit is on.                            |

### URL Normalization

These settings control how URLs are normalized for matching across GSC data, server logs, and crawler results.

| Setting             | Description                                                                                                                               |
| ------------------- | ----------------------------------------------------------------------------------------------------------------------------------------- |
| **Trailing Slash**  | How to handle trailing slashes: **Follow Sitemap** (default — strips trailing slashes), **Always Add**, or **Always Strip**               |
| **Keep Parameters** | Comma-separated list of query parameters to preserve (e.g., `p, page, id, lang`). All other parameters (including UTM tags) are stripped. |

The **Recalculate Normalized Paths** button recomputes all normalized paths when you change these settings. This affects how URLs are joined across the three data sources.

### Actionable Insight Thresholds

Configure the sensitivity of the Log Analyzer's insight engine:

| Setting                       | Default | Description                                                          |
| ----------------------------- | ------- | -------------------------------------------------------------------- |
| **High Crawl Waste (hits >)** | 20      | Log hits above this trigger "High Crawl Waste" for not-indexed pages |
| **Low Crawl Waste (hits >)**  | 5       | Log hits above this trigger "Low Crawl Waste" for not-indexed pages  |
| **Orphan Page (inlinks <)**   | 2       | Pages with fewer inlinks than this are flagged as orphan pages       |

### Internal Link Crawler Settings

| Setting                 | Default | Description                                                                                          |
| ----------------------- | ------- | ---------------------------------------------------------------------------------------------------- |
| **Requests per Second** | 1       | How fast the crawler visits your pages. Keep low to avoid overwhelming your server.                  |
| **Max Chrome Tabs**     | 5       | Number of concurrent headless Chrome tabs. Each uses \~50-100MB RAM.                                 |
| **Auto-Crawl Schedule** | Weekly  | How often the crawler runs automatically: **Weekly**, **Bi-weekly**, **Monthly**, or **Manual Only** |

{% hint style="warning" %}
The crawler uses headless Chrome to render pages (capturing JavaScript-rendered links). At the default rate of 1 request/second with 5 tabs, it processes \~300 pages/minute. A 2,000-URL batch takes about 7 minutes.
{% endhint %}

***

## How It Works Behind the Scenes

### URL Inspection

SEO Utils automatically inspects your URLs using the Google URL Inspection API every 4 hours. URLs are prioritized in this order:

1. **Never inspected** — New URLs from sitemap
2. **Content updated** — URLs where the sitemap lastmod changed
3. **Not indexed** — URLs that aren't indexed yet
4. **Indexed** — Re-checked on a rotating basis (lowest priority)

{% hint style="info" %}
Google enforces a limit of **2,000 URL Inspection API requests per day per property**. For sites with more URLs, SEO Utils automatically spreads inspections across multiple days.
{% endhint %}

### URL Normalization Pipeline

To join data from GSC (full URLs), server logs (request paths only), and the crawler (full URLs), all URLs pass through a normalization pipeline:

1. Strip protocol and hostname → `/blog/post`
2. Standardize trailing slashes (based on your setting)
3. Remove tracking parameters (UTM, gclid, fbclid, etc.)
4. Keep only configured structural parameters
5. Lowercase the path

This produces a `normalized_path` used for exact-match joins across all three data sources.

### Googlebot Verification

When importing server logs, SEO Utils verifies Googlebot requests using **Google's officially published IP ranges** (CIDR blocks). This prevents spoofed Googlebot user agents from polluting your data. IP ranges are refreshed daily from Google's public endpoint.

***

## Troubleshooting

<details>

<summary><strong>Chart shows "No chart data available"</strong></summary>

The chart requires at least one inspection batch to complete. Either:

* Click **Run Initial Scan** in the top-right to start an immediate inspection
* Wait for the automatic inspection cycle (runs every 4 hours)

Chart data is generated from inspection results, so URLs must be inspected before the chart has data to display.

</details>

<details>

<summary><strong>Dashboard shows "Based on 0/N URLs inspected"</strong></summary>

This means URLs have been discovered from your sitemap but haven't been inspected yet. This is normal on first setup. Click **Run Initial Scan** or wait for the automatic cycle.

</details>

<details>

<summary><strong>"No sitemaps found" error</strong></summary>

Your property doesn't have a sitemap configured in SEO Utils. Navigate to **Sitemaps** from the property navigation and add your sitemap URL.

</details>

<details>

<summary><strong>Velocity chart is empty</strong></summary>

The velocity chart only tracks URLs discovered **after** you set up SEO Utils. If all your URLs were already in the sitemap when you first connected, the chart has no data to show. Publish new content to your sitemap and the chart will populate as those URLs get indexed.

</details>

<details>

<summary><strong>Scatter plot says "Run the internal link crawler"</strong></summary>

The Internal Links vs. Indexation chart requires crawl data. Go to the **Visual Diagnostics** tab and click **Run Link Crawler** at the top. The scatter plot will populate once the crawl completes.

</details>

<details>

<summary><strong>Crawl Budget chart says "Link a log report"</strong></summary>

The Crawl Budget chart requires server log data. Create a log report in the **Log Analyzer** section of the sidebar, import your server logs, and make sure the report's domain matches your GSC property domain.

</details>

<details>

<summary><strong>Log Analyzer columns show "—"</strong></summary>

A dash means no data is available from that source:

* **Inlinks showing "—"** → Run the internal link crawler
* **Log Hits showing "—"** → Link a log report with server log data for this domain

The table still works with partial data — you don't need all three sources to use it.

</details>

<details>

<summary><strong>Inspection seems slow</strong></summary>

Each URL requires a network call to Google's API. Large sites with hundreds or thousands of URLs may take several minutes to complete a full inspection cycle. This is normal.

</details>

<details>

<summary><strong>Quota exceeded — inspections stopped</strong></summary>

Google limits URL Inspection API to 2,000 requests per day per property. When the quota is reached, remaining URLs are scheduled for the next day. The dashboard shows partial data based on what was inspected.

</details>

<details>

<summary><strong>Clicks and Impressions show 0 for all URLs</strong></summary>

Clicks and impressions come from Google Search Console performance data, which requires the GSC data sync to be running. Make sure your property has synced data (check the Performance page). The indexing dashboard joins with the performance data for the selected date range.

</details>
