What the &num=100 Parameter Means in Google Search (and Why It Shook Up GSC Metrics)
Introduction
In mid-September 2025, many site owners and SEO professionals woke up to a startling shift: Google Search Console (GSC) impressions plunged, average rankings “improved,” and click-through rates (CTR) often rose — even though actual site traffic and rankings remained stable. This was not a widespread algorithmic penalty, but rather a major measurement change — one tied to Google quietly disabling the &num=100 parameter.
For years, that parameter allowed users and SEO tools to fetch up to 100 search results in a single page load. But removing it dismantled a hidden source of artificial impression counts. What you’re seeing now is a cleaner, more human-driven view of search visibility — and a recalibration of how we interpret SEO performance.
What Was &num=100 — And Why SEOs Used It
By default, Google returns about 10 organic search results per page (for a standard web query). However, the &num=100 URL parameter (e.g. &num=100 appended to the search URL) forced Google to return up to 100 results on one page. This was especially useful to SEO tools and scrapers, because:
- It compacted what would otherwise require 10 paginated requests (pages 1 through 10) into one.
- It enabled crawling deeper into the search results (positions 11–100) quickly.
- It allowed rank-tracking platforms to assess how many keywords a site might have “visibility” for across deeper positions.
So far, so practical — but it came with a hidden consequence: that bot or tool-driven “views” of deep results might count (or influence) what GSC counts as an impression.
Why Disabling &num=100 Matters to GSC Reporting
1. Impressions Drop Because Bot-Driven Deep Impressions Disappear
Many SEO tools and scrapers made heavy use of &num=100 to scan dozens or hundreds of SERP results in one go. Some of those crawler loads were being logged (or inferred) as impressions in GSC for pages in deeper positions (e.g., position 40, 60, 80) even though real human users rarely, if ever, scroll that far.
When Google disabled or ignored the &num=100 parameter around September 12–14, 2025, those bot-driven “impression” counts vanished from GSC reports. Suddenly, impression totals dropped, often steeply, because a large skew of non-human counts was removed (Search Engine Roundtable, Locomotive Agency, Search Engine Journal).
In an analysis of 319 properties, 87.7% experienced declines in impressions post-change (Search Engine Land).
Was this nefarious on the part of Google or third-party tools? No. The use of &num=100 simply gave SEOs practitioners a way to see deeper into the top 100 results. The impressions from those positions weren’t “fake,” but they did create a skewed view of performance once folded into GSC averages.
2. Average Position Shifts Up (Numerically Better)
With lower-ranking (deep) results no longer included in counts, the weighted average position metric in GSC adjusts upward. In simpler terms: when the “deep tail” of many URLs is removed, the average of remaining positions looks better. locomotive.agency+3Search Engine Roundtable+3Search Engine Journal+3
Thus, it might falsely look like your site’s average position has improved — even though your real rankings (in visible zones) are unchanged.
3. Clicks Remain Largely Unaffected
Because human users rarely click results beyond the first few pages, your actual organic clicks and traffic tend to stay stable through this transition. The key shift is in how many “impressions” are registered — not how many real visits occur.
4. CTR (Click-Through Rate) Often Rises
CTR is a ratio of Clicks ÷ Impressions. Since the denominator (impressions) shrank while the numerator (clicks) held steady, many sites saw their CTRs increase — again, not because more people clicked, but because fewer “phantom” impressions were being counted.
5. Keyword / Query Visibility Contracts
Because deep (low-ranking) queries are no longer being counted, sites saw reductions in the number of unique ranking keywords and their perceived “visibility.” In other words, your site may appear to have lost keywords in GSC — but many of those were never driving meaningful volume. Search Engine Roundtable+1
In the same dataset above, 77.6% of sites saw drops in unique query count. Search Engine Land
Timeline & Evidence
- Reports began around September 10, 2025, of rank trackers failing, missing positions, or returning partial result sets when using
&num=100. Search Engine Journal+2locomotive.agency+2 - By September 12–14, widespread impression drops across GSC surfaced, especially for desktop. MeasureMinds+4Trial Guides+4locomotive.agency+4
- Tool vendors (e.g. Semrush, Ahrefs) acknowledged disruptions in their crawling / ranking modules. Sohail Zafar SEO Expert+3Logical Position+3Search Engine Journal+3
- Multiple SEO analysts and agencies published guides to interpret the data shifts and their implications. Logical Position+3Found+3locomotive.agency+3
What This Doesn’t Imply (But Some May Fear)
- It does not necessarily mean your site’s actual organic visibility collapsed overnight.
- It does not mean Google penalized you or downgraded your rankings (unless, independently, that happened).
- It does not mean that future impressions will return to prior inflated levels — the baseline is now different.
In most cases, the traffic, conversions, or sessions you care about remain stable — what changed is the data lens.
What You Should Do Now: Strategy & Reporting Adjustments
1. Annotate Your Data
Mark mid-September 2025 (e.g., Sept 12) as the point of the “num=100 removal adjustment” in your dashboards and reports. Avoid comparing pre- and post-change metrics without context.
2. Focus on Signals That Matter
Instead of giving too much weight to raw impressions, lean more heavily on:
- Click volume
- Organic sessions / users
- Conversion / goal performance
- Trends in position buckets (e.g. share of keywords in positions 1–3, 4–10, 11–20)
3. Rebaseline KPIs
Assume your impression totals will be permanently lower (compared to pre-September 2025). Reset your expectations and comparisons to post-change baselines.
4. Check Tool Vendor Methodologies
Ask your rank-tracking or SEO platforms how they now gather data (pagination, API, caching) and what coverage (top 10, top 20, top 50) they guarantee. Many have had to redesign workflows in response to &num=100 removal.
5. Prioritize Your Efforts
Because deep-SERP visibility (positions 50–100) rarely drives meaningful traffic anyway, refocus your efforts on lifting pages that rank in the 11–20 zone into the first page (1–10). That’s where you’ll get the best ROI.
Use content audits, internal linking, UX improvements, intent matching, and query refinement in that window.
6. Monitor for Ripples
- Watch for continued shifts in how GSC defines impressions (e.g., new experiment flags, or mobile/desktop splits).
- Check whether Google issues an official statement clarifying this or adjusts reporting again.
- Be alert to tool vendors altering their reporting outputs or coverage levels.
Why Google Probably Did This — The Leading Theories
- Remove Bot / Scraper Distortion in Metrics
Disabling&num=100helps strip out artificial impression inflation from mass scraping operations. Many believe this was a key motivator. Search Engine Journal+4Delante+4Logical Position+4 - Reduce Load / Infrastructure Abuse
Allowing a single query to fetch 100 results is more resource-intensive and opens the door to aggressive automated scraping. Disabling it is a deterrent. Sohail Zafar SEO Expert+2locomotive.agency+2 - Align Users’ Impressions with Human Behaviour
Since most users only see the first page (10 results), enforcing that default makes GSC’s reported impressions more reflective of what humans actually see. Logical Position+3MeasureMinds+3locomotive.agency+3 - Anti-Scraping / Anti-AI Strategy
With the rise of large language models and AI systems that rely on scraping search results, removing such parameters may help control or throttle abuse. Delante+1 - Measurement Consolidation Ahead of Future SERP Models
Google may be preparing its reporting framework for AI-augmented search, where impressions and result sets will behave differently. Cleaning up the measurement now sets a more stable foundation. Found+2Search Engine Journal+2
It’s worth noting: Google has not published (as of now) an official, detailed explanation confirming any single theory. Many of the above are inferred from timing, patterns, and tool vendor responses.

