Desktop Impressions Drop After Google Modifies Results Parameters

0
15


For a long time, SEOs and data tools have relied on Google’s &num=100 search parameter to pull full pages of results in one go. But recently, something changed. Google quietly stopped supporting the parameter around mid-September 2025, with SEOs noticing the shift throughout common tracking tools and reporting dashboards.

This is what we know so far, what it means for SEO tracking and how Brafton has adapted to continue delivering SEO success for clients.

What Happened & Why It Matters

Around mid-September 2025, SEOs going about their usual business noticed that adding &num=100 to a Google search URL no longer returned 100 results. Instead, pages often stopped after two sets of listings, regardless of the number requested. There’s even a Google Search Help thread from as far back as March 2025 indicating that something was off with the parameter.

Soon after, Barry Schwartz of Search Engine Land asked Google for a statement on the matter, to which they confirmed the parameter “is not something that we formally support.”

Not only was &num=100 convenient for thousands of SEOs, but many rank-tracking and visibility tools used it to collect search data efficiently. Now, those same tools are showing dramatic changes in tracking data. Going forward, it’s likely that marketers and analysts will start seeing some discrepancies or unusual metrics for impressions, average position and keyword coverage.

There isn’t currently a detailed understanding of why Google made this change. As of this writing, there hasn’t been an official statement or blog post explaining why, but some speculate it’s a result of the ongoing Google lawsuit in an effort to stop competing companies from using market dominator’s search results and data, which they will soon be required to share.

How To Navigate the &num=100 Parameter Change

Google’s quiet withdrawal of support for the &num=100 parameter doesn’t mean death for tracking, but it does signal some big changes in how SEOs collect, measure and interpret data moving forward.

Since this tweak affects both the volume of search results returned and the accuracy of data sampling from SERPs, SEOs and marketers will need to realign their expectations and update practices to maintain reliable insights.

1. Recalibrate Data Expectations

First and foremost, this change will cause domain performance to appear more volatile than it might actually be. Tools that relied on fetching 100 results at once now need to run multiple smaller queries, which can cause:

  • Apparent drops in impressions and visibility, especially for keywords that previously benefited from broader sampling.
  • Shifts in average position metrics due to reduced result depth.

Marketers should therefore treat post-September 2025 data as a new baseline and avoid comparing it directly with earlier periods without noting the methodology change.

2. Audit Your Rank-Tracking Tools

Many rank trackers and SEO platforms (like Semrush, Ahrefs, AccuRanker, etc.) are already working on internal fixes, such as paginating results through multiple smaller requests. Each vendor will approach this differently and at a different pace, so it’s worthwhile to audit your rank-tracking tools to learn how they’re handling the change:

  • Have they adapted to the new limit?
  • Are they showing data gaps or sudden visibility losses that might be artificial?

Knowing the answers to these questions will help you separate actual performance changes from tool-related artifacts.

3. Expect To Rely More on Google’s Native Data Through Search Console

With scraping becoming less reliable, SEOs should expect a stronger pivot toward Google Search Console as a primary and trustworthy data source. While it has its own limitations, Search Console should remain consistent with Google’s data ecosystem, since it’s their product, and therefore more predictable.

If you haven’t already, encourage SEO teams to use Search Console metrics as an anchor for reporting, rather than scraped SERP datasets.

4. Expect Tighter Data Controls from Google To Come

This change may be part of a broader trend of Google reducing automated access to SERPs in an effort to combat scraping, AI data harvesting and API overuse. Future changes could limit other URL parameters or impact how third-party tools interact with results pages. As such, SEO teams should expect:

  • More throttling or obfuscation of SERP data.
  • Continued emphasis on first-party data and authenticated APIs.
  • Greater emphasis on data interpretation, rather than collection volume.

5. Refocus Strategy on Impact, Not Volume

Instead of tracking hundreds of ranking positions, emphasis should shift to page-level performance, conversions and content quality signals. Rankings are now a more volatile, less reliable single metric — but user engagement and business outcomes remain stable indicators of success.

“SEOs should shift their focus from volume to value keywords,” says Philip Weafer, associate vice president of SEO at Brafton. “Place a greater emphasis on keywords that directly drive, or assist, revenue.”

This shift, advantageously, may actually free SEOs to focus more on storytelling, search intent and topical authority rather than raw ranking counts.

How Brafton Has Prepared for This Change for Continued SEO Success

Brafton has weathered some of the industry’s biggest changes for nearly two decades, shifting our approach and strategy as necessary to continue delivering consistent client wins. This most recent &num=100 parameter change means we’re:

  • Continuing to emphasize quality over quantity in all content areas.
  • Increasing our focus on the top 10 results and on generative engine optimization.
  • Driving improvements for our clients that build topical authority and ownership.
  • Building sturdy generative engine optimization campaigns as zero-click searches continue growing.

If you’re curious about these changes or SEO in general and how it can help your website reach your audience better, check out our SEO resources.