Sunday, February 5, 2023
HomeSEOA Full Information for SEOs

A Full Information for SEOs

URL parameters or question strings are the a part of a URL that sometimes comes after a query mark (?) and are used to move information together with the URL. They are often energetic parameters that modify web page content material or passive parameters which are principally used for monitoring and don’t change the content material.

They’re made up of key-value pairs, the place the important thing tells you what information is being handed and the worth is the info you’re passing, comparable to an identifier. They seem like ?key=worth however could also be separated by ampersands (&) like ?key=worth&key2=value2 if there may be a couple of pair. 

Parts of a URL parameter
Rationalization of URL parameter components.

On this information, we’ll be overlaying what you have to learn about URL parameters.

As I discussed within the intro, parameters could be energetic or passive. Let’s take a look at some examples of every.

Energetic parameters

Energetic parameters modify the content material of the web page in some manner. 

Filter. Removes among the content material, leaving extra particular content material on the web page {that a} person desires to see. An instance of that is faceted navigation in e-commerce. 


Type. Reorders the content material ultimately, comparable to by worth or score.


Paginate. Divides content material right into a collection of associated pages.


Translate. Adjustments the language of the content material.


Search. Queries a web site for data {that a} person is searching for.

On our search engine,, we use the important thing “q” for the question, and the worth comprises information in regards to the person question.


Passive parameters

Passive parameters don’t change the content material. They’re sometimes used for monitoring. Let’s take a look at some examples of every.

Affiliate IDs. Passes an identifier used to trace the place gross sales and signups come from.


Promoting tags. Tracks promoting campaigns.


Session IDs. Identifies a specific person. It’s not frequent on trendy web sites to make use of session IDs to trace customers.


Video timestamps. Jumps to the designated timestamp in a video.


URL parameters may cause plenty of completely different points in terms of website positioning, particularly in circumstances the place a number of parameters are used. Listed below are among the issues chances are you’ll encounter.

Passive parameters may cause points with duplicate content material. Usually, you need them to be crawled, and every web page ought to have a canonical set to the primary model. 

There could also be occasions the place you need to block these parameters from being crawled fully utilizing robots.txt—however solely in conditions the place you could have points with crawl finances. We’ll cowl this extra later.

Google will select a model of the web page to index in a course of referred to as canonicalization, and alerts comparable to hyperlinks will consolidate to that listed model.

Energetic parameters might create pages with near-duplicate content material or content material that’s similar to different content material. They might even be fully completely different content material. You’ll have to verify what your parameters are literally used for.

Inside hyperlinks

You need to keep away from passive parameters like these used for monitoring on inner hyperlinks (hyperlinks from one web page in your website to a different). 

That is nonetheless an all-too-common follow on bigger websites, however I need to emphasize that that is an previous and outdated follow that you simply shouldn’t be doing. 

Most analytics techniques have occasion monitoring you should utilize as an alternative that also information the info with out including parameters to your URLs.

It’s advantageous to make use of energetic parameters on inner hyperlinks in most circumstances.


Infinite URL paths with parameters or tons of various combos may cause points with crawling. Maintain a constant order, and don’t have paths that permit for including extra parameters.

You possibly can simply discover doubtlessly infinite paths utilizing the Depth report beneath the Construction Explorer software in Web site Audit. It’s not frequent for web sites to have 9+ ranges, so this can be a sturdy indicator that there might, actually, be infinite paths or another subject.

Structure Explorer's Depth report
Depth report in Construction Explorer.

Google will make changes because it acknowledges infinite paths or sure patterns when crawling. It’s going to attempt to restrict the crawling of URLs that it thinks received’t be helpful or are repetitive.


URL parameters are typically used for worldwide web sites. These are listed as an possibility for locale-specific URLs. However even Google says it’s not beneficial. It provides one other layer of complexity the place extra issues can go fallacious. You additionally received’t be capable of geo-target these URLs in Google Search Console.


Parameters are generally utilized in e-commerce for the whole lot—from monitoring, to pagination, to faceted navigation. These matters could be fairly advanced, so I like to recommend studying via the weblog posts I linked to higher perceive them.


There’s a rising pattern the place individuals are utilizing # as an alternative of ? because the fragment identifier, particularly for passive parameters like these used for monitoring. That is typically not a good suggestion. However in particular circumstances, it could be OK to do that to interchange pointless parameters. I are likely to advocate in opposition to it due to all the points.

The issue is something after a # is ignored by servers, and a number of techniques merely is not going to or can’t acknowledge parameters utilizing a #.

Moreover, # already has a delegated use case, which is to scroll to part of the web page. That is finished on the shopper aspect, and JavaScript devs may additionally use it for “routing” to a web page with completely different content material.

It’s a good suggestion to verify what parameters are used in your website. In Web site Audit’s Web page Explorer software, you possibly can seek for URLs that include a query mark (?).

Searching for parameters in Page Explorer
Trying to find parameters in Web page Explorer.

You need to use the superior filters to seek out pages with a number of parameters or to begin excluding parameters that can assist you determine all the varied parameters used in your web site.

As soon as you already know what parameters are used, I like to recommend checking a couple of of the pages to see what the parameters really do.

It’s also possible to verify the Duplicates report for precise or near-duplicates. The visible makes it simple to see you probably have a number of variations of the identical or related pages and whether or not or not they’ve matching canonical tags to decide on a most well-liked model. You possibly can click on into every cluster to get extra data.

Duplicate content tree map
Duplicate content material tree map view to indicate clusters.

There’s additionally an possibility beneath “Bulk export” that allows you to export all the duplicate content material directly. I discover this feature simpler to make use of for bigger units of information.

Prior to now, Google had a URL parameter software in Google Search Console the place you may select find out how to deal with completely different parameters primarily based on whether or not or not it modified the web page content material. The software was deprecated in early 2022. Right here’s what Google needed to say about it:

When the URL Parameters software launched in 2009 in Search Console’s predecessor, Webmaster Instruments, the web was a a lot wilder place than it’s at present. SessionID parameters had been quite common, CMSes had bother organizing parameters, and browsers usually broke hyperlinks. With the URL Parameters software, website homeowners had granular management over how Google crawled their website by specifying how sure parameters have an effect on the content material on their website.

Over time, Google grew to become a lot better at guessing which parameters are helpful on a website and that are —plainly put— ineffective. In actual fact, solely about 1% of the parameter configurations at present specified within the URL Parameters software are helpful for crawling. As a result of low worth of the software each for Google and Search Console customers, we’re deprecating the URL Parameters software in 1 month.

Whereas not talked about, I believe that some customers may need been hurting themselves with the software. I bumped into this prior to now the place somebody put in a fallacious setting that mentioned the content material didn’t change, but it surely did. This knocked a couple of hundred thousand pages out of the index for that website. Whoops!

You possibly can let Google crawl and determine find out how to deal with the parameters for you, however you even have some controls you possibly can leverage. Let’s take a look at your choices.

Canonical tags

A canonical tag may also help consolidate alerts to a selected URL however requires every extra model of a web page to be crawled. As I discussed earlier, Google might make changes because it acknowledges patterns, and these canonicalized URLs could also be crawled much less over time. 

That is what I’d go for by default. But when a website has a ton of points and parameters are uncontrolled, I could take a look at among the different choices.


A noindex meta robots tag removes a web page from the index. This requires a web page to be crawled. However once more, it could be crawled much less over time. In case you want alerts to consolidate to different pages, I’ll keep away from utilizing noindex.

Blocking in robots.txt

Blocking parameters in robots.txt implies that the pages should still get listed. They’re not prone to present in regular searches.

The issue is that these pages received’t be crawled and received’t consolidate alerts. If you wish to consolidate alerts, keep away from blocking the parameters.

Web site Audit

When organising a venture in Web site Audit, there’s a toggle within the crawl settings referred to as “Take away URL Parameters” that you should utilize to disregard any URLs with parameters.

It’s also possible to exclude parameterized URLs within the crawl setup utilizing sample matching.

Blocking a parameter in the crawl setup
Blocking a parameter in Web site Audit.


Enjoyable reality: We solely depend the canonicalized model of pages towards your crawl credit.

Ultimate ideas

Simply to summarize, URL parameters have a number of completely different use circumstances, and so they might or might not trigger points to your website. The whole lot is situational.

Message me on Twitter you probably have any questions.



Please enter your comment!
Please enter your name here

2 × two =

Most Popular

Recent Comments