A Beginner’s Guide to URL Parameters

Michelle Ofiwe

Jul 17, 20236 min read
url parameters
Share

TABLE OF CONTENTS

URL Parameters and How They Impact SEO

Although URL parameters are invaluable in the hands of seasoned SEO professionals, they often present serious challenges for your website’s rankings.

In this guide, we’ll share the most common SEO issues to watch out for when working with URL parameters.

But before that, let's go over some basics.

What Are URL Parameters? 

URL parameters (also known as query strings or URL query parameters) are elements inserted in your URLs to help you filter and organize content. Or implement tracking on your website.

To identify a URL parameter, look at the portion of the URL that comes after a question mark (?). 

URL parameters include a key and a value that are separated by an equals sign (=). Multiple parameters are then separated by an ampersand (&).

A complete URL string with parameters looks like this:

an example of a complete URL string with parameters

In the example above, there are two parameters: 

  1. “Color” with the value “blue”
  2. “Sort” with the value “newest”

This filters a webpage to display products that are blue and arranges them starting with the most recent ones. 

URL parameters will vary depending on the specific keys and values. And can include many different combinations.

But the basic structure (shown below) will always be something like “https//www.domain.com/page?key1=value1&key2=value2.” 

an example of a a basic structure of URL parameter

And here’s what each part means:

  • Key1: First variable name 
  • Key2: Second variable name
  • Value1: First property value
  • Value2: Second property value
  • ?: Query string begins
  • =: Value separator
  • &: Parameter separator

There can also be additional keys and values to form more complex URL parameters.

How to Use URL Parameters (with Examples)

URL parameters are commonly used to sort content on a page, making it easier for users to navigate products. Like in an online store.

These query strings allow users to order a page according to their specific needs. 

Query strings of tracking parameters are just as common.

They’re often used by digital marketers to monitor where traffic comes from. So they can track how their social media posts, advertising campaigns, and/or email newsletters contribute to website visits.

Here’s a look at what both tracking and sorting parameters look like:

an example of tracking and sorting parameter

How Do URL Parameters Work? 

According to Google, there are two main types of URL parameters. And the way they work depends on the type:

  1. Content-modifying parameters (active): Parameters that will modify the content displayed on the page. For example, “https://domain.com/t-shirts?color=black”willupdate the page to show black T-shirts.
  2. Tracking parameters (passive): Parameters that will record information, such as which network users came from, which campaign or ad group a user clicked on, etc.—but won’t change the content on the page. And custom URLs can be used for advanced tracking.
  • For example, “https://www.domain.com/?utm_source=newsletter&utm_medium=email”will track traffic from an email newsletter.
  • And “https://www.domain.com/?utm_source=twitter&utm_medium=tweet&utm_campaign=summer-sale”will track traffic from a Twitter campaign.

URL Query String Examples

We’ve already covered a few different ways query strings can be beneficial. 

But there are many common uses for URL parameters:

a list explaining different common uses for URL parameters including tracking, sorting, searching, identifying, paginating, translating and filtering

When Do URL Parameters Become an SEO Issue?

Most people suggest staying away from URL parameters as much as possible. 

This is because no matter how useful URL parameters are, they create crawlability and indexability issues.

Poorly structured, passive URL parameters that don’t change the content on the page can create endless URLs with the same content. 

The most common SEO issues caused by URL parameters are:

1. Duplicate content: Since every URL is treated by search engines as a separate page, multiple versions of the same page created by a URL parameter might be considered duplicate content. Because a page reordered according to a URL parameter is often very similar to the original page. And some parameters might return the exact same content as the original page.

2. Crawl budget waste: Complex URLs with multiple parameters create many different URLs that point to identical (or similar) content. According to Google, crawlers might end up wasting bandwidth or have trouble indexing all content on the website.

3. Keyword cannibalization: Filtered versions of the original URL target the same keyword group. This leads to multiple pages competing for the same keywords. This can confuse search engines about which of the competing pages should be ranking for the keyword.

4. Diluted ranking signals: When you have multiple URLs with the same content, people might link to any parameterized version of the page. Which may lead to your main pages not ranking well overall.

5. Poor URL readability: A parameterized URL is virtually unreadable for users. When displayed in the SERPs, the parameterized URL looks untrustworthy, making it less likely for users to click on the page.

How to Manage URL Parameters for Good SEO

Most of the SEO issues mentioned above point to the same thing: crawling and indexing of all parameterized URLs.

But thankfully, SEOs aren’t powerless against the endless creation of new URLs via parameters.

Here are some solutions you can implement.

Use Consistent Internal Linking

If your website has many parameterized URLs, it’s important to signal to crawlers which pages shouldn’t be indexed by consistently linking to the static, non-parameterized page. 

For example, here are a few parameterized URLs from an online shoe store: 

example of a few parameterized URLs from an online shoe store

In cases like these, it’s important to be careful and consistently add internal links only to the static page—never to the versions with parameters. 

This way, you’ll send consistent signals to search engines as to which version of the page is important and should be indexed.

Canonicalize One Version of the URL

Set up canonical tags on the parameterized URLs, referencing your preferred URL for indexing.

If you’ve created parameters to help users navigate your online shoe shop, all URL variations should include the canonical tag identifying the main page as the canonical page. 

This means that in the image below, “https://www.domain.com/shoes/women-shoes?color=blue” and “https://www.domain.com/shoes/women-shoes?type=high-heels” should reference a canonical link to “https://www.domain.com/shoes/women-shoes.”

example of URL variations with canonical page /shoes/women-shoes/

This will send a signal to crawlers that only the canonical, main page is to be indexed. And not the parameterized URLs.

Block Crawlers via Disallow

If you’re facing crawl budget issues, you can choose to block crawlers from accessing your parameterized URLs using your robots.txt file.

A robots.txt file is checked by bots before crawling a website. And they’ll follow the instructions about which pages to avoid crawling.

The following robots.txt rule will disallow any URLs featuring a question mark—i.e., your parameterized URLs:

User-agent: *
Disallow: /*?*

Using Semrush’s Site Audit Tool

It’s important to avoid crawling parameterized URLs when you want to get an overview of your website’s SEO health. So you can make sure to only audit URLs that matter. 

When setting up a Semrush Site Audit, you can configure the tool so that it excludes parameterized URLs from crawling. Here’s what the setup process looks like.

First, open the tool. Enter your domain and click “Start Audit.”

Site Audit tool with an arrow pointing to “Start Audit” button

The “Site Audit Settings” window will pop up.

“Site Audit Settings” window

Click “Remove URL parameters” and list the parameters you want to avoid crawling. 

For example, if you want to exclude your paginating parameters (“?page=1,” “?page=2,” “?page=3,” etc.) mention “page” in the box to the right of the tab.

“Remove URL parameters” section in Site Audit’s Settings

This will ensure the tool avoids crawling URLs that include the key “page” in their URL parameters.

After you list all the parameters you want to ignore, click “Start Site Audit.”

The tool will generate a report, providing you with an overview of your site’s technical health.

"Site Health" section highlighted in the report

Along with some of the top issues it found on your site.

"Top issues" section in Site Audit results

Then, you can review each issue. And take steps to fix them.

Handling URL Parameters for SEO

Parameterized URLs make it easier to modify content or implement tracking on your site, so it’s worth using them when you need to.

You’ll need to let web crawlers know whether to crawl URLs with parameters. And highlight the version of the page that’s the most valuable.

Take your time and decide which of your URLs shouldn’t be indexed. With time, web crawlers will better understand how to navigate and treat your site’s pages.

Share
Author Photo
Editor and writer, deeply dedicated to good copy. Based in Houston, TX.
More on this