New Business

Nicola Björk

office@alkye.com

United Kingdom

office@alkye.com

+44 7432 209 464

Knightsbridge, London, SW1X 7JF

Australia

office@alkye.com

+61 406 231 506

Lavender Bay, Sydney, NSW, 2060

INDIA

office@alkye.com

+91 8626 879465

Sector 74, Mohali, Punjab, 140308

Google is Eliminating URL Parameters Tool Soon!
News & views

Google is Eliminating URL Parameters Tool Soon!

April 9, 2022

A Google tool that has been available since 2009 is about to be phased out. The URL parameters tool used by webmasters will be decommissioned by Google soon.

The URL parameters tool will be eliminated around the end of April 2022, or about a month from now. URL parameters are used after a core URL to help filter or organise content.

Additional query strings that follow the main URL for a page, usually after a question mark, are referred to as URL parameters. Multiple query string parameters can be used within a single URL, which can have several negative effects on a website’s SEO and organic search performance, including duplicate content and crawling issues.

What is the URL Parameter Tool?

In 2009, Google announced the URL parameter tool as a parameter management tool, allowing users to tell Google to ignore particular URLs or combinations of URL parameters. Two years later, in 2011, Google enhanced the tool to handle a larger range of parameter situations. You can use the application to block Google from indexing your website’s URLs. You can use this tool until April 26th.

Site owners had granular control over how Google crawled their site with the URL Parameters tool by specifying how certain parameters affected the content on their site. Google’s ability to predict which parameters are beneficial on a site and which are, to put it bluntly, meaningless has improved over time. In reality, crawling is only useful for about 1% of the parameter configurations currently available in the URL Parameters tool. This month, we’re retiring the URL Parameters feature due to its low utility for both Google and Search Console users.

Issues with URL Parameters

  • Parameters Create Duplicate Content: The content of a page is frequently unaffected by URL parameters. A re-ordered version of a page is frequently identical to the original. The original URL is identical to a page URL with tracking elements or a session ID.
  • Parameters Split Page Ranking Signals: Links and social shares may be coming in on different copies of the same page content if you have many permutations. Your ranking signals will be diluted as a result of this. When a crawler is perplexed, it has trouble deciding which of the competing pages to index for the search query.
  • Parameters Waste Crawl Budget: Crawling redundant parameter pages depletes your crawl budget, making indexing SEO-relevant pages difficult and increasing server load. Google has summed it up well. “Overly intricate URLs, particularly ones with many parameters, may cause crawlers problems by generating an excessive number of URLs that redirect to the same or comparable content on your site.” 

Few Basic Scenarios

  • Removing a Single URL: In general, for your removal requests to be successful, the owner of the URL(s) in question—whether you or someone else—must have indicated that the content can be removed. This can be indicated for a specific URL in one of three ways:
  1. Using a robots.txt file, you can prevent the page from being crawled.
  2. Using a noindex meta tag, you can prevent the page from being indexed.
  3. Returning a 404 or 410 status code indicates that the page no longer exists.

Before submitting a removal request, make sure the URL is properly blocked:

  1. Robots.txt: To check if the URL is correctly blocked, utilise Webmaster Tools’ Fetch as Googlebot or Test robots.txt functionality.
  2. Noindex Meta Tag: Use Fetch as Googlebot to ensure that the meta tag appears somewhere between the tags. If you want to double-check a page that you can’t verify in Webmaster Tools, open it in a browser, navigate to View > Page Source, and check that the meta tag is between the and tags.
  3. 404/401 Status Code: To check whether the URL is returning the correct code, use Fetch as Googlebot or tools like Live HTTP Headers or web-sniffer.net. It’s common for “deleted” pages to display “404” or “Not Found” on the page, but return a 200 status code in the page header; therefore, it’s a good idea to double-check with a proper header-checking tool.

Final Thought

At  Alkye Services,  We help our clients stay ahead of the market, competitors, and technological trends. We can also help you design and develop new apps and websites using cutting-edge technology. Our goal is to help you grow your business while staying one step ahead of the competition and relevant to your customers.

googleMachine LearningSEO
Nicola Bond

Words by
Nicola Bond

More News
Share This Article