Human-led SEO logo - white background, black text
Get low-effort, high-reward backlink opportunities sent to your inbox each week.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Technical SEO

Fetch and Render: How to Inspect Any URL for Maximum Coverage

Learn how to use Google's Fetch and Render tool
Holly Stanley
Freelance Content Writer for B2B, SaaS, and Digital Marketing Brands
Fetch and Render: How to Inspect Any URL for Maximum Coverage
March 23, 2023
October 14, 2020

While ensuring your site has relevant and useful content should be an integral part of any SEO strategy, sometimes it’s easy to forget about the finer points of technical SEO.

Fetch and Render, now known as “Inspect any URL” is a free Google SEO tool that you can use to help streamline your SEO efforts and discover glitches that may be holding your site back.

With so many details to consider, technical SEO can seem overwhelmingly complex. It doesn’t have to be though.

At its core, technical SEO is all about ensuring your web pages are crawlable by Google search bots so that they appear when users search for them and rank well. 

Find out how to use Fetch and Render and why it’s so important for boosting your pages’ rankings. 

What Is Fetch and Render (Inspect any URL)? 

The “Inspect any URL” tool is located within Google Search Console and allows you to check how Google crawls a web page as well as how it will appear on both desktop and mobile devices. 

Since the tool was first created, Google Search Console have performed some upgrades and the platform now looks pretty different. 

A screenshot of Google search console

Instead of using the terms Fetch and Render, the tool is now called ‘Inspect any URL’. 

Why Is Inspecting URLs Important?

Using Search Console’s “Inspect any URL” allows you to detect and fix any errors that may be occurring across your webs pages. It shows you page elements that spiders are blocked from seeing. The bots also run through all of the links on or in it, meaning Google will tell you if links to images, JavaScript, separate CSS files are visible. 

The tool will help you build a clear picture of how Google views your site. You can: 

  • Identify elements that may be hidden within the page
  • Debug crawl issues
  • Check hacked pages without compromising safety

An essential step in optimising your technical SEO, it’s a highly useful SEO tool for pinpointing finer technical details that could be holding your site back from performing well in Google rankings. 

How to Use the “Inspect any URL” Tool 

While Search Console’s “Inspect any URL” tool is fairly straightforward once you’ve got your head around it, it’s still important to know how to use it to prevent any confusion further down the line. 

1. Log in to Google Search Console, scroll down to URL Inspection located on the left-hand navigation bar. Paste the URL you want to inspect in the search bar.

A screenshot of Google search console URL inspection

2. Check whether Google could fetch your URL when the last crawl was and if the page is indexed. 

A screenshot of Google search console page fetch

3. View the crawled page as Google sees the code. If your page doesn’t appear on Google in the way you want, it’s likely the answer lies in the code that the search bots crawl. 

Google search console - view crawled page as Google sees the code

4. Check if Google considers your page to be mobile-friendly. Google now uses mobile-first indexing with priority given to pages that are easy to use from a mobile device. If your page isn’t mobile-friendly, it’s time to make some improvements so Google starts rewarding it with rankings. 

Mobile Usability Check on Google search console - 2
Mobile usability check on Google search console

5. Request indexing if you find that your page doesn’t show up in Google search results. 

Request indexing on Google search console

How to Decode Page Fetch Statuses 

Now you’ve gone through the process, you’re probably wondering what the different statuses mean and how you can use them to make active improvements to your site. 

Complete: This is the ideal status that you want to obtain. Complete means that Google was able to connect with your site and any referenced resources. 

Partial: Having a partial status means Google was able to connect with and fetch your site but the robots.txt file blocked a resource referenced by the page. By clicking on the status, the tool will show you which resource was blocked and how severe the blockage is. This is key information for figuring out which technical glitches need fixing. 

Not Authorized: This is usually caused by a 403 error or any other type of restricted access.

Redirected: The tool only checks the exact URL requested. If the web page includes a redirect to another page, it will receive a redirect status. 

Not Found: This status normally comes from a 404 error, but it can also appear when a site is hacked and Google doesn’t want to index the page. 

Unreachable: This status is usually caused by a timeout issue. 

Unreachable robots.txt: Google is unable to reach the host of the resource for robots.txt - you probably need to test and update the robots.txt file. 

DNS Not Found: If your status isn’t found it could be because of a typo or site downtime.

Temporarily Unreachable: This status is usually down to a timeout error or too many fetch requests.

Error: This is the least common status. It is the result of an unknown error. If you keep having problems with an error status appearing, it’s best to post on the Google help forum. 

When to Use Inspect Any Page if You Don’t Want Pages to be Crawled by Google

There are some specific situations when you may not want Google to crawl your web pages and rank them in the SERPs. This is particularly true if your site has a member’s area or requires a subscription to access specific content. 

In these cases, you can use the tool to ensure your web pages aren’t available to those just clicking on search results. 

The Bottom Line on Fetch and Render or Inspect any URL

To ensure your pages can rank well, they first need to be crawlable by Google. It’s best to check your pages can be crawled by Google search bots every so often. If you make any changes or updates to your site, you should run these pages through Inspect any URL to verify that Google hasn’t detected any issues. 

The more crawlable your web pages are and the easier you make life easier for Google, the more Google will reward your content. 

You're reading Human-Led SEO

A regular column dedicated to illustrating how a searcher-first approach to SEO enables businesses to generate more revenue in less time from organic search.

Coming Soon
We sometimes earn referral commissions from these partners when a reader buys their products and services. Read our full disclosure here.