Crawl Budget

HomeSEOGlossary
Crawl Budget

What Is a Crawl Budget?

A crawl budget is the total number of website pages or URLs that a Crawler Bot, or spider, crawls/indexes in a day. This is based on two things.

1. Crawl Rate Limit

The crawl rate limit determines how many times a crawler bot can crawl a site, and how fast it can do it. This is dependent on 'crawl health' and whether the web page has set a limit in the Search Console. 

2. Crawl Demand

The crawl demand measures how much the crawler bot wants to crawl a website. This is determined by how popular the site is and how regularly it is updated.

Why Is Crawl Budget so Important?

Crawl budget is important because it allows a website’s pages to be found by crawler bots and ensures that new content is identified and indexed quickly.

How Does a Crawler work?

A crawler spider is a lot less frightening than its real life equivalent. Crawlers are actually very helpful to web page owners and end users.

They work by scouring the net, jumping from link to link, looking for updated content or new web pages. 

When the find a new page, for example, they copy the site’s information and store it in the index. This information is later processed by Google’s algorithm.

What Factors Affect Crawl Budget?

There are a wide range of factors that negatively affect crawl budget. Try to avoid the following:

How Can You Optimise Your Crawl Budget?

As indicated above, spiders are drawn to quality content, so ensure your pages are delivering on that to optimise your crawl budget, and ensure your pages are crawl-able.

Below are some of the key strategies for optimising crawl budget.

1. Update Content 

Rewrite any pages with weak content and update regularly. Make sure all your content is unique and add new pages. This will positively affect the crawl demand.

2. Improve Site Speed

The faster your site can run, the more requests it can handle. This will improve the crawl rate limit.

screenshot google page speed disgnostic - crawl budget

3. Watch out for Flash, AJAX and JavaScript

You need to ensure that a web crawler bot can crawl your site as easily as possible. For this reason, avoid using Flash, AJAX and JavaScript in your site’s navigation.

4. Make Sure You Use Internal Links

The crawler bots like a lot of internal links because it means that it can navigate your site easily and index more quickly. 

5. Block Sections in Your Site

If there are parts of your website that are no longer relevant or being used block them, so the bots don’t crawl them. You can do this by using robots.txt.

How Can We Help?

Feeling overwhelmed by the tasks needed to improve your crawl budget and need some help?

Get in touch and we could team you up with some awesome marketers to work through them step by step.

Alternatively, if you think you can handle it but just need that push - check out our training courses where we’ll show you our favourite and most effective SEO practises that you can then implement in your own time.