Peak Positions SEO Logo 231-922-9460

Crawl Rate Limiter Tool in Search Console to be Deprecated

November 24th, 2023

Google has announced the crawl rate limiter tool in Search Console is being deprecated on Jan 8th, 2024, ending over a decade of this tools availability to Search Console users.

The crawl rate refers to the number of requests made by Googlebot to your site per second while it is actively crawling, such as 5 requests per second.

Googlebot’s crawling behavior is influenced by the responses it receives from a website’s server. For instance, persistent HTTP 500 status codes or prolonged response times prompt Googlebot to automatically slow down its crawling activities. If a site experiences overwhelming crawling that it can’t handle, Google has a help article for more information on how to reduce crawl rates.

In contrast, the now-deprecated rate limiter tool had a slower impact, often taking more than a day for new limits to take effect. While rarely used by site owners, those who did typically set the crawling speed to a minimum. With the tool’s deprecation, the minimum crawling speed is adjusted to a lower rate, aligning with old crawl rate limits. This ensures that, even with low Search interest, Google’s crawlers respect past settings, preventing unnecessary bandwidth consumption for the site.

Due to advancements in automated crawl rate management and Google’s commitment to user-friendly simplicity, they have decided to phase out the crawl rate limiter tool in Search Console. While the Googlebot report form will still be available for reporting unusual Googlebot activities and emergencies, it’s essential to note that the most efficient way to adjust crawl rates is by instructing Googlebot through server responses.

How to Reduce Crawl Rate with Server Responses

Google has stated that if you urgently need to temporarily decrease the crawl rate for a few hours or up to 1-2 days, consider returning an informational error page with a 500, 503, or 429 HTTP response status code instead of serving regular content. Googlebot naturally lowers the crawling rate for your site when it encounters a significant number of URLs with these error codes. This adjustment affects both the crawling of URLs triggering these errors and the overall site. Once the frequency of these errors decreases, the crawl rate will automatically resume an increase. Be sure to work with an experienced web developer to make these changes, and only have them active for 1-2 days to prevent drops in rankings.


Our SEO firm is comprised of a world class team of qualified coders, web designers, developers & SEO copywriters who strive to bring clients the highest quality SEO services in the industry.

Free Website Audit
Top 20 SEO Company Worldwide
  • Dell Logo
  • Ferris State University Logo
  • Sony Logo
  • Primerus Logo
  • Detroit Medical Center Logo
  • Tenet Health Logo
Contact Us

Peak Positions SEO
10850 East Traverse Highway, Suite 2290
Traverse City, MI 49684